github
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/pull/1812#issuecomment-1249355888 | https://api.github.com/repos/simonw/datasette/issues/1812 | 1249355888 | IC_kwDOBm6k_c5Kd6hw | 22429695 | 2022-09-16T13:18:37Z | 2022-09-16T13:18:37Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1812?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **91.70**% // Head: **91.70**% // No change to project coverage :thumbsup: > Coverage data is based on head [(`b3855e7`)](https://codecov.io/gh/simonw/datasette/pull/1812?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`b40872f`)](https://codecov.io/gh/simonw/datasette/commit/b40872f5e5ae5dad331c58f75451e2d206565196?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1812 +/- ## ======================================= Coverage 91.70% 91.70% ======================================= Files 38 38 Lines 4735 4735 ======================================= Hits 4342 4342 Misses 393 393 ``` Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) </details> [:umbrella: View full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1812?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). :loudspeaker: Do you have feedback about the report comment? [Let us know in this issue](https://about.codecov.io/codecov-pr-comment-feedback/?utm_medium=referral&… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1375930971 | |
https://github.com/simonw/datasette/pull/1812#issuecomment-1249746777 | https://api.github.com/repos/simonw/datasette/issues/1812 | 1249746777 | IC_kwDOBm6k_c5KfZ9Z | 9599 | 2022-09-16T19:50:45Z | 2022-09-16T19:50:45Z | OWNER | Main difference I can see: ![CleanShot 2022-09-16 at 12 49 47@2x](https://user-images.githubusercontent.com/9599/190719563-a7b1bcc7-bfdc-4759-95c1-e19bcd0217c3.png) | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1375930971 | |
https://github.com/simonw/datasette/pull/1812#issuecomment-1249745637 | https://api.github.com/repos/simonw/datasette/issues/1812 | 1249745637 | IC_kwDOBm6k_c5KfZrl | 9599 | 2022-09-16T19:49:12Z | 2022-09-16T19:49:12Z | OWNER | Preview looks good. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1375930971 | |
https://github.com/simonw/datasette/issues/1809#issuecomment-1249985741 | https://api.github.com/repos/simonw/datasette/issues/1809 | 1249985741 | IC_kwDOBm6k_c5KgUTN | 9599 | 2022-09-17T03:04:51Z | 2022-09-17T03:04:51Z | OWNER | I'm going to throw an error in `ds.render_template()` if you haven't previously called `await ds.invoke_startup()`. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1373595927 | |
https://github.com/simonw/datasette/issues/1809#issuecomment-1249985971 | https://api.github.com/repos/simonw/datasette/issues/1809 | 1249985971 | IC_kwDOBm6k_c5KgUWz | 9599 | 2022-09-17T03:06:32Z | 2022-09-17T03:06:32Z | OWNER | This is likely going to cause some tests in plugins to break, but I'm OK with that - I'll fix them as I find them once this release is out. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1373595927 | |
https://github.com/simonw/datasette/issues/1809#issuecomment-1249986079 | https://api.github.com/repos/simonw/datasette/issues/1809 | 1249986079 | IC_kwDOBm6k_c5KgUYf | 9599 | 2022-09-17T03:07:24Z | 2022-09-17T03:07:24Z | OWNER | Datasette's own tests started to break because calls to the `TestClient` were performed without awaiting that method. I fixed that by adding this to `_request()` inside that class: ```python async def _request( self, path, follow_redirects=True, redirect_count=0, method="GET", cookies=None, headers=None, post_body=None, content_type=None, if_none_match=None, ): if not self.ds._startup_invoked: await self.ds.invoke_startup() ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1373595927 | |
https://github.com/simonw/datasette/issues/1809#issuecomment-1249987643 | https://api.github.com/repos/simonw/datasette/issues/1809 | 1249987643 | IC_kwDOBm6k_c5KgUw7 | 9599 | 2022-09-17T03:19:24Z | 2022-09-17T03:19:24Z | OWNER | In looking at the documentation on [writing tests](https://docs.datasette.io/en/latest/testing_plugins.html), there are a lot of examples like this: ```python def test_that_opens_the_debugger_or_errors(): ds = Datasette([db_path], pdb=True) response = await ds.client.get("/") ``` I really don't like having to tell people to add `await ds.invoke_startup()` to every test that might look like this. Since it's safe to call that function multiple times, I'm going to have `ds.client.get()` and friends call it for you too - so if you forget in a plugin test it won't matter. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1373595927 | |
https://github.com/simonw/datasette/issues/1809#issuecomment-1249990033 | https://api.github.com/repos/simonw/datasette/issues/1809 | 1249990033 | IC_kwDOBm6k_c5KgVWR | 9599 | 2022-09-17T03:39:05Z | 2022-09-17T03:39:05Z | OWNER | New docs section on the need to call `await ds.invoke_startup()`: https://github.com/simonw/datasette/blob/ddc999ad1296e8c69cffede3e367dda059b8adad/docs/testing_plugins.rst#setting-up-a-datasette-test-instance | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1373595927 | |
https://github.com/simonw/datasette/issues/1813#issuecomment-1250901367 | https://api.github.com/repos/simonw/datasette/issues/1813 | 1250901367 | IC_kwDOBm6k_c5Kjz13 | 883348 | 2022-09-19T11:34:45Z | 2022-09-19T11:34:45Z | CONTRIBUTOR | oh and by writing this I just realized the difference: the URL on fly.io is with a custom SQL command whereas the local one is without. It seems that there is no pagination when using custom SQL commands which makes sense Sorry for this useless issue, maybe this can be useful for someone else / me in the future. Thanks again for this wonderful project ! | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1377811868 | |
https://github.com/simonw/datasette/issues/1816#issuecomment-1251724180 | https://api.github.com/repos/simonw/datasette/issues/1816 | 1251724180 | IC_kwDOBm6k_c5Km8uU | 9599 | 2022-09-20T01:13:05Z | 2022-09-20T01:13:05Z | OWNER | Oops, that has a bug: ``` Error: Invalid setting '{key}' in settings.json ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1378640768 | |
https://github.com/simonw/datasette/issues/1814#issuecomment-1251677220 | https://api.github.com/repos/simonw/datasette/issues/1814 | 1251677220 | IC_kwDOBm6k_c5KmxQk | 9599 | 2022-09-19T23:34:30Z | 2022-09-19T23:34:30Z | OWNER | The `settings.json` file can only be used with settings that are set using `--setting name value` - the full list of those is here: https://docs.datasette.io/en/stable/settings.html The `--static` option works differently. In configuration directory mode you can skip it entirely and instead have a `/static/` folder - so your directory structure would look like this: ``` bibliography/static/styles.css ``` And then when you run `datasette bibliography/` the following URL will work: http://127.0.0.1:8001/static/styles.css | { "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
1378495690 | |
https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554 | https://api.github.com/repos/simonw/datasette/issues/1814 | 1251677554 | IC_kwDOBm6k_c5KmxVy | 9599 | 2022-09-19T23:35:06Z | 2022-09-19T23:35:06Z | OWNER | It might have been useful for Datasette to show an error when started against a `settings.json` file that contains an invalid setting though. | { "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1378495690 | |
https://github.com/simonw/datasette/issues/1816#issuecomment-1251682970 | https://api.github.com/repos/simonw/datasette/issues/1816 | 1251682970 | IC_kwDOBm6k_c5Kmyqa | 9599 | 2022-09-19T23:44:54Z | 2022-09-19T23:44:54Z | OWNER | I was going to add type validation too, but that's actually a bit tricky because the logic for that currently lives in Click option parsing here: https://github.com/simonw/datasette/blob/ddc999ad1296e8c69cffede3e367dda059b8adad/datasette/cli.py#L71-L88 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1378640768 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1254064260 | https://api.github.com/repos/simonw/datasette/issues/526 | 1254064260 | IC_kwDOBm6k_c5Kv4CE | 536941 | 2022-09-21T18:17:04Z | 2022-09-21T18:18:01Z | CONTRIBUTOR | hi @simonw, this is becoming more of a bother for my [labor data warehouse](https://labordata.bunkum.us/). Is there any research or a spike i could do that would help you investigate this issue? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/1646#issuecomment-1272149176 | https://api.github.com/repos/simonw/datasette/issues/1646 | 1272149176 | IC_kwDOBm6k_c5L03S4 | 9599 | 2022-10-07T23:06:17Z | 2022-10-07T23:06:17Z | OWNER | Updated documentation: https://docs.datasette.io/en/latest/settings.html#configuration-directory-mode | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1157182254 | |
https://github.com/simonw/datasette/issues/1362#issuecomment-1272228740 | https://api.github.com/repos/simonw/datasette/issues/1362 | 1272228740 | IC_kwDOBm6k_c5L1KuE | 9599 | 2022-10-08T05:03:56Z | 2022-10-08T05:03:56Z | OWNER | Useful example: how Play framework does this https://www.playframework.com/documentation/2.8.1/CspFilter | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
912864936 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1272344884 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1272344884 | IC_kwDOBm6k_c5L1nE0 | 9599 | 2022-10-08T15:41:28Z | 2022-10-08T15:41:28Z | OWNER | Lets switch to `mode=ro` when the `inspect` command runs, we can use this issue for that. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1272357976 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1272357976 | IC_kwDOBm6k_c5L1qRY | 536941 | 2022-10-08T16:56:51Z | 2022-10-08T16:56:51Z | CONTRIBUTOR | when you are running from docker, you **always** will want to run as `mode=ro` because the same thing that is causing duplication in the inspect layer will cause duplication in the final container read/write layer when `datasette serve` runs. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1362#issuecomment-1272369443 | https://api.github.com/repos/simonw/datasette/issues/1362 | 1272369443 | IC_kwDOBm6k_c5L1tEj | 9599 | 2022-10-08T18:03:03Z | 2022-10-08T18:03:03Z | OWNER | Asked for tips on Twitter: https://twitter.com/simonw/status/1578561096520114176 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
912864936 | |
https://github.com/simonw/datasette/issues/1362#issuecomment-1272369603 | https://api.github.com/repos/simonw/datasette/issues/1362 | 1272369603 | IC_kwDOBm6k_c5L1tHD | 9599 | 2022-10-08T18:03:56Z | 2022-10-08T18:03:56Z | OWNER | This document is useful: https://csp.withgoogle.com/docs/strict-csp.html | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
912864936 | |
https://github.com/simonw/datasette/issues/1362#issuecomment-1272369712 | https://api.github.com/repos/simonw/datasette/issues/1362 | 1272369712 | IC_kwDOBm6k_c5L1tIw | 9599 | 2022-10-08T18:04:31Z | 2022-10-08T18:05:05Z | OWNER | Also this series: https://scotthelme.co.uk/tag/csp/ - via https://twitter.com/adamchainz/status/1578762884481368065 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
912864936 | |
https://github.com/simonw/datasette/issues/1362#issuecomment-1272376377 | https://api.github.com/repos/simonw/datasette/issues/1362 | 1272376377 | IC_kwDOBm6k_c5L1uw5 | 9599 | 2022-10-08T18:42:09Z | 2022-10-08T18:42:09Z | OWNER | And a useful cheat sheet https://scotthelme.co.uk/csp-cheat-sheet/ | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
912864936 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258337011 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258337011 | IC_kwDOBm6k_c5LALLz | 536941 | 2022-09-26T16:49:48Z | 2022-09-26T16:49:48Z | CONTRIBUTOR | i think the smallest change that gets close to what i want is to change the behavior so that `max_returned_rows` is not applied in the `execute` method when we are are asking for a csv of query. there are some infelicities for that approach, but i'll make a PR to make it easier to discuss. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/pull/1820#issuecomment-1258803261 | https://api.github.com/repos/simonw/datasette/issues/1820 | 1258803261 | IC_kwDOBm6k_c5LB9A9 | 536941 | 2022-09-27T00:03:09Z | 2022-09-27T00:03:09Z | CONTRIBUTOR | the pattern in this PR `max_returned_rows` control the maximum rows rendered through html and json, and the csv render bypasses that. i think it would be better to have each of these different query renderers have more direct control for how many rows to fetch, instead of relying on the internals of the `execute` method. generally, users will not want to paginate through tens of thousands of results, but often will want to download a full query as json or as csv. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386456717 | |
https://github.com/simonw/datasette/pull/1820#issuecomment-1258601033 | https://api.github.com/repos/simonw/datasette/issues/1820 | 1258601033 | IC_kwDOBm6k_c5LBLpJ | 22429695 | 2022-09-26T20:32:47Z | 2022-10-07T03:58:13Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1820?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **92.50**% // Head: **92.51**% // Increases project coverage by **`+0.01%`** :tada: > Coverage data is based on head [(`9bead2a`)](https://codecov.io/gh/simonw/datasette/pull/1820?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`eff1124`)](https://codecov.io/gh/simonw/datasette/commit/eff112498ecc499323c26612d707908831446d25?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch coverage: 100.00% of modified lines in pull request are covered. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1820 +/- ## ========================================== + Coverage 92.50% 92.51% +0.01% ========================================== Files 35 35 Lines 4400 4406 +6 ========================================== + Hits 4070 4076 +6 Misses 330 330 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1820?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1820/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.11% <ø> (ø)` | | | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1820/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `94.80% <10… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386456717 | |
https://github.com/simonw/datasette/issues/1821#issuecomment-1258692555 | https://api.github.com/repos/simonw/datasette/issues/1821 | 1258692555 | IC_kwDOBm6k_c5LBh_L | 9599 | 2022-09-26T22:06:39Z | 2022-09-26T22:06:39Z | OWNER | - https://github.com/simonw/datasette/actions/runs/3131344150 - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://pypi.org/project/datasette/0.63a0/ | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386734383 | |
https://github.com/simonw/datasette/issues/1818#issuecomment-1258735283 | https://api.github.com/repos/simonw/datasette/issues/1818 | 1258735283 | IC_kwDOBm6k_c5LBsaz | 9599 | 2022-09-26T22:47:19Z | 2022-09-26T22:47:19Z | OWNER | That's a really interesting idea: for a lot of databases (those made out of straight imports from CSV) `max(rowid)` would indeed reflect the size of the table, but would be a MUCH faster operation than attempting a `count(*)`. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1384549993 | |
https://github.com/simonw/datasette/issues/1818#issuecomment-1258735747 | https://api.github.com/repos/simonw/datasette/issues/1818 | 1258735747 | IC_kwDOBm6k_c5LBsiD | 9599 | 2022-09-26T22:47:59Z | 2022-09-26T22:47:59Z | OWNER | Another option here is to tie into a feature I built in `sqlite-utils` with this problem in mind but never introduced on the Datasette side of things: https://sqlite-utils.datasette.io/en/stable/python-api.html#cached-table-counts-using-triggers | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1384549993 | |
https://github.com/simonw/datasette/issues/1819#issuecomment-1258738435 | https://api.github.com/repos/simonw/datasette/issues/1819 | 1258738435 | IC_kwDOBm6k_c5LBtMD | 9599 | 2022-09-26T22:52:19Z | 2022-09-26T22:52:19Z | OWNER | This is a good idea. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1385026210 | |
https://github.com/simonw/datasette/issues/1818#issuecomment-1258738740 | https://api.github.com/repos/simonw/datasette/issues/1818 | 1258738740 | IC_kwDOBm6k_c5LBtQ0 | 5363 | 2022-09-26T22:52:45Z | 2022-09-26T22:55:57Z | NONE | thoughts on order of precedence to use: * sqlite-utils count, if present. closest thing to a standard i guess. * row(max_id) if like, the first and/or last x amount of rows ids are all contiguous. kind of a cheap/dumb/imperfect heuristic to see if the table is dump/not dump. if the check passes, still stick on `est.` after the display. * count(*) if enabled in datasette | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1384549993 | |
https://github.com/simonw/datasette/issues/1819#issuecomment-1258746600 | https://api.github.com/repos/simonw/datasette/issues/1819 | 1258746600 | IC_kwDOBm6k_c5LBvLo | 9599 | 2022-09-26T23:05:40Z | 2022-09-26T23:05:40Z | OWNER | Implementing it like this, so at least you can copy and paste the SQL query back out again: <img width="796" alt="image" src="https://user-images.githubusercontent.com/9599/192395953-48512c94-10e0-4cf8-8ae5-b9e65e3d7b0f.png"> I'm not doing a full textarea because this error can be raised in multiple places, including on the table page itself. It's not just an error associated with the manual query page. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1385026210 | |
https://github.com/simonw/datasette/issues/1822#issuecomment-1258757544 | https://api.github.com/repos/simonw/datasette/issues/1822 | 1258757544 | IC_kwDOBm6k_c5LBx2o | 9599 | 2022-09-26T23:21:23Z | 2022-09-26T23:21:23Z | OWNER | Everything on https://docs.datasette.io/en/stable/internals.html that uses keyword arguments should do this I think. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386854246 | |
https://github.com/simonw/datasette/issues/1819#issuecomment-1258754105 | https://api.github.com/repos/simonw/datasette/issues/1819 | 1258754105 | IC_kwDOBm6k_c5LBxA5 | 9599 | 2022-09-26T23:16:15Z | 2022-09-26T23:16:15Z | OWNER | Demo: https://latest.datasette.io/_memory?sql=with+recursive+counter(x)+as+(%0D%0A++select+0%0D%0A++++union%0D%0A++select+x+%2B+1+from+counter%0D%0A)%2C%0D%0Ablah+as+(select+*+from+counter+limit+5000000)%0D%0Aselect+count(*)+from+blah | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1385026210 | |
https://github.com/simonw/datasette/issues/1817#issuecomment-1258756231 | https://api.github.com/repos/simonw/datasette/issues/1817 | 1258756231 | IC_kwDOBm6k_c5LBxiH | 9599 | 2022-09-26T23:19:34Z | 2022-09-26T23:19:34Z | OWNER | This is a good idea - it's something I should do before Datasette 1.0. I was a tiny bit worried about compatibility (Datasette is 3.7+) but it looks like they have been in Python since 3.0! | { "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1384273985 | |
https://github.com/simonw/datasette/issues/1822#issuecomment-1258760299 | https://api.github.com/repos/simonw/datasette/issues/1822 | 1258760299 | IC_kwDOBm6k_c5LByhr | 9599 | 2022-09-26T23:25:12Z | 2022-09-26T23:25:55Z | OWNER | A start: ```diff diff --git a/datasette/utils/asgi.py b/datasette/utils/asgi.py index 8a2fa060..41ade961 100644 --- a/datasette/utils/asgi.py +++ b/datasette/utils/asgi.py @@ -118,7 +118,7 @@ class Request: return dict(parse_qsl(body.decode("utf-8"), keep_blank_values=True)) @classmethod - def fake(cls, path_with_query_string, method="GET", scheme="http", url_vars=None): + def fake(cls, path_with_query_string, *, method="GET", scheme="http", url_vars=None): """Useful for constructing Request objects for tests""" path, _, query_string = path_with_query_string.partition("?") scope = { @@ -204,7 +204,7 @@ class AsgiWriter: ) -async def asgi_send_json(send, info, status=200, headers=None): +async def asgi_send_json(send, info, *, status=200, headers=None): headers = headers or {} await asgi_send( send, @@ -215,7 +215,7 @@ async def asgi_send_json(send, info, status=200, headers=None): ) -async def asgi_send_html(send, html, status=200, headers=None): +async def asgi_send_html(send, html, *, status=200, headers=None): headers = headers or {} await asgi_send( send, @@ -226,7 +226,7 @@ async def asgi_send_html(send, html, status=200, headers=None): ) -async def asgi_send_redirect(send, location, status=302): +async def asgi_send_redirect(send, location, *, status=302): await asgi_send( send, "", @@ -236,12 +236,12 @@ async def asgi_send_redirect(send, location, status=302): ) -async def asgi_send(send, content, status, headers=None, content_type="text/plain"): +async def asgi_send(send, content, status, *, headers=None, content_type="text/plain"): await asgi_start(send, status, headers, content_type) await send({"type": "http.response.body", "body": content.encode("utf-8")}) -async def asgi_start(send, status, headers=None, content_type="text/plain"): +async def asgi_start(send, status, *, headers=None, con… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386854246 | |
https://github.com/simonw/datasette/issues/1817#issuecomment-1258818028 | https://api.github.com/repos/simonw/datasette/issues/1817 | 1258818028 | IC_kwDOBm6k_c5LCAns | 9599 | 2022-09-27T00:27:53Z | 2022-09-27T00:27:53Z | OWNER | Made a start on this: ```diff diff --git a/datasette/hookspecs.py b/datasette/hookspecs.py index 34e19664..fe0971e5 100644 --- a/datasette/hookspecs.py +++ b/datasette/hookspecs.py @@ -31,25 +31,29 @@ def prepare_jinja2_environment(env, datasette): @hookspec -def extra_css_urls(template, database, table, columns, view_name, request, datasette): +def extra_css_urls( + template, database, table, columns, sql, params, view_name, request, datasette +): """Extra CSS URLs added by this plugin""" @hookspec -def extra_js_urls(template, database, table, columns, view_name, request, datasette): +def extra_js_urls( + template, database, table, columns, sql, params, view_name, request, datasette +): """Extra JavaScript URLs added by this plugin""" @hookspec def extra_body_script( - template, database, table, columns, view_name, request, datasette + template, database, table, columns, sql, params, view_name, request, datasette ): """Extra JavaScript code to be included in <script> at bottom of body""" @hookspec def extra_template_vars( - template, database, table, columns, view_name, request, datasette + template, database, table, columns, sql, params, view_name, request, datasette ): """Extra template variables to be made available to the template - can return dict or callable or awaitable""" ``` ```diff diff --git a/datasette/app.py b/datasette/app.py index 03d1dacc..2f3a46fe 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -1036,7 +1036,9 @@ class Datasette: return await template.render_async(template_context) - async def _asset_urls(self, key, template, context, request, view_name): + async def _asset_urls( + self, key, template, context, request, view_name, sql, params + ): # Flatten list-of-lists from plugins: seen_urls = set() collected = [] @@ -1045,6 +1047,8 @@ class Datasette: database=context.get("database"), … | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1384273985 | |
https://github.com/simonw/datasette/issues/1822#issuecomment-1258827688 | https://api.github.com/repos/simonw/datasette/issues/1822 | 1258827688 | IC_kwDOBm6k_c5LCC-o | 9599 | 2022-09-27T00:44:04Z | 2022-09-27T00:44:04Z | OWNER | I'll do this in a PR. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386854246 | |
https://github.com/simonw/datasette/pull/1823#issuecomment-1258828509 | https://api.github.com/repos/simonw/datasette/issues/1823 | 1258828509 | IC_kwDOBm6k_c5LCDLd | 9599 | 2022-09-27T00:45:26Z | 2022-09-27T00:45:26Z | OWNER | I should update the documentation to reflect this change. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386917344 | |
https://github.com/simonw/datasette/pull/1823#issuecomment-1258828705 | https://api.github.com/repos/simonw/datasette/issues/1823 | 1258828705 | IC_kwDOBm6k_c5LCDOh | 9599 | 2022-09-27T00:45:46Z | 2022-09-27T00:45:46Z | OWNER | Also need to do a bit more of an audit to see if there is anywhere else that this style should be applied. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386917344 | |
https://github.com/simonw/datasette/pull/1823#issuecomment-1258833358 | https://api.github.com/repos/simonw/datasette/issues/1823 | 1258833358 | IC_kwDOBm6k_c5LCEXO | 22429695 | 2022-09-27T00:54:15Z | 2022-10-05T04:37:54Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1823?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **91.58**% // Head: **92.50**% // Increases project coverage by **`+0.91%`** :tada: > Coverage data is based on head [(`b545b6a`)](https://codecov.io/gh/simonw/datasette/pull/1823?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`5f9f567`)](https://codecov.io/gh/simonw/datasette/commit/5f9f567acbc58c9fcd88af440e68034510fb5d2b?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch coverage: 90.47% of modified lines in pull request are covered. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1823 +/- ## ========================================== + Coverage 91.58% 92.50% +0.91% ========================================== Files 36 35 -1 Lines 4444 4400 -44 ========================================== Hits 4070 4070 + Misses 374 330 -44 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1823?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/utils/asgi.py](https://codecov.io/gh/simonw/datasette/pull/1823/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3V0aWxzL2FzZ2kucHk=) | `91.06% <88.23%> (ø)` | | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1823/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.11%… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1386917344 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258846992 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258846992 | IC_kwDOBm6k_c5LCHsQ | 9599 | 2022-09-27T01:21:41Z | 2022-09-27T01:21:41Z | OWNER | My main concern here is that public Datasette instances could easily have all of their available database connections consumed by long-running queries - either accidentally or deliberately. I do totally understand the need for this feature though. I think it can absolutely make sense provided it's protected by authentication and permissions. Maybe even limit the number of concurrent downloads at once such that there's always at least one database connection free for other requests. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258849766 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258849766 | IC_kwDOBm6k_c5LCIXm | 536941 | 2022-09-27T01:27:03Z | 2022-09-27T01:27:03Z | CONTRIBUTOR | i agree with that concern! but if i'm understanding the code correctly, `maximum_returned_rows` does not protect against long-running queries in any way. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258864140 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258864140 | IC_kwDOBm6k_c5LCL4M | 9599 | 2022-09-27T01:55:32Z | 2022-09-27T01:55:32Z | OWNER | That recursive query is a great example of the kind of thing having a maximum row limit protects against. Imagine if Datasette CSVs did allow unlimited retrievals. Someone could hit the CSV endpoint for that recursive query and tie up Datasette's SQL connection effectively forever. Even if this feature becomes a permission-guarded thing we still need to take that case into account. At the very least it would be good if the query could be cancelled if the client disconnects - so if someone accidentally starts an infinite query they can cancel the request and free up the server resources. It might be a good idea to implement a page that shows "currently running" queries and allows users with the right permission to terminate them from that page. Another option: a "limit of last resource" - either a very high row limit (10,000,000 perhaps) or even a time limit, saying that all queries will be cancelled if they take longer than thirty minutes or similar. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258860845 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258860845 | IC_kwDOBm6k_c5LCLEt | 9599 | 2022-09-27T01:48:31Z | 2022-09-27T01:50:01Z | OWNER | The protection is supposed to be from this line: ```python rows = cursor.fetchmany(max_returned_rows + 1) ``` By capping the call to `.fetchman()` at `max_returned_rows + 1` (the `+ 1` is to allow detection of whether or not there is a next page) I'm ensuring that Datasette never attempts to iterate over a huge result set. SQLite and the `sqlite3` library seem to handle this correctly. Here's an example: ```pycon >>> import sqlite3 >>> conn = sqlite3.connect(":memory:") >>> cursor = conn.execute(""" ... with recursive counter(x) as ( ... select 0 ... union ... select x + 1 from counter ... ) ... select * from counter""") >>> cursor.fetchmany(10) [(0,), (1,), (2,), (3,), (4,), (5,), (6,), (7,), (8,), (9,), (10,)] ``` `counter` there is an infinitely long table ([see TIL](https://til.simonwillison.net/sqlite/simple-recursive-cte)) - but we can retrieve the first 10 results without going into an infinite loop. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258871525 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258871525 | IC_kwDOBm6k_c5LCNrl | 536941 | 2022-09-27T02:09:32Z | 2022-09-27T02:14:53Z | CONTRIBUTOR | thanks @simonw, i learned something i didn't know about sqlite's execution model! > Imagine if Datasette CSVs did allow unlimited retrievals. Someone could hit the CSV endpoint for that recursive query and tie up Datasette's SQL connection effectively forever. why wouldn't the `sqlite_timelimit` guard prevent that? --- on my local version which has the code to [turn off truncations for query csv](#1820), `sqlite_timelimit` does protect me. ![Screenshot 2022-09-26 at 22-14-31 Error 500](https://user-images.githubusercontent.com/536941/192415680-94b32b7f-868f-4b89-8194-5752d45f6009.png) | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258878311 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258878311 | IC_kwDOBm6k_c5LCPVn | 536941 | 2022-09-27T02:19:48Z | 2022-09-27T02:19:48Z | CONTRIBUTOR | this sql query doesn't trip up `maximum_returned_rows` but does timeout ```sql with recursive counter(x) as ( select 0 union select x + 1 from counter ) select * from counter LIMIT 10 OFFSET 100000000 ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258905781 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258905781 | IC_kwDOBm6k_c5LCWC1 | 9599 | 2022-09-27T03:03:35Z | 2022-09-27T03:03:47Z | OWNER | Yes good point, the time limit does already protect against that. I've been contemplating a permissioned-users-only relaxation of that time limit too, and I got that idea mixed up with this one in my head. On that basis maybe this feature would be safe after all? Would need to do some testing, but it may be that the existing time limit provides enough protection here already. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258906440 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258906440 | IC_kwDOBm6k_c5LCWNI | 9599 | 2022-09-27T03:04:37Z | 2022-09-27T03:04:37Z | OWNER | It would be really neat if we could explore this idea in a plugin, but I don't think Datasette has plugin hooks in the right place for that at the moment. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1258910228 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258910228 | IC_kwDOBm6k_c5LCXIU | 536941 | 2022-09-27T03:11:07Z | 2022-09-27T03:11:07Z | CONTRIBUTOR | i think this feature would be safe, as its really only the time limit that can, and imo, should protect against long running queries, as it is pretty easy to make very expensive queries that don't return many rows. moving away from `max_returned_rows` will requires some thinking about: 1. memory usage and data flows to handle potentially very large result sets 2. how to avoid rendering tens or hundreds of thousands of [html rows](#1655). | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1259693536 | https://api.github.com/repos/simonw/datasette/issues/526 | 1259693536 | IC_kwDOBm6k_c5LFWXg | 9599 | 2022-09-27T15:42:55Z | 2022-09-27T15:42:55Z | OWNER | It's interesting to note WHY the time limit works against this so well. The time limit as-implemented looks like this: https://github.com/simonw/datasette/blob/5f9f567acbc58c9fcd88af440e68034510fb5d2b/datasette/utils/__init__.py#L181-L201 The key here is `conn.set_progress_handler(handler, n)` - which specifies that the handler function should be called every `n` SQLite operations. The handler function then checks to see if too much time has transpired and conditionally cancels the query. This also doubles up as a "maximum number of operations" guard, which is what's happening when you attempt to fetch an infinite number of rows from an infinite table. That limit code could even be extended to say "exit the query after either 5s or 50,000,000 operations". I don't think that's necessary though. To be honest I'm having trouble with the idea of dropping `max_returned_rows` mainly because what Datasette does (allow arbitrary untrusted SQL queries) is dangerous, so I've designed in multiple redundant defence-in-depth mechanisms right from the start. | { "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1259718517 | https://api.github.com/repos/simonw/datasette/issues/526 | 1259718517 | IC_kwDOBm6k_c5LFcd1 | 536941 | 2022-09-27T16:02:51Z | 2022-09-27T16:04:46Z | CONTRIBUTOR | i think that `max_returned_rows` **is** a defense mechanism, just not for connection exhaustion. `max_returned_rows` is a defense mechanism against **memory bombs**. if you are potentially yielding out hundreds of thousands or even millions of rows, you need to be quite careful about data flow to not run out of memory on the server, or on the client. you have a lot of places in your code that are protective of that right now, but `max_returned_rows` acts as the final backstop. so, given that, it makes sense to have removing `max_returned_rows` altogether be a non-goal, but instead allow for for specific codepaths (like streaming csv's) be able to bypass. that could dramatically lower the surface area for a memory-bomb attack. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/526#issuecomment-1260355224 | https://api.github.com/repos/simonw/datasette/issues/526 | 1260355224 | IC_kwDOBm6k_c5LH36Y | 9599 | 2022-09-28T04:01:25Z | 2022-09-28T04:01:25Z | OWNER | The ultimate protection against those memory bombs is to support more streaming output formats. Related issues: - #1177 - #1062 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 | |
https://github.com/simonw/datasette/issues/1826#issuecomment-1260357583 | https://api.github.com/repos/simonw/datasette/issues/1826 | 1260357583 | IC_kwDOBm6k_c5LH4fP | 9599 | 2022-09-28T04:05:16Z | 2022-09-28T04:05:16Z | OWNER | This is deliberate. The Datasette plugin system allows you to specify only a subset of the parameters for a hook - in this example, only the `value` is needed so the others can be omitted. There's a note about this at the very top of that documentation page: https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hooks > When you implement a plugin hook you can accept any or all of the parameters that are documented as being passed to that hook. > > For example, you can implement the `render_cell` plugin hook like this even though the full documented hook signature is `render_cell(value, column, table, database, datasette)`: > ```python > @hookimpl > def render_cell(value, column): > if column == "stars": > return "*" * int(value) > ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1388631785 | |
https://github.com/simonw/datasette/issues/1826#issuecomment-1260357878 | https://api.github.com/repos/simonw/datasette/issues/1826 | 1260357878 | IC_kwDOBm6k_c5LH4j2 | 9599 | 2022-09-28T04:05:45Z | 2022-09-28T04:05:45Z | OWNER | Though now I notice that the copy right there needs to be updated to reflect the new `row` parameter to `render_cell`! | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1388631785 | |
https://github.com/simonw/datasette/pull/1825#issuecomment-1260368122 | https://api.github.com/repos/simonw/datasette/issues/1825 | 1260368122 | IC_kwDOBm6k_c5LH7D6 | 22429695 | 2022-09-28T04:20:28Z | 2022-09-28T04:20:28Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1825?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **91.58**% // Head: **91.58**% // No change to project coverage :thumbsup: > Coverage data is based on head [(`b16eb2f`)](https://codecov.io/gh/simonw/datasette/pull/1825?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`5f9f567`)](https://codecov.io/gh/simonw/datasette/commit/5f9f567acbc58c9fcd88af440e68034510fb5d2b?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. > :exclamation: Current head b16eb2f differs from pull request most recent head e7e96dc. Consider uploading reports for the commit e7e96dc to get more accurate results <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1825 +/- ## ======================================= Coverage 91.58% 91.58% ======================================= Files 36 36 Lines 4444 4444 ======================================= Hits 4070 4070 Misses 374 374 ``` Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) </details> [:umbrella: View full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1825?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1388227245 | |
https://github.com/simonw/datasette/pull/1825#issuecomment-1260368537 | https://api.github.com/repos/simonw/datasette/issues/1825 | 1260368537 | IC_kwDOBm6k_c5LH7KZ | 9599 | 2022-09-28T04:21:18Z | 2022-09-28T04:21:18Z | OWNER | This is great, thank you very much! https://datasette--1825.org.readthedocs.build/en/1825/deploying.html#running-datasette-using-openrc | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1388227245 | |
https://github.com/simonw/datasette/issues/1826#issuecomment-1260373403 | https://api.github.com/repos/simonw/datasette/issues/1826 | 1260373403 | IC_kwDOBm6k_c5LH8Wb | 66709385 | 2022-09-28T04:30:27Z | 2022-09-28T04:30:27Z | NONE | I'm glad the bug report served some purpose. Frankly I just needed the method signature, that is why the documentation you mention wasn't read. On Tue, Sep 27, 2022, 9:05 PM Simon Willison ***@***.***> wrote: > Though now I notice that the copy right there needs to be updated to > reflect the new row parameter to render_cell! > > — > Reply to this email directly, view it on GitHub > <https://github.com/simonw/datasette/issues/1826#issuecomment-1260357878>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AP46PCLI6KVDQTFLWVZODRLWAO72JANCNFSM6AAAAAAQXKWIJA> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> > | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1388631785 | |
https://github.com/simonw/datasette/issues/1062#issuecomment-1260909128 | https://api.github.com/repos/simonw/datasette/issues/1062 | 1260909128 | IC_kwDOBm6k_c5LJ_JI | 536941 | 2022-09-28T13:22:53Z | 2022-09-28T14:09:54Z | CONTRIBUTOR | if you went this route: ```python with sqlite_timelimit(conn, time_limit_ms): c.execute(query) for chunk in c.fetchmany(chunk_size): yield from chunk ``` then `time_limit_ms` would probably have to be greatly extended, because the time spent in the loop will depend on the downstream processing. i wonder if this was why you were thinking this feature would need a dedicated connection? --- reading more, there's no real limit i can find on the number of active cursors (or more precisely active prepared statements objects, because sqlite doesn't really have cursors). maybe something like this would be okay? ```python with sqlite_timelimit(conn, time_limit_ms): c.execute(query) # step through at least one to evaluate the statement, not sure if this is necessary yield c.execute.fetchone() for chunk in c.fetchmany(chunk_size): yield from chunk ``` this seems quite weird that there's not more of limit of the number of active prepared statements, but i haven't been able to find one. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
732674148 | |
https://github.com/simonw/datasette/issues/1062#issuecomment-1260829829 | https://api.github.com/repos/simonw/datasette/issues/1062 | 1260829829 | IC_kwDOBm6k_c5LJryF | 536941 | 2022-09-28T12:27:19Z | 2022-09-28T12:27:19Z | CONTRIBUTOR | for teaching `register_output_renderer` to stream it seems like the two options are to 1. a [nested query technique ](https://github.com/simonw/datasette/issues/526#issuecomment-505162238)to paginate through 2. a fetching model that looks like something ```python with sqlite_timelimit(conn, time_limit_ms): c.execute(query) for chunk in c.fetchmany(chunk_size): yield from chunk ``` currently `db.execute` is not a generator, so this would probably need a new method? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
732674148 | |
https://github.com/simonw/datasette/issues/1624#issuecomment-1261194164 | https://api.github.com/repos/simonw/datasette/issues/1624 | 1261194164 | IC_kwDOBm6k_c5LLEu0 | 38532 | 2022-09-28T16:54:22Z | 2022-09-28T16:54:22Z | NONE | https://github.com/simonw/datasette-cors seems to workaround this | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1122427321 | |
https://github.com/simonw/datasette/issues/370#issuecomment-1261930179 | https://api.github.com/repos/simonw/datasette/issues/370 | 1261930179 | IC_kwDOBm6k_c5LN4bD | 72577720 | 2022-09-29T08:17:46Z | 2022-09-29T08:17:46Z | CONTRIBUTOR | Just watched this video which demonstrates the integration of *any* webapp into JupyterLab: https://youtu.be/FH1dKKmvFtc Maybe this is the answer? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
377155320 | |
https://github.com/simonw/datasette/pull/1827#issuecomment-1263570186 | https://api.github.com/repos/simonw/datasette/issues/1827 | 1263570186 | IC_kwDOBm6k_c5LUI0K | 22429695 | 2022-09-30T13:22:15Z | 2022-09-30T13:22:15Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1827?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **92.50**% // Head: **92.50**% // No change to project coverage :thumbsup: > Coverage data is based on head [(`1f0c557`)](https://codecov.io/gh/simonw/datasette/pull/1827?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`34defdc`)](https://codecov.io/gh/simonw/datasette/commit/34defdc10aa293294ca01cfab70780755447e1d7?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1827 +/- ## ======================================= Coverage 92.50% 92.50% ======================================= Files 35 35 Lines 4400 4400 ======================================= Hits 4070 4070 Misses 330 330 ``` Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) </details> [:umbrella: View full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1827?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). :loudspeaker: Do you have feedback about the report comment? [Let us know in this issue](https://about.codecov.io/codecov-pr-comment-feedback/?utm_medium=referral&… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1392426838 | |
https://github.com/simonw/datasette/issues/1828#issuecomment-1264738081 | https://api.github.com/repos/simonw/datasette/issues/1828 | 1264738081 | IC_kwDOBm6k_c5LYl8h | 9599 | 2022-10-02T21:34:37Z | 2022-10-02T21:34:37Z | OWNER | I'm running a build of that demo instance here (takes ~30m) https://github.com/dogsheep/github-to-sqlite/actions/runs/3170164705 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1393903845 | |
https://github.com/simonw/datasette/issues/1805#issuecomment-1264736537 | https://api.github.com/repos/simonw/datasette/issues/1805 | 1264736537 | IC_kwDOBm6k_c5LYlkZ | 9599 | 2022-10-02T21:25:37Z | 2022-10-02T21:25:37Z | OWNER | `word-wrap: anywhere` had some nasty side-effects, removing that: - #1828 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1363552780 | |
https://github.com/simonw/datasette/issues/485#issuecomment-1264737290 | https://api.github.com/repos/simonw/datasette/issues/485 | 1264737290 | IC_kwDOBm6k_c5LYlwK | 9599 | 2022-10-02T21:29:59Z | 2022-10-02T21:29:59Z | OWNER | To clarify: the feature this issue is talking about relates to the way Datasette automatically displays foreign key relationships, for example on this page: https://github-to-sqlite.dogsheep.net/github/commits <img width="1233" alt="image" src="https://user-images.githubusercontent.com/9599/193476985-d41148cf-2b2f-49b9-b717-e92145afab31.png"> Each of those columns is a foreign key to another table. The link text that is displayed there comes from the "label column" that has either been configured or automatically detected for that other table. I wonder if this could be handled with a tiny machine learning model that's trained to help pick the best label column? Inputs to that model could include: - The names of the columns - The number of unique values in each column - The type of each column (or maybe only `TEXT` columns should be considered) - How many `null` values there are - Is the column marked as unique? - What's the average (or median or some other statistic) string length of values in each column? Output would be the most likely label column, or some indicator that no likely candidates had been found. My hunch is that this would be better solved using a few extra heuristics rather than by training a model, but it does feel like an interesting opportunity to experiment with a tiny ML model. Asked for tips about this on Twitter: https://twitter.com/simonw/status/1576680930680262658 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
447469253 | |
https://github.com/simonw/datasette/issues/1828#issuecomment-1264753439 | https://api.github.com/repos/simonw/datasette/issues/1828 | 1264753439 | IC_kwDOBm6k_c5LYpsf | 9599 | 2022-10-02T23:01:17Z | 2022-10-02T23:01:17Z | OWNER | That change deployed and https://github-to-sqlite.dogsheep.net/github/commits now looks like this: <img width="1388" alt="image" src="https://user-images.githubusercontent.com/9599/193480158-de81ac0a-5cb2-4d53-a75c-025c78f293ee.png"> | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1393903845 | |
https://github.com/simonw/datasette/issues/1805#issuecomment-1264753725 | https://api.github.com/repos/simonw/datasette/issues/1805 | 1264753725 | IC_kwDOBm6k_c5LYpw9 | 9599 | 2022-10-02T23:02:17Z | 2022-10-02T23:02:17Z | OWNER | After reverting `word--wrap anywhere` https://latest.datasette.io/_memory?sql=select+%27https%3A%2F%2Fexample.com%2Faaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1363552780 | |
https://github.com/simonw/datasette/issues/1805#issuecomment-1264753894 | https://api.github.com/repos/simonw/datasette/issues/1805 | 1264753894 | IC_kwDOBm6k_c5LYpzm | 9599 | 2022-10-02T23:02:54Z | 2022-10-02T23:02:54Z | OWNER | I'm tempted to add `word-wrap: anywhere` only to links that are know to be longer than a certain threshold. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1363552780 | |
https://github.com/simonw/datasette/issues/485#issuecomment-1264769569 | https://api.github.com/repos/simonw/datasette/issues/485 | 1264769569 | IC_kwDOBm6k_c5LYtoh | 9599 | 2022-10-03T00:04:42Z | 2022-10-03T00:04:42Z | OWNER | I love these tips - tools that can compile a simple machine learning model to a SQL query! Would be pretty cool if I could bundle a model in Datasette itself as a big in-memory SQLite SQL query: - https://github.com/Chryzanthemum/xgb2sql - https://github.com/konstantint/SKompiler | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
447469253 | |
https://github.com/simonw/datasette/issues/1805#issuecomment-1265161668 | https://api.github.com/repos/simonw/datasette/issues/1805 | 1265161668 | IC_kwDOBm6k_c5LaNXE | 562352 | 2022-10-03T09:18:05Z | 2022-10-03T09:18:05Z | NONE | > I'm tempted to add `word-wrap: anywhere` only to links that are know to be longer than a certain threshold. Make sense IMHO. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1363552780 | |
https://github.com/simonw/datasette/issues/1829#issuecomment-1267708232 | https://api.github.com/repos/simonw/datasette/issues/1829 | 1267708232 | IC_kwDOBm6k_c5Lj7FI | 9599 | 2022-10-04T23:17:36Z | 2022-10-04T23:17:36Z | OWNER | Here's the relevant code from the table page: https://github.com/simonw/datasette/blob/4218c9cd742b79b1e3cb80878e42b7e39d16ded2/datasette/views/table.py#L215-L227 Note how `ensure_permissions()` there takes the table, database and instance into account... but the `private` assignment (used to decide if the padlock should display or not) only considers the `view-table` check. Here's the same code for the database page: https://github.com/simonw/datasette/blob/4218c9cd742b79b1e3cb80878e42b7e39d16ded2/datasette/views/database.py#L139-L141 And for canned query pages: https://github.com/simonw/datasette/blob/4218c9cd742b79b1e3cb80878e42b7e39d16ded2/datasette/views/database.py#L228-L240 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1396948693 | |
https://github.com/simonw/datasette/issues/1829#issuecomment-1267709546 | https://api.github.com/repos/simonw/datasette/issues/1829 | 1267709546 | IC_kwDOBm6k_c5Lj7Zq | 9599 | 2022-10-04T23:19:24Z | 2022-10-04T23:21:07Z | OWNER | There's also a `check_visibility()` helper which I'm not using in these particular cases but which may be relevant. It's called like this: https://github.com/simonw/datasette/blob/4218c9cd742b79b1e3cb80878e42b7e39d16ded2/datasette/views/database.py#L65-L77 And is defined here: https://github.com/simonw/datasette/blob/4218c9cd742b79b1e3cb80878e42b7e39d16ded2/datasette/app.py#L694-L710 It's actually documented as a public method here: https://docs.datasette.io/en/stable/internals.html#await-check-visibility-actor-action-resource-none > This convenience method can be used to answer the question "should this item be considered private, in that it is visible to me but it is not visible to anonymous users?" > > It returns a tuple of two booleans, `(visible, private)`. `visible` indicates if the actor can see this resource. `private` will be `True` if an anonymous user would not be able to view the resource. Note that this documented method cannot actually do the right thing - because it's not being given the multiple permissions that need to be checked in order to completely answer the question. So I probably need to redesign that method a bit. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1396948693 | |
https://github.com/simonw/datasette/issues/1832#issuecomment-1267918117 | https://api.github.com/repos/simonw/datasette/issues/1832 | 1267918117 | IC_kwDOBm6k_c5LkuUl | 9599 | 2022-10-05T04:19:52Z | 2022-10-05T04:19:52Z | OWNER | Code can go here: https://github.com/simonw/datasette/blob/b6ba117b7978b58b40e3c3c2b723b92c3010ed53/datasette/database.py#L511-L515 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1397193691 | |
https://github.com/simonw/datasette/issues/1832#issuecomment-1267925830 | https://api.github.com/repos/simonw/datasette/issues/1832 | 1267925830 | IC_kwDOBm6k_c5LkwNG | 9599 | 2022-10-05T04:31:57Z | 2022-10-05T04:31:57Z | OWNER | Turns out this already works - `__bool__` falls back on `__len__`: https://docs.python.org/3/reference/datamodel.html#object.__bool__ > When this method is not defined, [`__len__()`](https://docs.python.org/3/reference/datamodel.html#object.__len__ "object.__len__") is called, if it is defined, and the object is considered true if its result is nonzero. I'll add a test to demonstrate this. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1397193691 | |
https://github.com/simonw/datasette/issues/1824#issuecomment-1268398461 | https://api.github.com/repos/simonw/datasette/issues/1824 | 1268398461 | IC_kwDOBm6k_c5Lmjl9 | 562352 | 2022-10-05T12:55:05Z | 2022-10-05T12:55:05Z | NONE | Here is some working javascript code. There might be better solution, I'm not a JS expert. ```javascript var show_hide = document.querySelector(".show-hide-sql > a"); // Hide SQL query if the URL opened with #_hide_sql var hash = window.location.hash; if(hash === "#_hide_sql") { hide_sql(); } show_hide.setAttribute("href", "#"); show_hide.addEventListener("click", toggle_sql_display); function toggle_sql_display() { if (show_hide.innerText === "hide") { hide_sql(); return; } if (show_hide.innerText === "show") { show_sql(); return; } } function hide_sql() { sql_element.style.cssText="display:none"; show_hide.innerHTML = "show"; show_hide.setAttribute("href", "#_hide_sql"); } function show_sql() { sql_element.style.cssText="display:block"; show_hide.innerHTML = "hide"; show_hide.setAttribute("href", "#_show_sql"); } ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1387712501 | |
https://github.com/simonw/datasette/issues/1480#issuecomment-1268613335 | https://api.github.com/repos/simonw/datasette/issues/1480 | 1268613335 | IC_kwDOBm6k_c5LnYDX | 536941 | 2022-10-05T15:45:49Z | 2022-10-05T15:45:49Z | CONTRIBUTOR | running into this as i continue to grow my labor data warehouse. Here a CloudRun PM says the container size should **not** count against memory: https://stackoverflow.com/a/56570717 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1015646369 | |
https://github.com/simonw/datasette/issues/1480#issuecomment-1268629159 | https://api.github.com/repos/simonw/datasette/issues/1480 | 1268629159 | IC_kwDOBm6k_c5Lnb6n | 536941 | 2022-10-05T16:00:55Z | 2022-10-05T16:00:55Z | CONTRIBUTOR | as a next step, i'll fetch the docker image from the google registry, and see what memory and disk usage looks like when i run it locally. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1015646369 | |
https://github.com/simonw/datasette/issues/1480#issuecomment-1269275153 | https://api.github.com/repos/simonw/datasette/issues/1480 | 1269275153 | IC_kwDOBm6k_c5Lp5oR | 9599 | 2022-10-06T03:54:33Z | 2022-10-06T03:54:33Z | OWNER | I've been having success using Fly recently for a project which I thought would be too large for Cloud Run. I wrote about that here: - https://simonwillison.net/2022/Sep/5/laion-aesthetics-weeknotes/ | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1015646369 | |
https://github.com/simonw/datasette/issues/1480#issuecomment-1269847461 | https://api.github.com/repos/simonw/datasette/issues/1480 | 1269847461 | IC_kwDOBm6k_c5LsFWl | 536941 | 2022-10-06T11:21:49Z | 2022-10-06T11:21:49Z | CONTRIBUTOR | thanks @simonw, i'll spend a little more time trying to figure out why this isn't working on cloudrun, and then will flip over to fly if i can't. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1015646369 | |
https://github.com/simonw/datasette/pull/1835#issuecomment-1270586897 | https://api.github.com/repos/simonw/datasette/issues/1835 | 1270586897 | IC_kwDOBm6k_c5Lu54R | 9599 | 2022-10-06T19:34:00Z | 2022-10-06T19:34:00Z | OWNER | Wow, great catch! The whole point of inspect data was to avoid this kind of expensive operation on startup so this makes total sense - I had no idea Datasette was still trying to hash a giant file every time the server started. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400121355 | |
https://github.com/simonw/datasette/pull/1835#issuecomment-1270595328 | https://api.github.com/repos/simonw/datasette/issues/1835 | 1270595328 | IC_kwDOBm6k_c5Lu78A | 22429695 | 2022-10-06T19:42:25Z | 2022-10-06T19:42:25Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1835?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **91.71**% // Head: **92.50**% // Increases project coverage by **`+0.78%`** :tada: > Coverage data is based on head [(`b4b92df`)](https://codecov.io/gh/simonw/datasette/pull/1835?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`cb1e093`)](https://codecov.io/gh/simonw/datasette/commit/cb1e093fd361b758120aefc1a444df02462389a3?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1835 +/- ## ========================================== + Coverage 91.71% 92.50% +0.78% ========================================== Files 38 35 -3 Lines 4754 4400 -354 ========================================== - Hits 4360 4070 -290 + Misses 394 330 -64 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1835?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/database.py](https://codecov.io/gh/simonw/datasette/pull/1835/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2RhdGFiYXNlLnB5) | | | | [datasette/utils/shutil\_backport.py](https://codecov.io/gh/simonw/datasette/pull/1835/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3V0aWxzL3NodXRpbF9iYWNrcG9ydC5weQ==) | | | | [datasette/\_\_… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400121355 | |
https://github.com/simonw/datasette/pull/1837#issuecomment-1270855853 | https://api.github.com/repos/simonw/datasette/issues/1837 | 1270855853 | IC_kwDOBm6k_c5Lv7it | 22429695 | 2022-10-07T00:01:20Z | 2022-10-07T00:01:20Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1837?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **92.50**% // Head: **92.50**% // No change to project coverage :thumbsup: > Coverage data is based on head [(`c12447e`)](https://codecov.io/gh/simonw/datasette/pull/1837?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`eff1124`)](https://codecov.io/gh/simonw/datasette/commit/eff112498ecc499323c26612d707908831446d25?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1837 +/- ## ======================================= Coverage 92.50% 92.50% ======================================= Files 35 35 Lines 4400 4400 ======================================= Hits 4070 4070 Misses 330 330 ``` Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) </details> [:umbrella: View full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1837?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). :loudspeaker: Do you have feedback about the report comment? [Let us know in this issue](https://about.codecov.io/codecov-pr-comment-feedback/?utm_medium=referral&… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400431789 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271103097 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271103097 | IC_kwDOBm6k_c5Lw355 | 536941 | 2022-10-07T04:43:41Z | 2022-10-07T04:43:41Z | CONTRIBUTOR | @simonw, should i open up a new issue for investigating the differences between "immutable=1" and "mode=ro" and possibly switching to "mode=ro". Or would you like to keep that conversation in this issue? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271100651 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271100651 | IC_kwDOBm6k_c5Lw3Tr | 536941 | 2022-10-07T04:38:14Z | 2022-10-07T04:38:14Z | CONTRIBUTOR | > yes, and i also think that this is causing the apparent memory problems in #1480. when the container starts up, it will make some operation on the database in `immutable` mode which apparently makes some small change to the db file. if that's so, then the db files will be copied to the read/write layer which counts against cloudrun's memory allocation! > > running a test of that now. this completely addressed #1480 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1480#issuecomment-1271101072 | https://api.github.com/repos/simonw/datasette/issues/1480 | 1271101072 | IC_kwDOBm6k_c5Lw3aQ | 536941 | 2022-10-07T04:39:10Z | 2022-10-07T04:39:10Z | CONTRIBUTOR | switching from `immutable=1` to `mode=ro` completely addressed this. see https://github.com/simonw/datasette/issues/1836#issuecomment-1271100651 for details. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1015646369 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1270923537 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1270923537 | IC_kwDOBm6k_c5LwMER | 536941 | 2022-10-07T00:46:08Z | 2022-10-07T00:46:08Z | CONTRIBUTOR | i thought it was maybe to do with reading through all the files, but that does not seem to be the case if i make a little test file like: ```python # test_read.py import hashlib import sys import pathlib HASH_BLOCK_SIZE = 1024 * 1024 def inspect_hash(path): """Calculate the hash of a database, efficiently.""" m = hashlib.sha256() with path.open("rb") as fp: while True: data = fp.read(HASH_BLOCK_SIZE) if not data: break m.update(data) return m.hexdigest() inspect_hash(pathlib.Path(sys.argv[1])) ``` then a line in the Dockerfile like ```docker RUN python test_read.py nlrb.db && echo "[]" > /etc/inspect.json ``` just produes a layer of `3B` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1270936982 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1270936982 | IC_kwDOBm6k_c5LwPWW | 536941 | 2022-10-07T00:52:41Z | 2022-10-07T00:52:41Z | CONTRIBUTOR | it's not that the inspect command is somehow changing the db files. if i set them to only read-only, the "inspect" layer still has the same very large size. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1270988081 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1270988081 | IC_kwDOBm6k_c5Lwb0x | 536941 | 2022-10-07T01:19:01Z | 2022-10-07T01:27:35Z | CONTRIBUTOR | okay, some progress!! running some sql against a database file causes that file to get duplicated even if it doesn't apparently change the file. make a little test script like this: ```python # test_sql.py import sqlite3 import sys db_name = sys.argv[1] conn = sqlite3.connect(f'file:/app/{db_name}', uri=True) cur = conn.cursor() cur.execute('select count(*) from filing') print(cur.fetchone()) ``` then ```docker RUN python test_sql.py nlrb.db ``` produced a layer that's the same size as `nlrb.db`!! | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1270992795 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1270992795 | IC_kwDOBm6k_c5Lwc-b | 536941 | 2022-10-07T01:29:15Z | 2022-10-07T01:50:14Z | CONTRIBUTOR | fascinatingly, telling python to open sqlite in read only mode makes this layer have a size of 0 ```python # test_sql_ro.py import sqlite3 import sys db_name = sys.argv[1] conn = sqlite3.connect(f'file:/app/{db_name}?mode=ro', uri=True) cur = conn.cursor() cur.execute('select count(*) from filing') print(cur.fetchone()) ``` that's quite weird because setting the file permissions to read only didn't do anything. (on reflection, that chmod isn't doing anything because the dockerfile commands are run as root) | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271003212 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271003212 | IC_kwDOBm6k_c5LwfhM | 536941 | 2022-10-07T01:52:04Z | 2022-10-07T01:52:04Z | CONTRIBUTOR | and if we try immutable mode, which is how things are opened by `datasette inspect` we duplicate the files!!! ```python # test_sql_immutable.py import sqlite3 import sys db_name = sys.argv[1] conn = sqlite3.connect(f'file:/app/{db_name}?immutable=1', uri=True) cur = conn.cursor() cur.execute('select count(*) from filing') print(cur.fetchone()) ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271004167 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271004167 | IC_kwDOBm6k_c5LwfwH | 9599 | 2022-10-07T01:53:05Z | 2022-10-07T01:53:05Z | OWNER | Oh this is interesting! Is your hunch here that running this line is causing the file to be stored as a second layer? https://github.com/simonw/datasette/blob/5aa359b86907d11b3ee601510775a85a90224da8/datasette/utils/__init__.py#L399 I guess it's possible that running a non-read-only query against the database causes one or two bytes to be changed (maybe a transaction ID or similar?) Modifying the `inspect` command to use `?mode=ro` seems sensible to me. Except.... it should already be opening those files in immutable mode according to this line: https://github.com/simonw/datasette/blob/eff112498ecc499323c26612d707908831446d25/datasette/cli.py#L172-L173 Here's what opening as a `immutables` does: https://github.com/simonw/datasette/blob/eff112498ecc499323c26612d707908831446d25/datasette/app.py#L258-L260 https://github.com/simonw/datasette/blob/eff112498ecc499323c26612d707908831446d25/datasette/database.py#L98 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/pull/1838#issuecomment-1271009214 | https://api.github.com/repos/simonw/datasette/issues/1838 | 1271009214 | IC_kwDOBm6k_c5Lwg-- | 9599 | 2022-10-07T02:01:07Z | 2022-10-07T02:01:07Z | OWNER | The argument that has always convinced me NOT to use `target="_blank"` (even for links like this one) is that it breaks browser expectations. If you click a link with `target="_blank" on it you get a new browser window... with a disabled back button. You have to then know to close that browser window in order to return to the previous page - as opposed to hitting the "back" button like usual. You'll note that Datasette doesn't use `target="_blank"` even on URLs presented in database tables - like these ones: https://latest.datasette.io/fixtures/roadside_attractions So I'm very firmly in the anti-target-blank camp! This is the kind of change which I'd suggest implementing as a plugin. `datasette-external-links-new-windows` could run a bit of JavaScript on every page that looks for `<a href="...">` elements that link to off-domain pages and adds `target="_blank"` to them via the DOM. That way people who like `target="_blank"` can have it! | { "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400494162 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271008997 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271008997 | IC_kwDOBm6k_c5Lwg7l | 536941 | 2022-10-07T02:00:37Z | 2022-10-07T02:00:49Z | CONTRIBUTOR | yes, and i also think that this is causing the apparent memory problems in #1480. when the container starts up, it will make some operation on the database in `immutable` mode which apparently makes some small change to the db file. if that's so, then the db files will be copied to the read/write layer which counts against cloudrun's memory allocation! running a test of that now. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271006020 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271006020 | IC_kwDOBm6k_c5LwgNE | 9599 | 2022-10-07T01:54:07Z | 2022-10-07T01:54:07Z | OWNER | Just overlapped with your comment here: https://github.com/simonw/datasette/issues/1836#issuecomment-1271003212 - which notes that opening with `?immutable=1` DOES seem to cause the file to be duplicated! | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/issues/1836#issuecomment-1271020193 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1271020193 | IC_kwDOBm6k_c5Lwjqh | 536941 | 2022-10-07T02:15:05Z | 2022-10-07T02:21:08Z | CONTRIBUTOR | when i hack the connect method to open non mutable files with "mode=ro" and not "immutable=1" https://github.com/simonw/datasette/blob/eff112498ecc499323c26612d707908831446d25/datasette/database.py#L79 then: ```bash 870 B RUN /bin/sh -c datasette inspect nlrb.db --inspect-file inspect-data.json ``` the `datasette inspect` layer is only the size of the json file! | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400374908 | |
https://github.com/simonw/datasette/pull/1838#issuecomment-1271024708 | https://api.github.com/repos/simonw/datasette/issues/1838 | 1271024708 | IC_kwDOBm6k_c5LwkxE | 4399499 | 2022-10-07T02:19:49Z | 2022-10-07T02:19:49Z | NONE | Ooh, I didn't even think about links in tables! You're definitely right on the approach to this. It might also be a really good "stupidly simple" plugin for me to try to build myself, which could be fun. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400494162 | |
https://github.com/simonw/datasette/issues/1301#issuecomment-1271035998 | https://api.github.com/repos/simonw/datasette/issues/1301 | 1271035998 | IC_kwDOBm6k_c5Lwnhe | 536941 | 2022-10-07T02:38:04Z | 2022-10-07T02:38:04Z | CONTRIBUTOR | the only mode that `publish cloudrun` supports right now is immutable | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
860722711 | |
https://github.com/simonw/datasette/pull/1838#issuecomment-1271803298 | https://api.github.com/repos/simonw/datasette/issues/1838 | 1271803298 | IC_kwDOBm6k_c5Lzi2i | 9599 | 2022-10-07T16:28:41Z | 2022-10-07T16:28:41Z | OWNER | ... and here's @ocdtrekkie's plugin! https://github.com/ocdtrekkie/datasette-external-links-new-tabs | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1400494162 | |
https://github.com/simonw/datasette/issues/1852#issuecomment-1291392887 | https://api.github.com/repos/simonw/datasette/issues/1852 | 1291392887 | IC_kwDOBm6k_c5M-Rd3 | 9599 | 2022-10-26T02:04:48Z | 2022-10-26T02:04:48Z | OWNER | Implemented that `dstok_` prefix and the thing where only the `actor["id"]` is copied to the `"a"` field. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421552095 | |
https://github.com/simonw/datasette/issues/1852#issuecomment-1291397623 | https://api.github.com/repos/simonw/datasette/issues/1852 | 1291397623 | IC_kwDOBm6k_c5M-Sn3 | 9599 | 2022-10-26T02:11:40Z | 2022-10-26T02:11:40Z | OWNER | Built a prototype of the `actor_from_request()` hook for this and now: ``` % curl http://127.0.0.1:8001/-/actor.json -H 'Authorization: Bearer dstok_eyJhIjoicm9vdCIsImUiOm51bGx9.6O1OxgNTFkAU6uw7xNcmXYX949A' {"actor": {"id": "root", "dstok": true}} ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421552095 | |
https://github.com/simonw/datasette/issues/1852#issuecomment-1291406219 | https://api.github.com/repos/simonw/datasette/issues/1852 | 1291406219 | IC_kwDOBm6k_c5M-UuL | 9599 | 2022-10-26T02:19:54Z | 2022-10-26T02:59:52Z | OWNER | I'm going to split the remaining work into separate issues: - [x] #1856 - [ ] #1855 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421552095 |