id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1052247023,I_kwDOBm6k_c4-uAPv,1505,Datasette should have an option to output CSV with semicolons,9599,simonw,open,0,,,,,1,2021-11-12T18:02:21Z,2021-11-16T11:40:52Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y: `",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1939/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 953218043,MDU6SXNzdWU5NTMyMTgwNDM=,1403,Labels explaining what hidden tables are for,9599,simonw,open,0,,,,,0,2021-07-26T19:29:22Z,2022-03-21T22:20:37Z,,OWNER,,"A reasonable question: ""What are those hidden tables for?"" This could be answered by adding a small piece of explanatory text to each table - based on if it's related to FTS or to SpatiaLite or configured to be hidden for some other reason.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,simonw,open,0,,,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here: ```diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( ""Allow display of SQL trace debug information with ?_trace=1"", ), Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""), + Setting(""strict_templates"", False, ""Raise errors for undefined template variables""), ) _HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead"" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting(""strict_templates""): + env_extras[""undefined""] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters[""escape_css_string""] = escape_css_string self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus ``` Explored this idea a bit in: - #1999",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 716988478,MDU6SXNzdWU3MTY5ODg0Nzg=,997,Documentation covering buildpack deployment,9599,simonw,closed,0,,,5971510,Datasette 0.50,3,2020-10-08T03:21:52Z,2020-10-08T23:56:03Z,2020-10-08T23:32:10Z,OWNER,,A tidied up version of https://til.simonwillison.net/til/til/digitalocean_datasette-on-digitalocean-app-platform.md - but mention that you can deploy to Heroku using the same mechanism.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/997/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974987856,MDU6SXNzdWU5NzQ5ODc4NTY=,1442,Mechanism to cause specific branches to deploy their own demos,9599,simonw,closed,0,,,,,6,2021-08-19T19:41:39Z,2021-08-19T21:11:45Z,2021-08-19T21:09:40Z,OWNER,,"A useful capability would be if it was super-easy to say ""any pushes to branch X should be deployed to `latest-X.datasette.io`"". I'd like to use this for the column query information work in #1434",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893537744,MDU6SXNzdWU4OTM1Mzc3NDQ=,1331,Add support for Jinja2 version 3.0,475613,MarkusH,closed,0,,,,,10,2021-05-17T17:14:36Z,2021-05-23T00:57:39Z,2021-05-23T00:57:39Z,NONE,,"A week ago, [The Pallets Project](https://github.com/pallets) released [new major versions of several of its projects](https://palletsprojects.com/blog/flask-2-0-released/). Among those updates is one for Jinja2, which bumps it to version 3.0.0. I'd like for datasette to support Jinaj2 version 3.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1331/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421552095,I_kwDOBm6k_c5Uuynf,1852,Default API token authentication mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,30,2022-10-24T22:31:07Z,2022-11-15T19:57:00Z,2022-10-26T02:19:54Z,OWNER,,"API authentication will be via `Authorization: Bearer XXX` request headers. I'm inclined to add a default token mechanism to Datasette based on tokens that are signed with the `DATASETTE_SECRET`. Maybe the root user can access `/-/create-token` which provides a UI for generating a time-limited signed token? Could also have a `datasette token` command for creating such tokens at the command-line. Plugins can then define alternative ways of creating tokens, such as the existing https://datasette.io/plugins/datasette-auth-tokens plugin. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1289706439_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1852/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029242,I_kwDOBm6k_c5U8Dh6,1863,Update a single record in an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,16,2022-10-27T04:53:17Z,2022-11-29T18:08:53Z,2022-11-29T18:06:37Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/update { ""field"": ""updated_value"" } ``` Only the fields that you pass will be updated. Maybe this is the wrong design though? The design for insert currently looks like this: - https://github.com/simonw/datasette/issues/1851#issuecomment-1294224185 ``` POST /db/table/-/insert Authorization: Bearer xxx Content-Type: application/json { ""row"": { ""id"": 1, ""name"": ""New name"" } } ``` I could use the same format for `/-/update`, but in this case the API doesn't require you to pass every field so `""row""` doesn't seem like the right key. I think I'll go with this: ``` POST /db/table/1/-/update Authorization: Bearer xxx Content-Type: application/json { ""update"": { ""name"": ""New name"" } } ``` The benefit of having an `""update""` key is that it allows me to use other keys in the future. Maybe a `""alter"": true` key to indicate that new columns should be added if they are missing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1863/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029275,I_kwDOBm6k_c5U8Dib,1864,Delete a single record from an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-27T04:53:22Z,2022-11-29T18:54:04Z,2022-11-29T18:54:04Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/delete Or... DELETE /db/table/row-pks/-/delete ``` I'm just going to do `POST` for the moment, like I did here: - #1874 Permission: `delete-row` Still needed: - [ ] Tests for rowid tables - [ ] Tests for compound primary keys",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1864/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 641460179,MDU6SXNzdWU2NDE0NjAxNzk=,854,"Respect default scope[""actor""] if one exists",9599,simonw,closed,0,,,5533512,Datasette 0.45,0,2020-06-18T18:25:08Z,2020-06-18T18:39:22Z,2020-06-18T18:39:22Z,OWNER,,"ASGI wrapper plugins that themselves set the `actor` scope variable should be respected (though `actor_from_request` plugins should still execute and get the chance to replace that initial `actor` value). Relevant code: https://github.com/simonw/datasette/blob/09a3479a5402df96489ed6cab6cc9fd674bf3433/datasette/app.py#L910-L921",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275476839,MDU6SXNzdWUyNzU0NzY4Mzk=,138,"Per-database and per-table metadata, probably using data-package",9599,simonw,closed,0,,,,,1,2017-11-20T19:50:10Z,2017-12-10T03:08:36Z,2017-12-10T03:08:26Z,OWNER,,"Ability to annotate databases and tables with extra metadata describing their purpose, providing source and licensing information and describing individual columns. http://frictionlessdata.io/specs/data-package/ looks like a great format for this, see #105 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322477187,MDU6SXNzdWUzMjI0NzcxODc=,255,Facets,9599,simonw,closed,0,,,,,16,2018-05-12T03:00:07Z,2019-05-29T21:39:12Z,2018-05-16T15:32:12Z,OWNER,,"Ability to display facets and facet counts on the table view. Facets can be specified in the URL with `?_facet=column&_facet=othercolumn` or the default facets for a table can be set using a new `""facets"": [...]` property in `metadata.json` - [x] Implement `?_facet=` - [x] Implement `metadata.json` `facets` key - [x] Design for how facets should be presented - [x] Facets should be able to toggle off as well as on - [x] Expand labels for facets that are foreign keys - [x] Suggest potential facets (if we can do so within a tight time limit) - [x] Documentation",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 446429421,MDU6SXNzdWU0NDY0Mjk0MjE=,481,Facet by date,9599,simonw,closed,0,,,,,1,2019-05-21T05:55:54Z,2019-05-29T21:39:12Z,2019-05-21T06:09:49Z,OWNER,,Ability to facet on datetime fields by their date.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267515836,MDU6SXNzdWUyNjc1MTU4MzY=,4,Make URLs immutable,9599,simonw,closed,0,,,2857392,Ship first public release,8,2017-10-23T01:13:30Z,2017-10-24T02:38:24Z,2017-10-24T02:38:24Z,OWNER,,"Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612089949,MDU6SXNzdWU2MTIwODk5NDk=,756,Add pipx to installation documentation,9599,simonw,closed,0,,,,,2,2020-05-04T18:49:01Z,2020-05-04T19:19:06Z,2020-05-04T19:10:33Z,OWNER,,"Add to this page: https://datasette.readthedocs.io/en/stable/installation.html Here's how to install plugins: https://twitter.com/simonw/status/1257348687979778050 ``` $ datasette plugins [] $ pipx inject datasette datasette-json-html injected package datasette-json-html into venv datasette done! โœจ ๐ŸŒŸ โœจ $ datasette plugins [ { ""name"": ""datasette-json-html"", ""static"": false, ""templates"": false, ""version"": ""0.6"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 630120235,MDU6SXNzdWU2MzAxMjAyMzU=,797,"Documentation for new ""params"" setting for canned queries",9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-03T15:55:11Z,2020-06-09T04:00:40Z,2020-06-03T21:04:51Z,OWNER,,Added here: https://github.com/simonw/datasette/commit/aa82d0370463580f2cb10d9617f1bcbe45cc994a#diff-5e0ffd62fced7d46339b9b2cd167c2f9R236,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/797/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733796942,MDU6SXNzdWU3MzM3OTY5NDI=,1075,PrefixedUrlString mechanism broke everything,9599,simonw,closed,0,,,6026070,0.51,6,2020-10-31T19:58:05Z,2020-10-31T20:48:51Z,2020-10-31T20:48:51Z,OWNER,,Added in 7a67bc7a569509d65b3a8661e0ad2c65f0b09166 refs #1026. Lots of tests are failing now.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1075/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445855910,MDU6SXNzdWU0NDU4NTU5MTA=,475,Documentation for about and about_url metadata,9599,simonw,closed,0,,,4305096,0.28,0,2019-05-19T19:36:59Z,2019-05-19T20:13:36Z,2019-05-19T20:13:36Z,OWNER,,Added in https://github.com/simonw/datasette/commit/bf6b0f918de4aeee7c1036ac975ce2fb23237da7 without docs.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459598080,MDU6SXNzdWU0NTk1OTgwODA=,520,asgi_wrapper plugin hook,9599,simonw,closed,0,9599,simonw,,,3,2019-06-23T17:16:45Z,2019-07-03T04:40:34Z,2019-07-03T04:06:28Z,OWNER,,"After #272 we can finally add this hook. It will allow plugins to wrap their own ASGI middleware around Datasette. Potential use-cases include: * adding authentication * custom CORS headers (see #454) * maybe gzip support? * possibly defining entirely new routes, though that may be better handled by a separate hook",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465002978,MDU6SXNzdWU0NjUwMDI5Nzg=,550,Pull m2m faceting out of master so we can ship a release without it,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T23:10:48Z,2019-07-07T23:21:22Z,2019-07-07T23:21:22Z,OWNER,,After spending some time with #495 I believe I need to make some pretty major changes to how m2m faceting works. I don't want it to block the release of ASGI Datasette so I'm going to revert it back out of master for the moment and merge it back in after the release has gone out.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452495049,I_kwDOBm6k_c5Wk1DJ,1899,Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused ,95570,bgrins,closed,0,,,,,4,2022-11-17T00:29:52Z,2022-11-18T07:28:28Z,2022-11-18T07:20:53Z,CONTRIBUTOR,,"After the upgrade to 6 (#1893) I noticed this. I think it's because we're doing overflow:hidden to accomplish the CSS resizer. When there's a single line of SQL there's a gap below that line where clicking doesn't do anything. It should focus at the end of the line.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 568091133,MDU6SXNzdWU1NjgwOTExMzM=,676,?_searchmode=raw option for running FTS searches without escaping characters,58088336,tunguyenatwork,closed,0,,,,,9,2020-02-20T06:56:57Z,2020-02-25T05:57:24Z,2020-02-25T05:56:04Z,NONE,,"After the version 0.34. I am not able to use the wildchar in the _search option( or the full text search). It will not return any result unless I specify the whole word for text search. If I use 'match :search || ""*"" ' in the sql statement then it will work as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316123256,MDU6SXNzdWUzMTYxMjMyNTY=,229,Table view should support ?_size=400 parameter,9599,simonw,closed,0,,,,,1,2018-04-20T04:23:18Z,2018-04-26T04:49:46Z,2018-04-26T04:48:32Z,OWNER,,Allows callers to request more rows at once. The limit will still be `max_returned_rows` (defaults to 1000).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273537940,MDU6SXNzdWUyNzM1Mzc5NDA=,77,Add Travis CI badge to README,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T18:52:25Z,2017-11-13T21:24:15Z,2017-11-13T21:24:15Z,OWNER,,"Also fix this newline issue: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1197926598,I_kwDOBm6k_c5HZujG,1705,How to upgrade your plugin for 1.0 documentation,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-08T23:16:47Z,2022-12-13T05:29:05Z,,OWNER,,"Among other things, needed by: - #1704",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503218205,MDU6SXNzdWU1MDMyMTgyMDU=,586,Enable browser caching for plugin statics with datasette-auth,9599,simonw,closed,0,,,,,2,2019-10-07T03:47:14Z,2019-10-07T15:46:04Z,2019-10-07T15:46:03Z,OWNER,,"An authenticated Datasette I run is seeing delays on every page load. On looking at the network inspector it turns out it's because datasette-vega is nearly 1MB and a `cache-control: private` is preventing it from being cached! This may well turn out to be a bug in `datasette-auth-github` but it's still worth tracking here because caching of static assets from plugins is very important. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749981663,MDU6SXNzdWU3NDk5ODE2NjM=,1104,config.json in directory config mode should be settings.json,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:34:38Z,2020-11-24T20:37:42Z,2020-11-24T20:37:41Z,OWNER,,"Another knock-on effect of #992. https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/docs/config.rst#L51-L55",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924203783,MDU6SXNzdWU5MjQyMDM3ODM=,1379,Idea: ?_end=1 option for streaming CSV responses,9599,simonw,open,0,,,,,0,2021-06-17T18:11:21Z,2021-06-17T18:11:30Z,,OWNER,,"As discussed in this thread: https://twitter.com/simonw/status/1405554676993433605 - one of the disadvantages of Datasette's streaming CSV feature is that it's hard to tell if you got the whole file or if the connection ended early - or if an error occurred. Idea: offer an optional `?_end=1` parameter which, if enabled, adds a single row to the end of the CSV file that looks like this: `END,,,,,,,,,` For however many columns the CSV file usually has.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1338137350,I_kwDOBm6k_c5PwlsG,1781,Ensure Datasette Lite is promoted in docs and README,9599,simonw,closed,0,,,8303187,Datasette 0.62,1,2022-08-14T05:12:35Z,2022-08-14T15:24:40Z,2022-08-14T15:24:40Z,OWNER,,As of 0.62 https://lite.datasette.io is a supported piece of the overall Datasette ecosystem.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749979454,MDU6SXNzdWU3NDk5Nzk0NTQ=,1103,Rename /-/config to /-/settings,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:31:00Z,2020-11-24T20:19:20Z,2020-11-24T20:19:19Z,OWNER,,"As part of rebranding config to settings, see also #992.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1197925865,I_kwDOBm6k_c5HZuXp,1704,File PRs against incompatible plugins pinning to datasette<1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-04-08T23:15:30Z,2022-04-08T23:15:30Z,,OWNER,,"As part of the preparation for the 1.0 release, test all existing known plugins against the alpha. For any that break, submit a PR suggesting they pin to a version <1.0 - and include a link to the documentation on how to upgrade the plugin for 1.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1450796965,I_kwDOBm6k_c5WeWel,1894,Initialize CodeMirror during DOMContentLoaded instead of onload,95570,bgrins,closed,0,,,,,0,2022-11-16T03:52:19Z,2022-11-18T07:29:02Z,2022-11-18T07:29:02Z,CONTRIBUTOR,,As per https://github.com/simonw/datasette/pull/1893/files#r1023248927 this should prevent a flash between the textarea being replaced by CodeMirror.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1894/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309471814,MDU6SXNzdWUzMDk0NzE4MTQ=,189,Ability to sort (and paginate) by column,9599,simonw,closed,0,9599,simonw,,,31,2018-03-28T18:04:51Z,2018-04-15T18:54:22Z,2018-04-09T05:16:02Z,OWNER,,"As requested in https://github.com/simonw/datasette/issues/185#issuecomment-376614973 I've previously avoided this for performance reasons: sort-by-column on a column without an index is likely to perform badly for hundreds of thousands of rows. That's not a good enough reason to avoid the feature entirely though. A few options: * Allow sort-by-column by default, give users the option to disable it for specific tables/columns * Disallow sort-by-column by default, give users option (probably in `metadata.json`) to enable it for specific tables/columns * Automatically detect if a column either has an index on it OR a table has less than X rows in it We already have the mechanism in place to cut off SQL queries that take more than X seconds, so if someone DOES try to sort by a column that's too expensive it won't actually hurt anything - but it would be nice to not show people a ""sort"" option which is guaranteed to throw a timeout error. The vast majority of datasette usage that I've seen so far is on smaller datasets where the performance penalties of sort-by-column are extremely unlikely to show up. ---- Still left to do: - [x] UI that shows which sort order is currently being applied (in HTML and in JSON) - [x] UI for applying a sort order (with rel=nofollow to avoid Google crawling it) - [x] Sort column names should be escaped correctly in generated SQL - [x] Validation that the selected sort order is a valid column - [x] Throw error if user attempts to apply _sort AND _sort_desc at the same time - [x] Ability to disable sorting (or sort only for specific columns) in metadata.json - [x] Fix ""201 rows where sorted by sortable_with_nulls "" bug ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273267081,MDU6SXNzdWUyNzMyNjcwODE=,70,Paginate views using OFFSET/LIMIT,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-12T21:30:29Z,2017-11-13T21:11:01Z,2017-11-13T21:11:01Z,OWNER,,"As with #69 these should obey a maximum offset setting, which can be over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267522549,MDU6SXNzdWUyNjc1MjI1NDk=,11,Code that generates compile-time properties about the database ,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T02:18:24Z,2017-10-23T16:04:23Z,2017-10-23T16:04:23Z,OWNER,,"At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174708375,I_kwDOBm6k_c5GBKCX,1673,Streaming CSV spends a lot of time in `table_column_details`,9599,simonw,open,0,,,,,1,2022-03-20T22:25:28Z,2022-03-20T22:34:06Z,,OWNER,,"At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do: datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on' While investigating: - #1355 And spotted this: ``` datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2) Total Samples 5800 GIL: 71.00%, Active: 98.00%, Threads: 4 %Own %Total OwnTime TotalTime Function (filename:line) 8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212) 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614) 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81) 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120) 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571) 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140) ``` Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275159710,MDU6SXNzdWUyNzUxNTk3MTA=,128,"Every visualization should have an ""embed"" button",9599,simonw,open,0,,,,,0,2017-11-19T13:38:13Z,2019-05-13T18:33:51Z,,OWNER,,"At least for the first round of visualizations, any time you construct one using the UI the result should include an ""embed this"" button that returns source code to copy and paste These examples should use unpkg.com (or similarl) urls with SRI hashes, eg https://www.srihash.org - and should load data from the datasette JSON API.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,fgregg,open,0,,,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed. this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application"" (see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 754178780,MDU6SXNzdWU3NTQxNzg3ODA=,1121,Table actions cog is misaligned,3243482,abdusco,closed,0,,,,,1,2020-12-01T08:41:25Z,2020-12-03T01:03:19Z,2020-12-03T00:33:36Z,CONTRIBUTOR,,"At the moment it looks like this https://datasette-graphql-demo.datasette.io/github/repos ![image](https://user-images.githubusercontent.com/3243482/100716533-e6e2d300-33c9-11eb-866e-1e83ba228bf5.png) Adding a few flex statements fixes the alignment and centers `h1` text and the cog icon vertically. ![image](https://user-images.githubusercontent.com/3243482/100716605-f8c47600-33c9-11eb-8d69-0e37499cf641.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 377266351,MDU6SXNzdWUzNzcyNjYzNTE=,373,Views should be shown on root/index page along with tables,416374,gfrmin,closed,0,,,4305096,0.28,1,2018-11-05T06:28:41Z,2019-05-16T00:29:22Z,2019-05-16T00:29:22Z,CONTRIBUTOR,,"At the moment the number of views is given on a datasette ""homepage"", but not links to any views themselves",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 374953006,MDU6SXNzdWUzNzQ5NTMwMDY=,369,Interface should show same JSON shape options for custom SQL queries,416374,gfrmin,open,0,,,3268330,Datasette 1.0,2,2018-10-29T10:39:15Z,2020-05-30T17:24:06Z,,CONTRIBUTOR,,"At the moment the page returning a custom SQL query shows the JSON and CSV APIs, but not the multiple JSON shapes. However, adding the `_shape` parameter to the JSON API URL manually still works, so perhaps there should be consistency in the interface by having the same ""Advanced Export"" box for custom SQL queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 711627628,MDU6SXNzdWU3MTE2Mjc2Mjg=,981,Action menu for table columns,9599,simonw,closed,0,,,5971510,Datasette 0.50,16,2020-09-30T04:45:38Z,2020-10-08T23:55:00Z,2020-09-30T23:58:17Z,OWNER,,"At the very least I'd like a menu on each table column that lets me select sort-asc v.s. sort-desc without having to click twice. I'd also like to be able to indicate that a column should be used for faceting (possibly only for columns that are not floating point and do not have a unique index on them). This needs to be built with accessibility in mind - I don't want screenreaders to read out the contents of a menu as the ""th"" label for any given cell. Related: #690",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273192789,MDU6SXNzdWUyNzMxOTI3ODk=,67,Command that builds a local docker container,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-12T02:13:29Z,2017-11-13T16:17:52Z,2017-11-13T16:17:52Z,OWNER,,Be nice to indicate that this isn't just for Now. Shouldn't be too hard either.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1237871948,I_kwDOBm6k_c5JyG1M,1743,`datasette.utils.to_css_class()` should be a documented internal,9599,simonw,open,0,,,,,0,2022-05-16T23:57:26Z,2022-05-16T23:57:26Z,,OWNER,,"Because I'm using it in this plugin: - https://github.com/simonw/datasette-upload-dbs/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 442327592,MDU6SXNzdWU0NDIzMjc1OTI=,456,Installing installs the tests package,7725188,hellerve,closed,0,,,,,3,2019-05-09T16:35:16Z,2020-07-24T20:39:54Z,2020-07-24T20:39:54Z,CONTRIBUTOR,,"Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior. The offending line is here: https://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40 And only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling: ``` $ pip uninstall datasette Uninstalling datasette-0.27: Would remove: /usr/local/bin/datasette /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/* /usr/local/lib/python3.7/site-packages/datasette/* /usr/local/lib/python3.7/site-packages/tests/* Would not remove (might be manually added): [ .. snip .. ] Proceed (y/n)? ``` This should be a relatively simple fix, and I could drop a PR if desired! Cheers",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 955316250,MDU6SXNzdWU5NTUzMTYyNTA=,1405,utils.parse_metadata() should be a documented internal function,9599,simonw,closed,0,,,,,3,2021-07-28T23:51:39Z,2021-07-29T23:33:30Z,2021-07-29T23:30:24Z,OWNER,,Because it's used by this plugin: https://github.com/simonw/datasette-remote-metadata,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 864969683,MDU6SXNzdWU4NjQ5Njk2ODM=,1305,Index view crashes when any database table is not accessible to actor,416374,gfrmin,closed,0,,,,,0,2021-04-22T13:44:22Z,2021-06-02T04:26:29Z,2021-06-02T04:26:29Z,CONTRIBUTOR,,"Because of https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L63, the ```tables``` dict built does not include invisible tables; however, if https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L80 is reached (because table_counts was not successfully initialized, e.g. due to a very large database) then as db.get_all_foreign_keys() returns ALL tables, a KeyError will be raised. This error can be recreated with the fixtures.db if any table is hidden, e.g. by adding something like ```""foreign_key_references"": { ""allow"": {} }``` to fixtures-metadata.json and deleting ```or not table_counts``` from https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L77. I'm not sure how to fix this error; perhaps by testing if the table is in the aforementions ```tables``` dict.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 435819321,MDU6SXNzdWU0MzU4MTkzMjE=,436,400 Error when trying to register new user via https://publish.datasettes.com/,317694,nniiicc,closed,0,,,,,1,2019-04-22T17:55:00Z,2021-01-04T20:15:42Z,2021-01-04T20:15:41Z,NONE,,"Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. Replicated: Chrome and Firefox; 2 different email accounts",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691475400,MDU6SXNzdWU2OTE0NzU0MDA=,958,Upgrade to latest Black (20.8b1),9599,simonw,closed,0,,,5818042,Datasette 0.49,0,2020-09-02T22:24:19Z,2020-09-11T21:34:24Z,2020-09-02T22:25:10Z,OWNER,,"Black has some changes: https://black.readthedocs.io/en/stable/change_log.html#b0 - in particular: > - re-implemented support for explicit trailing commas: now it works consistently within any bracket pair, including nested structures (#1288 and duplicates) > - Black now reindents docstrings when reindenting code around it (#1053)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/958/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from: - #1620 ``` % curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462117311,MDU6SXNzdWU0NjIxMTczMTE=,531,/database/-/inspect,9599,simonw,open,0,,,,,1,2019-06-28T16:33:41Z,2019-07-08T15:43:57Z,,OWNER,,"Build `/database/-/inspect` which shows tables, columns, column types and foreign keys It won't show table counts. Or maybe it will include them optionally but only for `-i` databases, in a special area of the JSON reserved for immutable-only inspect details. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/465#issuecomment-506797086_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/531/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465001185,MDU6SXNzdWU0NjUwMDExODU=,549,Send pull request to the repo that the _table.html template will break,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T22:45:17Z,2019-07-08T03:36:46Z,2019-07-08T03:36:45Z,OWNER,,"Bump this to 0.29 https://github.com/simonw/salaries-datasette/blob/master/requirements/base.txt And rename https://github.com/simonw/salaries-datasette/blob/master/templates/_rows_and_columns.html to _table.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,CharlesNepote,open,0,,,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,cldellow,open,0,,,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 648435885,MDU6SXNzdWU2NDg0MzU4ODU=,878,"New pattern for views that return either JSON or HTML, available for plugins",9599,simonw,open,0,,,3268330,Datasette 1.0,26,2020-06-30T19:26:13Z,2022-03-19T16:19:30Z,,OWNER,,"Can be part of #870 - refactoring existing views to use `register_routes()`. > I'm going to put the new `check_permissions()` method on `BaseView` as well. If I want that method to be available to plugins I can do so by turning that `BaseView` class into a documented API that plugins are encouraged to use themselves. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/832#issuecomment-651995453_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/878/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 761706858,MDU6SXNzdWU3NjE3MDY4NTg=,1137,Update README to reflect new datasette.io site,9599,simonw,closed,0,,,,,0,2020-12-10T23:22:06Z,2020-12-10T23:28:50Z,2020-12-10T23:28:50Z,OWNER,,Can finally close #659.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 734777631,MDU6SXNzdWU3MzQ3Nzc2MzE=,1080,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them",9599,simonw,open,0,,,3268330,Datasette 1.0,7,2020-11-02T19:55:06Z,2022-02-04T06:25:18Z,,OWNER,,Can use `/database/-/...` namespace from #296,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1080/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 345469355,MDU6SXNzdWUzNDU0NjkzNTU=,351,Automatically create a GitHub release linking to release notes for every tagged release,9599,simonw,closed,0,,,,,1,2018-07-28T18:31:12Z,2020-05-28T18:56:16Z,2020-05-28T18:56:15Z,OWNER,,"Can use this API called from Travis: https://developer.github.com/v3/repos/releases/#create-a-release The release it generates should look like this one: https://github.com/simonw/datasette/releases/tag/0.24",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275228834,MDU6SXNzdWUyNzUyMjg4MzQ=,136,"""Reformat SQL"" button next to SQL editor textarea",9599,simonw,closed,0,,,,,0,2017-11-20T03:42:19Z,2019-10-14T03:46:13Z,2019-10-14T03:46:13Z,OWNER,,"Can use this: https://github.com/zeroturnaround/sql-formatter https://zeroturnaround.github.io/sql-formatter/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 640917326,MDU6SXNzdWU2NDA5MTczMjY=,852,canned_queries() plugin hook,9599,simonw,closed,0,,,5533512,Datasette 0.45,9,2020-06-18T05:24:35Z,2020-06-20T03:08:40Z,2020-06-20T03:08:40Z,OWNER,,"Canned queries are currently baked into `metadata.json` which is read once on startup. Allowing users to interactively create new canned queries - even if just through a plugin - would make a lot of sense. Is this a new plugin hook or some other mechanism? Lots to think about here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/852/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 582517965,MDU6SXNzdWU1ODI1MTc5NjU=,698,Ability for a canned query to write to the database,9599,simonw,closed,0,,,5512395,Datasette 0.44,26,2020-03-16T18:31:59Z,2020-06-06T19:43:49Z,2020-06-06T19:43:48Z,OWNER,,"Canned queries are currently read-only: https://datasette.readthedocs.io/en/0.38/sql_queries.html#canned-queries Add a `""write"": true` option to their definition in `metadata.json` which turns them into queries that are submitted via POST and send their queries to the write queue. Then they can be used as a really quick way to define a writable interface and JSON API!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 620969465,MDU6SXNzdWU2MjA5Njk0NjU=,767,Allow to specify a URL fragment for canned queries,2657547,rixx,closed,0,,,5471110,Datasette 0.43,2,2020-05-19T13:17:42Z,2020-05-27T21:52:25Z,2020-05-27T21:52:25Z,CONTRIBUTOR,,"Canned queries are very useful to direct users to prepared data and views. I like to use them with charts using datasette-vega a lot, because people get a direct impression at first glance. datasette-vega doesn't show up by default though, and users have to click through to it. Also, datasette-vega does not always guess the best way to render columns correctly though, so it would be nice if I could specify a URL fragment in my canned queries to make sure people see what I want them to see. My current workaround is to include a fragement link in ``description_html`` and ask people to reload the page, like [here](https://data.rixx.de/songs/show_by_bpm#g.mark=bar&g.x_column=bpm_floor&g.x_type=ordinal&g.y_column=bpm_count&g.y_type=quantitative), which is a bit hacky.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1536851861,I_kwDOBm6k_c5bmn-V,1994,Stuck on loading screen,10913053,jackhagley,open,0,,,,,1,2023-01-17T18:33:49Z,2023-01-23T08:21:08Z,,NONE,,"Canโ€™t actually open it! Downloaded today from the releases tab Running macOS13.1 ``` bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms ``` STUCK",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1994/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 675727366,MDU6SXNzdWU2NzU3MjczNjY=,919,"Travis should not build the master branch, only the main branch",9599,simonw,closed,0,,,,,3,2020-08-09T16:18:25Z,2020-08-09T16:26:18Z,2020-08-09T16:19:37Z,OWNER,,"Caused by #849 - since we are mirroring the two branches (to ensure old links to `master` keep working) Travis is building both. The following in `.travis.yml` should fix that: ``` branches: except: - master ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/919/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516370822,MDU6SXNzdWU1MTYzNzA4MjI=,611,Static assets no longer loading for installed plugins,9599,simonw,closed,0,,,,,3,2019-11-01T22:07:00Z,2019-11-01T22:15:55Z,2019-11-01T22:15:55Z,OWNER,,"Caused by fix I made in #606 e.g. `/-/static-plugins/datasette_leaflet_geojson/datasette-leaflet-geojson.js` is a 404, but view-`/-/static-plugins/datasette-leaflet-geojson/datasette-leaflet-geojson.js` works correctly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841343173,I_kwDOBm6k_c5twKrF,2132,Get form fields on query page working again ,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-08-08T13:39:05Z,2023-08-08T13:45:10Z,2023-08-08T13:45:09Z,OWNER,,"Caused by: - #2112 https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+where+%22pk1%22+%3D+%3Ap0+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=b The `:p0` form field is missing. Submitting the form results in this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275089535,MDU6SXNzdWUyNzUwODk1MzU=,121,?_json=foo&_json=bar query string argument ,9599,simonw,closed,0,,,,,4,2017-11-18T16:09:55Z,2018-05-31T13:48:12Z,2018-05-28T18:11:51Z,OWNER,,"Causes the specified columns in the output to be treated as JSON, and returned deserialized in the .json or .jsono response. This will be particularly powerful when combined with https://sqlite.org/json1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 788447787,MDU6SXNzdWU3ODg0NDc3ODc=,1194,?_size= argument is not persisted by hidden form fields in the table filters,9599,simonw,closed,0,,,6346396,Datasette 0.54,3,2021-01-18T17:41:52Z,2021-01-25T03:10:23Z,2021-01-25T03:10:23Z,OWNER,,"Click ""Apply"" on https://covid-19.datasettes.com/covid/ny_times_us_counties?_size=1000&county__exact=San+Francisco&state__exact=California&_sort_desc=date#g.mark=line&g.x_column=date&g.x_type=temporal&g.y_column=cases&g.y_type=quantitative and the `?_size=1000` parameter from the URL will no longer apply on the reloaded page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904071938,MDU6SXNzdWU5MDQwNzE5Mzg=,1345,?_nocol= does not interact well with default facets,9599,simonw,closed,0,,,,,7,2021-05-27T18:39:55Z,2021-05-31T02:40:44Z,2021-05-31T02:31:21Z,OWNER,,"Clicking ""Hide this column"" on `fips` on https://covid-19.datasettes.com/covid/ny_times_us_counties shows this error: https://covid-19.datasettes.com/covid/ny_times_us_counties?_nocol=fips > ## Invalid SQL > no such column: fips The reason is that https://covid-19.datasettes.com/-/metadata sets up the following: ```json ""ny_times_us_counties"": { ""sort_desc"": ""date"", ""facets"": [ ""state"", ""county"", ""fips"" ], ``` It's setting `fips` as a default facet, which breaks if you attempt to remove the column using `?_nocol`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 430103450,MDU6SXNzdWU0MzAxMDM0NTA=,425,Submitting SQL on hide page is broken,9599,simonw,closed,0,,,,,2,2019-04-07T04:21:31Z,2019-04-12T05:12:13Z,2019-04-12T05:00:53Z,OWNER,,Clicking the submit button here doesn't work correctly: https://3a208a4.datasette.io/fixtures?sql=select+%2A+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469796454,I_kwDOBm6k_c5Xm1Bm,1920,Document Datasette.metadata() method,25778,eyeseast,open,0,,,,,0,2022-11-30T15:10:36Z,2022-11-30T15:10:36Z,,CONTRIBUTOR,,"Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503 This will be the official way to access metadata from plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1920/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 657747959,MDU6SXNzdWU2NTc3NDc5NTk=,895,SQL query output should show numeric values in a different colour,9599,simonw,closed,0,,,,,1,2020-07-16T00:28:03Z,2020-09-15T20:40:08Z,2020-09-15T20:40:08Z,OWNER,,"Compare https://latest.datasette.io/fixtures/sortable with https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+order+by+pk1%2C+pk2+limit+101 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/895/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette"" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked % curl -I 'https://latest.datasette.io/' HTTP/1.1 200 OK link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette"" content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:52 GMT Server: Google Frontend Transfer-Encoding: chunked ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 309558826,MDU6SXNzdWUzMDk1NTg4MjY=,190,Keyset pagination doesn't work correctly for compound primary keys,9599,simonw,closed,0,,,,,7,2018-03-28T22:45:06Z,2018-03-30T06:31:15Z,2018-03-30T06:26:28Z,OWNER,,"Consider https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key ![2018-03-28 at 3 47 pm](https://user-images.githubusercontent.com/9599/38060388-56da86dc-329f-11e8-9f20-5576153ad55c.png) The next= link is to `d,v`: https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key?_next=d%2Cv But that page starts with: ![2018-03-28 at 3 48 pm](https://user-images.githubusercontent.com/9599/38060402-6b0f5984-329f-11e8-85b8-44a666c4ee71.png) The next key in the sequence should be `d,w`. Also we should return the full a-z of the ones that start with the letter e - in this example we only return `e-w`, `e-x`, `e-y` and `e-z`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448977444,MDU6SXNzdWU0NDg5Nzc0NDQ=,489,Pagination breaks when combined with expanded foreign keys,9599,simonw,closed,0,,,,,1,2019-05-27T19:56:56Z,2019-05-28T02:48:57Z,2019-05-28T02:23:27Z,OWNER,,"Consider https://edb3662.datasette.io/fixtures/roadside_attraction_characteristics?_sort=attraction_id&_size=2 The ""Next page"" link goes here, which returns 0 rows: https://edb3662.datasette.io/fixtures/roadside_attraction_characteristics?_size=2&_next=%257B%2527value%2527%253A%2B2%252C%2B%2527label%2527%253A%2B%2527Winchester%2BMystery%2BHouse%2527%257D%2C2&_sort=attraction_id That's because if you double-url-decode that `_next` link you get this: `_next={'value': 2, 'label': 'Winchester Mystery House'},2` It should be `_next=2,2`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/489/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 525993034,MDU6SXNzdWU1MjU5OTMwMzQ=,637,"Custom queries with 0 results should say ""0 results""",9599,simonw,closed,0,,,,,3,2019-11-20T18:28:14Z,2019-11-23T06:17:23Z,2019-11-23T06:07:08Z,OWNER,,"Consider https://latest.datasette.io/fixtures/neighborhood_search?text=foop It's currently not obvious that the query executed and returned 0 results.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/637/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 784628163,MDU6SXNzdWU3ODQ2MjgxNjM=,1185,"""Statement may not contain PRAGMA"" error is not strictly true",9599,simonw,closed,0,,,6346396,Datasette 0.54,3,2021-01-12T22:07:10Z,2021-01-24T21:21:37Z,2021-01-12T22:26:26Z,OWNER,,"Consider https://latest.datasette.io/fixtures?sql=select+%27select%0D%0A%27+%7C%7C+group_concat%28%27++++case+when+%5B%27+%7C%7C+name+%7C%7C+%27%5D+is+not+null+then+%27+%7C%7C+quote%28name+%7C%7C+%27%2C+%27%29+%7C%7C+%27+else+%27%27%27%27+end%27%2C+%27+%7C%7C%0D%0A%27%29+%7C%7C+%27%0D%0A++as+columns%2C%0D%0A++count%28*%29+as+num_rows%0D%0Afrom%0D%0A++%5B%27+%7C%7C+%3Atable+%7C%7C+%27%5D%0D%0Agroup+by%0D%0A++columns%0D%0Aorder+by%0D%0A++num_rows+desc%27+as+query+from+pragma_ytable_info%28%3Atable%29&table=facetable It says ""Statement may not contain PRAGMA"" - but that's not actually true. Datasette has an allow-list of PRAGMA that are OK - in this case there was a typo in `pragma_ytable_info` which caused the error, but pragma_table_info` would have been OK. So the error message is misleading.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327459829,MDU6SXNzdWUzMjc0NTk4Mjk=,298,URLify URLs in results from custom SQL statements / views,9599,simonw,closed,0,,,,,2,2018-05-29T19:41:07Z,2018-07-24T04:53:20Z,2018-07-24T03:56:50Z,OWNER,,"Consider this custom query: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+user%2C+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+user%29+as+user_url%2C+created_at%2C+text%2C+url+from+%5Btwitter-ratio%2Fsenators%5D+limit+10%3B ```select user, ('https://twitter.com/' || user) as user_url, created_at, text, url from [twitter-ratio/senators] limit 10;``` ![2018-05-29 at 12 38 pm](https://user-images.githubusercontent.com/9599/40681177-44a36d5c-633d-11e8-935b-c49dad7ac682.png) It would be nice if these URLs were turned into links, as happens on the table view page: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/twitter-ratio%2Fsenators ![2018-05-29 at 12 39 pm](https://user-images.githubusercontent.com/9599/40681206-5c69c47c-633d-11e8-9f3a-08899f8659b8.png) This currently does not happen because the table view render logic takes a different path through `display_columns_and_rows()` which includes this bit: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/views/table.py#L195-L202",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1497577017,I_kwDOBm6k_c5ZQzY5,1957,Reconsider row value truncation on query page,9599,simonw,open,0,,,,,1,2022-12-14T23:49:47Z,2022-12-14T23:50:50Z,,OWNER,,"Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos ```sql select json_group_array(full_name) from repos ``` ![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png) My intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json The truncation isn't helping here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets. This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets. Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903986178,MDU6SXNzdWU5MDM5ODYxNzg=,1344,Test Datasette Docker images built for different architectures,9599,simonw,open,0,,,,,10,2021-05-27T16:52:29Z,2022-09-06T00:07:58Z,,OWNER,,"Continuing on from #1319 - now that we have the ability to build Datasette's Docker image against multiple architectures we should test that it works. We can do this with QEMU emulation, see https://twitter.com/nevali/status/1397958044571602945",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1421544654,I_kwDOBm6k_c5UuwzO,1851,API to insert a single record into an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,22,2022-10-24T22:24:21Z,2022-11-15T19:59:18Z,2022-10-28T00:59:25Z,OWNER,,Controlled by a new `insert-row` permission.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1851/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314665147,MDU6SXNzdWUzMTQ2NjUxNDc=,216,Bug: Sort by column with NULL in next_page URL,222245,carlmjohnson,closed,0,,,,,15,2018-04-16T14:03:18Z,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,NONE,,"Copy-pasting from https://github.com/simonw/datasette/issues/189#issuecomment-381429213, since that issue is closed: I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The `next_url` gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then None is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 628156527,MDU6SXNzdWU2MjgxNTY1Mjc=,789,Mechanism for enabling pluggy tracing,9599,simonw,open,0,,,,,2,2020-06-01T05:10:14Z,2020-06-01T05:11:03Z,,OWNER,,"Could be useful for debugging plugins: https://pluggy.readthedocs.io/en/latest/#call-tracing I tried this out by adding these two lines in `plugins.py`: ```python pm = pluggy.PluginManager(""datasette"") pm.add_hookspecs(hookspecs) # Added these: pm.trace.root.setwriter(print) pm.enable_tracing() ``` Output looked something like this: ``` INFO: 127.0.0.1:52724 - ""GET /-/-/static/app.css HTTP/1.1"" 404 Not Found actor_from_request [hook] datasette: request: finish actor_from_request --> [] [hook] extra_body_script [hook] template: show_json.html database: None table: None view_name: json_data datasette: finish extra_body_script --> [] [hook] extra_template_vars [hook] template: show_json.html database: None table: None view_name: json_data request: datasette: finish extra_template_vars --> [] [hook] extra_css_urls [hook] template: show_json.html database: None table: None datasette: finish extra_css_urls --> [] [hook] extra_js_urls [hook] template: show_json.html database: None table: None datasette: finish extra_js_urls --> [] [hook] INFO: 127.0.0.1:52724 - ""GET /-/actor HTTP/1.1"" 200 OK actor_from_request [hook] datasette: request: finish actor_from_request --> [] [hook] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/789/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 681334912,MDU6SXNzdWU2ODEzMzQ5MTI=,942,Support column descriptions in metadata.json,9599,simonw,closed,0,,,,,18,2020-08-18T20:52:00Z,2022-01-13T22:21:42Z,2021-08-12T23:53:24Z,OWNER,,"Could look something like this: ```json { ""title"": ""Five Thirty Eight"", ""license"": ""CC Attribution 4.0 License"", ""license_url"": ""https://creativecommons.org/licenses/by/4.0/"", ""source"": ""fivethirtyeight/data on GitHub"", ""source_url"": ""https://github.com/fivethirtyeight/data"", ""databases"": { ""fivethirtyeight"": { ""tables"": { ""mueller-polls/mueller-approval-polls"": { ""description_html"": ""

....

"", ""columns"": { ""name_of_column"": ""column_description goes here"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/942/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275755475,MDU6SXNzdWUyNzU3NTU0NzU=,140,Heatmap visualization plugin,9599,simonw,open,0,,,,,2,2017-11-21T15:34:23Z,2019-05-13T18:33:51Z,,OWNER,,Could use https://github.com/scottbedard/svelte-heatmap,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 837348479,MDU6SXNzdWU4MzczNDg0Nzk=,1269,Don't attempt to run count(*) against virtual tables,9599,simonw,closed,0,,,,,2,2021-03-22T05:57:43Z,2021-03-22T17:40:42Z,2021-03-22T17:40:41Z,OWNER,,"Counting the rows in a virtual table doesn't seem very interesting to me, and it's the cause of at least one crashing bug with SpatiaLite 5.0 on Linux, see https://github.com/simonw/datasette/issues/1268",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 634844634,MDU6SXNzdWU2MzQ4NDQ2MzQ=,817,Drop resource_type from permission_allowed system,9599,simonw,closed,0,,,,,1,2020-06-08T18:41:37Z,2020-06-08T19:00:12Z,2020-06-08T19:00:12Z,OWNER,,"Current signature: permission_allowed(datasette, actor, action, resource_type, resource_identifier) It turns out the `resource_type` is always the same thing for any given action, so it's not actually useful. I'm going to drop it. New signature will be: permission_allowed(datasette, actor, action, resource) Refs #811.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/817/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278814220,MDU6SXNzdWUyNzg4MTQyMjA=,161,Support WITH query ,388154,wsxiaoys,closed,0,,,,,4,2017-12-03T20:00:40Z,2017-12-08T06:18:12Z,2017-12-04T04:52:41Z,NONE,,"Currently datasettle failed with error message: Statement must begin with SELECT Example query ```sql WITH RECURSIVE cnt(x) AS ( SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 1000000 ) SELECT x FROM cnt; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so: ```yaml databases: fixtures: queries: neighborhood_search: sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood title: Search neighborhoods ``` This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it. So, I'd like to see a new `""sql_file""` key that works like so: `metadata.yaml`: ```yaml databases: fixtures: queries: neighborhood_search: sql_file: neighborhood_search.sql title: Search neighborhoods ``` `neighborhood_search.sql`: ```sql select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood ``` Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup. A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml: - Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code - Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit - Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier) - Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 406055201,MDU6SXNzdWU0MDYwNTUyMDE=,406,Support nullable foreign keys in _labels mode,9599,simonw,closed,0,9599,simonw,,,2,2019-02-03T05:34:20Z,2019-11-02T22:39:28Z,2019-11-02T22:30:27Z,OWNER,,"Currently if there's a null in a foreign key we get ""None"" displayed in the inflated view: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1486036269,I_kwDOBm6k_c5Ykx0t,1941,Mechanism for supporting key rotation for DATASETTE_SECRET,9599,simonw,open,0,,,,,1,2022-12-09T05:24:53Z,2022-12-09T05:25:20Z,,OWNER,,"Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire. Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly. Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1941/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,simonw,open,0,,,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6 Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 407174173,MDU6SXNzdWU0MDcxNzQxNzM=,408,"Show metadata info (e.g. license, source) on custom SQL query pages",78356,stefanw,closed,0,,,,,0,2019-02-06T10:43:34Z,2019-10-14T03:53:22Z,2019-10-14T03:53:22Z,NONE,,"Currently metadata info is not displayed on custom SQL pages. E.g. compare the footer of [this normal table page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/categories) with the footer [this custom SQL page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7?sql=select+*+from+categories). This is important in order to adhere to attribution license requirements.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 953352015,MDU6SXNzdWU5NTMzNTIwMTU=,1404,`register_routes()` hook should take `datasette` argument,9599,simonw,closed,0,,,,,1,2021-07-26T23:00:33Z,2021-07-26T23:27:07Z,2021-07-26T23:26:00Z,OWNER,,Currently that plugin hook takes no arguments at all. This means it's not possible to conditionally register routes based on Datasette plugin configuration.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726531350,I_kwDOBm6k_c5m6McW,2079,Datasette should serve Access-Control-Max-Age,9599,simonw,closed,0,,,,,8,2023-05-25T21:50:50Z,2023-05-25T22:56:28Z,2023-05-25T22:08:35Z,OWNER,,"Currently the CORS headers served are: https://github.com/simonw/datasette/blob/9584879534ff0556e04e4c420262972884cac87b/datasette/utils/__init__.py#L1139-L1143 Serving `Access-Control-Max-Age: 600` would allow browsers to cache that for 10 minutes, avoiding additional CORS pre-flight OPTIONS requests during that time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316526433,MDU6SXNzdWUzMTY1MjY0MzM=,234,label_column option in metadata.json,9599,simonw,closed,0,,,,,3,2018-04-21T21:19:08Z,2018-04-22T20:47:12Z,2018-04-22T20:47:12Z,OWNER,,"Currently the column used for displaying a foreign key relationship is automatically detected by `inspect()` by looking for tables that have a primary key column and one other column. This doesn't work for tables with more than two columns. Let's allow the table section in `metadata.json` to optionally define a `label_column` which, if present, will be used for those displays.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/234/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724878151,MDU6SXNzdWU3MjQ4NzgxNTE=,1032,Bring date parsing into Datasette core,9599,simonw,open,0,,,,,8,2020-10-19T18:30:45Z,2020-10-19T19:37:55Z,,OWNER,,"Currently this is mainly handled by a plugin - https://github.com/simonw/datasette-dateutil - but I realise now that this really needs to be core functionality. See also Twitter thread: https://twitter.com/simonw/status/1318234808653213696",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 808843401,MDU6SXNzdWU4MDg4NDM0MDE=,1226,--port option should validate port is between 0 and 65535,9599,simonw,closed,0,,,,,4,2021-02-15T22:01:33Z,2021-02-18T18:41:27Z,2021-02-18T18:41:27Z,OWNER,,"Currently throws an ugly error message: ``` (datasette-graphql) datasette-graphql % datasette fivethirtyeight.db -p 80094 INFO: Started server process [45497] INFO: Waiting for application startup. INFO: Application startup complete. Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/datasette-graphql-n1OSJCS8/bin/datasette"", line 8, in sys.exit(cli()) ... server = await loop.create_server( File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 1461, in create_server sock.bind(sa) OverflowError: bind(): port must be 0-65535. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 811589344,MDU6SXNzdWU4MTE1ODkzNDQ=,1235,Upgrade Python version used by official Datasette Docker image,9599,simonw,closed,0,,,,,2,2021-02-19T00:47:40Z,2021-02-19T01:48:31Z,2021-02-19T01:48:30Z,OWNER,,"Currently uses 3.7.2: https://github.com/simonw/datasette/blob/73bed175631a79e13a521eee82f8451dd0477eb3/Dockerfile#L1 There's a security fix for Python which it would be good to ship in this image (even though I'm reasonably confident it doesn't affect Datasette): https://bugs.python.org/issue42938",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 634112607,MDU6SXNzdWU2MzQxMTI2MDc=,812,Ability to customize what happens when a view permission fails,9599,simonw,closed,0,,,5533512,Datasette 0.45,3,2020-06-08T04:26:14Z,2020-07-01T04:17:46Z,2020-07-01T04:17:45Z,OWNER,,"Currently view permission failures raise a `Forbidden` error which is transformed into a 403. It would be good if this page could offer a way forward - maybe just by linking to (or redirecting to) a login screen. This behaviour will vary based on authentication plugins, so a new plugin hook is probably the best way to do this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/812/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 626211658,MDU6SXNzdWU2MjYyMTE2NTg=,778,Ability to configure keyset pagination for views and queries,9599,simonw,open,0,,,,,1,2020-05-28T04:48:56Z,2020-10-02T02:26:25Z,,OWNER,,"Currently views offer pagination, but it uses offset/limit - e.g. https://latest.datasette.io/fixtures/paginated_view?_next=100 This means pagination will perform poorly on deeper pages. If a view is based on a table that has a primary key it should be possible to configure efficient keyset pagination that works the same way that table pagination works. This may be as simple as configuring a column that can be treated as a ""primary key"" for the purpose of pagination using `metadata.json` - or with a `?_view_pk=colname` querystring argument.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/778/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 725996507,MDU6SXNzdWU3MjU5OTY1MDc=,1036,Make it possible to download BLOB data from the Datasette UI,9599,simonw,closed,0,,,6026070,0.51,16,2020-10-20T22:47:56Z,2021-01-18T17:45:00Z,2020-10-25T00:14:52Z,OWNER,,"Currently you can only extract binary BLOB data as base64-encoded JSON, which is not user friendly at all. It should always be possible for end-users to get the binary data out. I'm worried about XSS vulnerabilities here, but hopefully sending `Content-Type: application/octet-stream` helps there? Need to research that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 726094754,MDU6SXNzdWU3MjYwOTQ3NTQ=,1037,Add horizontal scrollbar to tables,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-21T03:13:34Z,2020-10-27T20:52:04Z,2020-10-21T03:16:36Z,OWNER,,Currently you have to scroll the entire page sideways if a table is wide. Make the table `overflow-x: auto` instead.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 785573793,MDU6SXNzdWU3ODU1NzM3OTM=,1186,"script type=""module"" support",9599,simonw,closed,0,,,6346396,Datasette 0.54,1,2021-01-14T01:17:47Z,2021-01-24T21:21:41Z,2021-01-14T01:50:58Z,OWNER,,"Custom JavaScript can be loaded in `metadata.json` like this: ```json { ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } ``` Add a `""module"": true` option which causes the resulting script element to use ` {% endfor %} {% block extra_head %}{% endblock %} {% endblock %} {% block extra_body_end %} {% include ""_close_open_menus.html"" %} {% for body_script in body_scripts %} {% endfor %} {% endblock %} ``` But this resulted in pages looking like this: Note that the cog menu is broken and the filter UI is unstyled. To get these working correctly I would need to copy over a whole lot of Datasette's default CSS - and that means that when Datasette changes in the future those pages could break in subtle ways.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268462768,MDU6SXNzdWUyNjg0NjI3Njg=,38,Experiment with patterns for concurrent long running queries,9599,simonw,closed,0,,,,,5,2017-10-25T16:23:42Z,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,,I want to understand how the system could perform under load with many concurrent long-running queries. Can we serve these without blocking the event loop?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459509126,MDU6SXNzdWU0NTk1MDkxMjY=,516,Enforce import sort order with isort,9599,simonw,open,0,,,,,8,2019-06-22T20:35:50Z,2023-08-23T02:15:36Z,,OWNER,,"I want to use isort to order imports. A few steps here: - [x] Add a .isort.cfg file (see below) - [x] Use `isort -rc` to reformat existing code - [ ] Commit this change - [x] Add a unit test that ensures future changes remain isort compatible",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,JoeGermuska,closed,0,,,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 645975649,MDU6SXNzdWU2NDU5NzU2NDk=,867,register_routes() should support non-async view functions too,9599,simonw,closed,0,,,5533512,Datasette 0.45,1,2020-06-26T03:11:25Z,2020-06-27T18:30:41Z,2020-06-27T18:30:40Z,OWNER,,"I was looking at this: https://github.com/simonw/datasette-block-robots/blob/main/datasette_block_robots/__init__.py ```python from datasette import hookimpl from datasette.utils.asgi import Response async def robots_txt(): return Response.text(""User-agent: *\nDisallow: /"") @hookimpl def register_routes(): return [ (r""^/robots\.txt$"", robots_txt), ] ``` And I realized that if `register_routes()` could support non-async view functions it could be reduced to this: ```python @hookimpl def register_routes(): return [ (r""^/robots\.txt$"", lambda: Response.text(""User-agent: *\nDisallow: /"")), ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/867/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459621683,MDU6SXNzdWU0NTk2MjE2ODM=,521,Easier way of creating custom row templates,9599,simonw,closed,0,,,,,6,2019-06-23T21:49:27Z,2019-07-03T03:23:56Z,2019-07-03T03:23:56Z,OWNER,,"I was messing around with a custom `_rows_and_columns.html` template and ended up with this: ```html {% for row in display_rows %}

{% for cell in row %} {% if cell.column == ""First_Name"" %}

{{ cell.value }} {% elif cell.column == ""Last_Name"" %} {{ cell.value }}

{% elif cell.column == ""Short_Description"" %}

{{ cell.column }}: {{ cell.value }}

{% else %} {{ cell.column }}: {{ cell.value }}    {% endif %} {% endfor %}

{% endfor %} ``` This is nasty. I'd like to be able to do something like this instead: ``` {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 592829135,MDU6SXNzdWU1OTI4MjkxMzU=,713,Support YAML in metadata - metadata.yaml,9599,simonw,closed,0,,,,,6,2020-04-02T18:10:05Z,2020-04-02T19:36:17Z,2020-04-02T19:30:55Z,OWNER,,"I was originally going to do this with a plugin - see #357 - but the more I work with `metadata.json` the more I want it to just accept YAML as an optional alternative to JSON. The best example why is still this one: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/master/russian-ads-metadata.yaml YAML is just SO much better than JSON for multi-line strings - in particular HTML and SQL, both of which are common in `metadata.json` files.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection: ```pycon >>> from datasette.app import Datasette >>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""]) >>> db = ds.add_memory_database('geo') >>> await db.execute_write(""select InitSpatialMetadata(1)"") UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f') no such function: InitSpatialMetadata ``` It looks like the code that loads additional modules only works on read-only connections, not on write connections: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153 Compared to: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,ccorcos,closed,0,,,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 639072811,MDU6SXNzdWU2MzkwNzI4MTE=,849,Rename master branch to main,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2020-06-15T19:05:54Z,2022-10-27T13:57:08Z,2020-09-15T20:37:14Z,OWNER,,"I was waiting for consensus to form around this (and kind-of hoping for `trunk` since I like the tree metaphor) and it looks like `main` is it. I've seen convincing arguments against `trunk` too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So `main` is better anyway.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/849/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 625930207,MDU6SXNzdWU2MjU5MzAyMDc=,770,register_output_renderer can_render mechanism,9599,simonw,closed,0,,,5471110,Datasette 0.43,4,2020-05-27T18:29:14Z,2020-05-28T05:57:16Z,2020-05-28T05:57:16Z,OWNER,,"I would like is the ability for renderers to opt-in / opt-out of being displayed as options on the page. https://www.niche-museums.com/browse/museums for example shows a atom link because the datasette-atom plugin is installed... but clicking it will give you a 400 error because the correct columns are not present. Here's the code that passes a list of renderers to the template: https://github.com/simonw/datasette/blob/2d099ad9c657d2cab59de91cdb8bfed2da236ef6/datasette/views/base.py#L411-L423 A renderer is currently defined as a two-key dictionary: ```python @hookimpl def register_output_renderer(datasette): return { 'extension': 'test', 'callback': render_test } ``` I can add a third key, `""should_suggest""` which is a function that returns `True` or `False` for a given query. If that key is missing it is assumed to return `True`. One catch: what arguments should be passed to the `should_suggest(...)` function? UPDATE: now calling it `can_render` instead. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/581#issuecomment-634856748_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1148638868,I_kwDOBm6k_c5EdtaU,1639,Make datasette-redirect-forbidden unneccessary,9599,simonw,open,0,,,,,0,2022-02-23T22:18:46Z,2022-02-23T22:18:46Z,,OWNER,,"I wrote `datasette-redirect-forbidden` today because I needed 403 errors to redirect to `/-/login` and it was the quickest way to solve that problem. This should be a feature of Datasette core. - https://github.com/simonw/datasette-redirect-forbidden/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 679700269,MDU6SXNzdWU2Nzk3MDAyNjk=,938,Pass columns to extra CSS/JS/etc plugin hooks,9599,simonw,closed,0,,,,,3,2020-08-16T06:37:47Z,2020-09-30T20:36:12Z,2020-08-16T18:09:59Z,OWNER,,"I'd like `datasette-cluster-map` to only add links to JavaScript on pages that have tables with latitude and longitude columns. Passing the names of the columns to the plugin hook can support this and will be backwards compatible thanks to pluggy.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/938/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315738696,MDU6SXNzdWUzMTU3Mzg2OTY=,226,Unit tests for installable plugins,9599,simonw,closed,0,,,,,2,2018-04-19T06:05:32Z,2020-11-24T19:52:51Z,2020-11-24T19:52:46Z,OWNER,,"I'd like more thorough unit test coverage of the plugins mechanism - in particular for installable plugins. I think I can do this while still having the code live in the same repo, by creating a subdirectory in tests/example_plugin with its own setup.py and then running `python setup.py install` as part of the test runner. I imagine I will need to bump the version number every time I change the plugin in case someone runs the test again in the same virtual environment. If that doesn't work I can instead ship a datasette-plugins-tests two to PyPI and add that as a tests_require dependency. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 635147716,MDU6SXNzdWU2MzUxNDc3MTY=,825,Way to enable a default=False permission for anonymous users,9599,simonw,closed,0,,,5512395,Datasette 0.44,6,2020-06-09T06:26:27Z,2020-06-09T17:19:19Z,2020-06-09T17:01:10Z,OWNER,,"I'd like plugins to be able to ship with a default that says ""anonymous users cannot do this"", but allow site administrators to over-ride that such that anonymous users can use the feature after all. This is tricky because right now the anonymous user doesn't have an actor dictionary at all, so there's no key to match to an allow block.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/825/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here: https://github.com/simonw/datasette-publish-vercel/issues/51 In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site). Can this be addressed? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 632843030,MDU6SXNzdWU2MzI4NDMwMzA=,807,Ability to ship alpha and beta releases,9599,simonw,closed,0,,,5533512,Datasette 0.45,18,2020-06-07T00:12:55Z,2020-06-18T21:41:16Z,2020-06-18T21:41:16Z,OWNER,,I'd like to be able to ship alphas and betas to PyPI so in-development plugins can depend on them and help test unreleased plugin hooks.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/807/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760605882,MDU6SXNzdWU3NjA2MDU4ODI=,1135,Feature: --create option to create database file if it does not yet exist,9599,simonw,closed,0,,,,,0,2020-12-09T19:23:58Z,2021-01-24T21:19:39Z,2020-12-09T19:45:52Z,OWNER,,"I'd like to be able to tell people to run the following in the Datasette documentation to get started: brew install datasette datasette install datasette-upload-csvs datasette data.db --create --root --open This would give them a local Datasette instance with the ability to drag-and-drop CSV files directly into it. Just one catch: I don't want to have to talk them through creating an empty SQLite database file. So I want to add a new `--create` option which means ""If any of the database files passed on the command-line do not yet exist, create them"".",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324835838,MDU6SXNzdWUzMjQ4MzU4Mzg=,276,Handle spatialite geometry columns better,45057,russss,closed,0,,,,,21,2018-05-21T08:46:55Z,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,CONTRIBUTOR,,"I'd like to see spatialite geometry columns rendered more sensibly - at the moment they come through as well-known-binary unless you use custom SQL, and WKB isn't of much use to anyone on the web. In HTML: they should be shown either as simple lat/long (if it's just a point, for example), or as a sensible placeholder if they're more complex geometries. In JSON: they should be GeoJSON geometries, (which means they can be automatically fed into a leaflet map with no further messing around). In CSV: they should be WKT. I briefly wondered if this should go into a plugin, but I suspect it needs hooking in at a deeper level than the plugin architecture will support any time soon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/276/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200649889,I_kwDOBm6k_c5HkHah,1710,Guide for plugin authors to upgrade their plugins for 1.0,9599,simonw,closed,0,,,,,1,2022-04-11T22:58:25Z,2022-04-11T23:04:01Z,2022-04-11T23:03:25Z,OWNER,,I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 860722711,MDU6SXNzdWU4NjA3MjI3MTE=,1301,Publishing to cloudrun with immutable mode?,5413548,louispotok,open,0,,,,,1,2021-04-18T17:51:46Z,2022-10-07T02:38:04Z,,CONTRIBUTOR,,"I'm a bit confused about immutable mode and publishing to cloudrun. (I want to publish with immutable mode so that I can support database downloads.) Running `datasette publish cloudrun --extra-options=""-i example.db""` leads to an error: > Error: Invalid value for '-i' / '--immutable': Path 'example.db' does not exist. However, running `datasette publish cloudrun example.db` not only works but seems to publish in immutable mode anyway! I'm seeing this both with `/-/databases.json` and the fact that downloads are working. When I just `datasette serve` locally, this succeeds both ways and works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706486323,MDU6SXNzdWU3MDY0ODYzMjM=,973,'bool' object is not callable error,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-09-22T15:30:54Z,2020-10-08T23:54:32Z,2020-09-22T15:40:35Z,OWNER,,"I'm getting this when latest is deployed to Cloud Run: ``` Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.8/site-packages/datasette/cli.py"", line 406, in serve inspect_data = json.load(open(inspect_file)) TypeError: 'bool' object is not callable ``` I think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267713226,MDU6SXNzdWUyNjc3MTMyMjY=,15,Support multiple databases,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T15:29:51Z,2017-10-24T02:01:38Z,2017-10-24T02:01:38Z,OWNER,,"I'm going to loop through every database file in the app root directory and bundle all of them. Each one will be accessible at /databasename Note this is without the file extension, and we will disallow multiple files with the same name but different extensions. Supported extensions to start with will be `.db` and `.sqlite` and `.sqlite3`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1501713288,I_kwDOBm6k_c5ZglOI,1963,0.63.3 bugfix release,9599,simonw,closed,0,,,,,2,2022-12-18T02:48:15Z,2022-12-18T03:26:55Z,2022-12-18T03:26:55Z,OWNER,,"I'm going to ship a release which back-ports these two fixes: - https://github.com/simonw/datasette/issues/1958 - https://github.com/simonw/datasette/issues/1955",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1874255116,I_kwDOBm6k_c5vtt0M,2164,Ability to only load a specific list of plugins,9599,simonw,closed,0,,,,,1,2023-08-30T19:33:41Z,2023-09-08T04:35:46Z,2023-08-30T22:12:27Z,OWNER,,"I'm going to try and get this working through an environment variable, so that you can start Datasette and it will only load a subset of plugins including those that use the `register_commands()` hook. Initial research on this: - https://github.com/pytest-dev/pluggy/issues/422",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468709531,I_kwDOBm6k_c5Xirqb,1915,Interactive demo of Datasette 1.0 write APIs,9599,simonw,closed,0,,,,,6,2022-11-29T21:16:03Z,2022-11-30T04:05:46Z,2022-11-30T04:05:46Z,OWNER,,"I'm going to try to get this working on https://latest.datasette.io/ - it already has a way for people to sign in as root, but none of the databases there are writable. So I'm going to build a plugin which adds a writable named in-memory database. And some kind of mechanism for clearing out that database on a regular basis - maybe tables in that database get deleted automatically an hour after they are created? (Would be neat to display their time-left-until-deleted too)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1915/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200649124,I_kwDOBm6k_c5HkHOk,1708,Datasette 1.0 alpha upcoming release notes,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,2,2022-04-11T22:57:12Z,2022-12-13T05:29:06Z,,OWNER,,"I'm going to try writing the release notes first, to see if that helps unblock me. # โš ๏ธ Any release notes in this issue are a draft, and should not be treated as the real thing โš ๏ธ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 542553350,MDU6SXNzdWU1NDI1NTMzNTA=,655,Copy and paste doesn't work reliably on iPhone for SQL editor,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-12-26T13:15:10Z,2020-09-30T20:36:12Z,2020-08-30T17:51:40Z,OWNER,,I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,null92,closed,0,,,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car ![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1858228057,I_kwDOBm6k_c5uwk9Z,2147,Plugin hook for database queries that are run,18899,jackowayed,open,0,,,,,6,2023-08-20T18:43:50Z,2023-08-24T03:54:35Z,,NONE,,"I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.) As far as I can tell reading the docs, there isn't really a hook setup to allow this. Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good. I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just ""haven't needed it yet."" I'm potentially interested in implementing the hook if the latter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 327541975,MDU6SXNzdWUzMjc1NDE5NzU=,300,Hide sort select box on larger screens,9599,simonw,closed,0,,,,,0,2018-05-30T01:34:59Z,2018-05-31T14:43:13Z,2018-05-31T14:43:13Z,OWNER,,"I'm larger screens you can sort by clicking column headers, so no need to show the select box (which was added for the small screen layout that doesn't show headers)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642651572,MDU6SXNzdWU2NDI2NTE1NzI=,860,Plugin hook for instance/database/table metadata,9599,simonw,closed,0,,,,,10,2020-06-21T22:20:25Z,2022-01-13T22:21:42Z,2021-06-26T22:56:28Z,OWNER,,"I'm not happy with how `metadata.(json|yaml)` keeps growing new features. Rather than having a single plugin hook for all of `metadata.json` I'm going to split out the feature that shows actual real metadata for tables and databases - `source`, `license` etc - into its own plugin-powered mechanism. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/357#issuecomment-647189045_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/860/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1552368054,I_kwDOBm6k_c5ch0G2,2000,rewrite_sql hook,193185,cldellow,open,0,,,,,1,2023-01-23T01:02:52Z,2023-01-23T06:08:01Z,,CONTRIBUTOR,,"I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like ```python def rewrite_sql(datasette, database, request, fn, sql, params) ``` It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement. The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed. Plugins that could be written with this hook: - https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching - a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004)) - a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID - a plugin to maintain audit tables when users write to a table - a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue Flaws with this idea: `execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2000/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 638238548,MDU6SXNzdWU2MzgyMzg1NDg=,845,Code coverage should ignore files in .coveragerc,9599,simonw,open,0,,,,,0,2020-06-13T21:45:42Z,2020-06-13T21:46:03Z,,OWNER,,"I'm not sure why this is, but the code coverage I have running in a GitHub Action doesn't take my `.coveragerc` file into account. It should: https://github.com/simonw/datasette/blob/cf7a2bdb404734910ec07abc7571351a2d934828/.github/workflows/test-coverage.yml#L31-L35 Here's the bit that's ignored: https://github.com/simonw/datasette/blob/cf7a2bdb404734910ec07abc7571351a2d934828/.coveragerc#L1-L2 As a result my coverage score is 84%, when it should be 92%: ``` 2020-06-13T21:41:18.4404252Z ----------- coverage: platform linux, python 3.8.3-final-0 ----------- 2020-06-13T21:41:18.4404570Z Name Stmts Miss Cover 2020-06-13T21:41:18.4404971Z -------------------------------------------------------- 2020-06-13T21:41:18.4405227Z datasette/__init__.py 3 0 100% 2020-06-13T21:41:18.4405441Z datasette/__main__.py 3 3 0% 2020-06-13T21:41:18.4405668Z datasette/_version.py 279 279 0% 2020-06-13T21:41:18.4405921Z datasette/actor_auth_cookie.py 20 0 100% 2020-06-13T21:41:18.4406135Z datasette/app.py 499 27 95% 2020-06-13T21:41:18.4406343Z datasette/cli.py 162 45 72% 2020-06-13T21:41:18.4406553Z datasette/database.py 236 17 93% 2020-06-13T21:41:18.4406761Z datasette/default_permissions.py 40 0 100% 2020-06-13T21:41:18.4406975Z datasette/facets.py 210 24 89% 2020-06-13T21:41:18.4407186Z datasette/filters.py 122 7 94% 2020-06-13T21:41:18.4407394Z datasette/hookspecs.py 34 0 100% 2020-06-13T21:41:18.4407600Z datasette/inspect.py 36 23 36% 2020-06-13T21:41:18.4407807Z datasette/plugins.py 34 6 82% 2020-06-13T21:41:18.4408014Z datasette/publish/__init__.py 0 0 100% 2020-06-13T21:41:18.4408240Z datasette/publish/cloudrun.py 57 2 96% 2020-06-13T21:41:18.4408786Z datasette/publish/common.py 19 1 95% 2020-06-13T21:41:18.4409029Z datasette/publish/heroku.py 97 13 87% 2020-06-13T21:41:18.4409243Z datasette/renderer.py 63 4 94% 2020-06-13T21:41:18.4409450Z datasette/sql_functions.py 5 0 100% 2020-06-13T21:41:18.4410480Z datasette/tracer.py 87 16 82% 2020-06-13T21:41:18.4410972Z datasette/utils/__init__.py 504 31 94% 2020-06-13T21:41:18.4411755Z datasette/utils/asgi.py 264 24 91% 2020-06-13T21:41:18.4412173Z datasette/utils/shutil_backport.py 44 44 0% 2020-06-13T21:41:18.4412822Z datasette/version.py 4 0 100% 2020-06-13T21:41:18.4413562Z datasette/views/__init__.py 0 0 100% 2020-06-13T21:41:18.4414276Z datasette/views/base.py 288 19 93% 2020-06-13T21:41:18.4414579Z datasette/views/database.py 120 2 98% 2020-06-13T21:41:18.4414860Z datasette/views/index.py 57 2 96% 2020-06-13T21:41:18.4415379Z datasette/views/special.py 72 16 78% 2020-06-13T21:41:18.4418994Z datasette/views/table.py 418 18 96% 2020-06-13T21:41:18.4428811Z -------------------------------------------------------- 2020-06-13T21:41:18.4430394Z TOTAL 3777 623 84% ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 637395097,MDU6SXNzdWU2MzczOTUwOTc=,838,Incorrect URLs when served behind a proxy with base_url set,79913,tsibley,closed,0,,,6026070,0.51,14,2020-06-11T23:58:55Z,2021-11-20T19:35:48Z,2021-11-20T19:35:48Z,NONE,,"I'm running `datasette serve --config base_url:/foo/ โ€ฆ`, proxying to it with this Apache config: ProxyPass /foo/ http://localhost:8001/ ProxyPassReverse /foo/ http://localhost:8001/ and then accessing it via `https://example.com/foo/`. Although many of the URLs in the pages are correct (presumably because they either use absolute paths which include `base_url` or relative paths), the faceting and pagination links still use fully-qualified URLs pointing at `http://localhost:8001`. I looked into this a little in the source code, and it seems to be an issue anywhere `request.url` or `request.path` is used, as these contain the values for the request between the frontend (Apache) and backend (Datasette) server. Those properties are primarily used via the `path_with_โ€ฆ` family of utility functions and the `Datasette.absolute_url` method.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,precipice,open,0,,,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,xrotwang,closed,0,,,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906385991,MDU6SXNzdWU5MDYzODU5OTE=,1349,CSV ?_stream=on redundantly calculates facets for every page,9599,simonw,closed,0,,,,,9,2021-05-29T06:11:23Z,2021-06-17T18:12:32Z,2021-06-01T15:52:53Z,OWNER,,"I'm trying to figure out why a full CSV export from https://covid-19.datasettes.com/covid/ny_times_us_counties runs unbearably slowly. It's because the streaming endpoint works by scrolling through every page, and it turns out every page calculates facets and suggested facets!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1571711808,I_kwDOBm6k_c5drmtA,2018,`check_visibility` gives confusing (wrong?) results if permission is `None`,193185,cldellow,open,0,,,,,0,2023-02-06T01:03:08Z,2023-02-06T01:03:46Z,,CONTRIBUTOR,,"I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ (""update-row"", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 626663119,MDU6SXNzdWU2MjY2NjMxMTk=,781,request.url and request.scheme should obey force_https_urls config setting,9599,simonw,closed,0,,,,,3,2020-05-28T16:54:47Z,2020-05-28T17:39:54Z,2020-05-28T17:10:13Z,OWNER,,"I'm trying to get the https://www.niche-museums.com/browse/feed.atom feed to validate and I git this from https://validator.w3.org/feed/check.cgi?url=https%3A%2F%2Fwww.niche-museums.com%2Fbrowse%2Ffeed.atom > This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations. > > [line 6](https://validator.w3.org/feed/check.cgi?url=https%3A%2F%2Fwww.niche-museums.com%2Fbrowse%2Ffeed.atom#l6), column 73: Self reference doesn't match document location [[help](https://validator.w3.org/feed/docs/warning/SelfDoesntMatchLocation.html ""more information about this error"")] > > I tried to fix this using `force_https_urls` ([commit](https://github.com/simonw/museums/commit/5dc8e2c717c59f9e949b65e47a59878e01f929e4)) but it didn't work - because that setting isn't respected by the Request class: https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/utils/asgi.py#L15-L32",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being too large?,7936571,chrismp,closed,0,,,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command. Compiled slug size: 535.5M is too large (max is 500M). Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 502355384,MDU6SXNzdWU1MDIzNTUzODQ=,580,Testing utilities should be available to plugins,9599,simonw,closed,0,,,,,5,2019-10-03T23:58:26Z,2020-02-28T07:58:46Z,2020-02-28T07:58:46Z,OWNER,,"I'm trying to write a plugin at the moment ([datasette-atom](https://github.com/simonw/datasette-atom)) which needs to run unit tests against a full in-memory Datasette instance, in the same way that the Datasette test suite itself works. I got it working by creating copies of the [TestClient and TestResponse classes](https://github.com/simonw/datasette/blob/a314b761866d250c16f1ff6dd682010cf4181eb4/tests/fixtures.py#L22-L96) within the plugin itself: https://github.com/simonw/datasette-atom/commit/c0e3bd9556d7b31f253a8bf666d42205cd24f4fc#diff-33337525d2d877f7cc7f33737bfd2d7b I had to do this because those classes are in the `tests/` directory within Datasette, so they don't get included in the package that ships to PyPI. It would be better if these classes were included in the main package in a way that made it easy for plugins to reuse them to write their own tests.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 961367843,MDU6SXNzdWU5NjEzNjc4NDM=,1422,Ability to default to hiding the SQL for a canned query,9599,simonw,closed,0,,,,,4,2021-08-05T02:51:39Z,2021-08-07T05:32:29Z,2021-08-07T05:32:29Z,OWNER,,"I'm working on a project with some HUGE (400+ lines of SQL) canned queries right now. Any time you land on the canned query page you have to scroll down a long distance to get to the results! Would be useful to be able to default to https://latest.datasette.io/fixtures/magic_parameters?_hide_sql=1 without needing the parameter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,chrismp,closed,0,,,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 319449852,MDU6SXNzdWUzMTk0NDk4NTI=,247,SQLite code decoupled from Datasette,11912854,jsancho-gpl,open,0,,,,,1,2018-05-02T08:03:28Z,2018-05-21T15:29:31Z,,NONE,,"I'm working on the possibility of use Datasette with other file formats that aren't SQLite, like files with [PyTables](https://github.com/PyTables/PyTables) format. In order to accomplish that, I've started [a fork for decoupling the code related with SQLite](https://github.com/jsancho-gpl/datasette/tree/feature/db-type-plugin) and putting it in an external connector to allow future connectors for a lot of file formats. It'd be nice if you could look at it and suggest improvements for a possible PR.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 659580487,MDU6SXNzdWU2NTk1ODA0ODc=,897,Request method for retrieving the unparsed request body,9599,simonw,closed,0,,,,,1,2020-07-17T19:51:40Z,2020-07-17T20:16:02Z,2020-07-17T20:12:50Z,OWNER,,"I'm writing a plugin (https://github.com/simonw/datasette-update-api/issues/2) that implements an API for inserting JSON data. As such, I'd like to `POST` a JSON blob rather than using `key=value` form encoded data. Right now there's a `request.post_vars()` method but no `request.post_body()` one: https://github.com/simonw/datasette/blob/c5f06bc356fb5917ef7fbb6fe4693f30d711cdb3/datasette/utils/asgi.py#L93-L103",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/897/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411257981,MDU6SXNzdWU0MTEyNTc5ODE=,412,Linked Data(sette),43340,sfkeller,open,0,,,,,2,2019-02-18T00:38:14Z,2019-03-19T10:09:46Z,,NONE,,"I've a radical feature idea (possible first as an extension in order to experiment?): I'd like to link to a remote table from a remote database, e.g. with a function ""linked_datasette()"". So one could do following query: ``` SELECT foo.id, foo.a, remote_party.b FROM foo JOIN linked_datasette(""https://parlgov.datasettes.com/parlgov-b42a2f2"") AS remote_party ON foo.id=remote_party.id ``` This is inspired by SPARQL's SERVICE keyword for remote RDF ""endpoints"". There's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ). And here's an implementation from me in Postgres FDW to connect another Postgres ""endpoint"": https://pastebin.com/Fz2v64Cz .",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/412/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223234932,I_kwDOBm6k_c5I6RV0,1733,Get Datasette compatible with Pyodide,9599,simonw,closed,0,,,,,9,2022-05-02T19:01:58Z,2022-05-04T15:14:01Z,2022-05-02T20:15:27Z,OWNER,,"I've already got this working as a prototype. Here are the changes I had to make: - Replace the two dependencies that don't publish pure Python wheels to PyPI: `click-default-group` and `python-baseconv` - Get Datasette to work without threading - which it turns out is exclusively used for database connections - Make the `uvicorn` dependency optional (only needed when Datasette runs in the CLI) TODO: - [x] Switch to `click-default-group-wheel` - [x] https://github.com/simonw/datasette/issues/1734 - [x] Work around `uvicorn` import error - [x] https://github.com/simonw/datasette/issues/1735 - [x] #1737 Goal is to be able to do the following directly in https://pyodide.org/en/stable/console.html ```python import micropip await micropip.install(""datasette"") from datasette.app import Datasette ds = Datasette() await ds.client.get(""/.json"") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1733/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1218133366,I_kwDOBm6k_c5Imz12,1728,Writable canned queries fail with useless non-error against immutable databases,127565,wragge,closed,0,,,8303187,Datasette 0.62,13,2022-04-28T03:10:34Z,2022-08-14T16:34:40Z,2022-08-14T16:34:40Z,CONTRIBUTOR,,"I've been banging my head against a wall for a while and would appreciate any pointers... - I have a writeable canned query to update rows in the db. - I'm using the github-oauth plugin for authentication. - I have `allow` set on the query to accept my GitHub id and a GH organisation. - Authentication seems to work as expected both locally and on Cloudrun -- viewing `/-/actor` gives the same result in both environments - I can access the 'padlocked' canned query in both environments. Everything seems to be the same, but the canned query works perfectly when run locally, and fails when I try it on Cloudrun. I'm redirected back to the canned query page and the db is not changed. There's nothing in the Cloudstor logs to indicate an error. Any clues as to where I should be looking for the problem?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569613563,MDU6SXNzdWU1Njk2MTM1NjM=,682,Mechanism for writing to database via a queue,9599,simonw,closed,0,,,,,10,2020-02-24T03:10:07Z,2020-02-25T04:45:10Z,2020-02-25T04:45:10Z,OWNER,,"I've been mulling this over for a long time, and I have a new approach that I think is worth exploring. The catch with writing to SQLite is that it should only accept one write at a time. I'm now thinking that an easy way to manage that would be with a write queue for each database which is then read by a single dedicated write thread which manages its own writable connection.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/682/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822934563,I_kwDOBm6k_c5sp8Yj,2109,Plan for getting the new JSON format query views working,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:20:18Z,2023-07-27T00:24:47Z,2023-07-26T18:25:34Z,OWNER,,"I've been stuck on this for too long. I'm breaking it down into a full milestone: https://github.com/simonw/datasette/milestone/29",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212823665,I_kwDOBm6k_c5ISjhx,1715,Refactor TableView to use asyncinject,9599,simonw,closed,0,,,,,13,2022-04-22T21:43:39Z,2022-12-01T21:15:18Z,2022-04-28T22:26:56Z,OWNER,,"I've been working on a dependency injection mechanism in a separate library: - https://github.com/simonw/asyncinject I think it's ready to try out with Datasette to see if it's a pattern that will work here. I'm going to attempt to refactor `TableView` to use it. There are two overall goals here: - Use `asyncinject` to add parallel execution of some aspects of the table page - most notably I want to be able to execute the `count(*)` query, the `select ...` query, the various faceting queries and the facet suggestion queries in parallel - and measure if doing so is good for performance. - Use it to execute different output formats (possibly with some changes to the existing `register_output_renderer()` plugin hook). I want CSV and JSON to use the same mechanism that plugins use. Stretch goal is to get this working with streaming data too, see: - #1101",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name): `` My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`): `` So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication. Thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273247186,MDU6SXNzdWUyNzMyNDcxODY=,68,Support for title/source/license metadata,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:04:21Z,2017-12-04T04:55:43Z,2017-11-13T15:26:11Z,OWNER,,"I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509339999,MDU6SXNzdWU1MDkzMzk5OTk=,600,Don't auto-format SQL on first page load,9599,simonw,closed,0,,,,,0,2019-10-18T22:36:10Z,2019-10-18T23:56:46Z,2019-10-18T23:56:46Z,OWNER,,"I've gone back and forth on this a bit, but I've decided I'm not keen on the way Datasette now automatically formats SQL when a query (or canned query) page first loads. I like having an optional ""Format SQL"" button, but applying formatting automatically means that if the user has carefully formatted their SQL to a specific style their formatting will be automatically over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459397625,MDU6SXNzdWU0NTkzOTc2MjU=,514,Documentation with recommendations on running Datasette in production without using Docker,7936571,chrismp,closed,0,,,5971510,Datasette 0.50,27,2019-06-21T22:48:12Z,2020-10-08T23:55:53Z,2020-10-08T23:33:05Z,NONE,,"I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette. So instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code. `nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &`. When I go to `http://my-remote-server.com:8000`, the site loads. But I know this is not a good long-term solution to running datasette on this server. What is the ""correct"" way to have this site run, preferably on server port 80?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 396212021,MDU6SXNzdWUzOTYyMTIwMjE=,394,base_url configuration setting,9599,simonw,closed,0,,,5234079,Datasette 0.39,27,2019-01-05T23:48:48Z,2020-06-11T09:15:20Z,2020-03-25T00:18:45Z,OWNER,,"I've identified a couple of use-cases for running Datasette in a way that over-rides the default way that internal URLs are generated. 1. Running behind a reverse proxy. I tried running Datasette behind a proxy and found that some of the generated internal links incorrectly referenced `http://127.0.0.1:8001/fixtures/...` - when they should have been referencing `http://my-host.my-domain.com/fixtures/...` - this is a problem both for links within the HTML interface but also for the `toggle_url` keys returned in the JSON as part of the facets datastructure. 2. I would like it to be possible to host a Datasette instance at e.g. `https://www.mynewspaper.com/interactives/2018/election-results/` - either through careful HTTP proxying or, once Datasette has been ported to ASGI, by mounting a Datasette ASGI instance deep within an existing set of URL routes. I'm going to add a `url_prefix` configuration option. This will default to `""""`, which means Datasette will behave as it does at the moment - it will use `/` for most URL prefixes in the HTML version, and an absolute URL derived from the incoming `Host` header for URLs that are returned as part of the JSON output. If `url_prefix` is set to another value (either a full URL or a path) then this path will be appended to all generated URLs.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1347717749,I_kwDOBm6k_c5QVIp1,1791,Updating metadata.json on Datasette for MacOS,1780782,ment4list,open,0,,,,,1,2022-08-23T10:41:16Z,2022-08-23T13:29:51Z,,NONE,,"I've installed Datasette for Mac as per [the documentation](https://docs.datasette.io/en/stable/installation.html#datasette-desktop-for-mac) and it's working great! However, I'm not sure how to go about adding something like ""[Canned Queries](https://docs.datasette.io/en/stable/sql_queries.html#canned-queries)"" or utilising other advanced features or settings by manipulating the `metadata.json` or `settings.json` files. I can view these files from the Datasette App from the top right ""burger"" menu but it only shows the contents of the file with no way to edit or change it. Am I missing something? Where can I update the `metadata.json` file using the MacOS App? PS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,simonw,open,0,,,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it. ``` File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request await self.ds.refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas await self._refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database ``` That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,ar-jan,open,0,,,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn): > Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management. > > This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime. > > For production deployments we recommend using gunicorn with the uvicorn worker class. We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 398011658,MDU6SXNzdWUzOTgwMTE2NTg=,398,Ensure downloading a 100+MB SQLite database file works,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-01-10T20:57:52Z,2020-12-05T19:36:27Z,2020-12-05T19:36:27Z,OWNER,,I've seen attempted downloads of large files fail after about ten seconds.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1901416155,I_kwDOBm6k_c5xVU7b,2189,Server hang on parallel execution of queries to named in-memory databases,9599,simonw,closed,0,,,,,31,2023-09-18T17:23:18Z,2023-09-21T22:26:21Z,2023-09-21T22:26:21Z,OWNER,,"I've started to encounter a bug where queries to tables inside named in-memory databases sometimes trigger server hangs. I'm still trying to figure out what's going on here - on one occasion I managed to Ctrl+C the server and saw an exception that mentioned a thread lock, but usually hitting Ctrl+C does nothing and I have to `kill -9` the PID instead. This is all running on my M2 Mac. I've seen the bug in the Datasette 1.0 alphas and in Datasette 0.64.3 - but reverting to 0.61 appeared to fix it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 527670799,MDU6SXNzdWU1Mjc2NzA3OTk=,639,updating metadata.json without recreating the app,172847,pkoppstein,open,0,,,,,6,2019-11-24T09:19:53Z,2019-11-30T06:08:50Z,,NONE,,"I've sucessfully ""uploaded"" an SQLite database (with a metadata.json file) to heroku using: $ datasette publish heroku so-sales.db -m metadata.json -n so-sales The question is: how can I modify the (small) metadata.json file without having to upload the (large) SQLite database. The directions on heroku indicate I should run: heroku git:clone -a so-sales But this just results in an empty directory with a warning: warning: You appear to have cloned an empty repository. I've been able to ""clone"" the heroku ""app"" using the command: $ heroku slugs:download -a so-sales but this is not a git repository.... Ideally, it seems to me, there'd be an option of the `datasette` CLI to allow a file to be updated, or there'd be some way to create a local git ""clone"" of the app so that the heroku instructions for ""Deploying with git"" would apply. (p.s. I ran `datasette publish heroku -m metadata.json -n so-sales` in the hope that that would not cause the .db file to be wiped, but of course it was.) (p.p.s. Thanks for Datasette!)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1292370469,I_kwDOBm6k_c5NCAIl,1765,Document plugins providing new plugin hook-,9599,simonw,closed,0,,,,,1,2022-07-03T17:05:14Z,2023-08-31T23:08:24Z,2023-08-31T23:06:31Z,OWNER,,I've used this pattern twice now: https://til.simonwillison.net/datasette/register-new-plugin-hooks - in `datasette-graphql` and `datasette-low-disk-space-hook`. I should describe the pattern on https://docs.datasette.io/en/stable/writing_plugins.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1765/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338278056,I_kwDOBm6k_c5PxICo,1782,Release notes for Datasette 0.62,9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-08-14T15:26:45Z,2022-08-14T17:40:45Z,2022-08-14T17:32:54Z,OWNER,,"I've written a lot of these already for the alphas: - https://github.com/simonw/datasette/releases/tag/0.62a0 - https://github.com/simonw/datasette/releases/tag/0.62a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487738738,I_kwDOBm6k_c5YrRdy,1942,Option for plugins to request that JSON be served on the page,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-12-10T01:08:53Z,2022-12-10T01:11:30Z,,OWNER,,"Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say ""I'd like the JSON for a page to be included in a variable on the HTML page""? `datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data. This idea fits with my overall goals to unify the JSON and HTML context too. Refs: - #1711",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1942/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,simonw,open,0,,,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1479920517,I_kwDOBm6k_c5YNcuF,1934,Return number of ignored/replaced items from /-/insert,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-12-06T19:01:58Z,2022-12-06T19:02:03Z,,OWNER,,"Idea from here: - https://github.com/simonw/sqlite-utils/issues/516",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1934/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1473481262,I_kwDOBm6k_c5X04ou,1928,Hacker News Datasette write demo,9599,simonw,closed,0,,,,,7,2022-12-02T21:17:41Z,2022-12-02T23:47:11Z,2022-12-02T21:43:19Z,OWNER,,"Idea is to have my existing scraper at https://github.com/simonw/scrape-hacker-news-by-domain also write to my private Datasette Cloud account, then create an atom feed from it. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340039409,MDU6SXNzdWUzNDAwMzk0MDk=,336,Ensure --help examples in docs are always up to date,9599,simonw,closed,0,,,,,3,2018-07-10T23:20:01Z,2018-07-24T16:01:29Z,2018-07-24T16:01:29Z,OWNER,,"Ideally I would automatically generate the --help output shown in our docs, but I don't think I can get that working with readthedocs. Instead, I'm going to add a unit test that checks that those extracts in the documentation match the current output of the --help command.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 828858421,MDU6SXNzdWU4Mjg4NTg0MjE=,1258,Allow canned query params to specify default values,1385831,wdccdw,open,0,,,,,5,2021-03-11T07:19:02Z,2023-02-20T23:39:58Z,,NONE,,"If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out. ![image](https://user-images.githubusercontent.com/1385831/110748683-13e72300-820e-11eb-855c-32e03dfef5bf.png) Is there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1258/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 438200529,MDU6SXNzdWU0MzgyMDA1Mjk=,438,Plugins are loaded when running pytest,45057,russss,closed,0,,,,,2,2019-04-29T08:25:58Z,2019-05-02T05:09:18Z,2019-05-02T05:09:11Z,CONTRIBUTOR,,"If I have a datasette plugin installed on my system, its hooks are called when running the main datasette tests. This is probably undesirable, especially with the inspect hook in #437, as the plugin may rely on inspected state that the tests don't know about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 787098146,MDU6SXNzdWU3ODcwOTgxNDY=,1190,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,1024355,tomershvueli,closed,0,,,,,3,2021-01-15T18:18:42Z,2023-08-30T22:16:39Z,2023-08-30T22:16:38Z,NONE,,"If I have a self-hosted instance of Datasette up and running, I'd like to be able to the use the CLI to publish databases to that instance, not only Google or Heroku. Ideally there'd be a `url` parameter or something similar to which one could point the publish command to their instance. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 291639118,MDU6SXNzdWUyOTE2MzkxMTg=,183,Custom Queries - escaping strings,82988,psychemedia,closed,0,,,,,2,2018-01-25T16:49:13Z,2019-06-24T06:45:07Z,2019-06-24T06:45:07Z,CONTRIBUTOR,,"If a SQLite table column name contains spaces, they are usually referred to in double quotes: `SELECT * FROM mytable WHERE ""gappy column name""=""my value"";` In the JSON metadata file, this is passed by escaping the double quotes: `""queries"": {""my query"": ""SELECT * FROM mytable WHERE \""gappy column name\""=\""my value\"";""}` When specifying a custom query in `metadata.json` using double quotes, these are then rendered in the *datasette* query box using single quotes: `SELECT * FROM mytable WHERE 'gappy column name'='my value';` which does not work. Alternatively, a valid custom query can be passed using backticks (\`) to quote the column name and single (unescaped) quotes for the matched value: ``""queries"": {""my query"": ""SELECT * FROM mytable WHERE `gappy column name`='my value';""}`` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,danp,closed,0,,,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 643510821,MDU6SXNzdWU2NDM1MTA4MjE=,862,Set an upper limit on total facet suggestion time for a page,9599,simonw,open,0,,,,,1,2020-06-23T03:57:55Z,2020-06-23T03:58:48Z,,OWNER,,"If a table has 100 columns the facet suggestion code will currently run 100 times, taking a max of `facet_suggest_time_limit_ms` which defaults to 50ms per column: https://github.com/simonw/datasette/blob/000528192eaf891118932250141dabe7a1561ece/datasette/facets.py#L142-L162 So for 100 columns, that's 100 * 50ms = 5s total time that might be spent attempting to calculate facets on a large table! I should implement a hard upper limit on the total amount of time taken suggesting facets - probably of around 500ms. If it takes longer than that the remaining columns will not be considered.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/862/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 970626625,MDU6SXNzdWU5NzA2MjY2MjU=,1435,Turn off suggest facets on tables with large numbers of columns,9599,simonw,open,0,,,,,0,2021-08-13T18:30:48Z,2021-08-13T18:30:48Z,,OWNER,,If a table has 200 columns it will take multiple seconds to try and suggest facets. I should either quit after the first 20 or not suggest facets at all.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 349827640,MDU6SXNzdWUzNDk4Mjc2NDA=,359,Faceted browse against a JSON list of tags,9599,simonw,closed,0,,,,,6,2018-08-12T17:01:14Z,2019-05-29T21:39:12Z,2019-05-03T00:21:44Z,OWNER,,"If a table has a `[""foo"", ""bar"", ""baz""]` JSON column allow that to be faceted against. - [x] Support `?column__arraycontains=x` filter queries - [x] Support `?_facet_array=column` faceting",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 507454958,MDU6SXNzdWU1MDc0NTQ5NTg=,596,Handle really wide tables better,9599,simonw,open,0,,,,,9,2019-10-15T20:05:46Z,2022-09-07T00:58:41Z,,OWNER,,"If a table has hundreds of columns the Datasette UI starts getting unwieldy. Addressing this would be neat. One option would be to only select the first 30 columns by default and provide a UI for selecting more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/596/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 657572753,MDU6SXNzdWU2NTc1NzI3NTM=,894,?sort=colname~numeric to sort by by column cast to real,9599,simonw,open,0,,,,,21,2020-07-15T18:47:48Z,2021-08-20T02:07:53Z,,OWNER,,"If a text column actually contains numbers, being able to ""sort by column, treated as numeric"" would be really useful. Probably depends on column actions enabled by #690",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/894/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273128608,MDU6SXNzdWUyNzMxMjg2MDg=,58,"publish command should detect if ""now"" is installed",9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T08:10:17Z,2017-11-11T16:00:07Z,2017-11-11T16:00:07Z,OWNER,,"If now is not installed, it should tell you where to get it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445855789,MDU6SXNzdWU0NDU4NTU3ODk=,474,Do not allow downloads of mutable databases,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-19T19:35:32Z,2019-05-19T20:41:17Z,2019-05-19T20:41:16Z,OWNER,,If the file changes during download it will probably result in a corrupt download. Safer not to allow downloads at all of mutable databases.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585390482,MDU6SXNzdWU1ODUzOTA0ODI=,702,Option in metadata.json to set default sort order for a table,9599,simonw,closed,0,,,5234079,Datasette 0.39,5,2020-03-21T00:19:56Z,2020-03-25T04:19:36Z,2020-03-22T02:40:35Z,OWNER,,If you access the table page without any `?_sort` or `?_sort_desc` arguments it currently defaults to order by primary key - would be neat to be able to change that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 442330564,MDU6SXNzdWU0NDIzMzA1NjQ=,457,"Ability to ""publish cloudrun"" with no user input",9599,simonw,closed,0,,,,,2,2019-05-09T16:42:51Z,2019-05-09T19:41:31Z,2019-05-09T16:45:08Z,OWNER,,"If you attempt to deploy a new version of a cloudrun deployment, the script currently pauses and asks for user input for the service name like this: ```77d4d7de-3dfc-4acc-9a23-efe16230f318 2019-05-09T15:01:48+00:00 52S gs://datasette-222320_cloudbuild/source/1557414063.1-3a82df8096e9434b93511b0588d8d155.tgz gcr.io/datasette-222320/sf-trees (+1 more) SUCCESS Service name: (sf-trees): USER INPUT REQUIRED HERE Deploying container to Cloud Run service [sf-trees] in project [datasette-222320] region [us-central1] โœ“ Deploying... Done. โœ“ Creating Revision... โœ“ Routing traffic... โœ“ Setting IAM Policy... ``` This is incompatible with running under CI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503053243,MDU6SXNzdWU1MDMwNTMyNDM=,582,Datasette should not completely crash if one SQLite database is malformed,9599,simonw,open,0,,,,,0,2019-10-06T05:11:43Z,2019-10-06T05:11:43Z,,OWNER,,"If you run Datasette against a number of database files and one of them is malformed, you get this 500 error on the index page: It would be better if Datasette still worked and listed the databases that were NOT malformed, then showed an inline error message just for the one that could not be accessed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 684111953,MDU6SXNzdWU2ODQxMTE5NTM=,947,datasette --get exit code should reflect HTTP errors,9599,simonw,closed,0,,,5818042,Datasette 0.49,1,2020-08-23T04:17:08Z,2020-09-11T21:33:15Z,2020-09-11T21:33:15Z,OWNER,,"If you run `datasette . --get /` and the result is a 500 or 404 error (anything that's not a 200 or a 30x) the exit code from the command should not be 0. It should still output the returned content to stdout. This will help with writing soundness checks, as seen in https://til.simonwillison.net/til/til/github-actions_grep-tests.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/947/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503217375,MDU6SXNzdWU1MDMyMTczNzU=,585,"Databases on index page should display in order they were passed to ""datasette serve""?",9599,simonw,closed,0,,,,,1,2019-10-07T03:42:39Z,2019-10-14T03:52:34Z,2019-10-14T03:52:34Z,OWNER,,"If you run this: datasette serve -h 127.0.0.1 -p 8000 -m phone-locations.db healthkit.db locations.db genome.db Then the index page for that Datasette instance should show the databases in the order they were specified on the command-line. Mind you when we add pagination to that page in #468 we may want to do something different here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565552217,MDU6SXNzdWU1NjU1NTIyMTc=,674,Rethink how sanity checks work,9599,simonw,closed,0,,,,,5,2020-02-14T20:57:02Z,2020-03-26T17:19:23Z,2020-02-15T17:57:46Z,OWNER,,"If you specify a file to open using `files` or `-i` then Datasette should show a useful error message and fail to start. Files found by scanning a directory #672 should just be skipped. _Split off from comment by @simonw in https://github.com/simonw/datasette/issues/673#issuecomment-586455321_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 527710055,MDU6SXNzdWU1Mjc3MTAwNTU=,640,Nicer error message for heroku publish name clash,82988,psychemedia,open,0,,,,,1,2019-11-24T14:57:07Z,2019-12-06T07:19:34Z,,CONTRIBUTOR,,"If you try to publish to Heroku using no set name (i.e. the default `datasette` name) and a project already exists under that name, you get a meaningful error report on the first line followed by Py error messages that drown it out: ``` Creating datasette... ! โ–ธ Name datasette is already taken Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/NNNNN/Library/Python/3.7/lib/python/site-packages/datasette/publish/heroku.py"", line 124, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 411, in check_output **kwargs).stdout File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 512, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'datasette', '--json']' returned non-zero exit status 1. ``` It would be neater if: - the Py error message was caught; - the report suggested setting a project name using `-n` etc. It may also be useful to provide a command to list the current names that are being used, which I assume is available via a Heroku call?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1294641696,I_kwDOBm6k_c5NKqog,1767,Ability to set a custom favicon,9599,simonw,open,0,,,,,9,2022-07-05T18:41:12Z,2022-07-05T18:56:43Z,,OWNER,,"If you're running a website on Datasette, like https://www.niche-museums.com/ or https://til.simonwillison.net/ - you should have the ability to easily specify a custom favicon. Currently the `/favicon.ico` view is hard-coded to do this: https://github.com/simonw/datasette/blob/9f1eb0d4eac483b953392157bd9fd6cc4df37de7/datasette/app.py#L179-L188",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 793881756,MDU6SXNzdWU3OTM4ODE3NTY=,1207,"Document the Datasette(..., pdb=True) testing pattern",9599,simonw,closed,0,,,,,1,2021-01-26T02:48:10Z,2021-01-29T02:37:19Z,2021-01-29T02:12:34Z,OWNER,,"If you're writing tests for a Datasette plugin and you get a 500 error from inside Datasette, you can cause Datasette to open a PDB session within the application server code by doing this: ```python ds = Datasette([db_path], pdb=True) response = await ds.client.get(""/"") ``` You'll need to run `pytest -s` to interact with the debugger, otherwise you'll get an error.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353088849,I_kwDOBm6k_c5Qpn9R,1795,Consider automatically cleaning up curly quotes in searches,9599,simonw,open,0,,,,,0,2022-08-27T16:35:25Z,2022-08-27T16:35:25Z,,OWNER,,"If your phone helpfully adds curly quotes for you then phrase searches against FTS won't work: โ€œRebecca Sugarโ€ In regular (not `?_searchmode=raw` search mode Datasette could clean these up for you to help avoid that mistake.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 269731374,MDU6SXNzdWUyNjk3MzEzNzQ=,44,?_group_count=country - return counts by specific column(s),9599,simonw,closed,0,,,,,7,2017-10-30T19:50:32Z,2018-04-26T15:09:58Z,2018-04-26T15:09:58Z,OWNER,,"Imagine if this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283/airports.jsono?country__contains=gu&_group_count=country Turned into this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283?sql=select%20country,%20count(*)%20as%20group_count_country%20from%20airports%20where%20country%20like%20%27%gu%%27%20group%20by%20country%20order%20by%20group_count_country%20desc This would involve introducing a new precedent of query string arguments that start with an _ having special meanings. While we're at it, could try adding _fields=x,y,z Tasks: - [x] Get initial version working - [ ] Refactor code to not just ""pretend to be a view"" - [ ] Get foreign key relationships expanded",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275125805,MDU6SXNzdWUyNzUxMjU4MDU=,124,Option to open readonly but not immutable,9599,simonw,closed,0,,,,,5,2017-11-19T02:11:03Z,2019-06-24T06:43:46Z,2019-06-24T06:43:46Z,OWNER,,Immutable assumes no other process can modify the file. An option to open reqdonly instead would enable other processes to update the file in place.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267513523,MDU6SXNzdWUyNjc1MTM1MjM=,2,Initial proof-of-concept,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T00:45:37Z,2017-10-23T01:26:39Z,2017-10-23T00:45:53Z,OWNER,,Implemented in https://github.com/simonw/stateless-datasets/commit/de04d7a854d71003ffcf98028eab976a936c2dba,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753876808,MDU6SXNzdWU3NTM4NzY4MDg=,1119,"Include generated columns in fixtures.db, if SQLite version supports it",9599,simonw,closed,0,,,,,1,2020-11-30T23:25:31Z,2020-12-01T00:41:14Z,2020-12-01T00:28:04Z,OWNER,,"In #1117 I added a one-off test that creates a DB with generated columns in it: https://github.com/simonw/datasette/blob/461670a0b87efa953141b449a9a261919864ceb3/tests/test_api.py#L1933-L1964 If this table was conditionally added to `fixtures.db` (if SQLite supports the feature) it would be available in the demo at https://latest.datasette.io/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314471743,MDU6SXNzdWUzMTQ0NzE3NDM=,211,Load plugins from a `--plugins-dir=plugins/` directory,9599,simonw,closed,0,,,,,6,2018-04-16T01:17:43Z,2018-04-16T05:22:02Z,2018-04-16T05:22:02Z,OWNER,,"In #14 and 33c7c53ff87c2 I've added working support for setuptools entry_points plugins. These can be installed from PyPI using `pip install ...`. I imagine some projects will benefit from being able to add plugins without first publishing them to PyPI. Datasette already supports [loading custom templates](http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates) like so: datasette serve --template-dir=mytemplates/ mydb.db I propose an additional option, `--plugins-dir=` which specifies a directory full of `blah.py` files which will be loaded into Datasette when the application server starts. datasette serve --plugins-dir=myplugins/ mydb.db This will also need to be supported by `datasette publish` as those Python files should be copied up as part of the deployment.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1219398983,I_kwDOBm6k_c5Iro1H,1730,SQL tracing should much more closely track the SQL query execution,9599,simonw,open,0,,,,,0,2022-04-28T22:41:04Z,2022-04-28T22:41:10Z,,OWNER,,"In #1727 I realized that the SQL tracing was measuring a whole bunch of stuff outside of the SQL query itself. I started experimenting with this fix for that but it didn't work - I got back an empty JSON array of traces for some reason: ```diff diff --git a/datasette/database.py b/datasette/database.py index ba594a8..d7f9172 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -7,7 +7,7 @@ import sys import threading import uuid -from .tracer import trace +from .tracer import trace, trace_child_tasks from .utils import ( detect_fts, detect_primary_keys, @@ -207,30 +207,31 @@ class Database: time_limit_ms = custom_time_limit with sqlite_timelimit(conn, time_limit_ms): - try: - cursor = conn.cursor() - cursor.execute(sql, params if params is not None else {}) - max_returned_rows = self.ds.max_returned_rows - if max_returned_rows == page_size: - max_returned_rows += 1 - if max_returned_rows and truncate: - rows = cursor.fetchmany(max_returned_rows + 1) - truncated = len(rows) > max_returned_rows - rows = rows[:max_returned_rows] - else: - rows = cursor.fetchall() - truncated = False - except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: - if e.args == (""interrupted"",): - raise QueryInterrupted(e, sql, params) - if log_sql_errors: - sys.stderr.write( - ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( - conn, repr(sql), params, e + with trace(""sql"", database=self.name, sql=sql.strip(), params=params): + try: + cursor = conn.cursor() + cursor.execute(sql, params if params is not None else {}) + max_returned_rows = self.ds.max_returned_rows + if max_returned_rows == page_size: + max_returned_rows += 1 + if max_returned_rows and truncate: + rows = cursor.fetchmany(max_returned_rows + 1) + truncated = len(rows) > max_returned_rows + rows = rows[:max_returned_rows] + else: + rows = cursor.fetchall() + truncated = False + except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: + if e.args == (""interrupted"",): + raise QueryInterrupted(e, sql, params) + if log_sql_errors: + sys.stderr.write( + ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( + conn, repr(sql), params, e + ) ) - ) - sys.stderr.flush() - raise + sys.stderr.flush() + raise if truncate: return Results(rows, truncated, cursor.description) @@ -238,9 +239,8 @@ class Database: else: return Results(rows, False, cursor.description) - with trace(""sql"", database=self.name, sql=sql.strip(), params=params): - results = await self.execute_fn(sql_operation_in_thread) - return results + with trace_child_tasks(): + return await self.execute_fn(sql_operation_in_thread) @property def size(self): ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1727#issuecomment-1111602802_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 453846217,MDU6SXNzdWU0NTM4NDYyMTc=,506,Option to display binary data,9599,simonw,closed,0,,,,,10,2019-06-08T23:44:12Z,2019-06-11T15:48:27Z,2019-06-09T16:07:39Z,OWNER,,"In #442 we suppressed rendering of binary data: It turns out there is one use-case where displaying binary data is useful: when you're poking around looking at random SQLite databases you find in `~/Library` trying to figure out what they are for. So, a mechanism for opting in to ugly display of binary data again would be useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 626163974,MDU6SXNzdWU2MjYxNjM5NzQ=,776,register_output_renderer render callback should be optionally awaitable,9599,simonw,closed,0,,,5471110,Datasette 0.43,1,2020-05-28T02:26:29Z,2020-05-28T02:43:36Z,2020-05-28T02:43:36Z,OWNER,,"In #581 I made a bunch of improvements to this, including making `datasette` available to it so it could execute queries. But... it needs to be able to `await` in order to do that. Which means it should be optionally-awaitable. Original idea here: https://github.com/simonw/datasette/issues/645#issuecomment-560036740",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/776/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570309546,MDU6SXNzdWU1NzAzMDk1NDY=,685,Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread(),9599,simonw,closed,0,,,3268330,Datasette 1.0,15,2020-02-25T04:49:44Z,2020-05-30T13:20:50Z,2020-05-08T17:42:18Z,OWNER,,"In #683 I started a new section of internals documentation covering the `Database` class: https://datasette.readthedocs.io/en/latest/internals.html#database-class I decided not to document `.execute()` and `.execute_against_connection_in_thread()` yet because I'm not 100% happy with their API design yet.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/685/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 712984738,MDU6SXNzdWU3MTI5ODQ3Mzg=,987,Documented HTML hooks for JavaScript plugin authors,9599,simonw,open,0,,,,,7,2020-10-01T16:10:14Z,2021-01-25T04:00:03Z,,OWNER,,In #981 I added `data-column=` attributes to the `` on the table page. These should become part of Datasette's documented API so JavaScript plugin authors can use them to derive things about the tables shown on a page (`datasette-cluster-map uses them as-of https://github.com/simonw/datasette-cluster-map/issues/18).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 730210880,MDU6SXNzdWU3MzAyMTA4ODA=,1055,query.html and table.html should share the same table implementation,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2020-10-27T07:58:21Z,2020-10-27T07:58:29Z,,OWNER,,In #998 I made a change that affected the table page but didn't affect the query page because I incorrectly assumed they shared rendering logic.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1055/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957310278,MDU6SXNzdWU5NTczMTAyNzg=,1409,`default_allow_sql` setting (a re-imagining of the old `allow_sql` setting),9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2021-07-31T19:48:56Z,2023-01-07T18:06:01Z,2023-01-05T00:51:31Z,OWNER,,"In 49d6d2f7b0f6cb02e25022e1c9403811f1fa0a7c as part of #813 I removed the `allow_sql` setting - on the basis that users could disable the ability to execute custom SQL queries using the new permission system instead. I don't think this was the right decision. Disabling custom SQL is an important security capability, and explaining how to do it using permissions is significantly more complex than letting people know they can add `--setting allow_sql off`. So I want to bring that setting back - maybe with a different, better name - and have it modify the default for that option if the permissions system doesn't have an opinion. That way people can still use the setting but then use permissions to allow specific signed-in users access to execute SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465327844,MDU6SXNzdWU0NjUzMjc4NDQ=,553,Potential improvements to facet-by-date,9599,simonw,open,0,,,,,3,2019-07-08T15:37:53Z,2019-07-08T15:41:55Z,,OWNER,,"In addition to #483 Tobias had some useful suggestions on Twitter: https://twitter.com/rixxtr/status/1148253926476701696 > I think for date facets, it might be more meaningful to order them by date, rather than by size? Or offer both? I'm *definitely* often interested in size-over-time, so https://data.rixx.de/django_tickets/tickets?_facet_date=created#facet-created โ€ฆ isn't all that helpful! Screenshot of that link: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1886791100,I_kwDOBm6k_c5wdiW8,2180,Plugin hook: `actors_from_ids()`,9599,simonw,closed,0,,,,,6,2023-09-08T01:16:41Z,2023-09-10T17:44:14Z,2023-09-08T04:28:03Z,OWNER,,"In building Datasette Cloud we realized that a bunch of the features we are building need a way of resolving an actor ID to the actual actor, in order to display something more interesting than just an integer ID. Social plugins in particular need this - comments by X, CSV uploaded by X, that kind of thing. I think the solution is a new plugin hook: `actors_from_ids(datasette, ids)` which can return a list of actor dictionaries. The default implementation can return `[{""id"": ""...""}]` for the IDs passed to it. Pluggy has a `firstresult=True` option which is relevant here, since this is the first plugin hook we will have implemented where only one plugin should provide an answer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 502993509,MDU6SXNzdWU1MDI5OTM1MDk=,581,Redesign register_output_renderer callback,9599,simonw,closed,0,,,5471110,Datasette 0.43,24,2019-10-05T17:43:23Z,2020-05-28T02:24:14Z,2020-05-28T02:21:50Z,OWNER,,"In building https://github.com/simonw/datasette-atom it became clear that the callback function (which currently accepts just args, data and view_name) would also benefit from access to a mechanism to render templates and a `datasette` instance so it can execute SQL. To maintain backwards compatibility with existing plugins, we can introspect the callback function to see if it wants those new arguments or not. At a minimum I want to make `datasette` and ASGI `scope` available.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 423316403,MDU6SXNzdWU0MjMzMTY0MDM=,422,Figure out what to do about table counts in a mutable world,9599,simonw,closed,0,,,,,4,2019-03-20T15:27:15Z,2019-05-02T05:43:11Z,2019-05-02T05:43:11Z,OWNER,,"In moving away from the existing static inspect method (see #420 and #419) the biggest thing lost is full table row counts. These can be expensive against large tables, but currently Datasette runs the `count (*) from x` query once at inspection time and then reuses it for every page. We can run those counts with a timelimit, but this means that for larger tables we won't be able to show a count at all, which is disappointing. Is there a way we can find an approximate or lower bound count for a table?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 321631020,MDU6SXNzdWUzMjE2MzEwMjA=,253,Documentation explaining how to use SQLite FTS with Datasette,9599,simonw,closed,0,,,,,1,2018-05-09T16:02:08Z,2018-05-12T12:09:02Z,2018-05-12T12:06:51Z,OWNER,,"In particular how to work with https://www.sqlite.org/fts3.html#_external_content_fts4_tables_ - which Datasette can automatically detect and use to add a search UI to your page. Examples of basic search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name) SELECT rowid, name FROM interests; ``` And complex join-based search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, category, member, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name, category, member) SELECT interests.rowid, interests.name, interest_categories.name, members.name FROM interests JOIN interest_categories ON interests.category_id = interest_categories.id JOIN members ON interests.member_id = members.id; ``` Also mention how `csvs-to-sqlite` can be used to do this easily. This will benefit from #252 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808771690,MDU6SXNzdWU4MDg3NzE2OTA=,1225,More flexible formatting of records with CSS grid,649467,mhalle,open,0,,,,,0,2021-02-15T19:28:17Z,2021-02-15T19:28:35Z,,NONE,,"In several applications I've been experimenting with alternate formatting of datasette query results. Lately I've found that CSS grids work very well and seem quite general for formatting rows. In CSS I use grid templates to define the layout of each record and the regions for each field, hiding the fields I don't want. It's pretty flexible and looks good. It's also a great basis for highly responsive layout. I initially thought I'd only use this feature for record detail views, but now I use it for index views as well. However, there are some limitations: * With the existing table templates, it seems that you can change the `display` property on the enclosing `table`, `tbody`, and `tr` to make them be grid-like, but that seems hacky (convert `table` and `tbody` to be `display: block` and `tr` to be `display: grid`). * More significantly, it's very nice to have the column name available when rendering each record to display headers/field labels. The existing templates don't do that, so a custom `_table` template is necessary. * I don't know if any plugins are sensitive to whether data is rendered as a table or not since I'm not completely clear how plugins get their data. * Regardless, you need custom CSS to take full advantage of grids. I don't have a proposal on how to integrate them more deeply. It would be helpful to at least have an official example or test that used a grid layout for records to make sure nothing in datasette breaks with it. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 377166793,MDU6SXNzdWUzNzcxNjY3OTM=,372,Docker build tools,82988,psychemedia,open,0,,,,,0,2018-11-04T16:02:35Z,2018-11-04T16:02:35Z,,CONTRIBUTOR,,"In terms of small pieces lightly joined, I note that there are several tools starting to appear for building generating Dockerfiles and building Docker containers from simpler components such as `requirements.txt` files. If plugin/extensions builders want to include additional packages, then things like incremental builds of composable builds that add additional items into a base `datasette` container may be required. Examples of Dockerfile generators / container builders: - [openshift/source-to-image (s2i)](https://github.com/openshift/source-to-image) - [jupyter/repo2docker](https://github.com/jupyter/repo2docker) - [stencila/dockter](https://github.com/stencila/dockter) Discussions / threads (via Binderhub gitter) on: - [why `repo2docker` not `s2i`](http://words.yuvi.in/post/why-not-s2i/) - [why `dockter` not `repo2docker`](https://twitter.com/choldgraf/status/1058499607309647872) - [composability in `s2i`](https://trello.com/c/AexIVZNf/1008-8-composable-builds-builds-evg) Relates to things like: - https://github.com/simonw/datasette/pull/280",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/372/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",, 1822982933,I_kwDOBm6k_c5sqIMV,2117,Figure out what to do about `DatabaseView.name`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:58:06Z,2023-08-08T02:02:07Z,2023-08-08T02:02:07Z,OWNER,,"In the old code: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35 This `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54 Figure out how that should work once I've refactored those classes to view functions instead. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1496652622,I_kwDOBm6k_c5ZNRtO,1955,"invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things",32839123,Rik-de-Kort,closed,0,,,,,36,2022-12-14T13:39:56Z,2022-12-19T04:34:16Z,2022-12-18T02:45:18Z,NONE,,"In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: `gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app`. My most recent deployment, however, fails loudly by shouting that `Datasette.invoke_startup()` was not called. It does not seem to be possible to call `invoke_startup` when running using a uvicorn command directly like this (I've reproduced this locally using `uvicorn`). Two candidates that I have tried: * Uvicorn has a `--factory` option, but the app factory has to be synchronous, so no `await invoke_startup` there * `asyncio.get_event_loop().run_until_complete` is also not an option because `uvicorn` already has the event loop running. One additional option is: * Use Gunicorn's [server hooks](https://docs.gunicorn.org/en/stable/settings.html#server-hooks) to call `invoke_startup`. These are also synchronous, but I might be able to get ahead of the event loop starting here. In my current deployment setup, it does not appear to be possible to use `datasette serve` directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described [here](https://github.com/simonw/azure-functions-datasette)) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation. Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears! Almost forgot, minimal reproducer: ```python from datasette import Datasette ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ``` Save as app.py in the same folder as global-power-plants.db, and then try running `uvicorn app:app`. Opening the resulting Datasette instance in the browser will show the error message.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733485423,MDU6SXNzdWU3MzM0ODU0MjM=,1071,Messages should be displayed full width,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-30T20:11:35Z,2020-10-30T20:20:02Z,2020-10-30T20:13:05Z,OWNER,,"In the pattern portfolio: But they're currently showing like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1071/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 685806511,MDU6SXNzdWU2ODU4MDY1MTE=,950,Private/secret databases: database files that are only visible to plugins,9599,simonw,closed,0,,,,,6,2020-08-25T20:46:17Z,2023-08-24T22:26:09Z,2023-08-24T22:26:08Z,OWNER,,"In thinking about the best way to implement https://github.com/simonw/datasette-auth-passwords/issues/6 (SQL-backed user accounts for `datasette-auth-passwords`) I realized that there are a few different use-cases where a plugin might want to store data that isn't visible to regular Datasette users: - Storing password hashes - Storing API tokens - Storing secrets that are used for data import integrations (secrets for talking to the Twitter API for example) Idea: allow one or more private database files to be attached to Datasette, something like this: datasette github.db linkedin.db -s secrets.db -m metadata.yml The `secrets.db` file would not be visible using any of the Datasette's usual interface or API routes - but plugins would be able to run queries against it. So `datasette-auth-passwords` might then be configured like this: ```yaml plugins: datasette-auth-passwords: database: secrets sql: ""select password_hash from passwords where username = :username"" ``` The plugin could even refuse to operate against a database that hadn't been loaded as a secret database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1500636982,I_kwDOBm6k_c5Zcec2,1962,"Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`",9599,simonw,open,0,,,,,1,2022-12-16T17:56:51Z,2022-12-16T21:55:29Z,,OWNER,,"In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead: - #1959 But I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result. The main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead. This requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs: - #1843",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1962/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1434094365,I_kwDOBm6k_c5Veosd,1881,Tool for simulating permission checks against actors,9599,simonw,closed,0,,,,,9,2022-11-03T04:43:20Z,2022-12-09T01:38:21Z,2022-11-04T00:13:05Z,OWNER,,"In working on this issue: - #1855 I realized that if I'm going to make actors more complicated (the proposed `_r` key for additional restricted permissions) I really need an interactive tool for simulating these checks, similar to the https://latest.datasette.io/-/allow-debug tool.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1881/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272735257,MDU6SXNzdWUyNzI3MzUyNTc=,51,Make a proper README,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-09T21:46:07Z,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,,Include instructions on building a local Docker container - currently detailed here: https://gist.github.com/simonw/0ea5c960608c2d876e4637a5e48aa95d (those instructions don't work now that we have removed the Dockerfile in favour of a template generated by `datasette publish`),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733499930,MDU6SXNzdWU3MzM0OTk5MzA=,1072,load_template hook doesn't work for include/extends,9599,simonw,closed,0,,,6026070,0.51,20,2020-10-30T20:33:44Z,2020-10-31T20:48:18Z,2020-10-30T22:50:57Z,OWNER,,"Includes like this one always go to disk, without hitting the `load_template` plugin hook: ```html+jinja
{% block footer %}{% include ""_footer.html"" %}{% endblock %}
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1072/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336089,I_kwDOBm6k_c5U1mKZ,1855,`datasette create-token` ability to create tokens with a reduced set of permissions,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,19,2022-10-26T02:20:52Z,2022-12-14T01:24:49Z,2022-12-13T05:20:24Z,OWNER,,"Initial design ideas: https://github.com/simonw/datasette/issues/1852#issuecomment-1289733483 > Token design concept: > > ```json > { > ""t"": { > ""a"": [""ir"", ""ur"", ""dr""], > ""d"": { > ""fixtures"": [""ir"", ""ur"", ""dr""] > }, > ""t"": { > ""fixtures"": { > ""searchable"": [""ir""] > } > } > } > } > ``` > > That JSON would be minified and signed. > > Minified version of the above looks like this (101 characters): > > `{""t"":{""a"":[""ir"",""ur"",""dr""],""d"":{""fixtures"":[""ir"",""ur"",""dr""]},""t"":{""fixtures"":{""searchable"":[""ir""]}}}}` > > The `""t""` key shows this is a token that as a default API key. > > `""a""` means ""all"" - these are permissions that have been granted on all tables and databases. > > `""d""` means ""databases"" - this is a way to set permissions for all tables in a specific database. > > `""t""` means ""tables"" - this lets you set permissions at a finely grained table level. > > Then the permissions themselves are two character codes which are shortened versions - so: > > * `ir` = `insert-row` > * `ur` = `update-row` > * `dr` = `delete-row` ## Remaining tasks - [x] Add these options to the `datasette create-token` command - [x] Tests for `datasette create-token` options - [x] Documentation for those options at https://docs.datasette.io/en/latest/authentication.html#datasette-create-token - [x] A way to handle permissions that don't have known abbreviations (permissions added by plugins). Probably need to solve the plugin permission registration problem as part of that - [x] Stop hard-coding names of actions in the `permission_allowed_actor_restrictions` function",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1855/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 628572716,MDU6SXNzdWU2Mjg1NzI3MTY=,791,Tutorial: building a something-interesting with writable canned queries,9599,simonw,open,0,,,,,2,2020-06-01T16:32:05Z,2020-10-10T23:34:42Z,,OWNER,,"Initial idea: TODO list, as a tutorial for #698 writable canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273510781,MDU6SXNzdWUyNzM1MTA3ODE=,76,publish should have required argument specifying publisher,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T17:21:26Z,2017-11-13T18:41:01Z,2017-11-13T18:41:01Z,OWNER,,Initially the only argument will be โ€œnowโ€ - but โ€œhyperโ€ can be added in the future,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327395270,MDU6SXNzdWUzMjczOTUyNzA=,296,Per-database and per-table /-/ URL namespace,9599,simonw,open,0,,,,,3,2018-05-29T16:23:13Z,2019-06-28T16:46:34Z,,OWNER,,"Initially this will be for subsets of `/-/inspect` and `/-/metadata` but it will also give us a URL namespace for future features like `/-/facet` (expanded list of a specific facet, linked to from `...`) and `/-/graph` To start: * `/dbname/-/inspect` * `/dbname/-/metadata` * `/dbname/tablename/-/inspect` * `/dbname/tablename/-/metadata` This means we will no longer allow databases or tables to have the name `""-""` - I think that's OK We will continue to support rows with a primary key of `""-""` at the following URL: * `/dbname/tablename/-`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 856895291,MDU6SXNzdWU4NTY4OTUyOTE=,1299,Design better empty states,9599,simonw,open,0,,,,,0,2021-04-13T12:06:12Z,2021-04-13T12:06:12Z,,OWNER,,Inspiration here: https://emptystat.es/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 741665726,MDU6SXNzdWU3NDE2NjU3MjY=,1089,Sweep documentation for words that minimize involved difficulty,9599,simonw,closed,0,,,6055094,Datasette 0.52,1,2020-11-12T14:53:05Z,2020-11-28T23:28:43Z,2020-11-12T20:07:26Z,OWNER,,Inspired by https://github.com/django/django/pull/11482,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1089/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275082158,MDU6SXNzdWUyNzUwODIxNTg=,119,"Build an ""export this data to google sheets"" plugin",9599,simonw,closed,0,,,,,1,2017-11-18T14:14:51Z,2020-06-04T18:46:40Z,2020-06-04T18:46:39Z,OWNER,,"Inspired by https://github.com/kren1/tosheets It should be a plug-in because I'd like to keep all interactions with proprietary / non-open-source software encapsulated in plugins rather than shipped as part of core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/119/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 910092577,MDU6SXNzdWU5MTAwOTI1Nzc=,1356,"Research: syntactic sugar for using --get with SQL queries, maybe ""datasette query""",9599,simonw,open,0,,,,,10,2021-06-03T04:49:42Z,2022-01-20T01:06:37Z,,OWNER,,"Inspired by https://github.com/simonw/sqlite-utils/issues/264 - in particular this example: ``` datasette covid.db --get='/covid.yaml?sql=select * from ny_times_us_counties limit 1' - date: '2020-01-21' county: Snohomish state: Washington fips: 53061 cases: 1 deaths: 0 ``` Having to construct that URL - including potentially URL escaping the SQL query - isn't a great developer experience. Imagine if you could do this instead: datasette covid.db --query ""select * from ny_times_us_counties limit 1"" --format yaml ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102966378,I_kwDOBm6k_c5Bve5q,1599,Add architecture documentation,9599,simonw,open,0,,,,,0,2022-01-14T04:55:38Z,2022-01-14T04:56:03Z,,OWNER,,"Inspired by https://matklad.github.io/2021/02/06/ARCHITECTURE.md.html Good example: https://github.com/rust-analyzer/rust-analyzer/blob/d7c99931d05e3723d878bea5dc26766791fa4e69/docs/dev/architecture.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 638104520,MDU6SXNzdWU2MzgxMDQ1MjA=,841,Research feasibility of 100% test coverage,9599,simonw,closed,0,,,,,9,2020-06-13T06:07:01Z,2020-06-13T21:38:46Z,2020-06-13T21:38:46Z,OWNER,,"Inspired by https://twitter.com/mikeal/status/1271473021593636866 > Almost every library Iโ€™ve written in the last 2 years has had 100% coverage and thatโ€™s probably not going to change in the future. Itโ€™s not that hard to start at 100% and hold onto it and the workflow it enables is so much nicer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/841/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 726687572,MDU6SXNzdWU3MjY2ODc1NzI=,1039,Add an animation to the column actions menu,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-21T16:56:28Z,2020-10-23T19:44:07Z,2020-10-21T17:02:32Z,OWNER,,"Inspired by the animation on some of GitHub's dropdown menus: https://github.com/primer/css/blob/da8ee54248e6d76c15c18e53684a15a6516b5b7f/src/utilities/animations.scss#L114-L131 ```css /* Fade in an element and scale it fast */ .anim-scale-in { animation-name: scale-in; animation-duration: 0.15s; animation-timing-function: cubic-bezier(0.2, 0, 0.13, 1.5); } @keyframes scale-in { 0% { opacity: 0; transform: scale(0.5); } 100% { opacity: 1; transform: scale(1); } } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor. I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 335200136,MDU6SXNzdWUzMzUyMDAxMzY=,327,Explore if SquashFS can be used to shrink size of packaged Docker containers,9599,simonw,open,0,,,,,4,2018-06-24T18:15:16Z,2022-02-17T23:37:24Z,,OWNER,,"Inspired by this article: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html#sqlite-database-indexed--squashed https://en.wikipedia.org/wiki/SquashFS is ""a compressed read-only file system for Linux"" - which means it could be a really nice fit for Datasette and its read-only SQLite databases. It would be interesting to explore a Dockerfile recipe that used SquashFS to compress the SQLite database file that was bundled up by `datasette package` and friends.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855296937,MDU6SXNzdWU4NTUyOTY5Mzc=,1295,Errors should have links to further information,9599,simonw,open,0,,,,,2,2021-04-11T12:39:12Z,2022-12-14T23:28:49Z,,OWNER,,"Inspired by this tweet: https://twitter.com/willmcgugan/status/1381186384510255104 > While I am thinking about faqs. Iโ€™d also like to add short URLs to Rich exceptions. > > I loath cryptic error messages, and Iโ€™ve created a fair few myself. In Rich Iโ€™ve tried to make them as plain English as possible. But... > > would be great if every error message linked to a page that explains the error in detail and offers fixes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 947044667,MDU6SXNzdWU5NDcwNDQ2Njc=,1398,Documentation on using Datasette as a library,9599,simonw,open,0,,,,,1,2021-07-18T14:15:27Z,2021-07-30T03:21:49Z,,OWNER,,"Instantiating `Datasette()` directly is an increasingly interesting pattern. I do it in tests all the time, but thanks to `datasette.client` there are plenty of neat things you can do with it in a library context. Maybe support `from datasette import Datasette` for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 415575624,MDU6SXNzdWU0MTU1NzU2MjQ=,414,datasette requires specific version of Click,82988,psychemedia,closed,0,,,,,1,2019-02-28T11:24:59Z,2019-03-15T04:42:13Z,2019-03-15T04:42:13Z,CONTRIBUTOR,,"Is `datasette` beholden to version `click==6.7`? Current release is at 7.0. Can the requirement be liberalised, eg to `>=6.7`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,bsilverm,closed,0,,,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 802513359,MDU6SXNzdWU4MDI1MTMzNTk=,1217,Possible to deploy as a python app (for Rstudio connect server)?,6165713,plpxsk,open,0,,,,,4,2021-02-05T22:21:24Z,2022-11-04T11:37:52Z,,NONE,,"Is it possible to deploy a `datasette` application as a pythonย web app? In my enterprise, I have option to deploy python apps via [Rstudio Connect](https://github.com/rstudio/rsconnect-python), and I would like to publish a `datasette` dashboard for sharing. I welcome any pointers to converting `datasette serve` into a python app that can be run as something like `python datasette.py --my_data.db`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1217/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 893890496,MDU6SXNzdWU4OTM4OTA0OTY=,1332,?_facet_size=X to increase number of facets results on the page,192568,mroswell,closed,0,,,,,5,2021-05-18T02:40:16Z,2021-05-27T16:13:07Z,2021-05-23T00:34:37Z,CONTRIBUTOR,,"Is there a way to add a parameter to the URL to modify default_facet_size? LIkewise, a way to produce a link on the three dots to expand to all items (or match previous number of items, or add x more)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 658476055,MDU6SXNzdWU2NTg0NzYwNTU=,896,Use white-space: pre-wrap on ALL table cell contents,9599,simonw,closed,0,,,,,4,2020-07-16T19:05:21Z,2020-07-17T01:26:08Z,2020-07-17T01:26:08Z,OWNER,,"Is there any reason NOT to apply `white-space: pre-wrap` to the contents of all table cells in Datasette? The default display mechanism of HTML (stripping leading/trailing slashes and collapsing all other whitespace) doesn't really make sense for displaying the kind of data that Datasette works with.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/896/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842862708,MDU6SXNzdWU4NDI4NjI3MDg=,1280,Ability to run CI against multiple SQLite versions,9599,simonw,open,0,,,,,2,2021-03-28T23:54:50Z,2021-05-10T19:07:46Z,,OWNER,,"Issue #1276 happened because I didn't run tests against a SQLite version prior to 3.16.0 (released 2017-01-02). Glitch is a deployment target and runs SQLite 3.11.0 from 2016-02-15. If CI ran against that version of SQLite this bug could have been avoided.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 826064552,MDU6SXNzdWU4MjYwNjQ1NTI=,1253,"Capture ""Ctrl + Enter"" or ""โŒ˜ + Enter"" to send SQL query?",9308268,rayvoelker,open,0,,,,,1,2021-03-09T15:00:50Z,2021-10-30T16:00:42Z,,NONE,,"It appears as though ""Shift + Enter"" triggers the form submit action to submit SQL, but could that action be bound to the ""Ctrl + Enter"" or ""โŒ˜ + Enter"" action? I feel like that pattern already exists in a number of similar tools and could improve usability of the editor.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 316128955,MDU6SXNzdWUzMTYxMjg5NTU=,230,Setting page size AND max returned rows to 1000 doesn't seem to work,9599,simonw,closed,0,,,,,1,2018-04-20T05:05:11Z,2018-04-26T04:04:25Z,2018-04-26T04:04:25Z,OWNER,,"It appears that if the two settings are the same Datasette fails to return any results, probably because of the trick where we try to fetch 1001 rows so we know if there's a next page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/230/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842212586,MDU6SXNzdWU4NDIyMTI1ODY=,1277,Facet by array breaks if table name contains a space,9599,simonw,closed,0,,,,,1,2021-03-26T18:38:19Z,2021-03-27T03:49:38Z,2021-03-27T03:49:34Z,OWNER,,It breaks when you try to select a filtered item.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455965174,MDU6SXNzdWU0NTU5NjUxNzQ=,508,Ability to set default sort order for a table or view in metadata.json,9599,simonw,closed,0,9599,simonw,,,1,2019-06-13T21:40:51Z,2020-05-28T18:53:03Z,2020-05-28T18:53:02Z,OWNER,,"It can go here in the documentation: https://datasette.readthedocs.io/en/stable/metadata.html#setting-which-columns-can-be-used-for-sorting Also need to fix this sentence which is no longer true: > By default, database views in Datasette do not support sorting",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969548935,MDU6SXNzdWU5Njk1NDg5MzU=,1429,UI for setting `?_size=max` on table page,9599,simonw,open,0,,,,,2,2021-08-12T20:52:09Z,2021-08-13T04:37:41Z,,OWNER,,"It defaults to 100 per page, but you can increase that to 1000 per page using `?_size=max` (or higher if `max_returned_rows` is set higher than that). But... that's only available to people who know how to hack URLs. Solution: add a link that sets that option to the pagination block at the bottom of the table: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 505512251,MDU6SXNzdWU1MDU1MTIyNTE=,588,Queries per DB table in metadata.json,12617395,bsilverm,closed,0,,,,,3,2019-10-10T21:08:19Z,2019-10-21T12:58:22Z,2019-10-21T01:48:42Z,NONE,,"It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries: ` ""databases"": { ""MYDB"": { ""tables"": { ""MYFIRSTTABLE"": { ""source"": ""Test"", ""source_url"": ""https://www.google.com"", ""queries"": { ""Query 1"": { ""sql"": ""select * from MYFIRSTTABLE"", ""title"": ""Query 1"", ""description"": ""This is the first query"" }, } }, ""MYSECONDTABLE"": { ""source"":""Test2"", ""source_url"":""https://www.google.com"", ""queries"": { ""Query 2"" : { ""sql"":""select * from MYSECONDTABLE;"", ""title"": ""Query 2"", ""description"":""This is the second query"" } } } }`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 634651079,MDU6SXNzdWU2MzQ2NTEwNzk=,814,Remove --debug option from datasette serve,9599,simonw,closed,0,,,6026070,0.51,1,2020-06-08T14:10:14Z,2020-10-23T19:44:04Z,2020-10-10T23:39:43Z,OWNER,,"It doesn't appear to do anything useful at all: https://github.com/simonw/datasette/blob/f786033a5f0098371cb1df1ce83959b27c588115/datasette/cli.py#L251-L253 https://github.com/simonw/datasette/blob/f786033a5f0098371cb1df1ce83959b27c588115/datasette/cli.py#L365-L367",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/814/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445858491,MDU6SXNzdWU0NDU4NTg0OTE=,476,"Remove ""datasette skeleton""",9599,simonw,closed,0,,,4305096,0.28,0,2019-05-19T20:04:11Z,2019-05-19T20:06:06Z,2019-05-19T20:06:06Z,OWNER,,"It doesn't work any more, and it's not a particularly useful feature - I've hardly used it since I added it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 341229113,MDU6SXNzdWUzNDEyMjkxMTM=,344,datasette publish heroku fails without name provided,45057,russss,closed,0,,,,,1,2018-07-14T11:15:56Z,2018-07-14T13:00:48Z,2018-07-14T13:00:48Z,CONTRIBUTOR,,"It fails with the following JSON traceback if the `-n` option isn't provided, despite the fact that the command line help says that's not needed for heroku publishes.
``` Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 11, in sys.exit(cli()) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/datasette/cli.py"", line 265, in publish app_name = json.loads(create_output)[""name""] File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/__init__.py"", line 354, in loads return _default_decoder.decode(s) File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py"", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py"", line 357, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449886319,MDU6SXNzdWU0NDk4ODYzMTk=,493,Rename metadata.json to config.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-05-29T15:48:03Z,2023-08-23T01:29:21Z,2023-08-23T01:29:20Z,OWNER,,"It is increasingly being useful configuration options, when it started out as purely metadata. Could cause confusion with the `--config` mechanism though - maybe that should be called ""settings"" instead?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492153532,MDU6SXNzdWU0OTIxNTM1MzI=,573,Exposing Datasette via Jupyter-server-proxy,82988,psychemedia,closed,0,,,,,3,2019-09-11T10:32:36Z,2020-03-26T09:41:30Z,2020-03-26T09:41:30Z,CONTRIBUTOR,,"It is possible to expose a running `datasette` service in a Jupyter environment such as a MyBinder environment using the [`jupyter-server-proxy`](https://github.com/jupyterhub/jupyter-server-proxy). For example, using [this demo Binder](https://mybinder.org/v2/gh/binder-examples/r/master?filepath=index.ipynb) which has the server proxy installed, we can then upload a simple test database from the notebook homepage, from a Jupyter termianl install datasette and set it running against the test db on eg port 8001 and then view it via the path `proxy/8001`. Clicking links results in 404s though because the `datasette` links aren't relative to the current path? ![image](https://user-images.githubusercontent.com/82988/64689964-44b69280-d487-11e9-8f9f-3681422bcc9f.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274264175,MDU6SXNzdWUyNzQyNjQxNzU=,102,datasette publish elasticbeanstalk,9599,simonw,closed,0,,,,,1,2017-11-15T18:48:31Z,2021-01-04T20:13:20Z,2021-01-04T20:13:19Z,OWNER,,"It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster. https://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/102/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449445715,MDU6SXNzdWU0NDk0NDU3MTU=,491,Figure out how to use Firebase with cloudrun to enable vanity URLs and CDN caching,9599,simonw,open,0,,,,,0,2019-05-28T19:48:06Z,2019-05-28T19:48:35Z,,OWNER,,"It looks like Firebase can solve a couple of problems with the existing `datasette publish cloudrun` hosting mechanism: * The URLs it produces aren't pretty enough. Firebase offers more control over vanity URLs. * CDN caching (as seen in `datasette publish now`) is great for improving performance and saving money on Cloud Run execution time. https://firebase.google.com/docs/hosting/cloud-run looks like it can help with both of these. Lots of interesting questions: * Should this be a new `datasette publish firebase` command or should it instead be implemented as additional custom options to `datasette publish cloudrun`? * How much harder does it become to do account setup? * How much will this option cost users?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/491/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 679808124,MDU6SXNzdWU2Nzk4MDgxMjQ=,940,Move CI to GitHub Issues,9599,simonw,closed,0,,,5818042,Datasette 0.49,20,2020-08-16T19:06:08Z,2020-09-14T22:09:35Z,2020-09-14T22:09:35Z,OWNER,,"It looks like the tests take 3m33s to run in GitHub Actions, but they're taking more than 8 minutes in Travis",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/940/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 395236066,MDU6SXNzdWUzOTUyMzYwNjY=,393,"CSV export in ""Advanced export"" pane doesn't respect query",1727065,ltrgoddard,closed,0,,,,,6,2019-01-02T12:39:41Z,2021-06-17T18:14:24Z,2019-01-03T02:44:10Z,NONE,,"It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query. It may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637342551,MDU6SXNzdWU2MzczNDI1NTE=,834,startup() plugin hook,9599,simonw,closed,0,,,5533512,Datasette 0.45,6,2020-06-11T21:48:14Z,2020-06-28T19:38:50Z,2020-06-13T17:56:12Z,OWNER,,"It might be useful to have an `startup` hook which gets passed the `datasette` object as soon as Datasette has finished initializing. My initial use-case for this is configuration verification - checking that the `""plugins""` configuration block for this plugin contains valid details. I imagine there are plenty of other potential uses for this as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/834/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1343422749,I_kwDOBm6k_c5QEwEd,1787,"Move ""datasette --get"" from Getting Started to CLI Reference",9599,simonw,closed,0,,,,,5,2022-08-18T17:53:39Z,2022-08-18T21:57:09Z,2022-08-18T21:56:21Z,OWNER,,It really shouldn't be here: https://docs.datasette.io/en/0.62/getting_started.html#datasette-get,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1787/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 832092321,MDU6SXNzdWU4MzIwOTIzMjE=,1261,Some links aren't properly URL encoded.,812795,brimstone,closed,0,,,,,3,2021-03-15T18:43:59Z,2021-03-21T02:06:44Z,2021-03-20T21:36:06Z,NONE,,"It seems like a percent sign in the query causes some links to end invalid. The json and CSV links on this page don't behave like expected: https://honeypot-brimston3.vercel.app/honeypot?sql=select+time%2C+count%28time%29+as+count+from+%28select+strftime%28%22%25Y-%25m-%25d%22%2C+_etime%29+as+time+from+ssh+%29+group+by+time+order+by+time%3B I can take a swing at trying to fix this, but my python isn't strong and I need a pointer at the right approach and files to change. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 813899472,MDU6SXNzdWU4MTM4OTk0NzI=,1238,Custom pages don't work with base_url setting,79913,tsibley,closed,0,,,,,9,2021-02-22T21:58:58Z,2021-06-05T18:59:55Z,2021-06-05T18:59:55Z,NONE,,"It seems that custom pages aren't routing properly when the `base_url` setting is used. To reproduce, with Datasette 0.55. Create a `templates/pages/custom.html` with some text. ``` mkdir -p templates/pages/ echo ""Hello, world!"" > templates/pages/custom.html ``` Start Datasette. ``` datasette --template-dir templates/ ``` Visit http://localhost:8001/custom and see ""Hello, world!"". Start Datasette with a `base_url`. ``` datasette --template-dir templates/ --setting base_url /prefix/ ``` Visit http://localhost:8001/prefix/custom and see a ""Database not found: custom"" 404. Note that like all routes, http://localhost:8001/custom still works when run with `base_url`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314725342,MDU6SXNzdWUzMTQ3MjUzNDI=,217,Plugin support for datasette publish,9599,simonw,closed,0,,,,,1,2018-04-16T16:17:14Z,2018-07-26T05:33:39Z,2018-07-26T05:16:00Z,OWNER,,"It should be possible to support additional deployment options by writing a plugin (see #59). As part of this, rewrite the Heroku and Now publishers to be implemented as plugins (they will still ship with datasette by default). Maybe `datasette package` should be changed to being part of publish instead, `datasette publish docker` perhaps? Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/217/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469043836,I_kwDOBm6k_c5Xj9R8,1917,Don't allow writable API to edit the `_memory` database,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,2,2022-11-30T04:51:59Z,2022-11-30T05:07:56Z,2022-11-30T05:07:55Z,OWNER,,"It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1917/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464779810,MDU6SXNzdWU0NjQ3Nzk4MTA=,541,Plugin hook for adding extra template context variables,9599,simonw,closed,0,,,,,2,2019-07-05T21:37:05Z,2019-07-06T00:05:59Z,2019-07-06T00:05:59Z,OWNER,,"It turns out I need this for https://github.com/simonw/datasette-auth-github/issues/5 It can be modelled on the `extra_body_script` hook: https://datasette.readthedocs.io/en/stable/plugins.html#extra-body-script-template-database-table-view-name-datasette",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 337141108,MDU6SXNzdWUzMzcxNDExMDg=,332,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1,9599,simonw,closed,0,,,,,12,2018-06-29T21:21:27Z,2018-07-24T04:53:20Z,2018-07-24T03:08:30Z,OWNER,,"It turns out if you load this CSV using `csvs-to-sqlite` you get an Infinity value in SQLite: ``` name,num sasha,10 terry,Inf cathy,0.5 ``` `csvs-to-sqlite infinity-bug.csv infinity-bug.db` I deployed this using: ``` datasette publish now infinity-bug.db --name=datasette-infinity-bug --install=datasette-vega ``` Datasette outputs that as `Infinity` in the JSON format, which causes JavaScript errors. Demo * https://datasette-infinity-bug.now.sh/infinity-bug-0d0224e/infinity-bug - HTML view works * https://datasette-infinity-bug.now.sh/infinity-bug-0d0224e/infinity-bug.json?_shape=array - this outputs the following: ``` [ { ""rowid"": 1, ""name"": ""sasha"", ""num"": 10.0 }, { ""rowid"": 2, ""name"": ""terry"", ""num"": Infinity }, { ""rowid"": 3, ""name"": ""cathy"", ""num"": 0.5 } ] ``` But... in Firefox that gets rendered like this: ![2018-06-29 at 4 20 pm](https://user-images.githubusercontent.com/9599/42115408-5d30f630-7bb8-11e8-8370-c8484801c49b.png) And if you click the ""Show charting options"" button you get this error in the console: ``` SyntaxError: JSON.parse: unexpected character at line 1 column 83 of the JSON data ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 789336592,MDU6SXNzdWU3ODkzMzY1OTI=,1195,"view_name = ""query"" for the query page",9599,simonw,open,0,,,,,4,2021-01-19T20:21:36Z,2021-01-25T04:40:08Z,,OWNER,,It uses `view_name` of `database` at the moment which isn't as useful.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 587322443,MDU6SXNzdWU1ODczMjI0NDM=,710,Remove Zeit Now v1 support,9599,simonw,closed,0,,,,,2,2020-03-24T22:39:49Z,2020-04-04T23:05:12Z,2020-04-04T23:05:12Z,OWNER,,It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718910318,MDU6SXNzdWU3MTg5MTAzMTg=,1015,Research: could Datasette install its own plugins?,9599,simonw,open,0,,,,,1,2020-10-11T19:33:06Z,2020-10-11T19:35:04Z,,OWNER,,"It would be cool if Datasette could offer a plugin browsing interface where users could install plugins by clicking ""Install"" on them - similar to how VS Code extensions work.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1015/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267857622,MDU6SXNzdWUyNjc4NTc2MjI=,25,Endpoint that returns SQL ready to be piped into DB,9599,simonw,closed,0,,,,,2,2017-10-24T00:19:26Z,2017-11-15T05:11:12Z,2017-11-15T05:11:11Z,OWNER,,It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 418329842,MDU6SXNzdWU0MTgzMjk4NDI=,415,Add query parameter to hide SQL textarea,36796532,ad-si,closed,0,,,,,3,2019-03-07T14:11:30Z,2019-03-15T09:30:57Z,2019-03-15T05:22:43Z,NONE,,It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 839367451,MDU6SXNzdWU4MzkzNjc0NTE=,1275,Idea: long-running query mode,9599,simonw,open,0,,,,,0,2021-03-24T05:23:20Z,2021-03-24T05:23:20Z,,OWNER,,"It would be cool if you could run Datasette in a long-running query mode, for use with trusted users - something like this: datasette --unlimited my.db This would disable the query limit, but would also enable a feature where if a query takes longer than e.g. 1s to return Datasette returns an HTML page to the browser with a progress indicator and polls the server until the query is complete.... but also provides the user with a ""cancel"" button. This relates to the `.interrupt()` research in #1270.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1275/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613467382,MDU6SXNzdWU2MTM0NjczODI=,761,Allow-list pragma_table_info(tablename) and similar,9599,simonw,closed,0,,,,,8,2020-05-06T16:54:14Z,2020-05-07T03:09:05Z,2020-05-06T17:18:38Z,OWNER,,"It would be great if `pragma_table_info(tablename)` was allowed to be used in queries. See also https://github.com/simonw/til/blob/master/sqlite/list-all-columns-in-a-database.md > `select * from pragma_table_info(tablename);` is currently disallowed for user-provided queries via a regex restriction - but could help here too. > > https://github.com/simonw/datasette/blob/d349d57cdf3d577afb62bdf784af342a4d5be660/datasette/utils/__init__.py#L174 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/760#issuecomment-624729459_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 341123355,MDU6SXNzdWUzNDExMjMzNTU=,342,Requesting support for query description,12617395,bsilverm,closed,0,,,,,4,2018-07-13T18:50:16Z,2018-07-24T04:53:21Z,2018-07-16T02:33:54Z,NONE,,"It would be great if the metadata file allowed you to enter a description for the query. We have a lot of pre-defined queries that can only be so descriptive by their name. It would be nice if an optional description could be included underneath the name within the UI, or on hover where it currently shows the SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 910088936,MDU6SXNzdWU5MTAwODg5MzY=,1355,datasette --get should efficiently handle streaming CSV,9599,simonw,open,0,,,,,2,2021-06-03T04:40:40Z,2022-03-20T22:38:53Z,,OWNER,,"It would be great if you could use `datasette --get` to run queries that return streaming CSV data without running out of RAM. Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 845794436,MDU6SXNzdWU4NDU3OTQ0MzY=,1284,Feature or Documentation Request: Individual table as home page template,192568,mroswell,open,0,,,,,4,2021-03-31T03:56:17Z,2021-11-04T03:15:01Z,,CONTRIBUTOR,,"It would be great to have a sample showing how to move a single database that has a single table, to the index page. I'm trying it now, and find there is a real depth of Datasette and Python understanding that's required to be successful. I've got all the basic jinja concepts down... variables, template control structures, template inheritance, template overrides, css, html, the --template-dir and --static arguments, etc. But copying the table.html file to index.html doesn't work. There are undocumented functions and filters... I can figure some of them out (yay, url_builder.py and utils/__init__.py!) but it's a slog better handled by a much stronger Python developer. One sample would make a world of difference. The ideal form of this documentation would be a diff between the default table.html and how that would look if essentially moved to index.html. The use case is for everyone who wants to create a public-facing website to explore a single table at the root directory. (Maybe a second bit of documentation for people who have a single database with multiple tables.) (Hmm... might be cool to have a setting for that, where it happens automagically! If only one table, then home page is at the table level. if only one database, then home page is at the database level.... as an option.) I suppose I could ignore this, and somehow do this in the DNS settings once I hook up Vercel to a domain name, maybe.. and remove the breadcrumbs in table.html... but for now, a documentation request in the form of a diff... for viewing a single table (or a single database) at the root. (Actually, there's probably room for a whole expanded section on templates. Noticed some nice table metadata in one of the datasette examples, for instance... Hmm... maybe a whole library of solutions in one place... maybe a documentation hackathon! If that's of interest, of course it's a separate issue. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 847700726,MDU6SXNzdWU4NDc3MDA3MjY=,1285,Feature Request or Plugin Request: Numeric Range Facets,192568,mroswell,open,0,,,,,0,2021-04-01T01:50:20Z,2021-04-01T02:28:19Z,,CONTRIBUTOR,,"It would be great to offer facets for numeric data ranges. The ranges could pull from typical GIS methods of creating choropleth maps. https://gisgeography.com/choropleth-maps-data-classification/ Of the following, for mapping, I've always preferred a Jenks Natural Breaks, or a cross between Jenks and Pretty breaks. - Equal Intervals - Quantile (equal count) - Standard Deviation - Natural Breaks (Jenks) Classification - Pretty Breaks - Some sort of Aggregate Jenks Classification (this isn't standard, but it would be nice to be able to set classification ranges that work across tables.) Here are some links for Natural Breaks, in case this method is unfamiliar. - https://en.wikipedia.org/wiki/Jenks_natural_breaks_optimization - http://wiki.gis.com/wiki/index.php/Jenks_Natural_Breaks_Classification - https://medium.com/analytics-vidhya/jenks-natural-breaks-best-range-finder-algorithm-8d1907192051 Per that last link, there is a Jenks Python module... They also describe it as data-intensive for larger datasets. Maybe this is a good plugin idea. An example of equal Intervals would be 0 โ€“ < 10 10 โ€“ < 20 20 โ€“ < 30 30 โ€“ < 40 It's kind of confusing to have that less-than sign in there. it could also be displayed as: 0 โ€“ 10 10 โ€“ 20 20 โ€“ 30 30 โ€“ 40 But then it's not completely clear which category 10 is in, for instance. (Best to right-justify.. and use an ""en dash"" between numbers.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,janimo,open,0,,,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 646737558,MDU6SXNzdWU2NDY3Mzc1NTg=,870,Refactor default views to use register_routes,9599,simonw,open,0,,,,,10,2020-06-27T18:53:12Z,2022-03-15T20:07:18Z,,OWNER,,"It would be much cleaner if Datasette's default views were all registered using the new `register_routes()` plugin hook. Could dramatically reduce the code in `datasette/app.py`. > The ideal fix here would be to rework my `BaseView` subclass mechanism to work with `register_routes()` so that those views don't have any special privileges above plugin-provided views. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/864#issuecomment-648580556_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/870/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267707940,MDU6SXNzdWUyNjc3MDc5NDA=,14,Datasette Plugins,9599,simonw,closed,0,,,,,22,2017-10-23T15:15:28Z,2019-05-13T18:58:20Z,2019-05-13T18:58:19Z,OWNER,,"It would be neat if additional functionality could be opted-in to the system in the form of easy-to-add plugins, hosted as separate packages. First example: a Google Analytics plugin, which adds GA tracking code with your tracking ID to the web interface for your dataset. This may be an opportunity to experiment with entry points: http://amir.rachum.com/blog/2017/07/28/python-entry-points/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278191223,MDU6SXNzdWUyNzgxOTEyMjM=,159,Come up with an elegant mechanism for per-row template customization,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-11-30T16:47:26Z,2017-12-07T06:12:27Z,2017-12-07T06:12:26Z,OWNER,,It would be nice if customizing the display of an individual row in a custom table template was as simple as possible - refs #153 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/159/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836123030,MDU6SXNzdWU4MzYxMjMwMzA=,1265,Support for HTTP Basic Authentication,468612,yunzheng,closed,0,,,,,3,2021-03-19T15:31:09Z,2021-03-19T22:05:12Z,2021-03-19T21:03:09Z,NONE,,"It would be nice if datasette could support [HTTP Basic Authentication](https://en.wikipedia.org/wiki/Basic_access_authentication). For now I could ofcourse leverage Nginx for basic authentication, but it would be nice to have support for this in datasette by default or via a plugin like datasette-auth-github. My main usecase is to put the whole datasette instance behind a username/password prompt via Basic Auth and not specific urls.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 764059235,MDU6SXNzdWU3NjQwNTkyMzU=,1143,"More flexible CORS support in core, to encourage good security practices",114388,yurivish,open,0,,,3268330,Datasette 1.0,6,2020-12-12T17:06:35Z,2022-02-13T17:41:17Z,,NONE,,"It would be nice if the `--cors` option accepted an origin regex to more securely allow secure local development. As an example, Observable notebooks namespace every user's notebooks by their username and user content is served from username.observableusercontent.com, so you would set `--cors-origin username.observableusercontent.com` to restrict access to a local development Datasette instance to only your own notebooks, rather than exposing the data to any website that makes a request. Thank you for all of your work on Datasette!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,terrycojones,closed,0,,,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313837303,MDU6SXNzdWUzMTM4MzczMDM=,203,Support for units,45057,russss,closed,0,,,,,10,2018-04-12T18:24:28Z,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,,"It would be nice to be able to attach a unit to a column in the metadata, and have it rendered with that unit (and SI prefix) when it's displayed. It would also be nice to support entering the prefixes in variables when querying. With my radio licensing app I've put all frequencies in Hz. It's easy enough to special-case the row rendering to add the SI prefixes, but it's pretty unusable when querying by that field.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/203/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 438259941,MDU6SXNzdWU0MzgyNTk5NDE=,440,Plugin hook for additional data export formats,45057,russss,closed,0,,,,,0,2019-04-29T11:01:39Z,2019-05-01T23:01:57Z,2019-05-01T23:01:57Z,CONTRIBUTOR,,"It would be nice to have a simple way for plugins to provide additional data export formats. Might require a bit of work on the internals. I can work around this at a lower level with the `prepare_sanic` hook from #437 in the mean time. I guess plugins should be able to register a function which takes a row or list of rows and returns the rendered data. They'll also need to provide a file extension and probably a Content-Type. Datasette could then automatically include this format in the list of export formats on each page. Looks like this is related to #119.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440437037,MDU6SXNzdWU0NDA0MzcwMzc=,454,Plugin for allowing CORS from specified hosts,9599,simonw,closed,0,9599,simonw,,,5,2019-05-05T12:05:02Z,2019-10-03T23:59:57Z,2019-10-03T23:59:56Z,OWNER,,"It would be useful if Datasette could be configured to allow CORS requests from one or more origins, as opposed to only allowing either none or `""*""`. This is slightly tricky because the `Access-Control-Allow-Origin: https://foo.example` header is only allowed to return one value per request - and according to https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS ""The Access-Control-Allow-Origin header should contain the value that was sent in the request's Origin header."" This means the application code needs to have a whitelist of allowed hosts and code that dynamically changes the outgoing `Access-Control-Allow-Origin` header based on the `Origin` header from the incoming request.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324162476,MDU6SXNzdWUzMjQxNjI0NzY=,271,Mechanism for automatically picking up changes when on-disk .db file changes,9599,simonw,closed,0,,,,,4,2018-05-17T19:53:15Z,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,,"It would be useful if Datasette could spot when a SQLite database file changes on disk and restart itself (hence re-running .inspect() and picking up the new content hash). Ideally this could happen in an atomic way so no requests get dropped during the switch-over. This may not play well with SQLite opening databases in immutable mode. Research required.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/271/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603295970,MDU6SXNzdWU2MDMyOTU5NzA=,729,Visually distinguish integer and text columns,9599,simonw,closed,0,,,,,8,2020-04-20T14:47:26Z,2020-05-18T17:20:02Z,2020-05-15T18:16:56Z,OWNER,,It would be useful if I could tell from looking at the table page if a column was a integer or a text (or a float I guess?). This is particularly important for knowing if it safe to sort by that column.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274374317,MDU6SXNzdWUyNzQzNzQzMTc=,108,"Include version in python code, output in template",9599,simonw,closed,0,,,,,0,2017-11-16T02:32:40Z,2017-11-16T15:30:04Z,2017-11-16T15:30:04Z,OWNER,,It would be useful if I could tell which version of datasette was running on a site. Embed version number and include it in maybe a tooltip on the โ€œpowered by datasetteโ€ link,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273878873,MDU6SXNzdWUyNzM4Nzg4NzM=,91,"Option to serve databases from a different prefix, serve regular content elsewhere",9599,simonw,closed,0,,,,,1,2017-11-14T17:32:46Z,2017-12-10T03:07:58Z,2017-12-10T03:07:53Z,OWNER,,"It would be useful if the databases themselves could be served from a prefix e.g. datasette serve mydb.db --path-prefix=db Now my database is at `http://localhost:8001/db/mydb-23423` This would free up the rest of the URL namespace for other things. Maybe we could have an option to serve static content from a known folder e.g. datasette serve mydb.db --path-prefix=db --root-content=~/my-project/static Now a hit to `http://localhost:8001/news/` serves content from `~/my-project/static/news/index.html` This would make it trivial to package up entire HTML/CSS/JS apps with one or more underlying SQLite databases. Running without `--cors` would be fine here because any JS apps would be hosted on the same origin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,annapowellsmith,open,0,,,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 541467590,MDU6SXNzdWU1NDE0Njc1OTA=,654,Template debug mode that outputs template context,9599,simonw,closed,0,,,,,3,2019-12-22T15:51:25Z,2019-12-22T16:13:11Z,2019-12-22T16:04:51Z,OWNER,,It would make writing templates (including custom templates) easier if there was an option to dump out the full template context - maybe `?_context=1`,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576711589,MDU6SXNzdWU1NzY3MTE1ODk=,695,Update SQLite bundled with Docker container,9599,simonw,closed,0,,,,,7,2020-03-06T05:42:12Z,2020-03-08T23:33:23Z,2020-03-06T06:15:27Z,OWNER,,"It's 3.26.0 at the moment: https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/Dockerfile#L9-L11 Most recent release is 3.31.1: https://www.sqlite.org/releaselog/3_31_1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 318692953,MDU6SXNzdWUzMTg2OTI5NTM=,242,Rename ?_sql_time_limit_ms= to ?_timelimit=,9599,simonw,closed,0,,,,,0,2018-04-29T06:11:35Z,2018-05-02T00:20:42Z,2018-05-02T00:20:42Z,OWNER,,It's a bit of a mouthful at the moment.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585597329,MDU6SXNzdWU1ODU1OTczMjk=,704,Add datasette-publish-fly to Datasette Publish documentation,9599,simonw,closed,0,,,5234079,Datasette 0.39,1,2020-03-21T22:25:10Z,2020-03-24T22:39:09Z,2020-03-24T22:39:09Z,OWNER,,It's a cool example of a plugin that provides a new publish provider - worth mentioning on https://datasette.readthedocs.io/en/stable/publish.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610342575,MDU6SXNzdWU2MTAzNDI1NzU=,748,?_searchmode=raw should be documented on full-text search page,9599,simonw,closed,0,,,,,0,2020-04-30T19:50:06Z,2020-04-30T21:06:12Z,2020-04-30T21:06:12Z,OWNER,,"It's currently documented here: https://datasette.readthedocs.io/en/stable/json_api.html#special-table-arguments But it should also be described here: https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-view-api",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423364990,I_kwDOBm6k_c5U1tN-,1858,`max_signed_tokens_ttl` setting for a maximum duration on API tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-26T03:05:53Z,2022-11-15T19:58:52Z,2022-10-27T03:15:05Z,OWNER,,"It's currently possible to use `/-/create-token` to create a token that lasts forever. Some administrators may wish to have a maximum expiry instead. I should support that with a setting.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1858/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323223872,MDU6SXNzdWUzMjMyMjM4NzI=,260,Validate metadata.json on startup,9599,simonw,open,0,,,,,7,2018-05-15T13:42:56Z,2023-06-21T12:51:22Z,,OWNER,,"It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail. To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1425011030,I_kwDOBm6k_c5U7_FW,1862,"Create a new table from one or more records, `sqlite-utils` style",9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T04:25:02Z,2022-11-15T19:59:47Z,2022-11-15T06:42:09Z,OWNER,,"It's interesting to also think about what the form-based UI for this could look like - since that would involve users creating new columns of different types on the fly. Will need the `create-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1862/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443038584,MDU6SXNzdWU0NDMwMzg1ODQ=,465,Decide what to do about /-/inspect,9599,simonw,closed,0,,,,,4,2019-05-11T21:39:46Z,2019-06-28T16:34:33Z,2019-06-28T16:34:33Z,OWNER,,"It's not clear to me what this endpoint should do now as a result of #419 - it's still useful to be able to introspect databases for tools like datasette-registry, but since we aren't pre-calculating introspection data any more I need to rethink the approach. For one thing, this endpoint may need to be paginated. Or maybe it should be split up into separate endpoints for each connected database? Those should probably be paginated too seeing as fivethirtyeight has 400+ tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that. On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579973223,I_kwDOBm6k_c5eLHpn,2024,Mention WAL mode in documentation,9599,simonw,open,0,,,,,1,2023-02-10T16:11:10Z,2023-02-10T16:11:53Z,,OWNER,,It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 467790646,MDU6SXNzdWU0Njc3OTA2NDY=,560,CodeMirror fails to load on database page,9599,simonw,closed,0,,,,,3,2019-07-14T03:31:00Z,2019-09-03T01:03:02Z,2019-07-14T03:38:59Z,OWNER,,"It's not loading on https://latest.datasette.io/fixtures But it does load on https://latest.datasette.io/fixtures?sql=select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead: - #647 - https://github.com/simonw/datasette-hashed-urls/issues/3 https://github.com/simonw/datasette-hashed-urls Sub-tasks: - [x] Remove hashed URL mode implementation - [x] Update documentation - [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470542938,MDU6SXNzdWU0NzA1NDI5Mzg=,562,Facet by array shouldn't suggest for arrays that are not arrays-of-strings,9599,simonw,closed,0,,,,,2,2019-07-19T20:51:29Z,2019-11-01T19:42:10Z,2019-11-01T19:37:55Z,OWNER,,"It's triggering for arrays that look like this at the moment: ```json [ { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""12.2780519704024"", ""durationUnit"": ""min"" }, { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""19.467273102204"", ""durationUnit"": ""min"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574035432,MDU6SXNzdWU1NzQwMzU0MzI=,692,is_hidden_table context variable on table.html page,9599,simonw,open,0,,,,,1,2020-03-02T15:03:25Z,2020-03-02T15:03:48Z,,OWNER,,It's useful to know if a table is hidden when rendering that page. `datasette-configure-fts` for example may want to disallow enabling search on hidden tables.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273248366,MDU6SXNzdWUyNzMyNDgzNjY=,69,Enforce pagination (or at least limits) for arbitrary custom SQL,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:21:33Z,2017-11-13T20:32:47Z,2017-11-13T19:35:47Z,OWNER,,"It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination ""next"" links using offset as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"Iโ€™m currently operating under the assumption that itโ€™s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and Iโ€™m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a โ€œsafe modeโ€ option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are โ€œsafeโ€. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 567902704,MDU6SXNzdWU1Njc5MDI3MDQ=,675,--cp option for datasette publish and datasette package for shipping additional files and directories,141844,aviflax,open,0,,,,,12,2020-02-19T22:55:56Z,2020-12-28T18:49:21Z,,NONE,,"Iโ€™m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image โ€” in my case, itโ€™s a sort of a deployment directive. Iโ€™ve worked out a way to do this after the image has been created, but itโ€™s convoluted and brittle. So itโ€™d be excellent if there was an additional option for this command, something like, like, `--copy`. Iโ€™d envision it looking something like: ```shell $ datasette package --copy /the/source/path:/the/target/path data.db ``` Iโ€™d be happy to help design, specify, implement, and test this feature, if youโ€™d be interested. Thanks for the fantastic tools!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612378203,MDU6SXNzdWU2MTIzNzgyMDM=,757,Question: Any fixed date for the release with the uft8-encoding fix?,2181410,clausjuhl,closed,0,,,,,3,2020-05-05T06:51:20Z,2020-05-06T18:41:29Z,2020-05-06T18:41:29Z,NONE,,Just a little impatient :),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/757/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 649702801,MDU6SXNzdWU2NDk3MDI4MDE=,888,URLs in release notes point to 127.0.0.1,3243482,abdusco,closed,0,,,,,1,2020-07-02T07:28:04Z,2020-09-15T20:39:50Z,2020-09-15T20:39:49Z,CONTRIBUTOR,,"Just a quick heads up: Release notes for 0.45 include urls that point to localhost. https://github.com/simonw/datasette/releases/tag/0.45",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/888/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127117,MDU6SXNzdWUyNzMxMjcxMTc=,55,Ship first version to PyPI,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-11T07:38:48Z,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,,"Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912419349,MDU6SXNzdWU5MTI0MTkzNDk=,1359,`?_trace=1` should only be available with a new `trace_debug` setting,9599,simonw,closed,0,,,,,0,2021-06-05T19:59:27Z,2021-06-05T20:18:46Z,2021-06-05T20:18:46Z,OWNER,,Just like template debug mode is controlled by this off-by-default setting: https://github.com/simonw/datasette/blob/368aa5f1b16ca35f82d90ff747023b9a2bfa27c1/datasette/app.py#L160-L164,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518. ``` % curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 '' <head> <title>fixtures: facetable: 14 rows where where on_earth = 1 sorted by pk ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1431786951,I_kwDOBm6k_c5VV1XH,1876,SQL query should wrap on SQL interrupted screen,9599,simonw,closed,0,,,,,2,2022-11-01T17:14:01Z,2022-11-01T17:22:33Z,2022-11-01T17:22:33Z,OWNER,,"Just saw this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1876/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735644513,MDU6SXNzdWU3MzU2NDQ1MTM=,1081,"Fixtures should use FTS4 or FTS5, not FTS3",9599,simonw,closed,0,,,6055094,Datasette 0.52,0,2020-11-03T21:24:13Z,2020-11-12T00:03:00Z,2020-11-12T00:02:59Z,OWNER,,"Just spotted that `fixtures.db` uses FTS3, which is pretty much obsolete these days. https://github.com/simonw/datasette/blob/13d1228d80c91d382a05b1a9549ed02c300ef851/tests/fixtures.py#L488-L489",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 447469253,MDU6SXNzdWU0NDc0NjkyNTM=,485,Improvements to table label detection ,9599,simonw,open,0,9599,simonw,,,10,2019-05-23T06:19:49Z,2022-10-03T00:04:42Z,,OWNER,,"Label detection doesn't work if the primary key is called pk rather than id, so this page doesn't work: https://latest.datasette.io/fixtures/roadside_attraction_characteristics Code is here: https://github.com/simonw/datasette/blob/cccea85be6aaaeadb31f3b588ec7f732628815f5/datasette/app.py#L644-L653",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change. After shipping, release a new `datasette-hashed-urls` that depends on it, also this: - https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273191806,MDU6SXNzdWUyNzMxOTE4MDY=,66,Show table SQL on table page,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:51:23Z,2017-11-12T21:17:29Z,2017-11-12T21:17:29Z,OWNER,,"Let's do the SQL for the table you are looking at, plus SQL for any indexes that mention that table. The page for a view should show the SQL for that view.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275176094,MDU6SXNzdWUyNzUxNzYwOTQ=,134,Filtered table view should show a count,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T17:26:53Z,2017-11-19T18:10:49Z,2017-11-19T18:10:49Z,OWNER,,Let's do the thing where we attempt to show an accurate count if it can be done in less than 50ms,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271242824,MDU6SXNzdWUyNzEyNDI4MjQ=,45,Run SQLite operations in a thread pool,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-05T02:27:12Z,2017-11-05T02:27:34Z,2017-11-05T02:27:33Z,OWNER,,"Let's run SQLite operations in threads, so we don't end up blocking our core event loop. These articles are helpful: * https://pymotw.com/3/asyncio/executors.html * https://marlinux.wordpress.com/2017/05/19/python-3-6-asyncio-sqlalchemy/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733829385,MDU6SXNzdWU3MzM4MjkzODU=,1077,database_actions plugin hook,9599,simonw,closed,0,,,6055094,Datasette 0.52,3,2020-10-31T23:48:12Z,2020-11-02T18:43:25Z,2020-11-02T18:29:50Z,OWNER,,Like `column_actions` but adds a cog menu to the database page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1077/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750087350,MDU6SXNzdWU3NTAwODczNTA=,1108,Configure /en/stable/config.html redirect when I ship 0.52,9599,simonw,closed,0,,,6055094,Datasette 0.52,1,2020-11-24T21:39:19Z,2020-11-29T02:42:42Z,2020-11-29T02:42:42Z,OWNER,,"Like this: _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1106#issuecomment-733248437_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323677499,MDU6SXNzdWUzMjM2Nzc0OTk=,265,Add links to example Datasette instances to appropiate places in docs,9599,simonw,closed,0,,,,,5,2018-05-16T15:40:20Z,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,,"Links to working examples would really help, especially on these pages: * http://datasette.readthedocs.io/en/latest/json_api.html * http://datasette.readthedocs.io/en/latest/sql_queries.html * http://datasette.readthedocs.io/en/latest/facets.html * http://datasette.readthedocs.io/en/latest/full_text_search.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108300685,I_kwDOBm6k_c5CD1ON,1604,Option to assign a domain/subdomain using `datasette publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-19T16:21:17Z,2022-01-19T16:23:54Z,,OWNER,,Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 722673818,MDU6SXNzdWU3MjI2NzM4MTg=,1023,Fix issues relating to base_url,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-15T21:02:06Z,2020-11-24T19:51:44Z,2020-10-31T20:51:01Z,OWNER,,Lots of `base_url` bugs that I'd like to solve at once.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577578306,MDU6SXNzdWU1Nzc1NzgzMDY=,697,index.html is not reliably loaded from a plugin,9599,simonw,closed,0,,,,,7,2020-03-08T22:37:55Z,2020-03-08T23:33:28Z,2020-03-08T23:11:27Z,OWNER,,"Lots of detail in https://github.com/simonw/datasette-search-all/issues/2 - short version is that I have a plugin with its own `index.html` template and Datasette intermittently fails to load it and uses the default `index.html` that ships with Datasette instead. Related: * #689: ""Templates considered"" comment broken in >=0.35 * #693: Variables from extra_template_vars() not exposed in _context=1 (may as well fix this while I'm in there)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 899169307,MDU6SXNzdWU4OTkxNjkzMDc=,1338,Fix jinja2 warnings,9599,simonw,closed,0,,,,,0,2021-05-24T01:38:23Z,2021-05-24T01:41:55Z,2021-05-24T01:41:55Z,OWNER,,"Lots of these in the test suite now, after the Jinja upgrade in #1331: ``` tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:45: DeprecationWarning: 'jinja2.escape' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.escape' instead. label=jinja2.escape(data[""label""] or """") or "" "", tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:41: DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead. return jinja2.Markup( ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help ``` Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. ``` ``` Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension TEXT Path to a SQLite extension to load --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1471969984,I_kwDOBm6k_c5XvHrA,1926,Release notes for 1.0a1 (and release it),9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-12-01T21:18:12Z,2022-12-01T22:06:13Z,2022-12-01T22:06:12Z,OWNER,,"Mainly CORS support and a few small bug fixes. Changes: https://github.com/simonw/datasette/compare/1.0a0...99da46f7258225fc6fd8e94ddc20859ccccc4109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1926/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 629541395,MDU6SXNzdWU2Mjk1NDEzOTU=,795,response.set_cookie() method,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-02T21:57:05Z,2020-06-09T22:33:33Z,2020-06-09T22:19:48Z,OWNER,,"Mainly to clean up this code: https://github.com/simonw/datasette/blob/4fa7cf68536628344356d3ef8c92c25c249067a0/datasette/app.py#L439-L454",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 629459637,MDU6SXNzdWU2Mjk0NTk2Mzc=,792,"Replace response.body.decode(""utf8"") with response.text in tests",9599,simonw,closed,0,,,,,0,2020-06-02T19:32:24Z,2020-06-02T21:29:58Z,2020-06-02T21:29:58Z,OWNER,,"Make use of the `response.text` property to clean up the tests a tiny bit: https://github.com/simonw/datasette/blob/57cf5139c552cb7feab9947daa949ca434cc0a66/tests/fixtures.py#L26-L38",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/792/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1374626873,I_kwDOBm6k_c5R7yQ5,1810,Featured table(s) on the homepage,9599,simonw,open,0,,,,,4,2022-09-15T14:30:49Z,2022-09-15T15:51:25Z,,OWNER,,"Many Datasette instances mainly exist to serve a single table - for example: - https://global-power-plants.datasettes.com/global-power-plants/global-power-plants - https://laion-aesthetic.datasette.io/laion-aesthetic-6pls/images It would be neat if the / homepage of those instances could be configured to highlight that specific table. Or maybe more than one?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1810/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 963528457,MDU6SXNzdWU5NjM1Mjg0NTc=,1425,render_cell() hook should support returning an awaitable,9599,simonw,closed,0,,,,,11,2021-08-08T22:32:29Z,2021-08-09T07:14:35Z,2021-08-09T03:00:37Z,OWNER,,"Many of the plugin hooks can return an awaitable - e.g. https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hook-extra-template-vars - but `render_cell()` doesn't support this. I recently found myself wanting to execute an additional SQL query from that hook, but it wasn't possible to do that since I couldn't use `await`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268176505,MDU6SXNzdWUyNjgxNzY1MDU=,34,Support CSV export with a .csv extension,9599,simonw,closed,0,,,,,1,2017-10-24T20:34:43Z,2021-06-17T18:14:48Z,2018-05-28T20:45:34Z,OWNER,,"Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesnโ€™t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/34/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268592894,MDU6SXNzdWUyNjg1OTI4OTQ=,43,"While running, server should spot new db files added to its directory ",9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-26T00:32:37Z,2017-11-14T08:25:53Z,2017-11-14T08:25:37Z,OWNER,,"Maybe in each request it checks the time and if 5s has elapsed since t last scanned the directory it scans it again This would allow people with dedicated hosting to run the app there and just upload new datasets whenever they want. It would also be very convenient for development.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,simonw,open,0,,,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 315327860,MDU6SXNzdWUzMTUzMjc4NjA=,223,datasette publish --install=name-of-plugin,9599,simonw,closed,0,,,,,3,2018-04-18T04:33:59Z,2018-04-18T14:56:17Z,2018-04-18T14:56:17Z,OWNER,,Mechanism for causing datasette publish and datasette package to install one or more additional plugins using `pip install` - refs #14 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855885427,I_kwDOBm6k_c5unpBz,2143,De-tangling Metadata before Datasette 1.0,15178711,asg017,open,0,,,,,24,2023-08-18T00:51:50Z,2023-08-24T18:28:27Z,,CONTRIBUTOR,,"Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add ""metadata"" about your ""data"" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries. Specifically, I've found the following problems when working with Datasette metadata: 1. Metadata cannot be updated without re-starting the entire Datasette instance. 2. The `metadata.json`/`metadata.yaml` has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries 3. The Python APIs for defining extra metadata are a bit awkward (the `datasette.metadata()` class, `get_metadata()` hook, etc.) ## Possible solutions Here's a few ideas of Datasette core changes we can make to address these problems. ### Re-vamp the Datasette Python metadata APIs The Datasette object has a single `datasette.metadata()` method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the `get_metadata()` hook. The `get_metadata()` hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do. (I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods) ### Add an optional `datasette_metadata` table Datasette should detect and use metadata stored in a new special table called `datasette_metadata`. This would be a regular table that a user can edit on their own, and would serve as a ""live updating"" source of metadata, than can be changed while the Datasette instance is running. Not too sure what the schema would look like, but I'd imagine: ```sql CREATE TABLE datasette_metadata( level text, target any, key text, value any, primary key (level, target) ) ``` Every row in this table would map to a single metadata ""entry"". - `level` would be one of ""datasette"", ""database"", ""table"", ""column"", which is the ""level"" the entry describes. For example, `level=""table""` means it is metadata about a specific table, `level=""database""` for a specific database, or `level=""datasette""` for the entire Datasette instance. - `target` would ""point"" to the specific object the entry metadata is about, and would depend on what `level` is specific. - `level=""database""`: `target` would be the string name of the database that the metadata entry is about. ex `""fixtures""` - `level=""table""`: `target` would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex `[""fixtures"", ""students""]` - `level=""column""`: `target` would be a JSON array of 3 strings: The database name, table name, and column name. Ex `[""fixtures"", ""students"", ""student_id""`] - `key` would be the type of metadata entry the row has, similar to the current ""keys"" that exist in `metadata.json`. Ex `""about_url""`, `""source""`, `""description""`, etc - `value` would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc A quick sample: level | target | key | value -- | -- | -- | -- datasette | NULL | title | my datasette title... db | fixtures | source | table | [""fixtures"", ""students""] | label_column | student_name column | [""fixtures"", ""students"", ""birthdate""] | description | This `datasette_metadata` would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in `datasette_metadata`, to update descriptions/columns on the fly. ### Re-vamp `metadata.json` and move non-metadata config to another place The motivation behind this is that it's awkward that `metadata.json` contains config about things that are not strictly metadata, including: - Plugin configuration - [Authentication/permissions](https://docs.datasette.io/en/latest/authentication.html#access-permissions-in-metadata) (ex the `allow` key on datasettes/databases/tables - Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases. I think we should move these outside of `metadata.json` and into a different file. The `datasette.json` idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in `datasette.json`, while `metadata.json`/`datasette_metadata` will strictly be about documenting databases/tables/columns. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 448664792,MDU6SXNzdWU0NDg2NjQ3OTI=,487,Refactor database methods off Datasette class,9599,simonw,closed,0,,,,,1,2019-05-27T04:52:41Z,2019-05-27T20:05:34Z,2019-05-27T05:08:01Z,OWNER,,"Methods like this one: https://github.com/simonw/datasette/blob/182a3017c24e3fa3af60e4ac0c91c7e48f8736fd/datasette/app.py#L497-L503 Should live on the `ConnectedDatabase` class instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506669,MDU6SXNzdWUzMTQ1MDY2Njk=,215,Allow plugins to define additional URL routes and views,9599,simonw,closed,0,,,5512395,Datasette 0.44,14,2018-04-16T05:31:09Z,2020-06-09T03:14:32Z,2020-06-09T03:12:08Z,OWNER,,"Might be as simple as having plugins get passed the `app` after the other routes have been defined: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1270-L1274 Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/215/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886330,MDU6SXNzdWUyNjc4ODYzMzA=,27,Ability to plot a simple graph,9599,simonw,closed,0,,,,,3,2017-10-24T03:34:59Z,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,,"Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. Itโ€™s up to the user to come up with SQL that gets the right values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 441858747,MDU6SXNzdWU0NDE4NTg3NDc=,455,Hidden tables shown on the index page,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-08T18:02:13Z,2019-05-14T15:49:29Z,2019-05-14T15:48:08Z,OWNER,,"Minor bug in master right now. https://csvconf.now.sh/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312620566,MDU6SXNzdWUzMTI2MjA1NjY=,199,Ability to apply sort on mobile in portrait mode,9599,simonw,closed,0,,,,,4,2018-04-09T17:35:04Z,2018-04-10T00:37:53Z,2018-04-10T00:34:38Z,OWNER,,"Missed this in #189... on mobile in portrait mode we hide the column headers, which means you can't click them to sort! You can sort in landscape mode at least. Need to come up with an alternative sort UI for portrait on mobile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 714377268,MDU6SXNzdWU3MTQzNzcyNjg=,991,Redesign application homepage,9599,simonw,open,0,,,,,7,2020-10-04T18:48:45Z,2021-01-26T19:06:36Z,,OWNER,,"Most Datasette instances only host a single database, but the current homepage design assumes that it should leave plenty of space for multiple databases: Reconsider this design - should the default show more information? The Covid-19 Datasette homepage looks particularly sparse I think: https://covid-19.datasettes.com/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/991/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 635519358,MDU6SXNzdWU2MzU1MTkzNTg=,826,Document the ds_actor signed cookie,9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-09T15:06:52Z,2020-06-09T22:33:12Z,2020-06-09T22:32:31Z,OWNER,,"Most authentication plugins (https://github.com/simonw/datasette-auth-github for example) are likely to work by setting the `ds_actor` signed cookie, which is already magically decoded and supported by default Datasette here: https://github.com/simonw/datasette/blob/4fa7cf68536628344356d3ef8c92c25c249067a0/datasette/actor_auth_cookie.py#L1-L13 I should document this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,simonw,closed,0,,,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605110015,MDU6SXNzdWU2MDUxMTAwMTU=,731,Option to automatically configure based on directory layout,9599,simonw,closed,0,,,,,9,2020-04-22T22:17:47Z,2020-04-27T16:32:44Z,2020-04-27T16:30:26Z,OWNER,,"My Datasette projects increasingly take on the following structure: - `metadata.json` with the metadata - One or more `something.db` database files - A `templates/` folder with some custom templates - A `plugins/` folder with some custom plugins Then I have to run Datasette like this: datasette *.db -m metadata.json --template-dir=templates --plugins-dir=plugins It would be really interesting if Datasette had a special mode where you could point it at a directory with the above layout and it would automatically configure itself based on the contents. Maybe even allow `datasette serve` to detect if it was passed a single argument that's a directory, not a file, and kick in to ""directory layout configuration mode"" in that case: datasette . ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/731/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316323336,MDU6SXNzdWUzMTYzMjMzMzY=,231,metadata.json support for plugin configuration options,9599,simonw,closed,0,,,,,4,2018-04-20T15:58:47Z,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,,"My [datasette-cluster-map](https://github.com/simonw/datasette-cluster-map) plugin currently works by detecting `latitude` and `longitude` columns. I'd like to be able to configure it to look for different column names. One way to do this could be to support optional plugin configuration as part of `metadata.json`. Something like this: { ""title"": ""Polar Bear Ear Tags, 2009-2011"", ""source"": ""USGS Alaska Science Center, Polar Bear Research Program"", ""source_url"": ""https://alaska.usgs.gov/products/data.php?dataid=130"", ""plugins"": { ""datasette_cluster_map"": { ""latitude_columns"": [ ""latitude"", ""Capture Latitude"" ], ""longitude_columns"": [ ""longitude"", ""Capture Longitude"" ] } } } These settings should be supported at the root level or at the individual database or table level. They could also be exposed in the https://datasette-cluster-map-demo.now.sh/-/plugins debug tool. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,mroswell,open,0,,,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency. Would love to have documentation on how to modify the template to: - Remove the generated ID from the table - Link the federal ID to the detail page - and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key. I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports. Perhaps this isn't a template issue, maybe more of a db manipulation... Margie",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1426195437,I_kwDOBm6k_c5VAgPt,1868,Design URLs for the write API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T19:55:30Z,2022-11-15T19:59:14Z,2022-10-27T20:07:01Z,OWNER,,"My original design for this issue: - #1851 Was `POST /db/table` with JSON of `{""insert"": {...}}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 823035080,MDU6SXNzdWU4MjMwMzUwODA=,1248,duckdb database (very low performance in SQLite),15836677,verajosemanuel,closed,0,,,,,1,2021-03-05T12:20:29Z,2021-03-08T00:25:27Z,2021-03-08T00:25:27Z,NONE,,"My sqlite is getting too big to be processed by datasette (more than 10 minutes waiting to load) so I am working with duckdb and is waaaaay faster. I think the fastest embeddable database actually. https://duckdb.org/ Taking into account DuckDb is SQLite based it would be GREAT to use it with datasette. is that possible? Regards and thanks for a superb job",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432870248,MDU6SXNzdWU0MzI4NzAyNDg=,431,Datasette doesn't reload when database file changes,82988,psychemedia,closed,0,,,,,3,2019-04-13T16:50:43Z,2019-05-02T05:13:55Z,2019-05-02T05:13:54Z,CONTRIBUTOR,,"My understanding of the `--reload` option was that if the database file changed `datasette` would automatically reload. I'm running on a Mac and from the `datasette` UI queries don't seem to be picking up data in a newly changed db (I checked the db timestamp - it certainly updated). I was also expecting to see some sort of log statement in the datasette logging to say that it had detected a file change and restarted, but don't see anything there? Will try to check on an Ubuntu box when I get a chance to see if this is a Mac thing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637966833,MDU6SXNzdWU2Mzc5NjY4MzM=,840,Log out mechanism for clearing ds_actor cookie,9599,simonw,closed,0,,,5533512,Datasette 0.45,4,2020-06-12T19:41:51Z,2020-06-29T04:31:43Z,2020-06-29T04:31:43Z,OWNER,,"Need a cookie clearing mechanism and a way to show that you are logged in. `datasette-auth-github` had a solution for this that can be pulled into core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/840/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912418094,MDU6SXNzdWU5MTI0MTgwOTQ=,1358,Release Datasette 0.57,9599,simonw,closed,0,,,,,3,2021-06-05T19:56:13Z,2021-06-05T22:20:07Z,2021-06-05T22:20:07Z,OWNER,,"Need release notes. Changes are here: https://github.com/simonw/datasette/compare/0.56...368aa5f1b16ca35f82d90ff747023b9a2bfa27c1 Partial release notes already exist for the two alphas, https://github.com/simonw/datasette/releases/tag/0.57a0 and https://github.com/simonw/datasette/releases/tag/0.57a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273569068,MDU6SXNzdWUyNzM1NjkwNjg=,79,Add more detailed API documentation to the README,9599,simonw,closed,0,,,,,3,2017-11-13T20:36:21Z,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,,"Need to document: - [ ] The ?column__gt=4 style filter arguments for tables - [ ] The ?sql= API, and how named parameters work - [ ] How API pagination works - [ ] How redirects and cache headers work",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464449570,MDU6SXNzdWU0NjQ0NDk1NzA=,540,Add a universal navigation bar which can be modified by plugins,9599,simonw,closed,0,,,,,8,2019-07-05T03:50:33Z,2019-07-06T23:13:29Z,2019-07-06T23:11:35Z,OWNER,,"Needed by https://github.com/simonw/datasette-auth-github/issues/5 We already have a navigation breadcrumbs header on some pages, I can extend that to be present on every page and make it easy to modify with custom templates. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443020810,MDU6SXNzdWU0NDMwMjA4MTA=,460,Design changes to homepage to support mutable files,9599,simonw,closed,0,,,4305096,0.28,5,2019-05-11T17:58:05Z,2019-05-16T03:34:09Z,2019-05-16T03:24:16Z,OWNER,,"Needed for #419 - since we can now start up Datasette with a whole bunch of large connected databases that are mutable we can no longer guarantee a quick count of rows across all of the tables. New proposed homepage tweaks: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732798913,MDU6SXNzdWU3MzI3OTg5MTM=,1064,Navigation menu plus plugin hook,9599,simonw,closed,0,,,6026070,0.51,10,2020-10-30T00:49:36Z,2020-10-30T03:45:16Z,2020-10-30T03:45:16Z,OWNER,,Needed for #690. Prototype in https://github.com/simonw/datasette/commit/0d7ac764861d84be24d661cf4104ce61ea11a82a,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1064/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642297505,MDU6SXNzdWU2NDIyOTc1MDU=,857,Comprehensive documentation for variables made available to templates,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2020-06-20T03:19:43Z,2022-10-26T02:58:17Z,2022-10-26T02:58:17Z,OWNER,,"Needed for the Datasette 1.0 release, so template authors can trust that Datasette is unlikely to break their templates.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/857/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886865,MDU6SXNzdWUyNjc4ODY4NjU=,28,/database?sql= should redirect correctly,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-24T03:38:44Z,2017-10-24T23:54:30Z,2017-10-24T23:54:30Z,OWNER,,Needs to redirect to the location with the hash while retaining the query string. This should also work with the .json extension.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 632724154,MDU6SXNzdWU2MzI3MjQxNTQ=,805,Writable canned queries live demo on Glitch,9599,simonw,open,0,,,,,11,2020-06-06T20:52:13Z,2020-07-01T22:44:01Z,,OWNER,,"Needs to run somewhere with a mutable disk drive, so not Cloud Run or Heroku or Vercel. I think I'll put it on Glitch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/805/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,honzajavorek,open,0,,,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error: ``` $ datasette Traceback (most recent call last): File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group' ``` I have datasette in my dependencies like this: ```toml [tool.poetry.group.dev.dependencies] datasette = {version = ""1.0a7"", allow-prereleases = true} ``` I had the latest regular version (not pre-release) there originally, but the result was the same: ```toml [tool.poetry.group.dev.dependencies] datasette = ""0.64.5"" ``` Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 633578769,MDU6SXNzdWU2MzM1Nzg3Njk=,811,"Support ""allow"" block on root, databases and tables, not just queries",9599,simonw,closed,0,,,5512395,Datasette 0.44,16,2020-06-07T17:01:09Z,2020-06-08T19:34:00Z,2020-06-08T19:32:36Z,OWNER,,"No reason not to expand the ""allow"" mechanism [described here](https://github.com/simonw/datasette/blob/86dec9e8fffd6c4efec928ae9b5713748dec7e74/docs/authentication.rst#permissions-for-canned-queries) to the root of `metadata.json` plus to databases and tables. Refs #810 and #800. ```json { ""databases"": { ""mydatabase"": { ""allow"": { ""id"": [""root""] } } } } ``` TODO: - [x] Instance level - [x] Database level - [x] Table level - [x] Query level - [x] Affects list of queries - [x] Affects list of tables on database page - [x] Affects truncated list of tables on index page - [x] Affects list of SQL views on database page - [x] Affects list of databases on index page - [x] Show ๐Ÿ”’ in header on index page for private instances - [x] Show ๐Ÿ”’ in header on private database page - [x] Show ๐Ÿ”’ in header on private table page - [x] Show ๐Ÿ”’ in header on private query page - [x] Move `assert_permissions_checked()` calls from `test_html.py` to `test_permissions.py` - [x] Update documentation",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/811/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963527045,MDU6SXNzdWU5NjM1MjcwNDU=,1424,Document exceptions that can be raised by db.execute() and friends,9599,simonw,open,0,,,,,4,2021-08-08T22:23:25Z,2021-08-08T22:27:31Z,,OWNER,,Not currently covered here: https://docs.datasette.io/en/stable/internals.html#await-db-execute-sql,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 636511683,MDU6SXNzdWU2MzY1MTE2ODM=,830,Redesign register_facet_classes plugin hook,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2020-06-10T20:03:27Z,2021-12-16T19:58:22Z,,OWNER,,"Nothing uses this plugin hook yet, so the design is not yet proven. I'm going to build a real plugin against it and use that process to inform any design changes that may need to be made. I'll add a warning about this to the documentation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1239080102,I_kwDOBm6k_c5J2tym,1745,Documentation on running cog,9599,simonw,closed,0,,,,,1,2022-05-17T19:41:06Z,2022-05-17T19:45:51Z,2022-05-17T19:43:45Z,OWNER,,Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573583971,MDU6SXNzdWU1NzM1ODM5NzE=,689,"""Templates considered"" comment broken in >=0.35",35075,chrishas35,closed,0,,,,,6,2020-03-01T17:31:21Z,2020-04-05T19:39:44Z,2020-04-05T19:39:44Z,NONE,,"Noticed that the ""Templates Considered"" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34: ```diff < ""datasette_version"": ""0.34"", < ""app_css_hash"": ""ffa51a"", < ""select_templates"": [ < ""*index.html"" < ], < ""zip"": """", < ""body_scripts"": [], < ""extra_css_urls"": """", < ""extra_js_urls"": """", < ""format_bytes"": """", < ""database_url"": "">"", < ""database_color"": "">"" --- > ""datasette_version"": ""0.35"", > ""database_url"": "">"", > ""database_color"": "">"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216479167,I_kwDOBm6k_c5Igf-_,1722,`db.primary_keys()` and `db.table_columns()` don't show up in traces,9599,simonw,open,0,,,,,0,2022-04-26T21:08:36Z,2022-04-26T21:08:36Z,,OWNER,,"Noticed this while working on: - #1715 This code here isn't showing up in traces: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/table.py#L218-L220 Because those functions don't use the regular trace-instrumented `db.execute()` code path - they work directly against a connection instead: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/utils/__init__.py#L610-L626 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 443020048,MDU6SXNzdWU0NDMwMjAwNDg=,459,"Fix the ""datasette now publish ... --alias=x"" option",9599,simonw,closed,0,,,4305096,0.28,3,2019-05-11T17:48:40Z,2019-05-11T20:22:08Z,2019-05-11T20:22:08Z,OWNER,,"Now have deprecated the mechanism we were using for this - running `now alias` without any parameters - in favour of something new: https://zeit.co/blog/automatic-aliasing",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648637666,MDU6SXNzdWU2NDg2Mzc2NjY=,880,POST to /db/canned-query that returns JSON should be supported (for API clients),9599,simonw,closed,0,,,5818042,Datasette 0.49,11,2020-07-01T03:14:43Z,2020-09-14T21:28:21Z,2020-09-14T21:25:01Z,OWNER,,Now that CSRF is solved for API requests (#835) it would be good to support API requests to the `.json` extension.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 636614868,MDU6SXNzdWU2MzY2MTQ4Njg=,831,"It would be more intuitive if ""allow"": none meant ""no-one can do this""",9599,simonw,closed,0,,,5512395,Datasette 0.44,1,2020-06-10T23:43:56Z,2020-06-10T23:57:25Z,2020-06-10T23:50:55Z,OWNER,,"Now that I'm starting to write alternative plugins to control permissions - see #818 - I think I need an easy way to tell Datasette ""no-one has permission to do X unless a plugin says otherwise"". One relatively intuitive way to do that could be like this: ```json { ""databases"": { ""fixtures"": { ""allow"": null } } } ``` Right now I think that opens up permissions to everyone, which isn't as obvious.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/831/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,simonw,open,0,,,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0. The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 908465747,MDU6SXNzdWU5MDg0NjU3NDc=,1354,Update help in tests for latest Click,9599,simonw,closed,0,,,,,1,2021-06-01T16:14:31Z,2021-06-01T16:17:04Z,2021-06-01T16:17:04Z,OWNER,,"Now that Uvicorn 0.14 is out with an unpinned Click dependency - https://github.com/encode/uvicorn/pull/1033 - our test suite runs against Click 8.0 - which subtly changes the output of `--help` causing test failures: https://github.com/simonw/datasette/runs/2720383031?check_suite_focus=true ``` def test_help_includes(name, filename): expected = (docs_path / filename).read_text() runner = CliRunner() result = runner.invoke(cli, name.split() + [""--help""], terminal_width=88) actual = f""$ datasette {name} --help\n\n{result.output}"" # actual has ""Usage: cli package [OPTIONS] FILES"" # because it doesn't know that cli will be aliased to datasette expected = expected.replace(""Usage: datasette"", ""Usage: cli"") > assert expected == actual E AssertionError: assert '$ datasette ...e and exit.\n' == '$ datasette ...e and exit.\n' E Skipping 848 identical leading characters in diff, use -v to show E nt_id xxx E + E --version-note TEXT Additional note to show on /-/versions E --secret TEXT Secret used for signing secure values, such as signed E cookies E + E --title TEXT Title for metadata ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573755726,MDU6SXNzdWU1NzM3NTU3MjY=,690,Mechanism for plugins to add action menu items for various things,9599,simonw,closed,0,,,6026070,0.51,11,2020-03-02T06:48:36Z,2020-10-30T05:20:43Z,2020-10-30T05:20:42Z,OWNER,,"Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page. Some examples: - `datasette-configure-fts` needs to add a ""configure search for this table"" link - a plugin that lets you render or delete tables needs to add a link or button somewhere - existing plugins like `datasette-vega` and `datasette-cluster-map` already do this with JavaScript The challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521335335,MDU6SXNzdWU1MjEzMzUzMzU=,629,"""datasette publish"" commands should deploy with Python 3.8",9599,simonw,closed,0,,,,,1,2019-11-12T05:22:31Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,,Now that we support 3.8 (#627) `datasette publish` should always deploy using Python 3.8.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/629/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 634663505,MDU6SXNzdWU2MzQ2NjM1MDU=,815,Group permission checks by request on /-/permissions debug page,9599,simonw,open,0,,,,,8,2020-06-08T14:25:23Z,2020-12-17T22:06:48Z,,OWNER,,"Now that we're making a LOT more permission checks (on the DB index page we do a check for every listed table for example) the `/-/permissions` page gets filled up pretty quickly. Can make this more readable by grouping permission checks by request. Have most recent request at the top of the page but the permission requests within that page sorted chronologically by most recent last.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/815/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 638212085,MDU6SXNzdWU2MzgyMTIwODU=,842,Magic parameters for canned queries,9599,simonw,closed,0,,,5533512,Datasette 0.45,18,2020-06-13T18:50:08Z,2020-06-28T03:30:31Z,2020-06-28T02:58:18Z,OWNER,,"Now that writable canned queries (#698) have landed, it would be neat if they supported ""magic"" parameters - parameters that are automatically populated with: - the current actor ID / other actor properties - the current date and time - the user's IP or user-agent And maybe other things potentially added by plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/842/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1384273985,I_kwDOBm6k_c5SglhB,1817,Expose `sql` and `params` arguments to various plugin hooks,9599,simonw,open,0,,,,,7,2022-09-23T20:34:45Z,2022-09-27T00:27:53Z,,OWNER,,"On Discord: https://discord.com/channels/823971286308356157/996877076982415491/1022784534363787305 > Hi! I'm attempting to write a plugin that would provide some statistics on text fields (most common words, etc). I would want this information displayed in the table pages, and (ideally) also updated when users make custom queries from the table pages. > > It seems one way to do this would be to use the extra_template_vars hook, and make the appropriate SQL query there. So extra_template_vars would create a variable that is a list of most common words, and this is displayed on the page, possibly above the regular table view. > > Is there a way that the plugin code can access the SQL query (or even the data) that was used to produce the table view? I can see that TableView class constructs the SQL query, but I can't seem to find a way to access that information from the objects that are available to extra_template_vars. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1817/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274001453,MDU6SXNzdWUyNzQwMDE0NTM=,96,UI for editing named parameters,9599,simonw,closed,0,,,,,3,2017-11-15T01:19:21Z,2017-11-16T01:45:51Z,2017-11-16T01:33:38Z,OWNER,,"On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters. Eg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309033998,MDU6SXNzdWUzMDkwMzM5OTg=,187,Windows installation error,11855322,robmarkcole,closed,0,,,,,7,2018-03-27T16:04:37Z,2019-06-15T21:44:23Z,2019-06-15T21:44:23Z,NONE,,"On attempting install on a Win 7 PC with py 3.6.2 (Anaconda dist) I get the error: ``` Collecting uvloop>=0.5.3 (from Sanic==0.7.0->datasette) Downloading uvloop-0.9.1.tar.gz (1.8MB) 100% |ยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆยฆ| 1.8MB 12.8MB/s Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\RCole\AppData\Local\Temp\pip-build-juakfqt8\uvloop\setup.py "", line 10, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/187/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 655465863,MDU6SXNzdWU2NTU0NjU4NjM=,892,"""latest"" in new documentation navbar is invisible",9599,simonw,closed,0,,,,,2,2020-07-12T19:57:21Z,2020-07-12T20:02:35Z,2020-07-12T20:02:17Z,OWNER,,"On https://datasette.readthedocs.io/en/latest/ Compare with https://datasette.readthedocs.io/en/0.45/ Some custom CSS should fix it. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243512344,I_kwDOBm6k_c5KHn4Y,1747,Add tutorials to the getting started guide,9599,simonw,closed,0,,,,,1,2022-05-20T19:01:52Z,2022-05-20T19:12:30Z,2022-05-20T19:05:34Z,OWNER,,On https://docs.datasette.io/en/stable/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 712839383,MDU6SXNzdWU3MTI4MzkzODM=,985,Column actions should support facet by compound primary keys,9599,simonw,closed,0,,,5971510,Datasette 0.50,1,2020-10-01T13:21:57Z,2020-10-08T23:55:11Z,2020-10-01T16:50:41Z,OWNER,,"On https://latest.datasette.io/fixtures/compound_three_primary_keys the column action menu doesn't display for the pk1, pk2 and pk3 columns (because they are primary keys) even though faceting by them is actually useful. Refs #98",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903200328,MDU6SXNzdWU5MDMyMDAzMjg=,1341,"""Show all columns"" cog menu item should show if ?_col= is used",9599,simonw,closed,0,,,,,1,2021-05-27T04:28:17Z,2021-05-27T04:31:16Z,2021-05-27T04:31:16Z,OWNER,,"On https://latest.datasette.io/fixtures/sortable?_col=sortable the ""Show all columns"" item (from #615) is not shown (it should be): ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,simonw,open,0,,,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment: datasette fixtures.db -p 3344 --setting base_url /foo/bar/ Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3 But... that `json` link goes here, which is a 404: http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 713304417,MDU6SXNzdWU3MTMzMDQ0MTc=,989,Column action sort descending/ascending links should remove _next= pagination,9599,simonw,closed,0,,,5971510,Datasette 0.50,0,2020-10-02T02:33:48Z,2020-10-08T23:55:15Z,2020-10-04T18:05:28Z,OWNER,,"On page https://latest.datasette.io/fixtures/sortable?_next=15%2Cg%2Cz&_sort=sortable clicking on `sortable_with_nulls > sort_ascending` links to https://latest.datasette.io/fixtures/sortable?_next=15%2Cg%2Cz&_sort_desc=sortable_with_nulls - which doesn't make sense. Changing the sort order needs to reset to the first page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/989/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,DavidPratten,closed,0,,,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1 I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc ```sql SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns' FROM sqlite_schema AS m, pragma_index_list(m.name) AS il, pragma_index_info(il.name) AS ii WHERE m.type='table' ORDER BY 1; ``` https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733999615,MDU6SXNzdWU3MzM5OTk2MTU=,1079,Handle long breadcrumbs better with new menu,9599,simonw,open,0,,,,,1,2020-11-01T15:57:41Z,2022-01-13T22:21:29Z,,OWNER,,"On this page when signed in as root: https://latest.datasette.io/fixtures/roadside_attraction_characteristics/1 ![EF921CB1-625F-4D04-A850-490B812A72B3](https://user-images.githubusercontent.com/9599/97807807-db0fbf80-1c17-11eb-9c77-ae5169b12c3d.jpeg) ![A49D8B76-5ACF-4F71-A8B4-21A44F5C8D51](https://user-images.githubusercontent.com/9599/97807809-dea34680-1c17-11eb-9511-a49af56a4bd2.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 642127307,MDU6SXNzdWU2NDIxMjczMDc=,855,Add instructions for using cookiecutter plugin template to plugin docs,9599,simonw,closed,0,,,5533512,Datasette 0.45,2,2020-06-19T17:33:25Z,2020-06-22T02:51:38Z,2020-06-22T02:51:38Z,OWNER,,Once I ship the `datasette-plugin` template: https://github.com/simonw/datasette-plugin/issues/1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/855/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309047460,MDU6SXNzdWUzMDkwNDc0NjA=,188,Ability to bundle metadata and templates inside the SQLite file,9599,simonw,open,0,,,,,4,2018-03-27T16:42:07Z,2020-12-04T17:18:34Z,,OWNER,,"One of the nicest qualities of SQLite as a data format is that you get a single file which you can then backup or share with other people. Datasette breaks this a little once you start including custom metadata.json or template files and CSS. It would be cool if there was an optional mechanism for baking that extra configuration into the SQLite file itself. That way entire datasette mini-applications (including canned queries and custom HTML and CSS) could be constructed as single .db files. Since datasette configuration is all file-based, one way to achieve that would be to support a ""datasette_files"" table which, if present is used to search for file contents by path. This is inline with the philosophy described by https://www.sqlite.org/appfileformat.html ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 421551434,MDU6SXNzdWU0MjE1NTE0MzQ=,419,"Default to opening files in mutable mode, special option for immutable files",9599,simonw,closed,0,,,4305096,0.28,10,2019-03-15T14:39:27Z,2019-05-16T15:14:32Z,2019-05-16T15:14:31Z,OWNER,,"One of the original ideas behind Datasette was that serving immutable data makes everything way easier. Two examples: You don't have to worry about SQLite concurrency and you can bundle the database inside a Docker container and deploy it to immutable hosting. See [The interesting ideas in Datasette](https://simonwillison.net/2018/Oct/4/datasette-ideas/) for more on this. I'm beginning to see a much stronger case for being able to serve mutable data as well. SQLite is actually perfectly capable of handling reads against a database that is also being written to, even if the writes are coming from another process. https://www.sqlite.org/wal.htm There are all kinds of interesting use-cases which Datasette is currently unsuitable for due to its insistence on immutable databases. Some examples: * Continually run Datasette against a SQLite database updated by another process, e.g. Firefox bookmarks * Projects where a cron runs every X minutes and writes new entries gathered from other sources to SQLite * Tail a log file, write those log updates to a SQLite file, view recent log entries in Datasette This is also relevant to #417, Datasette Library.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/419/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432636432,MDU6SXNzdWU0MzI2MzY0MzI=,429,?_where=sql-fragment parameter for table views,9599,simonw,closed,0,,,,,7,2019-04-12T15:58:51Z,2019-04-15T10:48:01Z,2019-04-13T01:37:25Z,OWNER,,"Only available if arbitrary SQL is enabled (the default). `?_where=id in (1,2,3)&_where=id in (select tag_id from tags)` Allows any table (or view) page to have arbitrary additional `extra_where` clauses defined using the URL! This would be extremely useful for building JavaScript applications against the Datasette API that only need on extra tiny bit of SQL but still want to benefit from other table view features like faceting. Would be nice if this could take `:named` parameters and have them filled in via querystring as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,pjamargh,closed,0,,,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565518772,MDU6SXNzdWU1NjU1MTg3NzI=,673,Mechanism for checking if a SQLite database file is safe to open,9599,simonw,closed,0,,,,,11,2020-02-14T19:36:04Z,2020-02-14T20:13:59Z,2020-02-14T20:13:59Z,OWNER,,"Opening a SpatiaLite database file without SpatiaLite will result in errors later on. Same for database files which use custom extensions, like the Apple Photos database. I've figured out how to tell if a database is safe to open or not: ```sql select sql from sqlite_master where sql like 'CREATE VIRTUAL TABLE%'; ``` This returns the SQL definitions for virtual tables. The bit after `using` tells you what they need. Run this against a SpatiaLite database and you get the following: ```sql CREATE VIRTUAL TABLE SpatialIndex USING VirtualSpatialIndex() CREATE VIRTUAL TABLE ElementaryGeometries USING VirtualElementary() ``` Run it against an Apple Photos `photos.db` file (found with `find ~/Library | grep photos.db`) and you get this (partial list): ```sql CREATE VIRTUAL TABLE RidList_VirtualReader using RidList_VirtualReaderModule CREATE VIRTUAL TABLE Array_VirtualReader using Array_VirtualReaderModule CREATE VIRTUAL TABLE LiGlobals_VirtualBufferReader using VirtualBufferReaderModule CREATE VIRTUAL TABLE RKPlace_RTree using rtree (modelId,minLongitude,maxLongitude,minLatitude,maxLatitude) ``` For a database with FTS4 you get: ```sql CREATE VIRTUAL TABLE ""docs_fts"" USING FTS4 ( [title], [content], content=""docs"" ) ``` FTS5: ```sql CREATE VIRTUAL TABLE [FARA_All_Registrants_fts] USING FTS5 ( [Name], [Address_1], [Address_2], content=[FARA_All_Registrants] ) ``` So I can use this to figure out all of the `using` pieces and then compare them to a list of known support ones. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/672#issuecomment-586441484_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1366915240,I_kwDOBm6k_c5ReXio,1807,Plugin ecosystem needs to avoid crashes due to no available databases,9599,simonw,open,0,,,,,1,2022-09-08T19:54:34Z,2022-09-08T20:14:05Z,,OWNER,,"Opening this here to track the issue first reported in: - https://github.com/simonw/datasette-upload-dbs/issues/5 Plugins that expect to be able to write to a database need to not crash in situations where no writable database is available.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1807/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 272661336,MDU6SXNzdWUyNzI2NjEzMzY=,49,Pick a name,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-09T17:56:17Z,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,,"Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` That example [was achieved](https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/daf51a8c50a78e8bc7971c211005fd85e66ccf64/russian-ads-metadata.yaml#L72-L77) using a custom SQL query and [datasette-json-html](https://github.com/simonw/datasette-json-html) - but I'd like this to be a built-in feature instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1219385669,I_kwDOBm6k_c5IrllF,1729,Implement ?_extra and new API design for TableView,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,12,2022-04-28T22:28:14Z,2022-12-13T05:29:07Z,,OWNER,,"Part of: - #262 - #1518",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1408757705,I_kwDOBm6k_c5T9-_J,1843,"Intermittent ""Too many open files"" error running tests",9599,simonw,open,0,,,,,16,2022-10-14T04:45:01Z,2022-12-17T22:02:41Z,,OWNER,,"Partial stack trace from one of them: ``` /Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.10/site-packages/jinja2/loaders.py:200: in get_source f = open_if_exists(filename) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ filename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html', mode = 'rb' def open_if_exists(filename: str, mode: str = ""rb"") -> t.Optional[t.IO]: """"""Returns a file descriptor for the filename if that file exists, otherwise ``None``. """""" if not os.path.isfile(filename): return None > return open(filename, mode) E OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1843/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 510076368,MDU6SXNzdWU1MTAwNzYzNjg=,605,Support queries at the table level,12617395,bsilverm,open,0,,,,,2,2019-10-21T15:58:30Z,2019-10-30T18:55:37Z,,NONE,,"Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 318737808,MDU6SXNzdWUzMTg3Mzc4MDg=,243,--spatialite option for datasette publish commands,9599,simonw,closed,0,,,,,2,2018-04-29T18:19:32Z,2018-05-31T14:17:53Z,2018-05-31T14:17:53Z,OWNER,,Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175894898,I_kwDOBm6k_c5GFrty,1680,Consider simplifying permissions for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-03-21T20:17:29Z,2022-03-21T20:17:29Z,,OWNER,,"Permission checks right now can express one of three opinions: - `False` means ""so not grant this permisson"" - `True` means ""grant this permission"" - `None` means ""I have no opinion"" But... there's also a concept of a ""default"" for a given permission check, which might be `False` or `True`. I worry this is too complicated. Could this be simplified before 1.0? In particular the default concept. See also: - #1676 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 635696400,MDU6SXNzdWU2MzU2OTY0MDA=,827,Document CSRF protection (for plugins),9599,simonw,closed,0,,,5512395,Datasette 0.44,1,2020-06-09T19:19:10Z,2020-06-09T19:38:30Z,2020-06-09T19:35:13Z,OWNER,,"Plugin authors need to know that if they want to POST a form they should include this: ```html+jinja ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/827/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,simonw,open,0,,,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 315517578,MDU6SXNzdWUzMTU1MTc1Nzg=,224,Ability for plugins to bundle templates,9599,simonw,closed,0,,,,,1,2018-04-18T14:57:53Z,2018-04-19T05:50:36Z,2018-04-19T05:50:36Z,OWNER,,"Plugins should be able to bundle templates. The Datasette template loader should then consult those plugins first when loading a template. Jinja2 has a `PackageLoader` class that can help with this: http://jinja.pocoo.org/docs/2.10/api/#jinja2.PackageLoader Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326783670,MDU6SXNzdWUzMjY3ODM2NzA=,291,Avoid plugins accidentally loading dependencies twice,9599,simonw,closed,0,,,,,3,2018-05-27T03:15:21Z,2020-09-30T20:36:12Z,2018-05-28T20:42:02Z,OWNER,,Plugins that include JavaScript files risk loading the same code twice. In particular: I want to build a second plugin that uses the Leaflet mapping library (the first was [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/)). But I don't want the two plugins to load duplicate copies of Leaflet.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108671952,I_kwDOBm6k_c5CFP3Q,1605,Scripted exports,25778,eyeseast,open,0,,,,,10,2022-01-19T23:45:55Z,2022-11-30T15:06:38Z,,CONTRIBUTOR,,"Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries. I used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries. This is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 677326155,MDU6SXNzdWU2NzczMjYxNTU=,930,Datasette sdist is missing templates (hence broken when installing from Homebrew),9599,simonw,closed,0,,,,,6,2020-08-12T02:20:16Z,2020-08-12T03:30:59Z,2020-08-12T03:30:59Z,OWNER,,Pretty nasty bug this: I'm getting 500 errors for all pages that try to render a template after installing the newly released Datasette 0.47 - both from `pip install` and via Homebrew.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/930/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267759136,MDU6SXNzdWUyNjc3NTkxMzY=,20,Config file with support for defining canned queries,9599,simonw,closed,0,9599,simonw,2949431,Custom templates edition,9,2017-10-23T17:53:06Z,2017-12-05T19:05:35Z,2017-12-05T17:44:09Z,OWNER,,"Probably using YAML because then we get support for multiline strings: bats: db: bats.sqlite3 name: ""Bat sightings"" queries: specific_row: | select * from Bats where a = 1; ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 377156339,MDU6SXNzdWUzNzcxNTYzMzk=,371,datasette publish digitalocean plugin,82988,psychemedia,closed,0,,,,,3,2018-11-04T14:07:41Z,2021-01-04T20:14:28Z,2021-01-04T20:14:28Z,CONTRIBUTOR,,"Provide support for launching `datasette` on Digital Ocean. Example: [Deploy Docker containers into Digital Ocean](https://blog.machinebox.io/deploy-machine-box-in-digital-ocean-385265fbeafd). Digital Ocean also has a preconfigured VM running Docker that can be launched from the command line via the Digital Ocean API: [Docker One-Click Application](https://www.digitalocean.com/docs/one-clicks/docker/). Related: - Launching containers in Digital Ocean servers running docker: [How To Provision and Manage Remote Docker Hosts with Docker Machine on Ubuntu 16.04](https://www.digitalocean.com/community/tutorials/how-to-provision-and-manage-remote-docker-hosts-with-docker-machine-on-ubuntu-16-04) - [How To Use Doctl, the Official DigitalOcean Command-Line Client](https://www.digitalocean.com/community/tutorials/how-to-use-doctl-the-official-digitalocean-command-line-client)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,simonw,open,0,,,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 497170355,MDU6SXNzdWU0OTcxNzAzNTU=,576,Documented internals API for use in plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2019-09-23T15:28:50Z,2021-01-05T23:12:51Z,2021-01-05T23:12:37Z,OWNER,,"Quite a few of the plugin hooks make a `datasetteโ€`instance of the Datasette class available to the plugins, so that they can look up configuration settings and execute database queries. This means it should provide a documented, stable API so that plugin authors can rely on it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1439009231,I_kwDOBm6k_c5VxYnP,1884,Exclude virtual tables from datasette inspect,25778,eyeseast,open,0,,,,,6,2022-11-07T21:26:01Z,2022-11-21T04:40:56Z,,CONTRIBUTOR,,"Ran `inspect` on a spatialite database and got these warnings: ``` ERROR: conn=, sql = 'select count(*) from [SpatialIndex]', params = None: no such module: VirtualSpatialIndex ERROR: conn=, sql = 'select count(*) from [ElementaryGeometries]', params = None: no such module: VirtualElementary ERROR: conn=, sql = 'select count(*) from [KNN]', params = None: no such module: VirtualKNN ``` It still worked, but probably want to catch this. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1884/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 913017577,MDU6SXNzdWU5MTMwMTc1Nzc=,1365,pathlib.Path breaks internal schema,25778,eyeseast,closed,0,,,,,1,2021-06-07T01:40:37Z,2021-06-21T15:57:39Z,2021-06-21T15:57:39Z,CONTRIBUTOR,,"Ran into an issue while trying to build a plugin to render GeoJSON. I'm using pytest's `tmp_path` fixture, which is a `pathlib.Path`, to get a temporary database path. I was getting a weird error involving writes, but I was doing reads. Turns out it's the internal database trying to insert a `Path` where it wants a string. My test looked like this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([database]) # this will break with a path await datasette.refresh_schemas() # build a url url = datasette.urls.table(database.stem, TABLE_NAME, format=""geojson"") response = await datasette.client.get(url) fc = response.json() assert 200 == response.status_code ``` I only ran into this while running tests, because passing in database paths from the CLI uses strings, but it's a weird error and probably something other people have run into. The fix is easy enough: Convert the path to a string and everything works. So this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([str(database)]) # this is fine now await datasette.refresh_schemas() ``` This could (probably, haven't tested) be fixed [here](https://github.com/simonw/datasette/blob/03ec71193b9545536898a4bc7493274fec48bdd7/datasette/app.py#L357) by calling `str(db.path)` or by doing that conversion earlier.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 799693777,MDU6SXNzdWU3OTk2OTM3Nzc=,1214,Re-submitting filter form duplicates _x querystring arguments,9599,simonw,closed,0,,,,,3,2021-02-02T21:13:35Z,2021-02-02T21:28:53Z,2021-02-02T21:21:13Z,OWNER,,"Really nasty bug, caused by #1194 fix in 07e163561592c743e4117f72102fcd350a600909 Navigate to this page: https://github-to-sqlite.dogsheep.net/github/labels?_search=help&_sort=id Click ""Apply"" to submit the form and the resulting URL is https://github-to-sqlite.dogsheep.net/github/labels?_search=help&_sort=id&_search=help&_sort=id That's because the (truncated) HTML for the form looks like this: ```html ... ...
... ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,gustavorps,open,0,,,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 722724086,MDU6SXNzdWU3MjI3MjQwODY=,1025,"Fix last remaining links to ""/"" that do not respect base_url",9599,simonw,closed,0,,,6026070,0.51,7,2020-10-15T22:46:38Z,2020-10-23T19:44:06Z,2020-10-20T05:21:29Z,OWNER,,"Refs #1023 ``` datasette % git grep '""/""' -- '*.html' datasette/templates/error.html: home datasette/templates/patterns.html: home / datasette/templates/query.html: home / ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1025/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed