id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1479920517,I_kwDOBm6k_c5YNcuF,1934,Return number of ignored/replaced items from /-/insert,9599,open,0,,3268330,0,2022-12-06T19:01:58Z,2022-12-06T19:02:03Z,,OWNER,,"Idea from here: - https://github.com/simonw/sqlite-utils/issues/516",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1934/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1483250004,I_kwDOBm6k_c5YaJlU,1936,Fix /db/table/-/upsert in the API explorer,9599,open,0,,3268330,2,2022-12-08T00:59:34Z,2022-12-08T01:36:02Z,,OWNER,,"Split from: - #1931 - #1878 This is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not `rowid` should be included.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1936/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1486036269,I_kwDOBm6k_c5Ykx0t,1941,Mechanism for supporting key rotation for DATASETTE_SECRET,9599,open,0,,,1,2022-12-09T05:24:53Z,2022-12-09T05:25:20Z,,OWNER,,"Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire. Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly. Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1941/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1487738738,I_kwDOBm6k_c5YrRdy,1942,Option for plugins to request that JSON be served on the page,9599,open,0,,3268330,1,2022-12-10T01:08:53Z,2022-12-10T01:11:30Z,,OWNER,,"Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say ""I'd like the JSON for a page to be included in a variable on the HTML page""? `datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data. This idea fits with my overall goals to unify the JSON and HTML context too. Refs: - #1711",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1942/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493404423,I_kwDOBm6k_c5ZA4sH,1948,500 error on permission debug page when testing actors with _r,9599,open,0,,,1,2022-12-13T05:22:03Z,2022-12-13T05:22:19Z,,OWNER,," The 500 error is silent unless you are looking at the DevTools network pane. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1948/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493471221,I_kwDOBm6k_c5ZBI_1,1949,`.json` errors should be returned as JSON,9599,open,0,,8755003,10,2022-12-13T06:14:12Z,2022-12-15T00:46:27Z,,OWNER,,"Eg the error in this issue: - #1945 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1949/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1495716243,I_kwDOBm6k_c5ZJtGT,1952,Improvements to /-/create-token restrictions interface,9599,open,0,,8755003,1,2022-12-14T05:22:39Z,2022-12-14T05:23:13Z,,OWNER,,"> It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks. > > That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402_ Also, DO show the `_memory` database there if Datasette was started in `--crossdb` mode.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1952/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1497577017,I_kwDOBm6k_c5ZQzY5,1957,Reconsider row value truncation on query page,9599,open,0,,,1,2022-12-14T23:49:47Z,2022-12-14T23:50:50Z,,OWNER,,"Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos ```sql select json_group_array(full_name) from repos ``` ![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png) My intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json The truncation isn't helping here.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1500636982,I_kwDOBm6k_c5Zcec2,1962,"Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`",9599,open,0,,,1,2022-12-16T17:56:51Z,2022-12-16T21:55:29Z,,OWNER,,"In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead: - #1959 But I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result. The main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead. This requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs: - #1843",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1962/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501778647,I_kwDOBm6k_c5Zg1LX,1964,Cog menu is not keyboard accessible (also no ARIA),9599,open,0,,,1,2022-12-18T06:36:28Z,2022-12-18T06:37:28Z,,OWNER,,"This menu here: https://latest.datasette.io/fixtures/attraction_characteristic You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard. ![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1964/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524867951,I_kwDOBm6k_c5a46Nv,1980,"""Cannot sort table by id"" when sortable_columns is used",9599,open,0,,,2,2023-01-09T03:21:33Z,2023-01-09T03:23:53Z,,OWNER,,"I had an instance with this in `metadata.yml`: ```yaml databases: timezones: tables: timezones: sortable_columns: - tzid ``` When I clicked on the ""Apply"" button here: It sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message: > 500: Cannot sort table by id",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524983536,I_kwDOBm6k_c5a5Wbw,1981,Canned query field labels truncated,9599,open,0,,,1,2023-01-09T06:04:24Z,2023-01-09T06:05:44Z,,OWNER,,"Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776 ![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,open,0,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1516815571,I_kwDOBm6k_c5aaMTT,1975,_col=id can cause id column to export twice in CSV export,9599,open,0,,,0,2023-01-03T00:25:15Z,2023-01-03T00:25:21Z,,OWNER,,"https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1 ```csv id,id,title,body 1,1,WaSP Phase II,""

The Web Standards project has launched Phase II.

"" ``` That should not have two `id` columns.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1975/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529707837,I_kwDOBm6k_c5bLX09,1988,Reconsider pattern where plugins could break existing template context,9599,open,0,,3268330,4,2023-01-11T21:13:43Z,2023-01-11T21:25:05Z,,OWNER,,"> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing. _Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_ ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557507274,I_kwDOBm6k_c5c1azK,2005,`extra_template_vars` should be OK to return `None`,9599,open,0,,,1,2023-01-26T01:40:45Z,2023-01-26T01:41:50Z,,OWNER,,"Got this exception and had to make sure it always returned `{}`: ``` File "".../python3.11/site-packages/datasette/app.py"", line 1049, in render_template assert isinstance(extra_vars, dict), ""extra_vars is of type {}"".format( AssertionError: extra_vars is of type ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,open,0,,3268330,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,open,0,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1554032168,I_kwDOBm6k_c5coKYo,2002,Document how actors are displayed,9599,open,0,,,0,2023-01-24T00:08:49Z,2023-01-24T00:08:49Z,,OWNER,,"https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/__init__.py#L1052-L1056 This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1563264257,I_kwDOBm6k_c5dLYUB,2010,Row page should default to card view,9599,open,0,,3268330,1,2023-01-30T21:49:37Z,2023-01-30T21:52:06Z,,OWNER,,"Datasette currently uses the same table layout on the row pages as it does on the table pages: https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect https://datasette.io/content/pypi_packages/datasette-column-inspect If you shrink down to mobile width you get this instead, on both of those pages: I think that view, which I think of as the ""card view"", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564769997,I_kwDOBm6k_c5dRH7N,2011,"Applied facet did not result in an ""x"" icon to dismiss it",9599,open,0,,,1,2023-01-31T17:57:44Z,2023-01-31T17:58:54Z,,OWNER,,"![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png) That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata It's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564774831,I_kwDOBm6k_c5dRJGv,2012,Missing space in database summary,9599,open,0,,,0,2023-01-31T18:01:13Z,2023-01-31T18:01:13Z,,OWNER,,"Spotted this on an instance index page: ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2012/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1573424830,I_kwDOBm6k_c5dyI6-,2019,Refactor out the keyset pagination code,9599,open,0,,,14,2023-02-06T23:04:00Z,2023-02-08T01:40:46Z,,OWNER,,"While working on: - #1999 I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493 Extracting that into a utility function would simplify that code a lot.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2019/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1579973223,I_kwDOBm6k_c5eLHpn,2024,Mention WAL mode in documentation,9599,open,0,,,1,2023-02-10T16:11:10Z,2023-02-10T16:11:53Z,,OWNER,,It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615692818,I_kwDOBm6k_c5gTYQS,2035,Potential feature: special support for `?a=1&a=2` on the query page,9599,open,0,,3268330,14,2023-03-08T18:05:03Z,2023-03-31T16:09:08Z,,OWNER,,"From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138 The key idea is to make it easier for people to implement `where id in (...)` that's populated from query string arguments. What if you could add `?id=11&id=32&id=62` to the URL and have that made available as a list that can be used in the query?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,open,0,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here: ```diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( ""Allow display of SQL trace debug information with ?_trace=1"", ), Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""), + Setting(""strict_templates"", False, ""Raise errors for undefined template variables""), ) _HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead"" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting(""strict_templates""): + env_extras[""undefined""] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters[""escape_css_string""] = escape_css_string self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus ``` Explored this idea a bit in: - #1999",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,open,0,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0. The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1641013220,I_kwDOBm6k_c5hz9_k,2045,First column on a view page has no facet option in cog menu,9599,open,0,,3268330,0,2023-03-26T18:02:47Z,2023-03-26T18:02:48Z,,OWNER,,"e.g. first column on this page - cog menu has no option to facet. https://datasette.io/content/tools ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1657861026,I_kwDOBm6k_c5i0POi,2054,"Make detailed notes on how table, query and row views work right now",9599,open,0,,,13,2023-04-06T18:21:09Z,2023-04-07T20:14:38Z,,OWNER,,"Research to help influence the following: - #2049 - #2053 - #2050 - #262 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,open,0,,8755003,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related: - #262 I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too. Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649791661,I_kwDOBm6k_c5iVdKt,2050,Row page JSON should use new ?_extra= format,9599,open,0,,8755003,1,2023-03-31T17:56:53Z,2023-03-31T17:59:49Z,,OWNER,,"https://latest.datasette.io/fixtures/facetable/2.json Related: - #2049 - #1709 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,open,0,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,open,0,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it. ``` File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request await self.ds.refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas await self._refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database ``` That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665510265,I_kwDOBm6k_c5jRat5,2060,Clean up a bunch of warnings from ruff,9599,open,0,,,0,2023-04-13T01:23:02Z,2023-04-13T01:23:02Z,,OWNER,,"See: - #2056 `ruff` spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,open,0,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline I had to do this workaround: ```sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] ``` I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1764792125,I_kwDOBm6k_c5pMJc9,2086,Show information on startup in directory configuration mode,9599,open,0,,,0,2023-06-20T07:13:33Z,2023-06-20T07:13:33Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098 > One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,open,0,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080 > May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,open,0,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1803264272,I_kwDOBm6k_c5re6EQ,2101,alter: true support for JSON write API,9599,open,0,,,1,2023-07-13T15:24:11Z,2023-07-13T15:24:18Z,,OWNER,,"Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642 > The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,open,0,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,open,0,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,open,0,,8755003,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,open,0,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6 Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1843821954,I_kwDOBm6k_c5t5n2C,2137,Redesign row default JSON,9599,open,0,,8755003,1,2023-08-09T18:49:11Z,2023-08-09T19:02:47Z,,OWNER,,"This URL here: https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables ```json { ""database"": ""fixtures"", ""table"": ""simple_primary_key"", ""rows"": [ { ""id"": ""1"", ""content"": ""hello"" } ], ""columns"": [ ""id"", ""content"" ], ""primary_keys"": [ ""id"" ], ""primary_key_values"": [ ""1"" ], ""units"": {}, ""foreign_key_tables"": [ { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_blank_label"", ""count"": 0, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_blank_label=1"" }, { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_label"", ""count"": 1, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_label=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f3"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f3=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f2"", ""count"": 0, ""link"": ""/fixtures/complex_foreign_keys?f2=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f1"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f1=1"" } ], ""query_ms"": 4.226590999678592, ""source"": ""tests/fixtures.py"", ""source_url"": ""https://github.com/simonw/datasette/blob/main/tests/fixtures.py"", ""license"": ""Apache License 2.0"", ""license_url"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""ok"": true, ""truncated"": false } ``` That `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,open,0,,3268330,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1840324765,I_kwDOBm6k_c5tsSCd,2129,CSV ?sql= should indicate errors,9599,open,0,,3268330,1,2023-08-07T23:13:04Z,2023-08-08T02:02:21Z,,OWNER,,"> https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now: ```bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' ``` ``` HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1840417903,I_kwDOBm6k_c5tsoxv,2131,Refactor code that supports templates_considered comment,9599,open,0,,3268330,1,2023-08-08T01:28:36Z,2023-08-09T15:27:41Z,,OWNER,,"I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167 I think it should move to `datasette.render_template()` - and maybe have a renamed template variable too.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857234285,I_kwDOBm6k_c5usyVt,2145,If a row has a primary key of `null` various things break,9599,open,0,,,23,2023-08-18T20:06:28Z,2023-08-21T17:30:01Z,,OWNER,,"Stumbled across this while experimenting with `datasette-write-ui`. The error I got was a 500 on the `/db` page: > `'NoneType' object has no attribute 'encode'` Tracked it down to this code, which assembles the URL for a row page: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L120-L134 That's because `tilde_encode` can't handle `None`: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L1175-L1178 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1876353656,I_kwDOBm6k_c5v1uJ4,2168,Consider a request/response wrapping hook slightly higher level than asgi_wrapper(),9599,open,0,,,6,2023-08-31T21:42:04Z,2023-09-10T17:54:08Z,,OWNER,,"There's a long justification for why this might be needed here: - https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001 Short version: it would be neat if it was possible to stash some data on the `request` object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware. The `asgi_wrapper()` mechanism doesn't have access to the request or response objects - it gets `scope` and can mess around with `receive` and `send`, but those are pretty low-level primitives. Since Datasette has well-defined `request` and `response` objects now it might be nice to have a middleware layer that can manipulate those directly.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,open,0,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax. > > Here's what I've come up with: > ``` > datasette \ > -s settings.sql_time_limit_ms 1000 \ > -s plugins.datasette-auth-tokens.manage_tokens true \ > -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ > mydatabase.db tokens.db > ``` > Which would be equivalent to `datasette.yml` containing this: > ```yaml > plugins: > datasette-auth-tokens: > manage_tokens: true > manage_tokens_database: tokens > settings: > sql_time_limit_ms: 1000 > ``` More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1875739055,I_kwDOBm6k_c5vzYGv,2167,Document return type of await ds.permission_allowed(),9599,open,0,,,0,2023-08-31T15:14:23Z,2023-08-31T15:14:23Z,,OWNER,,"The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350 On inspecting the code I'm not 100% sure if it's possible for this. method to return `None`, or if it can only return `True` or `False`. Need to confirm that. https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1895266807,I_kwDOBm6k_c5w93n3,2184,Design decision - should configuration be exposed at /-/config ?,9599,open,0,,,0,2023-09-13T21:07:08Z,2023-09-13T21:07:38Z,,OWNER,,"> This made me think. That `{""$env"": ""ENV_VAR""}` hack was introduced back here: > > - https://github.com/simonw/datasette/issues/538 > > The problem it was solving was that metadata was visible to everyone with access to the instance at `/-/metadata` but plugins clearly needed a way to set secret settings. > > Now that this stuff is moving to config, we have some decisions to make: > > 1. Add `/-/config` to let people see the configuration of their instance, and keep the `$env` trick for secret settings. > 2. Say all configuration aside from metadata is secret and make `$env` optional or ditch it entirely. > 3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from `/-/config` > > I've found `/-/metadata` extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like. > > So I'm leaning towards option 1 or 3. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924_ Also refs: - #2093",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1884408624,I_kwDOBm6k_c5wUcsw,2177,Move schema tables from _internal to _catalog,9599,open,0,,,1,2023-09-06T16:58:33Z,2023-09-06T17:04:30Z,,OWNER,,"This came up in discussion over: - https://github.com/simonw/datasette/pull/2174 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,open,0,,3268330,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17 Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported. That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed. It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1907765514,I_kwDOBm6k_c5xtjEK,2195,`datasette publish` needs support for the new config/metadata split,9599,open,0,,,9,2023-09-21T21:08:12Z,2023-09-21T22:57:48Z,,OWNER,,"> ... which raises the challenge that `datasette publish` doesn't yet know what to do with a config file! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871_ ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1940346034,I_kwDOBm6k_c5zp1Sy,2199,Detailed upgrade instructions for metadata.yaml -> datasette.yaml,9599,open,0,,3268330,7,2023-10-12T16:21:25Z,2023-10-12T22:08:42Z,,OWNER,,"> `Exception: Datasette no longer accepts plugin configuration in --metadata. Move your ""plugins"" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.` > > I think we should link directly to documentation that tells people how to perform this upgrade. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021_ ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1888477283,I_kwDOC8SPRc5wj-Bj,38,Run `rebuild_fts` after building the index,9599,open,0,,,0,2023-09-08T23:17:45Z,2023-09-08T23:17:45Z,,MEMBER,,"In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix: ```bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db ```",197431109,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,open,0,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,open,0,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072792507,I_kwDOCGYnMM4_8YO7,352,`sqlite-utils insert --extract colname`,9599,open,0,,,4,2021-12-07T00:55:44Z,2022-02-03T22:59:36Z,,OWNER,,"Is there a reason I've not added `--extract` as an option for `sqlite-utils insert` next? There's a `extracts=` option for the various `table.insert()` etc methods - last line in this code block: https://github.com/simonw/sqlite-utils/blob/213a0ff177f23a35f3b235386366ff132eb879f1/sqlite_utils/db.py#L2483-L2495",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066563554,I_kwDOCGYnMM4_knfi,346,Way to test SQLite 3.37 (and potentially other versions) in CI,9599,open,0,,,5,2021-11-29T22:21:06Z,2021-11-29T23:12:49Z,,OWNER,,"> Need to figure out a good pattern for testing this in CI too - it will currently skip the new tests if it doesn't have SQLite 3.37 or higher. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982076924_",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977155641,I_kwDOCGYnMM512QA5,601,Move plugin directory into documentation,9599,open,0,,,0,2023-11-04T04:07:52Z,2023-11-04T04:07:52Z,,OWNER,,"https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html https://til.simonwillison.net/readthedocs/stable-docs",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090798237,I_kwDOCGYnMM5BBEKd,359,Use RETURNING if available to populate last_pk,9599,open,0,,,0,2021-12-29T23:43:23Z,2021-12-29T23:43:23Z,,OWNER,,"Inspired by this: https://news.ycombinator.com/item?id=29729283 > Because SQLite is effectively serializing all the writes for us, we have zero locking in our code. We used to have to lock when inserting new items (to get the LastInsertRowId), but the newer version of SQLite supports the RETURNING keyword, so we don't even have to lock on inserts now.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122446693,I_kwDOCGYnMM5C5y1l,394,Test against Python 3.11-dev,9599,open,0,,,1,2022-02-02T22:21:03Z,2022-02-03T21:06:35Z,,OWNER,,"Same as: - https://github.com/simonw/datasette/issues/1621",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1124731464,I_kwDOCGYnMM5DCgpI,399,"Make it easier to insert geometries, with documentation and maybe code",9599,open,0,,,25,2022-02-05T00:11:26Z,2023-05-16T03:11:52Z,,OWNER,,"In playing with the new SpatiaLite helpers from #385 I noticed that actually populating geometry columns is still a little bit tricky. Here's what I ended up doing: ```python import httpx, sqlite_utils db = sqlite_utils.Database(""/tmp/spatial.db"") attractions = httpx.get(""https://latest.datasette.io/fixtures/roadside_attractions.json?_shape=array"").json() db[""attractions""].insert_all(attractions, pk=""pk"") # Schema of that table is now: # CREATE TABLE [attractions] ( # [pk] INTEGER PRIMARY KEY, # [name] TEXT, # [address] TEXT, # [latitude] FLOAT, # [longitude] FLOAT # ) db.init_spatialite() db[""attractions""].add_geometry_column(""point"", ""POINT"") db.execute("""""" update attractions set point = GeomFromText( 'POINT(' || longitude || ' ' || latitude || ')', 4326 ) """""") ``` That last line took some figuring out - especially the need for the SRID of `4326`, without which I got this error: > `IntegrityError: attractions.point violates Geometry constraint [geom-type or SRID not allowed]` It would be good to both document this in more detail, but ideally also to come up with a more obvious pattern for inserting common types of spatial data. Also related: - #398 - #79",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1125297737,I_kwDOCGYnMM5DEq5J,402,Advanced class-based `conversions=` mechanism,9599,open,0,,,14,2022-02-06T19:47:41Z,2022-02-16T10:18:55Z,,OWNER,,"The `conversions=` parameter works like this at the moment: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions ```python db[""places""].insert( {""name"": ""Wales"", ""geometry"": wkt}, conversions={""geometry"": ""GeomFromText(?, 4326)""}, ) ``` This proposal is to support values in that dictionary that are objects, not strings, which can represent more complex conversions - spun out from #399. New proposed mechanism: ```python from sqlite_utils.utils import LongitudeLatitude db[""places""].insert( { ""name"": ""London"", ""point"": (-0.118092, 51.509865) }, conversions={""point"": LongitudeLatitude}, ) ``` Here `LongitudeLatitude` is a magical value which does TWO things: it sets up the `GeomFromText(?, 4326)` SQL function, and it handles converting the `(51.509865, -0.118092)` tuple into a `POINT({} {})` string. This would involve a change to the `conversions=` contract - where it usually expects a SQL string fragment, but it can also take an object which combines that SQL string fragment with a Python conversion function. Best of all... this resolves the `lat, lon` v.s. `lon, lat` dilemma because you can use `from sqlite_utils.utils import LongitudeLatitude` OR `from sqlite_utils.utils import LatitudeLongitude` depending on which you prefer! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030739566_",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,open,0,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1160182768,I_kwDOCGYnMM5FJvvw,412,Optional Pandas integration,9599,open,0,,,13,2022-03-05T01:49:27Z,2022-06-14T15:36:29Z,,OWNER,,"It would be neat if there was a way to use this more seamlessly with Pandas, in particular Pandas dataframes - but without making Pandas a required dependency.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181236173,I_kwDOCGYnMM5GaDvN,422,Reconsider not running convert functions against null values,9599,open,0,,,1,2022-03-25T20:22:40Z,2022-03-25T20:23:21Z,,OWNER,,"I just got caught out by the fact that `None` values are not processed by the `.convert()` mechanism https://github.com/simonw/sqlite-utils/blob/0b7b80bd40fe86e4d66a04c9f607d94991c45c0b/sqlite_utils/db.py#L2504-L2510 I had run this code while working on #420 and I wasn't sure why it didn't work: ``` $ sqlite-utils add-column content.db articles score float $ sqlite-utils convert content.db articles score ' import random random.seed(10) def convert(value): global random return random.random() ' ``` The reason it didn't work is that the newly added `score` column was full of `null` values. I fixed it by doing this instead: $ sqlite-utils add-column content.db articles score float --not-null-default 1.0 But this indicates to me that the design of `convert()` here may be incorrect.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1215216249,I_kwDOCGYnMM5Ibrp5,428,Research adding support for savepoints,9599,open,0,,,1,2022-04-26T01:04:01Z,2022-04-26T01:05:29Z,,OWNER,,"https://www.sqlite.org/lang_savepoint.html Savepoints are like regular transactions except they have names and can be nested. Would there be any value in adding support to them to `sqlite-utils`, potentially as some kind of context manager? Something like this: ```python with db.savepoint(""name""): # do stuff with db.savepoint(""name2""): # do more stuff raise Release # Rolls back to before ""name2"" savepoint ``` I've never used this feature so I'm not comfortable adding anything like this without a bunch of extra research.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1271426387,I_kwDOCGYnMM5LyG1T,444,CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool,9599,open,0,,,5,2022-06-14T22:22:47Z,2022-07-07T16:39:18Z,,OWNER,,"> I forgot to add equivalents of `extras_key=` and `ignore_extras=` to the CLI tool - will do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1155767915_",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1326349129,I_kwDOCGYnMM5PDntJ,461,Consider including animated SVG console demos,9599,open,0,,,1,2022-08-02T20:10:04Z,2022-08-02T20:12:14Z,,OWNER,,"I recorded this one using https://github.com/nbedos/termtosvg - with `pipx install termtosvg` and then `termtosvg` - execute demo - `exit` to save. ![sqlite-utils-insert-json](https://user-images.githubusercontent.com/9599/182464206-f4976af4-eda8-4020-8257-4ada1867fb44.svg) ```json [ { ""id"": 1, ""name"": ""Catimus"" }, { ""id"": 2, ""name"": ""Feliopia"" } ] ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353481513,I_kwDOCGYnMM5QrH0p,478,`sqlite-utils tables data.db table1 table2`,9599,open,0,,,1,2022-08-28T22:05:53Z,2022-08-28T22:22:35Z,,OWNER,,"The `sqlite-utils tables` command currently lists all tables. If you have a huge table in there then running it with `--counts` can get expensive, because of the huge table. Would be useful if it could accept an optional list of tables that it should execute against, as an alternative to the default of all of them. This should be a backwards compatible change. Current design is: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#tables ``` Usage: sqlite-utils tables [OPTIONS] PATH List the tables in the database Example: sqlite-utils tables trees.db ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1373224657,I_kwDOCGYnMM5R2b7R,488,`sqlite-utils transform` should set empty strings to null when converting text columns to integer/float,9599,open,0,,,5,2022-09-14T15:51:30Z,2022-12-23T17:38:55Z,,OWNER,,"``` /tmp % echo ""id,age,weight\n1,3,2.5\n2,,"" | sqlite-utils insert test.db test - --csv /tmp % sqlite-utils schema test.db CREATE TABLE [test] ( [id] TEXT, [age] TEXT, [weight] TEXT ); /tmp % sqlite-utils transform test.db test --type age integer --type weight float /tmp % sqlite-utils schema test.db CREATE TABLE ""test"" ( [id] TEXT, [age] INTEGER, [weight] FLOAT ); /tmp % sqlite-utils rows test.db test [{""id"": ""1"", ""age"": 3, ""weight"": 2.5}, {""id"": ""2"", ""age"": """", ""weight"": """"}] ``` It would be neat if this resulted in the following instead: ``` {""id"": ""2"", ""age"": null, ""weight"": null} ``` Related Discord discussion: https://discord.com/channels/823971286308356157/823971286941302908/1019635490833567794",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/488/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1374939463,I_kwDOCGYnMM5R8-lH,489,Ability to load JSON records held in a file with a single top level key that is a list of objects,9599,open,0,,,9,2022-09-15T18:46:03Z,2022-09-15T20:56:10Z,,OWNER,,"It's very common for JSON to look like this: ```json { ""Version"": ""5.5.52.6"", ""List"": [ { ""Description"": ""Nonpartisan"", ""Id"": 1, ""ExternalId"": """" }, { ""Description"": ""Undeclared"", ""Id"": 2, ""ExternalId"": """" } ] } ``` This example taken from the records downloaded from https://www.elections.alaska.gov/election-results/e/ Right now you can't import this into `sqlite-utils` - you need to run it through `jq .List` first. But since this is so common, it would be neat if `sqlite-utils` could have a rule of thumb that says ""if it's an object, but it has a single key that is is a list of objects, use that instead"".",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/489/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1359604075,I_kwDOCGYnMM5RCelr,481,"Idea: `sqlite-utils create-table tablename --sql ""select ...""`",9599,open,0,,,0,2022-09-02T01:41:24Z,2022-09-02T01:42:08Z,,OWNER,,"Could offer syntactic sugar for: ```sql create table foo as select * from bar ``` ``` sqlite-utils create-table data.db foo --sql ""select * from bar"" ``` https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-table",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363766973,I_kwDOCGYnMM5RSW69,484,Expose convert recipes to `sqlite-utils --functions`,9599,open,0,,,11,2022-09-06T20:15:08Z,2022-09-07T19:09:52Z,,OWNER,,"`--functions` was added in: - #471 It would be useful if the `r.jsonsplit()` and similar recipes for `sqlite-utils convert` could be used in these blocks of code too: https://sqlite-utils.datasette.io/en/stable/cli.html#sqlite-utils-convert-recipes",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386530156,I_kwDOCGYnMM5SpMVs,492,Idea: ability to pass extra variables to `--convert` scripts,9599,open,0,,,1,2022-09-26T18:30:45Z,2022-09-26T18:33:19Z,,OWNER,,"Got this idea from this example in https://jeqo.github.io/notes/2022-09-24-ingest-logs-sqlite/ ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': 'localhost'}) row.update({'component': 'broker'}) return rows "" ``` And the accompanying note: > The `row.update` allows to label rows as I’m planning to ingest logs from different hosts and potentially different components. This made me think: it might be neat if you could inject additional variable values into that script with extra command-line options, to make this kind of reuse easier. Something like this: ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': server}) row.update({'component': component}) return rows "" --var server ""localhost"" --var component ""broker"" ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/492/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,open,0,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1479914599,I_kwDOCGYnMM5YNbRn,516,Feature request: output number of ignored/replaced rows for insert command,9599,open,0,,,4,2022-12-06T18:59:21Z,2022-12-06T19:08:14Z,,OWNER,,"https://hachyderm.io/@briandorsey/109468185742876820 > I'm fiddling with piping json to `insert -ignore` I'd love to see the count of records inserted & ignored, but didn't see a way to do that in the help/docs. > > Example: `xh ""https://hachyderm.io/api/v1/timelines/tag/rust?max_id=109443380308326328"" | sqlite-utils insert aoc.db aoc - --pk=id --ignore`",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1700840265,I_kwDOCGYnMM5lYMNJ,541,Get tests to pass with `pytest -Werror`,9599,open,0,,,1,2023-05-08T19:57:23Z,2023-05-08T19:59:35Z,,OWNER,,"Inspired by: - #534",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1700936245,I_kwDOCGYnMM5lYjo1,542,Remove `skip_false=True` and `--no-skip-false` in `sqlite-utils` 4.0,9599,open,0,,9374594,1,2023-05-08T21:04:28Z,2023-05-08T21:07:41Z,,OWNER,,"Following: - #527 The only reason I didn't remove fix this mis-feature entirely is that it represents a backwards incompatible change. I'll make that change in 4.0.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1784794489,I_kwDOCGYnMM5qYc15,562,Explore the intersection between sqlite-utils and dataclasses,9599,open,0,,,1,2023-07-02T19:23:08Z,2023-07-02T19:26:39Z,,OWNER,,"> Aside: this makes me think it might be cool if `sqlite-utils` had a way of working with dataclasses rather than just dicts, and knew how to create a SQLite table to match a dataclass and maybe how to code-generate dataclasses for a specific table schema (dynamically or even using code-generation that can be written to disk, for better editor integrations). _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1616742529_ ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1818838294,I_kwDOCGYnMM5saUUW,578,Plugin hook for adding new output formats,9599,open,0,,,5,2023-07-24T17:29:18Z,2023-08-07T15:41:49Z,,OWNER,,"> What would it take to add a format hook? I'm still thinking about my GIS workflow, and being able to do `sqlite-utils query ... --geojson` would be nice. It's the one place my Datasette workflow is messy, having to do `datasette . --get /path/to/query.geojson --setting max_rows_returned 10000 --load-extension spatialite`. > I know the current pattern is `--csv`, but maybe `--format geojson` is more future-proof. https://discord.com/channels/823971286308356157/997738192360964156/1133076679011602432",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,open,0,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1868713944,I_kwDOCGYnMM5vYk_Y,588,`table.get(column=value)` option for retrieving things not by their primary key,9599,open,0,,,1,2023-08-28T00:41:23Z,2023-08-28T00:41:54Z,,OWNER,,"This came up working on this feature: - https://github.com/simonw/llm/pull/186 I have a table with this schema: ```sql CREATE TABLE [collections] ( [id] INTEGER PRIMARY KEY, [name] TEXT, [model] TEXT ); CREATE UNIQUE INDEX [idx_collections_name] ON [collections] ([name]); ``` So the primary key is an integer (because it's going to have a huge number of rows foreign key related to it, and I don't want to store a larger text value thousands of times), but there is a unique constraint on the `name` - that would be the primary key column if not for all of those foreign keys. Problem is, fetching the collection by name is actually pretty inconvenient. Fetch by numeric ID: ```python try: table[""collections""].get(1) except NotFoundError: # It doesn't exist ``` Fetching by name: ```python def get_collection(db, collection): rows = db[""collections""].rows_where(""name = ?"", [collection]) try: return next(rows) except StopIteration: raise NotFoundError(""Collection not found: {}"".format(collection)) ``` It would be neat if, for columns where we know that we should always get 0 or one result, we could do this instead: ```python try: collection = table[""collections""].get(name=""entries"") except NotFoundError: # It doesn't exist ``` The existing `.get()` method doesn't have any non-positional arguments, so using `**kwargs` like that should work: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L1495",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,open,0,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879214365,I_kwDOCGYnMM5wAokd,590,Ability to tell if a Database is an in-memory one,9599,open,0,,,1,2023-09-03T19:50:15Z,2023-09-03T19:50:36Z,,OWNER,,"Currently the constructor accepts `memory=True` or `memory_name=...` and uses those to create a connection, but does not record what those values were: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L307-L349 This makes it hard to tell if a database object is to an in-memory or a file-based database, which is sometimes useful to know.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1891614971,I_kwDOCGYnMM5wv8D7,594,Represent compound foreign keys in table.foreign_keys output,9599,open,0,,,2,2023-09-12T03:48:24Z,2023-09-12T03:51:13Z,,OWNER,,"Given this schema: ```sql CREATE TABLE departments ( campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, dept_name TEXT, PRIMARY KEY (campus_name, dept_code) ); CREATE TABLE courses ( course_code TEXT PRIMARY KEY, course_name TEXT, campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, FOREIGN KEY (campus_name, dept_code) REFERENCES departments(campus_name, dept_code) ); ``` The output of `db[""courses""].foreign_keys` right now is: ``` [ForeignKey(table='courses', column='campus_name', other_table='departments', other_column='campus_name'), ForeignKey(table='courses', column='dept_code', other_table='departments', other_column='dept_code')] ``` Which suggests two normal foreign keys, not one compound foreign key.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071071397,I_kwDODFdgUs4_10Cl,69,View that combines issues and issue comments,9599,open,0,,,1,2021-12-04T00:34:33Z,2021-12-04T00:34:52Z,,MEMBER,,I want to see a reverse chronologically ordered interface onto both issues and comments - essentially a unified log of comments and issues opened across one or multiple projects.,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,open,0,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,open,0,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then. Problem is I don't actually know what order it iterates over the notes in.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,open,0,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,open,0,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it: > Or just ""Actions ⚙️ "" And got back: > `Actions ‚öôÔ∏è` Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this: ```python s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè' s = s.encode('latin-1') s = s.decode('utf-8') s = s.encode('macroman') s = s.decode('utf-8') print(s) ``` ",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,open,0,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,open,0,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 464987783,MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz,546,Facet by delimiter,9599,open,0,,,2,2019-07-07T20:06:05Z,2019-11-18T23:46:01Z,,OWNER,simonw/datasette/pulls/546,Refs #510,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 473288428,MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz,564,First proof-of-concept of Datasette Library,9599,open,0,,,1,2019-07-26T10:22:26Z,2023-02-07T15:14:11Z,,OWNER,simonw/datasette/pulls/564,"Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() ",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 501773982,MDExOlB1bGxSZXF1ZXN0MzIzOTgzNzMy,579,New connection pooling,9599,open,0,,,1,2019-10-02T23:22:19Z,2019-11-15T22:57:21Z,,OWNER,simonw/datasette/pulls/579,See #569,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 565064079,MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy,672,--dirs option for scanning directories for SQLite databases,9599,open,0,,,15,2020-02-14T02:25:52Z,2020-03-27T01:03:53Z,,OWNER,simonw/datasette/pulls/672,Refs #417.,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/672/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 793002853,MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1,1204,WIP: Plugin includes,9599,open,0,,,3,2021-01-25T03:59:06Z,2021-12-17T07:10:49Z,,OWNER,simonw/datasette/pulls/1204,"Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1,