{"id": 1054246919, "node_id": "I_kwDOBm6k_c4-1ogH", "number": 1511, "title": "Review plugin hooks for Datasette 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2021-11-15T23:26:05Z", "updated_at": "2021-11-16T01:20:14Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I need to perform a detailed review of the plugin interface - especially the plugin hooks like [register_facet_classes()](https://docs.datasette.io/en/stable/plugin_hooks.html#register-facet-classes) which I don't yet have complete confidence in.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1511/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1050163432, "node_id": "I_kwDOBm6k_c4-mDjo", "number": 1503, "title": "`?_nocol=` removes that column from the filter interface", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-11-10T18:22:50Z", "updated_at": "2021-11-14T05:08:27Z", "closed_at": "2021-11-14T04:53:07Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable\r\n\r\n\"fixtures__sortable__201_rows\"\r\n\r\nThis causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter.\r\n\r\n![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1503/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1052247023, "node_id": "I_kwDOBm6k_c4-uAPv", "number": 1505, "title": "Datasette should have an option to output CSV with semicolons", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-11-12T18:02:21Z", "updated_at": "2021-11-16T11:40:52Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": null, "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1505/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1052826038, "node_id": "I_kwDOBm6k_c4-wNm2", "number": 1506, "title": "Columns beginning with an underscore do not facet correctly", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-11-14T02:20:32Z", "updated_at": "2021-11-14T04:45:21Z", "closed_at": "2021-11-14T04:45:21Z", "author_association": "OWNER", "pull_request": null, "body": "Datasette treats columns that start with an underscore as querystring parameters it should ignore!\r\n\r\n\"bchydro__item_versions__99_918_rows\"\r\n\r\nDiscovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1506/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1028115674, "node_id": "I_kwDOBm6k_c49R8za", "number": 1493, "title": "`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59", "user": {"value": 1580956, "label": "chenrui333"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-10-16T18:22:22Z", "updated_at": "2021-10-19T04:39:11Z", "closed_at": "2021-10-19T04:39:11Z", "author_association": "NONE", "pull_request": null, "body": "\ud83d\udc4b trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command.\r\n\r\nMy QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks!\r\n\r\nrelates to https://github.com/Homebrew/homebrew-core/pull/87369", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1493/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1033864602, "node_id": "I_kwDOBm6k_c49n4Wa", "number": 1496, "title": "Named parameters docs should include an example of a cast", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-10-22T18:56:04Z", "updated_at": "2021-10-22T19:38:23Z", "closed_at": "2021-10-22T19:34:27Z", "author_association": "OWNER", "pull_request": null, "body": "https://docs.datasette.io/en/stable/sql_queries.html#named-parameters\r\n\r\nIt's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1496/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1072106103, "node_id": "I_kwDOBm6k_c4_5wp3", "number": 1542, "title": "feature request: order and dependency of plugins (that use js)", "user": {"value": 33631, "label": "fs111"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-06T12:40:45Z", "updated_at": "2021-12-15T17:47:08Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js.\r\n\r\nBasically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it.\r\n\r\nBasically what I want to do is:\r\n\r\nmy-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized.\r\n\r\nSince I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1542/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1058803238, "node_id": "I_kwDOBm6k_c4_HA4m", "number": 1520, "title": "Pattern for avoiding accidental URL over-rides", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-11-19T18:28:05Z", "updated_at": "2021-11-19T18:29:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Following #1517 I'm experimenting with a plugin that does this:\r\n```python\r\n@hookimpl\r\ndef register_routes():\r\n return [\r\n (r\"/(?P[^/]+)/(?P[^/]+?)$\", Table().view),\r\n ]\r\n```\r\nThis is supposed to replace the default table page with new code... but there's a problem: `/-/versions` on that instance now returns 404 `Database '-' does not exist`!\r\n\r\nNeed to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1520/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1069881276, "node_id": "I_kwDOBm6k_c4_xRe8", "number": 1541, "title": "Different default layout for row page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-02T18:56:36Z", "updated_at": "2021-12-02T18:56:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "The row page displays as a table even though it only has one table row.\r\n\r\nmaybe default to the same display as the narrow page version, even for wide pages?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1541/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1994857251, "node_id": "I_kwDOBm6k_c525xsj", "number": 2208, "title": "No suggested facets when a column named 'value' is included", "user": {"value": 198537, "label": "rgieseke"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-11-15T14:11:17Z", "updated_at": "2023-11-15T14:18:59Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'.\r\n\r\nhttps://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174\r\n\r\nCurrently the following is shown (from https://latest.datasette.io/fixtures/facetable)\r\n\r\n![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5)\r\n\r\nWhen I add a column named 'value' only the JSON facets are processed.\r\n\r\n![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d)\r\n\r\nI think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that.\r\n\r\nThere is also a TODO with a similar question in the same file. I have not looked into that yet.\r\n\r\nhttps://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2208/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 2028698018, "node_id": "I_kwDOBm6k_c5463mi", "number": 2213, "title": "feature request: gzip compression of database downloads", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-12-06T14:35:03Z", "updated_at": "2023-12-06T15:05:46Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed. \r\n\r\nthis is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of \"application\"\r\n\r\n(see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2213/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 2029908157, "node_id": "I_kwDOBm6k_c54_fC9", "number": 2214, "title": "CSV export fails for some `text` foreign key references", "user": {"value": 2874, "label": "precipice"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-12-07T05:04:34Z", "updated_at": "2023-12-07T07:36:34Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. \r\n\r\nI'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com.\r\n\r\nTheir data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read.\r\n\r\nIf I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. \r\n\r\nThe foreign key configuration is as follows:\r\n\r\n```\r\n# Some tables use integer ids, like sensible tables do. Let's import them first\r\n# since we favor them.\r\n\r\nfor TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY\r\ndo\r\n\tsqlite-utils create-table records.db $TABLE id integer name text --pk=id\r\n\tsqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv\r\n\tsqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id\r\n\tsqlite-utils create-index records.db collisions $TABLE\r\ndone\r\n\r\n# *Other* tables use letter keys, like they were raised by WOLVES. Let's put them\r\n# at the end of the import queue.\r\n\r\nfor TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \\\r\nPRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \\\r\nPED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \\\r\nSTWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP\r\ndo\r\n\tsqlite-utils create-table records.db $TABLE key text name text --pk=key\r\n\tsqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv\r\n\tsqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key\r\n\tsqlite-utils create-index records.db collisions $TABLE\r\ndone\r\n```\r\n\r\nYou can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db\r\n\r\nIf I run this code and then hit the CSV export link in the Datasette interface (the simple link or the \"advanced\" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output:\r\n\r\n```\r\nINFO: 127.0.0.1:57885 - \"GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1\" 200 OK\r\nCaught this error: \r\n\r\n```\r\n\r\n(No other output follows `error:` other than a blank line.)\r\n\r\nI've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2214/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1089529555, "node_id": "I_kwDOBm6k_c5A8ObT", "number": 1581, "title": "when hashed urls are turned on, the _memory db has improperly long-lived cache expiry", "user": {"value": 536941, "label": "fgregg"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-28T00:05:48Z", "updated_at": "2022-03-24T04:08:18Z", "closed_at": "2022-03-24T04:08:18Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database.\r\n\r\nin particular, this header is set:\r\n\r\n`cache-control: max-age=31536000`\r\n\r\nthis is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561).\r\n\r\nEither the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1581/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1076057610, "node_id": "I_kwDOBm6k_c5AI1YK", "number": 1546, "title": "validating the sql", "user": {"value": 50336793, "label": "jadsongmatos"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-09T21:35:57Z", "updated_at": "2021-12-18T02:05:17Z", "closed_at": "2021-12-18T02:05:16Z", "author_association": "NONE", "pull_request": null, "body": "Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1546/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1081318247, "node_id": "I_kwDOBm6k_c5Ac5tn", "number": 1556, "title": "Show count of facet values always, not just for `?_facet_size=max`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 1, "created_at": "2021-12-15T17:49:01Z", "updated_at": "2022-01-13T22:26:07Z", "closed_at": "2021-12-15T17:58:06Z", "author_association": "OWNER", "pull_request": null, "body": "> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1556/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1082564912, "node_id": "I_kwDOBm6k_c5AhqEw", "number": 1557, "title": "`?_nosuggest=1` parameter for disabling facet suggestions on table view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 1, "created_at": "2021-12-16T19:21:42Z", "updated_at": "2022-01-13T22:26:48Z", "closed_at": "2021-12-16T19:24:59Z", "author_association": "OWNER", "pull_request": null, "body": "Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1557/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1083581011, "node_id": "I_kwDOBm6k_c5AliJT", "number": 1564, "title": "_prepare_connection not called on write connections", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 1, "created_at": "2021-12-17T20:06:47Z", "updated_at": "2022-01-20T21:29:43Z", "closed_at": "2021-12-18T01:58:44Z", "author_association": "OWNER", "pull_request": null, "body": "I was trying to initalize SpatiaLite in a write connection:\r\n```pycon\r\n>>> from datasette.app import Datasette\r\n>>> ds = Datasette(memory=True, files=[], sqlite_extensions=[\"spatialite\"])\r\n>>> db = ds.add_memory_database('geo')\r\n>>> await db.execute_write(\"select InitSpatialMetadata(1)\")\r\nUUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f')\r\nno such function: InitSpatialMetadata\r\n```\r\nIt looks like the code that loads additional modules only works on read-only connections, not on write connections:\r\n\r\nhttps://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153\r\n\r\nCompared to:\r\n\r\nhttps://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1564/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1084185188, "node_id": "I_kwDOBm6k_c5An1pk", "number": 1573, "title": "Make trace() a documented internal API", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-19T20:32:56Z", "updated_at": "2021-12-19T21:13:13Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52\r\n\r\nIncluding the new `kwargs` pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1573/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1084257842, "node_id": "I_kwDOBm6k_c5AoHYy", "number": 1575, "title": "__call__() got an unexpected keyword argument 'specname'", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-20T01:24:04Z", "updated_at": "2021-12-20T01:48:03Z", "closed_at": "2021-12-20T01:47:57Z", "author_association": "OWNER", "pull_request": null, "body": "> I've installed the alpha version but get an error when starting up Datasette:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/tim/.pyenv/versions/stock-exchange/bin/datasette\", line 5, in \r\n from datasette.cli import cli\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py\", line 15, in \r\n from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py\", line 31, in \r\n from .views.database import DatabaseDownload, DatabaseView\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py\", line 25, in \r\n from datasette.plugins import pm\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py\", line 29, in \r\n mod = importlib.import_module(plugin)\r\n File \"/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py\", line 127, in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py\", line 9, in \r\n @hookimpl(specname=\"filters_from_request\")\r\nTypeError: __call__() got an unexpected keyword argument 'specname'\r\n```\r\n\r\n_Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1575/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1104691662, "node_id": "I_kwDOBm6k_c5B2EHO", "number": 1600, "title": "plugins --all example should use cog", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-15T11:47:49Z", "updated_at": "2022-01-20T05:06:21Z", "closed_at": "2022-01-20T05:04:16Z", "author_association": "OWNER", "pull_request": null, "body": "The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1600/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1090810196, "node_id": "I_kwDOBm6k_c5BBHFU", "number": 1583, "title": "consider adding deletion step of cloudbuild artifacts to gcloud publish", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-12-30T00:33:23Z", "updated_at": "2021-12-30T00:34:16Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun.\r\n\r\nafter successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1583/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1091838742, "node_id": "I_kwDOBm6k_c5BFCMW", "number": 1585, "title": "Fire base caching for `publish cloudrun`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-01T15:38:15Z", "updated_at": "2022-01-01T15:40:38Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848\r\n\r\nCould this enable easier vanity URLs of the format `https://$project_id.web.app/`? How about CDN caching?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1585/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1096536240, "node_id": "I_kwDOBm6k_c5BW9Cw", "number": 1586, "title": "run analyze on all databases as part of start up or publishing", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-07T17:52:34Z", "updated_at": "2022-02-02T07:13:37Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Running `analyze;` lets sqlite's query planner make *much* better use of any indices.\r\n\r\nIt might be nice if the analyze was run as part of the start up of \"serve\" or \"publish\".", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1586/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1100499619, "node_id": "I_kwDOBm6k_c5BmEqj", "number": 1592, "title": "Row pages should show links to foreign keys", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-12T15:50:20Z", "updated_at": "2022-01-12T15:52:17Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Refs #1518 refactor.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1592/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1102612922, "node_id": "I_kwDOBm6k_c5BuIm6", "number": 1597, "title": "\"datasette inspect\" has no help summary", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-14T00:02:16Z", "updated_at": "2022-01-14T00:07:36Z", "closed_at": "2022-01-14T00:07:36Z", "author_association": "OWNER", "pull_request": null, "body": "Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help\r\n```\r\nCommands:\r\n serve* Serve up specified SQLite database files with a web UI\r\n inspect\r\n install Install Python packages - e.g.\r\n```\r\n```\r\nUsage: datasette inspect [OPTIONS] [FILES]...\r\n\r\nOptions:\r\n --inspect-file TEXT\r\n --load-extension TEXT Path to a SQLite extension to load\r\n --help Show this message and exit.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1597/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1102637351, "node_id": "I_kwDOBm6k_c5BuOkn", "number": 1598, "title": "Replace update-docs-help.py script with cog", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-14T00:33:27Z", "updated_at": "2022-01-14T00:47:57Z", "closed_at": "2022-01-14T00:47:57Z", "author_association": "OWNER", "pull_request": null, "body": "I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism:\r\n\r\nhttps://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1598/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1108300685, "node_id": "I_kwDOBm6k_c5CD1ON", "number": 1604, "title": "Option to assign a domain/subdomain using `datasette publish cloudrun`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-19T16:21:17Z", "updated_at": "2022-01-19T16:23:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1604/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1113384383, "node_id": "I_kwDOBm6k_c5CXOW_", "number": 1611, "title": "Avoid ever running count(*) against SpatiaLite KNN table", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-25T03:32:54Z", "updated_at": "2022-02-02T06:45:47Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Got this in a trace:\r\n\r\n\"image\"\r\n\r\nLooks like running `count(*)` against KNN took 83s! It ignored the time limit. And still only returned a count of 0.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1611/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1115435536, "node_id": "I_kwDOBm6k_c5CfDIQ", "number": 1614, "title": "Try again with SQLite codemirror support", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-26T20:05:20Z", "updated_at": "2022-12-23T21:27:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I tried and failed to implement autocomplete a while ago. Relevant code:\r\n\r\nhttps://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335\r\n\r\nSounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1614/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1149310456, "node_id": "I_kwDOBm6k_c5EgRX4", "number": 1641, "title": "Tweak mobile keyboard settings", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-02-24T13:47:10Z", "updated_at": "2022-02-24T13:49:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12\r\n\r\n`autocorrect=\"off\"` is worth experimenting with.\r\n\r\nTwitter: https://twitter.com/forestgregg/status/1496842959563726852", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1641/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1152072027, "node_id": "I_kwDOBm6k_c5Eqzlb", "number": 1642, "title": "Dependency issue with asgiref and uvicorn", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-02-26T18:00:35Z", "updated_at": "2022-03-05T01:11:27Z", "closed_at": "2022-03-05T01:11:17Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\nERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.\r\n\r\nWe recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.\r\n\r\ndatasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible.\r\n```\r\nThat's after I forced an upgrade of `uvicorn` due to this warning:\r\n```\r\nERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.\r\n\r\nWe recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.\r\n\r\nuvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1642/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1169840669, "node_id": "I_kwDOBm6k_c5Fulod", "number": 1658, "title": "Revert main to version that passes tests", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-03-15T15:37:02Z", "updated_at": "2022-03-19T04:04:50Z", "closed_at": "2022-03-15T15:42:58Z", "author_association": "OWNER", "pull_request": null, "body": "> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1658/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1170497629, "node_id": "I_kwDOBm6k_c5FxGBd", "number": 1662, "title": "[feature request] Publish to fully static website", "user": {"value": 32609395, "label": "contrun"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-03-16T03:32:28Z", "updated_at": "2022-03-19T00:42:23Z", "closed_at": "2022-03-19T00:42:23Z", "author_association": "NONE", "pull_request": null, "body": "It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly.\r\n\r\n* [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html)\r\n* [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/)\r\n* [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1662/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1190828163, "node_id": "I_kwDOBm6k_c5G-piD", "number": 1698, "title": "Add a warning about bots and Cloud Run", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-03T05:57:17Z", "updated_at": "2022-04-03T06:10:24Z", "closed_at": "2022-04-03T06:10:24Z", "author_association": "OWNER", "pull_request": null, "body": "Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1698/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1189113609, "node_id": "I_kwDOBm6k_c5G4G8J", "number": 1697, "title": "`Request.fake(..., url_vars={})`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-04-01T01:48:40Z", "updated_at": "2022-04-01T02:02:18Z", "closed_at": "2022-04-01T02:02:10Z", "author_association": "OWNER", "pull_request": null, "body": "I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well:\r\n```python\r\nfrom datasette.utils.asgi import Request\r\nclass Request(Request):\r\n\r\n @classmethod\r\n def fake(cls, path_with_query_string, method=\"GET\", scheme=\"http\", url_vars=None):\r\n \"\"\"Useful for constructing Request objects for tests\"\"\"\r\n path, _, query_string = path_with_query_string.partition(\"?\")\r\n scope = {\r\n \"http_version\": \"1.1\",\r\n \"method\": method,\r\n \"path\": path,\r\n \"raw_path\": path_with_query_string.encode(\"latin-1\"),\r\n \"query_string\": query_string.encode(\"latin-1\"),\r\n \"scheme\": scheme,\r\n \"type\": \"http\",\r\n }\r\n if url_vars:\r\n scope[\"url_route\"] = {\r\n \"kwargs\": url_vars\r\n }\r\n return cls(scope, None)\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1697/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1174697144, "node_id": "I_kwDOBm6k_c5GBHS4", "number": 1672, "title": "Refactor CSV handling code out of DataView", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-03-20T21:47:00Z", "updated_at": "2022-03-20T21:52:39Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> I think the way to get rid of most of the remaining complexity in `DataView` is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1672/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1174708375, "node_id": "I_kwDOBm6k_c5GBKCX", "number": 1673, "title": "Streaming CSV spends a lot of time in `table_column_details`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-03-20T22:25:28Z", "updated_at": "2022-03-20T22:34:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do:\r\n\r\n datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on'\r\n\r\nWhile investigating:\r\n- #1355\r\n\r\nAnd spotted this:\r\n```\r\ndatasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2)\r\nTotal Samples 5800\r\nGIL: 71.00%, Active: 98.00%, Threads: 4\r\n\r\n %Own %Total OwnTime TotalTime Function (filename:line) \r\n 8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212)\r\n 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614)\r\n 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81)\r\n 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120)\r\n 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571)\r\n 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140)\r\n```\r\nRelevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1673/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1174717287, "node_id": "I_kwDOBm6k_c5GBMNn", "number": 1674, "title": "Tweak design of /.json", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-03-20T22:58:01Z", "updated_at": "2022-03-20T22:58:40Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/.json\r\n\r\nCurrently:\r\n```json\r\n{\r\n \"_memory\": {\r\n \"name\": \"_memory\",\r\n \"hash\": null,\r\n \"color\": \"a6c7b9\",\r\n \"path\": \"/_memory\",\r\n \"tables_and_views_truncated\": [],\r\n \"tables_and_views_more\": false,\r\n \"tables_count\": 0,\r\n \"table_rows_sum\": 0,\r\n \"show_table_row_counts\": false,\r\n \"hidden_table_rows_sum\": 0,\r\n \"hidden_tables_count\": 0,\r\n \"views_count\": 0,\r\n \"private\": false\r\n },\r\n \"fixtures\": {\r\n \"name\": \"fixtures\",\r\n \"hash\": \"645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa\",\r\n \"color\": \"645005\",\r\n \"path\": \"/fixtures\",\r\n \"tables_and_views_truncated\": [\r\n {\r\n \"name\": \"compound_three_primary_keys\",\r\n \"columns\": [\r\n \"pk1\",\r\n \"pk2\",\r\n \"pk3\",\r\n \"content\"\r\n ],\r\n \"primary_keys\": [\r\n \"pk1\",\r\n \"pk2\",\r\n \"pk3\"\r\n ],\r\n \"count\": 1001,\r\n \"hidden\": false,\r\n \"fts_table\": null,\r\n \"num_relationships_for_sorting\": 0,\r\n \"private\": false\r\n },\r\n```\r\nAs-of this issue the `\"path\"` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns:\r\n\r\n- #1668", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1674/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1175694248, "node_id": "I_kwDOBm6k_c5GE6uo", "number": 1677, "title": "Remove `check_permission()` from `BaseView`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-03-21T17:18:18Z", "updated_at": "2022-03-21T18:45:04Z", "closed_at": "2022-03-21T18:45:03Z", "author_association": "OWNER", "pull_request": null, "body": "Follow-on from:\r\n- #1675\r\n\r\nRefs:\r\n- #1660", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1677/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1175715988, "node_id": "I_kwDOBm6k_c5GFACU", "number": 1678, "title": "Make `check_visibility()` a documented API", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-03-21T17:30:34Z", "updated_at": "2022-03-21T19:04:03Z", "closed_at": "2022-03-21T19:01:46Z", "author_association": "OWNER", "pull_request": null, "body": "Spotted this while working on:\r\n- #1677\r\n\r\nhttps://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1678/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1178521513, "node_id": "I_kwDOBm6k_c5GPs-p", "number": 1682, "title": "SQL queries against databases with different routes are broken", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-03-23T18:42:57Z", "updated_at": "2022-03-23T18:48:16Z", "closed_at": "2022-03-23T18:48:16Z", "author_association": "OWNER", "pull_request": null, "body": "500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable\r\n\r\nHere's the trace:\r\n```\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py\", line 54, in data\r\n return await QueryView(self.ds).data(\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py\", line 232, in data\r\n self.ds.get_database(database), sql\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py\", line 401, in get_database\r\n return self.databases[name]\r\nKeyError: 'fixtures-aa7318b'\r\n```\r\nIt looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago!\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1682/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1179998071, "node_id": "I_kwDOBm6k_c5GVVd3", "number": 1684, "title": "Mechanism for disabling faceting on large tables only", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-03-24T20:06:11Z", "updated_at": "2022-03-24T20:13:19Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions\r\n\r\nOne option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1684/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1182065616, "node_id": "I_kwDOBm6k_c5GdOPQ", "number": 1689, "title": "datasette.add_message() documentation is incorrect", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-03-26T20:49:42Z", "updated_at": "2022-03-26T21:35:57Z", "closed_at": "2022-03-26T20:51:21Z", "author_association": "OWNER", "pull_request": null, "body": "https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says:\r\n\r\n`.add_message(request, message, message_type=datasette.INFO)`\r\n\r\nBut in the code it's:\r\n\r\nhttps://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1689/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1197926598, "node_id": "I_kwDOBm6k_c5HZujG", "number": 1705, "title": "How to upgrade your plugin for 1.0 documentation", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-08T23:16:47Z", "updated_at": "2022-12-13T05:29:05Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Among other things, needed by:\r\n- #1704", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1705/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200224939, "node_id": "I_kwDOBm6k_c5Hifqr", "number": 1707, "title": "[feature] expanded detail page", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-11T16:29:17Z", "updated_at": "2022-04-11T16:33:00Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Right now, if click on the detail page for a row you get the info for the row and links to related tables:\r\n![Screenshot 2022-04-11 at 12-27-26 lm20 filing](https://user-images.githubusercontent.com/536941/162786802-90ac1a71-4624-47c4-ae55-b783f4f6c92d.png)\r\n\r\nIt would be very cool if there was an option to expand the rows of the related tables from within this detail view.\r\n\r\nIf you had that then datasette could fulfill a pretty common use case where you want to search for an entity and get a consolidate detail view about what you know about that entity.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1707/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200649502, "node_id": "I_kwDOBm6k_c5HkHUe", "number": 1709, "title": "Redesigned JSON API with ?_extra= parameters", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-11T22:57:49Z", "updated_at": "2022-12-13T05:29:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This will be the single biggest breaking change for the 1.0 release.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1709/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200649889, "node_id": "I_kwDOBm6k_c5HkHah", "number": 1710, "title": "Guide for plugin authors to upgrade their plugins for 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-11T22:58:25Z", "updated_at": "2022-04-11T23:04:01Z", "closed_at": "2022-04-11T23:03:25Z", "author_association": "OWNER", "pull_request": null, "body": "I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1710/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1200650491, "node_id": "I_kwDOBm6k_c5HkHj7", "number": 1711, "title": "Template context powered entirely by the JSON API format", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-11T22:59:27Z", "updated_at": "2022-12-13T05:29:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette 1.0 will have a stable template context. I'm going to achieve this by refactoring the templates to work only with keys returned by the API (or some of its extras) - then the API documentation will double up as template documentation.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1711/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1223263540, "node_id": "I_kwDOBm6k_c5I6YU0", "number": 1735, "title": "Datasette setting to disable threading (for Pyodide)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-05-02T19:31:08Z", "updated_at": "2022-05-02T23:25:49Z", "closed_at": "2022-05-02T20:13:52Z", "author_association": "OWNER", "pull_request": null, "body": "> I'm going to add a Datasette setting to disable threading entirely, designed for usage in this particular case.\r\n>\r\n> I thought about adding a new setting, then I noticed this:\r\n>\r\n> datasette mydatabase.db --setting num_sql_threads 10\r\n>\r\n> I'm going to let users set that to `0` to disable threaded execution of SQL queries.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115278325_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1735/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1212838949, "node_id": "I_kwDOBm6k_c5ISnQl", "number": 1716, "title": "Configure git blame to ignore Black commit", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-22T21:56:37Z", "updated_at": "2022-04-22T22:02:19Z", "closed_at": "2022-04-22T22:02:19Z", "author_association": "OWNER", "pull_request": null, "body": "GitHub can support this in blame views now too:\r\n\r\nhttps://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1716/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1239080102, "node_id": "I_kwDOBm6k_c5J2tym", "number": 1745, "title": "Documentation on running cog", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-05-17T19:41:06Z", "updated_at": "2022-05-17T19:45:51Z", "closed_at": "2022-05-17T19:43:45Z", "author_association": "OWNER", "pull_request": null, "body": "Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1745/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1243512344, "node_id": "I_kwDOBm6k_c5KHn4Y", "number": 1747, "title": "Add tutorials to the getting started guide", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-05-20T19:01:52Z", "updated_at": "2022-05-20T19:12:30Z", "closed_at": "2022-05-20T19:05:34Z", "author_association": "OWNER", "pull_request": null, "body": "On https://docs.datasette.io/en/stable/getting_started.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1747/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1251710928, "node_id": "I_kwDOBm6k_c5Km5fQ", "number": 1751, "title": "Add scrollbars to table presentation in default layout", "user": {"value": 408765, "label": "knutwannheden"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-05-28T19:44:57Z", "updated_at": "2022-05-28T19:52:17Z", "closed_at": "2022-05-28T19:52:17Z", "author_association": "NONE", "pull_request": null, "body": "(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.)\r\n\r\nWhen a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach.\r\n\r\nI understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1751/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1292370469, "node_id": "I_kwDOBm6k_c5NCAIl", "number": 1765, "title": "Document plugins providing new plugin hook-", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-07-03T17:05:14Z", "updated_at": "2023-08-31T23:08:24Z", "closed_at": "2023-08-31T23:06:31Z", "author_association": "OWNER", "pull_request": null, "body": "I've used this pattern twice now: https://til.simonwillison.net/datasette/register-new-plugin-hooks - in `datasette-graphql` and `datasette-low-disk-space-hook`. I should describe the pattern on https://docs.datasette.io/en/stable/writing_plugins.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1765/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1337541526, "node_id": "I_kwDOBm6k_c5PuUOW", "number": 1780, "title": "`facet_time_limit_ms` and `sql_time_limit_ms` overlap?", "user": {"value": 53165, "label": "davepeck"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-08-12T17:55:37Z", "updated_at": "2022-08-15T23:50:08Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I needed more than the default 200ms to facet a specific column in a database I was working with, so I ran `datasette` with `--setting facet_time_limit_ms 30000` — definitely overkill! \r\n\r\nBut it still didn't work; it took a moment to realize I also needed to up my `sql_time_limit_ms` to something larger too.\r\n\r\nI'm happy to submit a PR that documents this behavior if it's helpful. Or, if there's a code change we'd like to make (like making sure `sql_time_limit_ms` is always set to the larger of itself and `facet_time_limit_ms`), happy to do that too.\r\n\r\nApologies if I missed this somewhere in the docs. And: thanks. I'm really enjoying the simple, effective tooling datasette gives me out of the box for exploring my databases!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1780/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1338137350, "node_id": "I_kwDOBm6k_c5PwlsG", "number": 1781, "title": "Ensure Datasette Lite is promoted in docs and README", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8303187, "label": "Datasette 0.62"}, "comments": 1, "created_at": "2022-08-14T05:12:35Z", "updated_at": "2022-08-14T15:24:40Z", "closed_at": "2022-08-14T15:24:40Z", "author_association": "OWNER", "pull_request": null, "body": "As of 0.62 https://lite.datasette.io is a supported piece of the overall Datasette ecosystem.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1781/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1347717749, "node_id": "I_kwDOBm6k_c5QVIp1", "number": 1791, "title": "Updating metadata.json on Datasette for MacOS", "user": {"value": 1780782, "label": "ment4list"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-08-23T10:41:16Z", "updated_at": "2022-08-23T13:29:51Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I've installed Datasette for Mac as per [the documentation](https://docs.datasette.io/en/stable/installation.html#datasette-desktop-for-mac) and it's working great!\r\n\r\nHowever, I'm not sure how to go about adding something like \"[Canned Queries](https://docs.datasette.io/en/stable/sql_queries.html#canned-queries)\" or utilising other advanced features or settings by manipulating the `metadata.json` or `settings.json` files.\r\n\r\nI can view these files from the Datasette App from the top right \"burger\" menu but it only shows the contents of the file with no way to edit or change it.\r\n\r\nAm I missing something? Where can I update the `metadata.json` file using the MacOS App?\r\n\r\nPS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1791/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1366915240, "node_id": "I_kwDOBm6k_c5ReXio", "number": 1807, "title": "Plugin ecosystem needs to avoid crashes due to no available databases", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-09-08T19:54:34Z", "updated_at": "2022-09-08T20:14:05Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Opening this here to track the issue first reported in:\r\n- https://github.com/simonw/datasette-upload-dbs/issues/5\r\n\r\nPlugins that expect to be able to write to a database need to not crash in situations where no writable database is available.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1807/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1377811868, "node_id": "I_kwDOBm6k_c5SH72c", "number": 1813, "title": "missing next and next_url in JSON responses from an instance deployed on Fly ", "user": {"value": 883348, "label": "adipasquale"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-09-19T11:32:34Z", "updated_at": "2022-09-19T11:34:45Z", "closed_at": "2022-09-19T11:34:45Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "\ud83d\udc4b thank you for an incredibly useful project! \r\n\r\nI have noticed that my deployed instance on Fly does not include the `next` and `next_url` keys even for a truncated response :\r\n\r\n\"Screenshot\r\n \r\n\r\nThis is publically accessible here: `https://collectif-objets-datasette.fly.dev/collectif-objets.json?sql=select+*+from+mairies`\r\n\r\nHowever when I run the dataset server locally with the same data I get these next keys for the exact same query:\r\n\r\n\"Screenshot\r\n\r\nI am wondering if I've missed some config or something specific to deployments on Fly.io? \r\n\r\nI am running datasette v0.62, without any specific config : \r\n\r\n- locally `poetry run datasette data/collectif-objets.sqlite`\r\n- for the deploy : `poetry run datasette publish fly data/collectif-objets.sqlite`\r\n\r\nas visible in [the Makefile](https://github.com/adipasquale/collectif-objets-datasette/blob/main/Makefile). _The very limited codebase is public but the sqlite db is not versioned yet because it is too large._", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1813/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1386734383, "node_id": "I_kwDOBm6k_c5Sp-Mv", "number": 1821, "title": "Release Datasette 0.63a0", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-09-26T21:15:27Z", "updated_at": "2022-09-26T22:06:39Z", "closed_at": "2022-09-26T22:06:39Z", "author_association": "OWNER", "pull_request": null, "body": "> - The [prepare_jinja2_environment(env, datasette)](https://docs.datasette.io/en/latest/plugin_hooks.html#plugin-hook-prepare-jinja2-environment) plugin hook now accepts an optional `datasette` argument. Hook implementations can also now return an `async` function which will be awaited automatically. ([#1809](https://github.com/simonw/datasette/issues/1809))\r\n> - `--load-extension` option now supports entrypoints. Thanks, Alex Garcia. ([#1789](https://github.com/simonw/datasette/pull/1789))\r\n> - New tutorial: [Cleaning data with sqlite-utils and Datasette](https://datasette.io/tutorials/clean-data).\r\n> - Facet size can now be set per-table with the new `facet_size` table metadata option. ([#1804](https://github.com/simonw/datasette/issues/1804))\r\n> - `truncate_cells_html` setting now also affects long URLs in columns. ([#1805](https://github.com/simonw/datasette/issues/1805))\r\n> - `Database(is_mutable=)` now defaults to `True`. ([#1808](https://github.com/simonw/datasette/issues/1808))\r\n> - Non-JavaScript textarea now increases height to fit the SQL query. ([#1786](https://github.com/simonw/datasette/issues/1786))\r\n> - More detailed command descriptions on the [CLI reference](https://docs.datasette.io/en/latest/cli-reference.html#cli-reference) page. ([#1787](https://github.com/simonw/datasette/issues/1787))\r\n> - Datasette no longer enforces upper bounds on its depenedencies. ([#1800](https://github.com/simonw/datasette/issues/1800))\r\n> - Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. ([#1794](https://github.com/simonw/datasette/pull/1794))\r\n> - The `settings.json` file used in [Configuration directory mode](https://docs.datasette.io/en/latest/settings.html#config-dir) is now validated on startup. ([#1816](https://github.com/simonw/datasette/issues/1816))", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1821/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1387712501, "node_id": "I_kwDOBm6k_c5Sts_1", "number": 1824, "title": "Convert &_hide_sql=1 to #_hide_sql", "user": {"value": 562352, "label": "CharlesNepote"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-09-27T12:53:31Z", "updated_at": "2022-10-05T12:56:27Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hiding the SQL textarea with `&_hide_sql=1` enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries).\r\n\r\nIt could probably be done with a few lines of Javascript (I'm going to see if I can do that).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1824/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1425682079, "node_id": "I_kwDOBm6k_c5U-i6f", "number": 1865, "title": "Stop syncing main to master", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-10-27T13:55:38Z", "updated_at": "2022-10-27T13:58:27Z", "closed_at": "2022-10-27T13:56:13Z", "author_association": "OWNER", "pull_request": null, "body": "I think it's been long enough now that I can drop the code that syncs the main branch to master.\r\n\r\nI originally added this for people who might be using `datasette publish ... --branch master` - which might only have been me anyway!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1865/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1423347412, "node_id": "I_kwDOBm6k_c5U1o7U", "number": 1857, "title": "Prevent API tokens from using /-/create-token to create more tokens", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8658075, "label": "Datasette 1.0a0"}, "comments": 1, "created_at": "2022-10-26T02:38:09Z", "updated_at": "2022-11-15T19:57:11Z", "closed_at": "2022-10-26T02:57:26Z", "author_association": "OWNER", "pull_request": null, "body": "> It strikes me that users should NOT be able to use a token to create additional tokens.\r\n>\r\n> The current design actually does allow that, since the `dstok_` Bearer token can be used to authenticate calls to the `/-/create-token` page.\r\n>\r\n> So I think I need a mechanism whereby that page can only allow access to users authenticated by cookie.\r\n> \r\n> Not obvious how to do that though, since Datasette's authentication actor system is designed to abstract that detail away!\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1291417100_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1857/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1426080014, "node_id": "I_kwDOBm6k_c5VAEEO", "number": 1867, "title": "/db/table/-/rename API (also allows atomic replace)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-10-27T18:13:23Z", "updated_at": "2023-01-09T15:34:12Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first.\r\n>\r\n> ...\r\n>\r\n> If people care about that kind of thing they could always push all of their inserts to a table called `_tablename` and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism).\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1867/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1432012302, "node_id": "I_kwDOBm6k_c5VWsYO", "number": 1877, "title": "Refactor and tidy up final write API code", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-11-01T20:00:11Z", "updated_at": "2022-11-29T19:44:16Z", "closed_at": "2022-11-29T19:44:07Z", "author_association": "OWNER", "pull_request": null, "body": "- `views/table.py` has got a bit too big - I think the write classes should be pulled out into a separate module.\r\n- [x] There's duplicate logic for deciding if the table and database exist and checking permissions", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1877/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1447439985, "node_id": "I_kwDOBm6k_c5WRi5x", "number": 1888, "title": "API explorer should take immutability into account", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8658075, "label": "Datasette 1.0a0"}, "comments": 1, "created_at": "2022-11-14T06:00:14Z", "updated_at": "2022-11-15T19:59:10Z", "closed_at": "2022-11-14T06:04:48Z", "author_association": "OWNER", "pull_request": null, "body": "Refs:\r\n- #1871\r\n\r\nI noticed the API explorer doesn't show any links on https://latest-1-0-dev.datasette.io/-/api because the `fixtures` database is immutable.\r\n\r\nIt should still show read examples there.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1888/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1466952626, "node_id": "I_kwDOBm6k_c5Xb-uy", "number": 1909, "title": "Option to sort facets alphabetically", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-11-28T19:18:14Z", "updated_at": "2022-11-28T19:19:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Suggested here:\r\n- https://github.com/simonw/datasette/discussions/1908", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1909/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1468689139, "node_id": "I_kwDOBm6k_c5Ximrz", "number": 1914, "title": "Finalize design of JSON for Datasette 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-11-29T20:59:10Z", "updated_at": "2022-12-13T06:15:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Tracking issue.\r\n\r\n- [ ] #1709\r\n- [ ] #1729\r\n- [ ] #1875", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1914/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1469015001, "node_id": "I_kwDOBm6k_c5Xj2PZ", "number": 1916, "title": "GET requests against POST endpoints should not 500 error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7867486, "label": "Datasette 1.0a1"}, "comments": 1, "created_at": "2022-11-30T04:04:43Z", "updated_at": "2022-11-30T05:15:19Z", "closed_at": "2022-11-30T05:15:19Z", "author_association": "OWNER", "pull_request": null, "body": "![CF37BA4D-0677-4DDD-A339-EAF163BB63B7](https://user-images.githubusercontent.com/9599/204705025-6f88e9f7-757d-45e8-a89c-ab97e84781e8.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1916/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1469044738, "node_id": "I_kwDOBm6k_c5Xj9gC", "number": 1918, "title": "API explorer should list mutable databases first", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7867486, "label": "Datasette 1.0a1"}, "comments": 1, "created_at": "2022-11-30T04:53:33Z", "updated_at": "2022-11-30T05:22:07Z", "closed_at": "2022-11-30T05:07:56Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/-/api hides `ephemeral` down at the bottom, would be more interesting if it was at the top.\r\n\r\nRelated:\r\n- #1915 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1918/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1469062686, "node_id": "I_kwDOBm6k_c5XkB4e", "number": 1919, "title": "Intermittent `test_delete_row` test failure ", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-11-30T05:18:46Z", "updated_at": "2022-11-30T05:20:56Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/actions/runs/3580503393/jobs/6022689591\r\n\r\n```\r\n delete_response = await ds_write.client.post(\r\n \"/data/{}/{}/-/delete\".format(table, delete_path),\r\n headers={\r\n \"Authorization\": \"***\".format(write_token(ds_write)),\r\n },\r\n )\r\n> assert delete_response.status_code == 200\r\nE assert 404 == 200\r\nE + where 404 = .status_code\r\n\r\n/home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError\r\n=========================== short test summary info ============================\r\nFAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200\r\n + where 404 = .status_code\r\n```\r\nThis passes most of the time, but very occasionally fails - in this case in Python 3.7\r\n\r\nIt seems to only fail for the `article,k` compound primary key test.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1919/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1471969984, "node_id": "I_kwDOBm6k_c5XvHrA", "number": 1926, "title": "Release notes for 1.0a1 (and release it)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7867486, "label": "Datasette 1.0a1"}, "comments": 1, "created_at": "2022-12-01T21:18:12Z", "updated_at": "2022-12-01T22:06:13Z", "closed_at": "2022-12-01T22:06:12Z", "author_association": "OWNER", "pull_request": null, "body": "Mainly CORS support and a few small bug fixes.\r\n\r\nChanges: https://github.com/simonw/datasette/compare/1.0a0...99da46f7258225fc6fd8e94ddc20859ccccc4109", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1926/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1490576818, "node_id": "I_kwDOBm6k_c5Y2GWy", "number": 1943, "title": "`/-/permissions` should list available permissions", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-12-11T23:38:03Z", "updated_at": "2022-12-15T00:41:37Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Idea: a `/-/permissions` introspection endpoint for listing registered permissions\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1939#issuecomment-1345691103_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1943/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1486036269, "node_id": "I_kwDOBm6k_c5Ykx0t", "number": 1941, "title": "Mechanism for supporting key rotation for DATASETTE_SECRET", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-09T05:24:53Z", "updated_at": "2022-12-09T05:25:20Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire.\r\n\r\nAdding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly.\r\n\r\nCan model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1941/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1487738738, "node_id": "I_kwDOBm6k_c5YrRdy", "number": 1942, "title": "Option for plugins to request that JSON be served on the page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-12-10T01:08:53Z", "updated_at": "2022-12-10T01:11:30Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say \"I'd like the JSON for a page to be included in a variable on the HTML page\"?\r\n\r\n`datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data.\r\n\r\nThis idea fits with my overall goals to unify the JSON and HTML context too.\r\n\r\nRefs:\r\n- #1711", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1942/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1493404423, "node_id": "I_kwDOBm6k_c5ZA4sH", "number": 1948, "title": "500 error on permission debug page when testing actors with _r", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-13T05:22:03Z", "updated_at": "2022-12-13T05:22:19Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "\"image\"\r\n\r\nThe 500 error is silent unless you are looking at the DevTools network pane.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1948/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1495716243, "node_id": "I_kwDOBm6k_c5ZJtGT", "number": 1952, "title": "Improvements to /-/create-token restrictions interface", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-12-14T05:22:39Z", "updated_at": "2022-12-14T05:23:13Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks.\r\n>\r\n> That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402_\r\n\r\nAlso, DO show the `_memory` database there if Datasette was started in `--crossdb` mode.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1952/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1497577017, "node_id": "I_kwDOBm6k_c5ZQzY5", "number": 1957, "title": "Reconsider row value truncation on query page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-14T23:49:47Z", "updated_at": "2022-12-14T23:50:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos\r\n\r\n```sql\r\nselect json_group_array(full_name) from repos\r\n```\r\n\r\n![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png)\r\n\r\nMy intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json\r\n\r\nThe truncation isn't helping here.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1957/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1500636982, "node_id": "I_kwDOBm6k_c5Zcec2", "number": 1962, "title": "Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-16T17:56:51Z", "updated_at": "2022-12-16T21:55:29Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead:\r\n- #1959\r\n\r\nBut I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result.\r\n\r\nThe main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead.\r\n\r\nThis requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs:\r\n\r\n- #1843", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1962/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1501778647, "node_id": "I_kwDOBm6k_c5Zg1LX", "number": 1964, "title": "Cog menu is not keyboard accessible (also no ARIA)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-18T06:36:28Z", "updated_at": "2022-12-18T06:37:28Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This menu here: https://latest.datasette.io/fixtures/attraction_characteristic\r\n\r\nYou can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard.\r\n\r\n![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1964/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1501900064, "node_id": "I_kwDOBm6k_c5ZhS0g", "number": 1966, "title": "Broken link to live demo in Getting started docs", "user": {"value": 7551922, "label": "lbellomo"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-18T13:17:00Z", "updated_at": "2022-12-31T19:15:19Z", "closed_at": "2022-12-31T19:15:10Z", "author_association": "NONE", "pull_request": null, "body": "The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1966/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1524983536, "node_id": "I_kwDOBm6k_c5a5Wbw", "number": 1981, "title": "Canned query field labels truncated", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-09T06:04:24Z", "updated_at": "2023-01-09T06:05:44Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776\r\n\r\n![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1981/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1528448642, "node_id": "I_kwDOBm6k_c5bGkaC", "number": 1985, "title": "Don't let Datasette(path) without a list cause weird errors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-11T05:17:44Z", "updated_at": "2023-01-11T18:25:04Z", "closed_at": "2023-01-11T18:25:04Z", "author_association": "OWNER", "pull_request": null, "body": "I got a confusing `sqlite3.OperationalError: disk I/O error` error in my tests, it turned out it was because this:\r\n```python\r\nds = Datasette(path)\r\n```\r\nShould have been this:\r\n```python\r\nds = Datasette([path])\r\n```\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1985/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1536851861, "node_id": "I_kwDOBm6k_c5bmn-V", "number": 1994, "title": "Stuck on loading screen", "user": {"value": 10913053, "label": "jackhagley"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-17T18:33:49Z", "updated_at": "2023-01-23T08:21:08Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Can\u2019t actually open it!\r\n\r\nDownloaded today from the releases tab\r\n\r\nRunning macOS13.1 \r\n\r\n```\r\nbin/python3.9 --version\r\nPython 3.9.6\r\nTook 83ms\r\nbin/python3.9 --version\r\nPython 3.9.6\r\nTook 113ms\r\nbin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check\r\nRequirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63)\r\nRequirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6)\r\nRequirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2)\r\nRequirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2)\r\nRequirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2)\r\nRequirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1)\r\nRequirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1)\r\nRequirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2)\r\nRequirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3)\r\nRequirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3)\r\nRequirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1)\r\nRequirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0)\r\nRequirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0)\r\nRequirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0)\r\nRequirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9)\r\nRequirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2)\r\nRequirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0)\r\nRequirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2)\r\nRequirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2)\r\nRequirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0)\r\nRequirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0)\r\nRequirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2)\r\nRequirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4)\r\nRequirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30)\r\nRequirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3)\r\nRequirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5)\r\nRequirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0)\r\nRequirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24)\r\nRequirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0)\r\nRequirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0)\r\nRequirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0)\r\nRequirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2)\r\nRequirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4)\r\nRequirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0)\r\nRequirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1)\r\nRequirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0)\r\nRequirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2)\r\nRequirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3)\r\nRequirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9)\r\nRequirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0)\r\nTook 784ms\r\n```\r\nSTUCK", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1994/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1557507274, "node_id": "I_kwDOBm6k_c5c1azK", "number": 2005, "title": "`extra_template_vars` should be OK to return `None`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-26T01:40:45Z", "updated_at": "2023-01-26T01:41:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Got this exception and had to make sure it always returned `{}`:\r\n\r\n```\r\n File \".../python3.11/site-packages/datasette/app.py\", line 1049, in render_template\r\n assert isinstance(extra_vars, dict), \"extra_vars is of type {}\".format(\r\nAssertionError: extra_vars is of type \r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2005/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1552368054, "node_id": "I_kwDOBm6k_c5ch0G2", "number": 2000, "title": "rewrite_sql hook", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-23T01:02:52Z", "updated_at": "2023-01-23T06:08:01Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like\r\n\r\n```python\r\ndef rewrite_sql(datasette, database, request, fn, sql, params)\r\n```\r\n\r\nIt would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement.\r\n\r\nThe hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed.\r\n\r\nPlugins that could be written with this hook:\r\n\r\n- https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching\r\n- a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004))\r\n- a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID\r\n- a plugin to maintain audit tables when users write to a table\r\n- a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue\r\n\r\nFlaws with this idea:\r\n\r\n`execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2000/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1560662739, "node_id": "I_kwDOBm6k_c5dBdLT", "number": 2007, "title": "`render_cell()` hook should take an optional `request` argument", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-28T03:13:00Z", "updated_at": "2023-08-09T17:15:03Z", "closed_at": "2023-01-28T03:34:26Z", "author_association": "OWNER", "pull_request": null, "body": "From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2007/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1563264257, "node_id": "I_kwDOBm6k_c5dLYUB", "number": 2010, "title": "Row page should default to card view", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2023-01-30T21:49:37Z", "updated_at": "2023-01-30T21:52:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette currently uses the same table layout on the row pages as it does on the table pages:\r\n\r\nhttps://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect\r\n\r\n\"image\"\r\n\r\nhttps://datasette.io/content/pypi_packages/datasette-column-inspect\r\n\r\n\"image\"\r\n\r\nIf you shrink down to mobile width you get this instead, on both of those pages:\r\n\r\n\"image\"\r\n\r\nI think that view, which I think of as the \"card view\", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2010/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1564769997, "node_id": "I_kwDOBm6k_c5dRH7N", "number": 2011, "title": "Applied facet did not result in an \"x\" icon to dismiss it", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-31T17:57:44Z", "updated_at": "2023-01-31T17:58:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png)\r\n\r\nThat's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata\r\n\r\nIt's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2011/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1571207083, "node_id": "I_kwDOBm6k_c5dprer", "number": 2016, "title": "Database metadata fields like description are not available in the index page template's context", "user": {"value": 9993, "label": "palewire"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2023-02-05T02:25:53Z", "updated_at": "2023-02-05T22:56:43Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2016/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1578609658, "node_id": "I_kwDOBm6k_c5eF6v6", "number": 2022, "title": "Error 500 - not clear the cause", "user": {"value": 1667631, "label": "DavidPratten"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-02-09T20:57:17Z", "updated_at": "2023-02-09T21:13:50Z", "closed_at": "2023-02-09T21:13:50Z", "author_association": "NONE", "pull_request": null, "body": "On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error.\r\n\r\nhttp://127.0.0.1:8001/literature/authors_papers?authorId=100550354\r\n\r\nThe cause of the error is not apparent.\r\n\r\nIs this expected behaviour?\r\n\r\nDavid", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2022/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1579973223, "node_id": "I_kwDOBm6k_c5eLHpn", "number": 2024, "title": "Mention WAL mode in documentation", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-02-10T16:11:10Z", "updated_at": "2023-02-10T16:11:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2024/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1649791661, "node_id": "I_kwDOBm6k_c5iVdKt", "number": 2050, "title": "Row page JSON should use new ?_extra= format", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2023-03-31T17:56:53Z", "updated_at": "2023-03-31T17:59:49Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures/facetable/2.json\r\n\r\nRelated:\r\n- #2049\r\n- #1709 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2050/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1665053646, "node_id": "I_kwDOBm6k_c5jPrPO", "number": 2059, "title": "\"Deceptive site ahead\" alert on Heroku deployment", "user": {"value": 1186275, "label": "mtdukes"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-04-12T18:34:51Z", "updated_at": "2023-04-13T01:13:01Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a \"Deceptive site ahead\" warning to users.\r\n\r\nIs there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2059/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1690765434, "node_id": "I_kwDOBm6k_c5kxwh6", "number": 2067, "title": "Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6", "user": {"value": 39538958, "label": "justmars"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-05-01T12:42:28Z", "updated_at": "2023-05-03T00:16:03Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi! Wondering if this issue is limited to my local system or if it affects others as well. \r\n\r\nIt seems like 3.11 errors out on a \"litestream-restored\" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6.\r\n\r\nTo demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli.\r\n\r\n```sh\r\n# create new shell with 3.11.3\r\nlitestream restore -o data/db.sqlite s3://mytestbucketxx/db\r\nsqlite3 data/db.sqlite \r\n# SQLite version 3.41.2 2023-03-22 11:56:21\r\n# Enter \".help\" for usage hints.\r\n# sqlite> .tables\r\n# _litestream_lock _litestream_seq movie \r\n# sqlite> \r\n```\r\n\r\nHowever this get me an `OperationalError` when reading via datasette:\r\n\r\n
\r\nError on 3.11.3 and 3.10.8\r\n\r\n```sh\r\ndatasette data/db.sqlite\r\n```\r\n\r\n```console\r\n/tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API\r\n warnings.warn(\"pkg_resources is deprecated as an API\", DeprecationWarning)\r\nTraceback (most recent call last):\r\n File \"/tester/.venv/bin/datasette\", line 8, in \r\n sys.exit(cli())\r\n ^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/click/core.py\", line 1130, in __call__\r\n return self.main(*args, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/click/core.py\", line 1055, in main\r\n rv = self.invoke(ctx)\r\n ^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/click/core.py\", line 1657, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/click/core.py\", line 1404, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/click/core.py\", line 760, in invoke\r\n return __callback(*args, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/cli.py\", line 143, in wrapped\r\n return fn(*args, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/cli.py\", line 615, in serve\r\n asyncio.get_event_loop().run_until_complete(check_databases(ds))\r\n File \"/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py\", line 653, in run_until_complete\r\n return future.result()\r\n ^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/cli.py\", line 660, in check_databases\r\n await database.execute_fn(check_connection)\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/database.py\", line 213, in execute_fn\r\n return await asyncio.get_event_loop().run_in_executor(\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py\", line 58, in run\r\n result = self.fn(*self.args, **self.kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/database.py\", line 211, in in_thread\r\n return fn(conn)\r\n ^^^^^^^^\r\n File \"/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py\", line 951, in check_connection\r\n for r in conn.execute(\r\n ^^^^^^^^^^^^^\r\nsqlite3.OperationalError: unable to open database file\r\n```\r\n\r\n
\r\n\r\n\r\n
\r\nWorks on 3.10.7, 3.10.6\r\n\r\n```sh\r\n# create new shell with 3.10.7 / 3.10.6\r\nlitestream restore -o data/db.sqlite s3://mytestbucketxx/db\r\ndatasette data/db.sqlite\r\n# ...\r\n# INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)\r\n```\r\n\r\n
\r\n\r\nIn both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2067/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1698865182, "node_id": "I_kwDOBm6k_c5lQqAe", "number": 2069, "title": "[BUG] Cannot insert new data to deployed instance", "user": {"value": 31861128, "label": "yqlbu"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-05-07T02:59:42Z", "updated_at": "2023-05-07T03:17:35Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "## Summary\r\n\r\nRecently, I deployed an instance of datasette to Vercel with the following plugins:\r\n\r\n- datasette-auth-tokens\r\n- datasette-insert\r\n\r\nWith the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel:\r\n\r\n```console\r\nFile \"/var/task/datasette/database.py\", line 179, in _execute_writes\r\nconn = self.connect(write=True)\r\nFile \"/var/task/datasette/database.py\", line 93, in connect\r\nassert not (write and not self.is_mutable)\r\nAssertionError\r\n``` \r\n\r\n\"image\"\r\n\r\nI think it is a potential bug.\r\n\r\n## Reproduce\r\n\r\n
metadata.json\r\n
\r\n\r\n```json\r\n{\r\n \"plugins\": {\r\n \"datasette-insert\": {\r\n \"allow\": {\r\n \"id\": \"*\"\r\n }\r\n },\r\n \"datasette-auth-tokens\": {\r\n \"tokens\": [\r\n {\r\n \"token\": {\r\n \"$env\": \"INSERT_TOKEN\"\r\n },\r\n \"actor\": {\r\n \"id\": \"repeater\"\r\n }\r\n }\r\n ],\r\n \"param\": \"_auth_token\"\r\n }\r\n }\r\n}\r\n```\r\n\r\n
\r\n\r\n
commands\r\n
\r\n\r\n```bash\r\n# deploy\r\ndatasette publish vercel remote.db \\\r\n --project=repeater-bot-sqlite \\\r\n --metadata metadata.json \\\r\n --install datasette-auth-tokens \\\r\n --install datasette-insert \\\r\n --vercel-json=vercel.json\r\n\r\n# test insert\r\ncat fixtures/dogs.json | curl --request POST -d @- -H \"Authorization: Bearer \" \\\r\n 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id'\r\n```\r\n\r\n
\r\n\r\n
logs\r\n
\r\n\r\n```console\r\nTraceback (most recent call last):\r\nFile \"/var/task/datasette/app.py\", line 1354, in route_path\r\nresponse = await view(request, send)\r\nFile \"/var/task/datasette/app.py\", line 1500, in async_view_fn\r\nresponse = await async_call_with_supported_arguments(\r\nFile \"/var/task/datasette/utils/__init__.py\", line 1005, in async_call_with_supported_arguments\r\nreturn await fn(*call_with)\r\nFile \"/var/task/datasette_insert/__init__.py\", line 14, in insert_or_upsert\r\nresponse = await insert_or_upsert_implementation(request, datasette)\r\nFile \"/var/task/datasette_insert/__init__.py\", line 91, in insert_or_upsert_implementation\r\ntable_count = await db.execute_write_fn(write_in_thread, block=True)\r\nFile \"/var/task/datasette/database.py\", line 167, in execute_write_fn\r\nraise result\r\nFile \"/var/task/datasette/database.py\", line 179, in _execute_writes\r\nconn = self.connect(write=True)\r\nFile \"/var/task/datasette/database.py\", line 93, in connect\r\nassert not (write and not self.is_mutable)\r\nAssertionError\r\n```\r\n\r\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2069/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1803264272, "node_id": "I_kwDOBm6k_c5re6EQ", "number": 2101, "title": "alter: true support for JSON write API", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-07-13T15:24:11Z", "updated_at": "2023-07-13T15:24:18Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642\r\n\r\n> The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2101/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1822936521, "node_id": "I_kwDOBm6k_c5sp83J", "number": 2110, "title": "Merge database index page and query view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 9700784, "label": "Datasette 1.0a3"}, "comments": 1, "created_at": "2023-07-26T18:21:57Z", "updated_at": "2023-07-26T19:53:25Z", "closed_at": "2023-07-26T19:53:25Z", "author_association": "OWNER", "pull_request": null, "body": "Refs:\r\n- #2109\r\n\r\nThe idea here is that hitting `/content` without a `?sql=` will show an empty result set AND default to including a bunch of extras about the list of tables in the database.\r\n\r\nThen I won't have to think about `/content` and `/content?sql=` as separate pages any more.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2110/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1822982933, "node_id": "I_kwDOBm6k_c5sqIMV", "number": 2117, "title": "Figure out what to do about `DatabaseView.name`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 9700784, "label": "Datasette 1.0a3"}, "comments": 1, "created_at": "2023-07-26T18:58:06Z", "updated_at": "2023-08-08T02:02:07Z", "closed_at": "2023-08-08T02:02:07Z", "author_association": "OWNER", "pull_request": null, "body": "In the old code:\r\n\r\nhttps://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35\r\n\r\nThis `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54\r\n\r\nFigure out how that should work once I've refactored those classes to view functions instead.\r\n\r\nRefs:\r\n- #2109 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2117/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1843821954, "node_id": "I_kwDOBm6k_c5t5n2C", "number": 2137, "title": "Redesign row default JSON", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2023-08-09T18:49:11Z", "updated_at": "2023-08-09T19:02:47Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This URL here:\r\n\r\nhttps://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables\r\n\r\n```json\r\n{\r\n \"database\": \"fixtures\",\r\n \"table\": \"simple_primary_key\",\r\n \"rows\": [\r\n {\r\n \"id\": \"1\",\r\n \"content\": \"hello\"\r\n }\r\n ],\r\n \"columns\": [\r\n \"id\",\r\n \"content\"\r\n ],\r\n \"primary_keys\": [\r\n \"id\"\r\n ],\r\n \"primary_key_values\": [\r\n \"1\"\r\n ],\r\n \"units\": {},\r\n \"foreign_key_tables\": [\r\n {\r\n \"other_table\": \"foreign_key_references\",\r\n \"column\": \"id\",\r\n \"other_column\": \"foreign_key_with_blank_label\",\r\n \"count\": 0,\r\n \"link\": \"/fixtures/foreign_key_references?foreign_key_with_blank_label=1\"\r\n },\r\n {\r\n \"other_table\": \"foreign_key_references\",\r\n \"column\": \"id\",\r\n \"other_column\": \"foreign_key_with_label\",\r\n \"count\": 1,\r\n \"link\": \"/fixtures/foreign_key_references?foreign_key_with_label=1\"\r\n },\r\n {\r\n \"other_table\": \"complex_foreign_keys\",\r\n \"column\": \"id\",\r\n \"other_column\": \"f3\",\r\n \"count\": 1,\r\n \"link\": \"/fixtures/complex_foreign_keys?f3=1\"\r\n },\r\n {\r\n \"other_table\": \"complex_foreign_keys\",\r\n \"column\": \"id\",\r\n \"other_column\": \"f2\",\r\n \"count\": 0,\r\n \"link\": \"/fixtures/complex_foreign_keys?f2=1\"\r\n },\r\n {\r\n \"other_table\": \"complex_foreign_keys\",\r\n \"column\": \"id\",\r\n \"other_column\": \"f1\",\r\n \"count\": 1,\r\n \"link\": \"/fixtures/complex_foreign_keys?f1=1\"\r\n }\r\n ],\r\n \"query_ms\": 4.226590999678592,\r\n \"source\": \"tests/fixtures.py\",\r\n \"source_url\": \"https://github.com/simonw/datasette/blob/main/tests/fixtures.py\",\r\n \"license\": \"Apache License 2.0\",\r\n \"license_url\": \"https://github.com/simonw/datasette/blob/main/LICENSE\",\r\n \"ok\": true,\r\n \"truncated\": false\r\n}\r\n```\r\n\r\nThat `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2137/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null}