{"id": 828858421, "node_id": "MDU6SXNzdWU4Mjg4NTg0MjE=", "number": 1258, "title": "Allow canned query params to specify default values", "user": {"value": 1385831, "label": "wdccdw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-03-11T07:19:02Z", "updated_at": "2023-02-20T23:39:58Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out.\r\n\r\n![image](https://user-images.githubusercontent.com/1385831/110748683-13e72300-820e-11eb-855c-32e03dfef5bf.png)\r\n\r\nIs there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1258/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1592327343, "node_id": "I_kwDOBm6k_c5e6Pyv", "number": 2029, "title": "Sorry Simon, didn't know how else to contact you", "user": {"value": 5804626, "label": "llchristopherson"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-20T19:02:53Z", "updated_at": "2023-02-20T19:02:53Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi Simon,\r\n\r\nWould you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org.\r\n\r\nThanks,\r\nLaura", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2029/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1586980089, "node_id": "PR_kwDOBm6k_c5KF-by", "number": 2026, "title": "Avoid repeating primary key columns if included in _col args", "user": {"value": 8513, "label": "runderwood"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-16T04:16:25Z", "updated_at": "2023-02-16T04:16:41Z", "closed_at": null, "author_association": "FIRST_TIME_CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/2026", "body": "...while maintaining given order.\r\n\r\nFixes #1975 (if I'm understanding correctly).\r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--2026.org.readthedocs.build/en/2026/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2026/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1581218043, "node_id": "PR_kwDOBm6k_c5JyqPy", "number": 2025, "title": "Add database metadata to index.html template context", "user": {"value": 9993, "label": "palewire"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-12T11:16:58Z", "updated_at": "2023-02-12T11:17:14Z", "closed_at": null, "author_association": "FIRST_TIME_CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/2025", "body": "Fixes #2016 \r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--2025.org.readthedocs.build/en/2025/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2025/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1323346408, "node_id": "I_kwDOBm6k_c5O4Kno", "number": 1775, "title": "i18n support", "user": {"value": 428820, "label": "johnfelipe"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2022-07-31T02:51:04Z", "updated_at": "2023-02-10T18:04:40Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I want contribute for translate UI to es, de, de and it if you share strings", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1775/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1579973223, "node_id": "I_kwDOBm6k_c5eLHpn", "number": 2024, "title": "Mention WAL mode in documentation", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-02-10T16:11:10Z", "updated_at": "2023-02-10T16:11:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2024/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1577548579, "node_id": "I_kwDOBm6k_c5eB3sj", "number": 2021, "title": "Docker images for 1.0 alphas?", "user": {"value": 1563881, "label": "meowcat"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-09T09:35:52Z", "updated_at": "2023-02-09T09:35:52Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nwould you consider putting 1.0alpha images on Dockerhub? \r\n\r\n(Also, how usable are the alphas?)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2021/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1575880841, "node_id": "I_kwDOBm6k_c5d7giJ", "number": 2020, "title": "Documentation refers to \"off\" setting; doesn't seem to work, \"false\" does", "user": {"value": 1350673, "label": "dmick"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-08T10:38:10Z", "updated_at": "2023-02-08T10:38:10Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using \"off\" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a \"JSON boolean\" and have the values \"true\" or \"false\". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2020/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1573424830, "node_id": "I_kwDOBm6k_c5dyI6-", "number": 2019, "title": "Refactor out the keyset pagination code", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 14, "created_at": "2023-02-06T23:04:00Z", "updated_at": "2023-02-08T01:40:46Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "While working on:\r\n- #1999\r\n\r\nI noticed that some of the most complex code in the existing table view is the code that implements keyset pagination:\r\n\r\nhttps://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493\r\n\r\nExtracting that into a utility function would simplify that code a lot.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2019/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 473288428, "node_id": "MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz", "number": 564, "title": "First proof-of-concept of Datasette Library", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-07-26T10:22:26Z", "updated_at": "2023-02-07T15:14:11Z", "closed_at": null, "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/564", "body": "Refs #417. Run it like this:\r\n\r\n datasette -d ~/Library\r\n\r\nUses a new plugin hook - available_databases()\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/564/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 1, "state_reason": null} {"id": 1571711808, "node_id": "I_kwDOBm6k_c5drmtA", "number": 2018, "title": "`check_visibility` gives confusing (wrong?) results if permission is `None`", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-06T01:03:08Z", "updated_at": "2023-02-06T01:03:46Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table.\r\n\r\nI expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this:\r\n\r\n```python\r\n visible, private = await datasette.check_visibility(\r\n request.actor,\r\n permissions=[\r\n (\"update-row\", (database, table)),\r\n ],\r\n )\r\n\r\n if not visible:\r\n return None\r\n```\r\n\r\nBut `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.)\r\n\r\nBased on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false.\r\n\r\nI think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available.\r\n\r\nBut `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2018/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1571207083, "node_id": "I_kwDOBm6k_c5dprer", "number": 2016, "title": "Database metadata fields like description are not available in the index page template's context", "user": {"value": 9993, "label": "palewire"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2023-02-05T02:25:53Z", "updated_at": "2023-02-05T22:56:43Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2016/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1556065335, "node_id": "PR_kwDOBm6k_c5Ie5nA", "number": 2004, "title": "use single quotes for string literals, fixes #2001", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-25T05:08:46Z", "updated_at": "2023-02-01T06:37:18Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/2004", "body": "This modernizes some uses of double quotes for string literals to use only single quotes, fixes simonw/datasette#2001\r\n\r\nWhile developing it, I manually enabled the stricter mode by using the code snippet at https://gist.github.com/cldellow/85bba507c314b127f85563869cd94820\r\n\r\nI think that code snippet isn't generally safe/portable, so I haven't tried to automate it in the tests.\r\n\r\n\r\n----\r\n:books: Documentation preview :books:: https://datasette--2004.org.readthedocs.build/en/2004/\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2004/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1565179870, "node_id": "I_kwDOBm6k_c5dSr_e", "number": 2013, "title": "Datasette uses non-standard quoting for identifiers", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-02-01T00:05:39Z", "updated_at": "2023-02-01T00:06:30Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Related to #2001, but where #2001 was about literals, this is about identifiers\r\n\r\nFrom https://www.sqlite.org/lang_keywords.html:\r\n\r\n> \"keyword\" A keyword in double-quotes is an identifier.\r\n> [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility.\r\n\r\nDatasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures.\r\n\r\nMigrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2013/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1564774831, "node_id": "I_kwDOBm6k_c5dRJGv", "number": 2012, "title": "Missing space in database summary", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-31T18:01:13Z", "updated_at": "2023-01-31T18:01:13Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Spotted this on an instance index page:\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2012/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1564769997, "node_id": "I_kwDOBm6k_c5dRH7N", "number": 2011, "title": "Applied facet did not result in an \"x\" icon to dismiss it", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-31T17:57:44Z", "updated_at": "2023-01-31T17:58:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png)\r\n\r\nThat's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata\r\n\r\nIt's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2011/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1563264257, "node_id": "I_kwDOBm6k_c5dLYUB", "number": 2010, "title": "Row page should default to card view", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2023-01-30T21:49:37Z", "updated_at": "2023-01-30T21:52:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette currently uses the same table layout on the row pages as it does on the table pages:\r\n\r\nhttps://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect\r\n\r\n\"image\"\r\n\r\nhttps://datasette.io/content/pypi_packages/datasette-column-inspect\r\n\r\n\"image\"\r\n\r\nIf you shrink down to mobile width you get this instead, on both of those pages:\r\n\r\n\"image\"\r\n\r\nI think that view, which I think of as the \"card view\", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2010/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1186696202, "node_id": "I_kwDOBm6k_c5Gu4wK", "number": 1696, "title": "Show foreign key label when filtering", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2022-03-30T16:18:54Z", "updated_at": "2023-01-29T20:56:20Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "For example here:\r\n\r\n\"image\"\r\n\r\n3 corresponds to \"Human Related: Other\" - it would be neat to display this in this area of the page somehow.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1696/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1560982210, "node_id": "PR_kwDOBm6k_c5IvYKw", "number": 2008, "title": "array facet: don't materialize unnecessary columns", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2023-01-28T19:33:40Z", "updated_at": "2023-01-29T18:17:40Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/2008", "body": "The presence of `inner.*` causes SQLite to materialize a row with all the columns. Those columns will be discarded later.\r\n\r\nInstead, we can select only the column we'll use. This lets SQLite's optimizer realize that the other columns in the CTE definition aren't needed.\r\n\r\nOn a test table with 278K rows, 98K of which had an array, this speeds up the facet calculation from 4 sec to 1 sec.\r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--2008.org.readthedocs.build/en/2008/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2008/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1515815014, "node_id": "I_kwDOBm6k_c5aWYBm", "number": 1973, "title": "render_cell plugin hook's row object is not a sqlite.Row", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2023-01-01T20:27:46Z", "updated_at": "2023-01-29T00:40:31Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette:\r\n\r\n> row - sqlite.Row\r\n> The SQLite row object that the value being rendered is part of\r\n\r\nThis appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue.\r\n\r\nI have a table:\r\n\r\n```sql\r\nCREATE TABLE IF NOT EXISTS \"dss_job_stats\"(\r\n job_id integer not null references dss_job(id) on delete cascade,\r\n host text not null,\r\n // other columns elided as irrelevant\r\n primary key (job_id, host)\r\n);\r\n```\r\n\r\nOn datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like:\r\n\r\n```\r\nCustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')])\r\n```\r\n\r\nI expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`.\r\n\r\nI can work around this, but was wondering if this was intended behaviour?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1973/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1558644003, "node_id": "I_kwDOBm6k_c5c5wUj", "number": 2006, "title": "Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 2, "created_at": "2023-01-26T19:17:40Z", "updated_at": "2023-01-26T19:20:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes.\r\n\r\nI can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2006/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1557507274, "node_id": "I_kwDOBm6k_c5c1azK", "number": 2005, "title": "`extra_template_vars` should be OK to return `None`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-26T01:40:45Z", "updated_at": "2023-01-26T01:41:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Got this exception and had to make sure it always returned `{}`:\r\n\r\n```\r\n File \".../python3.11/site-packages/datasette/app.py\", line 1049, in render_template\r\n assert isinstance(extra_vars, dict), \"extra_vars is of type {}\".format(\r\nAssertionError: extra_vars is of type \r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2005/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1555701851, "node_id": "PR_kwDOBm6k_c5IdsD7", "number": 2003, "title": "Show referring tables and rows when the referring foreign key is compound", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2023-01-24T21:31:31Z", "updated_at": "2023-01-25T18:44:42Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/2003", "body": "sqlite foreign keys can be compound, but that is not as well supported by datasette as single column foreign keys.\r\n\r\nin particular, \r\n\r\n1. in a table view, there is not a link from the row to the referenced row if the foreign key is compound\r\n2. in a row view, there is no listing of tables and rows that refer to the focal row if those referencing foreign keys are compound.\r\n\r\nBoth of these issues are discussed in #1099. \r\n\r\nThis PR only fixes the second one, because it's not clear what the right UX is for the first issue.\r\n\r\n![Screenshot 2023-01-24 at 19-47-40 nlrb bargaining_unit](https://user-images.githubusercontent.com/536941/214454749-d53deead-4151-4329-a5d4-8a7a454de7d3.png)\r\n\r\nSome things that might not be desirable about this approach.\r\n\r\n1. it changes the external API, by changing `column` => `columns` and `other_column` => `other_columns` (see inline comment for more discussion.\r\n2. There are various places where the plural foreign keys have to be checked for length and discarded or transformed to singular. \r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2003/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1553615704, "node_id": "I_kwDOBm6k_c5cmktY", "number": 2001, "title": "Datasette is not compatible with SQLite's strict quoting compilation option", "user": {"value": 406380, "label": "gwk"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2023-01-23T19:10:07Z", "updated_at": "2023-01-25T04:59:58Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a \"MySQL 3.x misfeature\". See https://www.sqlite.org/quirks.html#dblquote for background.\r\n\r\nDatasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment.\r\n\r\nMy experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response.\r\n\r\nThe error: `sqlite3.OperationalError: no such column: geometry_columns`\r\n\r\nThe responsible SQL: `'select 1 from sqlite_master where tbl_name = \"geometry_columns\"'`\r\n\r\nI then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `\"select 1 from sqlite_master where tbl_name = 'geometry_columns'\"`.\r\n\r\nWith this change, I get a little further, but have the same problem with the first table name in my database (in my case, \"Meta\"):\r\n```\r\nOperationalError: no such column: Meta\r\nTraceback (most recent call last):\r\n File \"/Users/gwk/external/datasette/datasette/app.py\", line 1522, in route_path\r\n response = await view(request, send)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/views/base.py\", line 151, in view\r\n return await self.dispatch_request(request)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/views/base.py\", line 105, in dispatch_request\r\n response = await handler(request)\r\n ^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/views/index.py\", line 70, in get\r\n \"fts_table\": await db.fts_table(table),\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/database.py\", line 363, in fts_table\r\n return await self.execute_fn(lambda conn: detect_fts(conn, table))\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/database.py\", line 213, in execute_fn\r\n return await asyncio.get_event_loop().run_in_executor(\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py\", line 58, in run\r\n result = self.fn(*self.args, **self.kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/database.py\", line 211, in in_thread\r\n return fn(conn)\r\n ^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/database.py\", line 363, in \r\n return await self.execute_fn(lambda conn: detect_fts(conn, table))\r\n ^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/Users/gwk/external/datasette/datasette/utils/__init__.py\", line 588, in detect_fts\r\n rows = conn.execute(detect_fts_sql(table)).fetchall()\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\nsqlite3.OperationalError: no such column: Meta\r\nINFO: 127.0.0.1:50258 - \"GET / HTTP/1.1\" 500 Internal Server Error\r\n```\r\n\r\nI will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon.\r\n\r\nNote that the DQS setting compile-time option can be overridden at runtime with calls to the C API:\r\n```\r\nsqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0);\r\nsqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0);\r\n```\r\n\r\nAs far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2001/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 743371103, "node_id": "MDU6SXNzdWU3NDMzNzExMDM=", "number": 1099, "title": "Support linking to compound foreign keys", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-11-15T23:23:17Z", "updated_at": "2023-01-25T00:58:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Reported as a bug in #1098 because they caused 500 errors - but it would be even better if Datasette could hyperlink to related rows via compound foreign keys.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1099/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1554032168, "node_id": "I_kwDOBm6k_c5coKYo", "number": 2002, "title": "Document how actors are displayed", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-24T00:08:49Z", "updated_at": "2023-01-24T00:08:49Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/__init__.py#L1052-L1056\r\n\r\nThis logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2002/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1536851861, "node_id": "I_kwDOBm6k_c5bmn-V", "number": 1994, "title": "Stuck on loading screen", "user": {"value": 10913053, "label": "jackhagley"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-17T18:33:49Z", "updated_at": "2023-01-23T08:21:08Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Can\u2019t actually open it!\r\n\r\nDownloaded today from the releases tab\r\n\r\nRunning macOS13.1 \r\n\r\n```\r\nbin/python3.9 --version\r\nPython 3.9.6\r\nTook 83ms\r\nbin/python3.9 --version\r\nPython 3.9.6\r\nTook 113ms\r\nbin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check\r\nRequirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63)\r\nRequirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6)\r\nRequirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2)\r\nRequirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2)\r\nRequirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2)\r\nRequirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1)\r\nRequirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1)\r\nRequirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2)\r\nRequirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3)\r\nRequirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3)\r\nRequirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1)\r\nRequirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0)\r\nRequirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0)\r\nRequirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0)\r\nRequirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9)\r\nRequirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2)\r\nRequirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0)\r\nRequirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2)\r\nRequirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2)\r\nRequirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0)\r\nRequirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0)\r\nRequirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2)\r\nRequirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4)\r\nRequirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30)\r\nRequirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3)\r\nRequirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5)\r\nRequirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0)\r\nRequirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24)\r\nRequirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0)\r\nRequirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0)\r\nRequirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0)\r\nRequirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2)\r\nRequirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4)\r\nRequirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0)\r\nRequirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1)\r\nRequirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0)\r\nRequirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2)\r\nRequirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3)\r\nRequirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9)\r\nRequirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0)\r\nTook 784ms\r\n```\r\nSTUCK", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1994/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1552368054, "node_id": "I_kwDOBm6k_c5ch0G2", "number": 2000, "title": "rewrite_sql hook", "user": {"value": 193185, "label": "cldellow"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-23T01:02:52Z", "updated_at": "2023-01-23T06:08:01Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like\r\n\r\n```python\r\ndef rewrite_sql(datasette, database, request, fn, sql, params)\r\n```\r\n\r\nIt would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement.\r\n\r\nThe hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed.\r\n\r\nPlugins that could be written with this hook:\r\n\r\n- https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching\r\n- a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004))\r\n- a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID\r\n- a plugin to maintain audit tables when users write to a table\r\n- a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue\r\n\r\nFlaws with this idea:\r\n\r\n`execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2000/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 749283032, "node_id": "MDU6SXNzdWU3NDkyODMwMzI=", "number": 1101, "title": "register_output_renderer() should support streaming data", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 13, "created_at": "2020-11-24T02:17:09Z", "updated_at": "2023-01-21T22:07:19Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> I'd like to implement this by first extending the `register_output_renderer()` hook to support streaming huge responses, then switching CSV to use the plugin hook in addition to TSV using it.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1096#issuecomment-732542285_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1101/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1551113681, "node_id": "I_kwDOBm6k_c5cdB3R", "number": 1998, "title": "`datasette --version` should also show the SQLite version", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2023-01-20T16:11:30Z", "updated_at": "2023-01-20T18:19:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1998/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1533673397, "node_id": "I_kwDOBm6k_c5baf-1", "number": 1991, "title": "fts5 tables are not auto-detected and hidden", "user": {"value": 83819, "label": "keturn"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-15T06:00:42Z", "updated_at": "2023-01-20T04:54:24Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I set up a [Datasette instance](https://huggingface.co/spaces/Sygil/INE-dataset-explorer/tree/main) and was following the docs on full-text search.\r\n\r\nWhen I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either.\r\n\r\nIf I [manually set](https://huggingface.co/spaces/keturn/INED-datasette/blob/main/metadata.json#L9) `fts_table` for a view, then search does work as expected.\r\n\r\nMy table and view creation code looks like this:\r\n```py\r\nconnection.execute(\"\"\"CREATE TABLE IF NOT EXISTS\r\n captions(image_key text PRIMARY KEY, caption text NOT NULL)\r\n\"\"\")\r\n\u00a0\r\nconnection.execute(\"\"\"CREATE VIRTUAL TABLE\r\n captions_fts USING\r\n fts5(caption, image_key UNINDEXED, content=captions)\r\n\"\"\")\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1991/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1538342965, "node_id": "PR_kwDOBm6k_c5HpNYo", "number": 1996, "title": "Document custom json encoder", "user": {"value": 25778, "label": "eyeseast"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-18T16:54:14Z", "updated_at": "2023-01-19T12:55:57Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1996", "body": "Closes #1983 \r\n\r\nAll documentation here. Edits welcome.\r\n\r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--1996.org.readthedocs.build/en/1996/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1996/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1538197093, "node_id": "I_kwDOBm6k_c5brwZl", "number": 1995, "title": "foreign_keys error 500", "user": {"value": 137183, "label": "jonschoning"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-18T15:27:36Z", "updated_at": "2023-01-18T16:44:01Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "**Error 500 expected string or bytes-like object**\r\n\r\n[espial-new.sqlite3.zip](https://github.com/simonw/datasette/files/10447965/espial-new.sqlite3.zip)\r\n\r\nrun `datasette espial-new.sqlite3` & click on any table other than `User`\r\n\r\n```\r\n/home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in \u2502\r\n\u2502 expand_foreign_keys \u2502\r\n\u2502 \u2502\r\n\u2502 811 \u2502 \u2502 \u2502 from {other_table} \u2502\r\n\u2502 812 \u2502 \u2502 \u2502 where {other_column} in ({placeholders}) \u2502\r\n\u2502 813 \u2502 \u2502 \"\"\".format( \u2502\r\n\u2502 \u2771 814 \u2502 \u2502 \u2502 other_column=escape_sqlite(fk[\"other_column\"]), \u2502\r\n\u2502 815 \u2502 \u2502 \u2502 label_column=escape_sqlite(label_column), \u2502\r\n\u2502 816 \u2502 \u2502 \u2502 other_table=escape_sqlite(fk[\"other_table\"]), \u2502\r\n\u2502 817 \u2502 \u2502 \u2502 placeholders=\", \".join([\"?\"] * len(set(values))), \u2502\r\n\u2502 \u2502\r\n\u2502 \u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500 locals \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e \u2502\r\n\u2502 \u2502 column = 'user_id' \u2502 \u2502\r\n\u2502 \u2502 database = 'espial-new' \u2502 \u2502\r\n\u2502 \u2502 db = \u2502 \u2502\r\n\u2502 \u2502 fk = { \u2502 \u2502\r\n\u2502 \u2502 \u2502 'column': 'user_id', \u2502 \u2502\r\n\u2502 \u2502 \u2502 'other_table': 'user', \u2502 \u2502\r\n\u2502 \u2502 \u2502 'other_column': None \u2502 \u2502\r\n\u2502 \u2502 } \u2502 \u2502\r\n\u2502 \u2502 foreign_keys = [ \u2502 \u2502\r\n\u2502 \u2502 \u2502 { \u2502 \u2502\r\n\u2502 \u2502 \u2502 \u2502 'column': 'user_id', \u2502 \u2502\r\n\u2502 \u2502 \u2502 \u2502 'other_table': 'user', \u2502 \u2502\r\n\u2502 \u2502 \u2502 \u2502 'other_column': None \u2502 \u2502\r\n\u2502 \u2502 \u2502 } \u2502 \u2502\r\n\u2502 \u2502 ] \u2502 \u2502\r\n\u2502 \u2502 label_column = 'name' \u2502 \u2502\r\n\u2502 \u2502 labeled_fks = {} \u2502 \u2502\r\n\u2502 \u2502 self = \u2502 \u2502\r\n\u2502 \u2502 table = 'bookmark' \u2502 \u2502\r\n\u2502 \u2502 values = [] \u2502 \u2502\r\n\u2502 \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f \u2502\r\n\u2502 \u2502\r\n\u2502 /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 \u2502\r\n\u2502 in escape_sqlite \u2502\r\n\u2502 \u2502\r\n\u2502 343 \u2502\r\n\u2502 344 \u2502\r\n\u2502 345 def escape_sqlite(s): \u2502\r\n\u2502 \u2771 346 \u2502 if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) \u2502\r\n\u2502 347 \u2502 \u2502 return s \u2502\r\n\u2502 348 \u2502 else: \u2502\r\n\u2502 349 \u2502 \u2502 return f\"[{s}]\" \u2502\r\n\u2502 \u2502\r\n\u2502 \u256d\u2500 locals \u2500\u256e \u2502\r\n\u2502 \u2502 s = None \u2502 \u2502\r\n\u2502 \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f \u2502\r\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\r\nTypeError: expected string or bytes-like object\r\nTraceback (most recent call last):\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/app.py\", line 1354, in route_path\r\n response = await view(request, send)\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py\", line 134, in view\r\n return await self.dispatch_request(request)\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py\", line 91, in dispatch_request\r\n return await handler(request)\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py\", line 361, in get\r\n response_or_template_contexts = await self.data(request, **data_kwargs)\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py\", line 158, in data\r\n return await self._data_traced(request, default_labels, _next, _size)\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py\", line 603, in _data_traced\r\n await self.ds.expand_foreign_keys(\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/app.py\", line 814, in expand_foreign_keys\r\n other_column=escape_sqlite(fk[\"other_column\"]),\r\n File \"/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py\", line 346, in escape_sqlite\r\n if _boring_keyword_re.match(s) and (s.lower() not in reserved_words):\r\nTypeError: expected string or bytes-like object\r\nINFO: 127.0.0.1:38574 - \"GET /espial-new/bookmark HTTP/1.1\" 500 Internal Server Error\r\nINFO: 127.0.0.1:38574 - \"GET /-/static/app.css?d59929 HTTP/1.1\" 200 OK\r\n```\r\n\r\nSchema:\r\n```\r\nCREATE TABLE IF NOT EXISTS \"user\"\r\n (\r\n \"id\" INTEGER PRIMARY KEY,\r\n \"name\" VARCHAR NOT NULL,\r\n \"password_hash\" VARCHAR NOT NULL,\r\n \"api_token\" VARCHAR NULL,\r\n \"private_default\" BOOLEAN NOT NULL,\r\n \"archive_default\" BOOLEAN NOT NULL,\r\n \"privacy_lock\" BOOLEAN NOT NULL,\r\n CONSTRAINT \"unique_user_name\" UNIQUE (\"name\")\r\n );\r\n\r\nCREATE TABLE IF NOT EXISTS \"bookmark\"\r\n (\r\n \"id\" INTEGER PRIMARY KEY,\r\n \"user_id\" INTEGER NOT NULL REFERENCES \"user\" ON DELETE RESTRICT ON UPDATE RESTRICT,\r\n \"slug\" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))),\r\n \"href\" VARCHAR NOT NULL,\r\n \"description\" VARCHAR NOT NULL,\r\n \"extended\" VARCHAR NOT NULL,\r\n \"time\" TIMESTAMP NOT NULL,\r\n \"shared\" BOOLEAN NOT NULL,\r\n \"to_read\" BOOLEAN NOT NULL,\r\n \"selected\" BOOLEAN NOT NULL,\r\n \"archive_href\" VARCHAR NULL,\r\n CONSTRAINT \"unique_user_href\" UNIQUE (\"user_id\", \"href\"),\r\n CONSTRAINT \"unique_user_slug\" UNIQUE (\"user_id\", \"slug\")\r\n );\r\n\r\nCREATE TABLE IF NOT EXISTS \"bookmark_tag\"\r\n (\r\n \"id\" INTEGER PRIMARY KEY,\r\n \"user_id\" INTEGER NOT NULL REFERENCES \"user\" ON DELETE RESTRICT ON UPDATE RESTRICT,\r\n \"tag\" VARCHAR NOT NULL,\r\n \"bookmark_id\" INTEGER NOT NULL REFERENCES \"bookmark\" ON DELETE RESTRICT ON UPDATE RESTRICT,\r\n \"seq\" INTEGER NOT NULL,\r\n CONSTRAINT \"unique_user_tag_bookmark_id\" UNIQUE (\"user_id\", \"tag\", \"bookmark_id\"),\r\n CONSTRAINT \"unique_user_bookmark_id_tag_seq\" UNIQUE (\"user_id\", \"bookmark_id\", \"tag\", \"seq\")\r\n );\r\n\r\nCREATE TABLE IF NOT EXISTS \"note\"\r\n (\r\n \"id\" INTEGER PRIMARY KEY,\r\n \"user_id\" INTEGER NOT NULL REFERENCES \"user\" ON DELETE RESTRICT ON UPDATE RESTRICT,\r\n \"slug\" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))),\r\n \"length\" INTEGER NOT NULL,\r\n \"title\" VARCHAR NOT NULL,\r\n \"text\" VARCHAR NOT NULL,\r\n \"is_markdown\" BOOLEAN NOT NULL,\r\n \"shared\" BOOLEAN NOT NULL DEFAULT false,\r\n \"created\" TIMESTAMP NOT NULL,\r\n \"updated\" TIMESTAMP NOT NULL\r\n );\r\nCREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC);\r\nCREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq);\r\nCREATE INDEX idx_note_user_created ON note (user_id, created DESC); \r\n```\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1995/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1532000914, "node_id": "I_kwDOBm6k_c5bUHqS", "number": 1990, "title": "Suggestion: Highlight error messages ('These facets timed out')", "user": {"value": 116795, "label": "pax"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-13T09:40:58Z", "updated_at": "2023-01-13T09:40:58Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the _These facets timed out_ notice. \r\n\r\nIt might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1990/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1529707837, "node_id": "I_kwDOBm6k_c5bLX09", "number": 1988, "title": "Reconsider pattern where plugins could break existing template context", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 4, "created_at": "2023-01-11T21:13:43Z", "updated_at": "2023-01-11T21:25:05Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1988/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1509783085, "node_id": "I_kwDOBm6k_c5Z_XYt", "number": 1969, "title": "sql-formatter javascript is not now working with CloudFlare rocketloader", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-12-23T21:14:06Z", "updated_at": "2023-01-10T01:56:33Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "This is probably not a bug with datasette, but I thought you might want to know, @simonw.\r\n\r\nI noticed today that my CloudFlare proxied datasette instance lost the \"Format SQL\" option. I'm pretty sure it was there last week.\r\n\r\nIn the CloudFlare settings, if I turn off [Rocket Loader](https://developers.cloudflare.com/fundamentals/speed/rocket-loader/), I get the \"Format SQL\" option back.\r\n\r\nRocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading?\r\n\r\nI'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1969/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1525815985, "node_id": "I_kwDOBm6k_c5a8hqx", "number": 1983, "title": "Make CustomJSONEncoder a documented public API", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2023-01-09T15:27:05Z", "updated_at": "2023-01-09T15:35:58Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1983/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1426080014, "node_id": "I_kwDOBm6k_c5VAEEO", "number": 1867, "title": "/db/table/-/rename API (also allows atomic replace)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-10-27T18:13:23Z", "updated_at": "2023-01-09T15:34:12Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first.\r\n>\r\n> ...\r\n>\r\n> If people care about that kind of thing they could always push all of their inserts to a table called `_tablename` and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism).\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1867/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1082584499, "node_id": "I_kwDOBm6k_c5Ahu2z", "number": 1558, "title": "Redesign `facet_results` JSON structure prior to Datasette 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 3, "created_at": "2021-12-16T19:45:10Z", "updated_at": "2023-01-09T15:31:17Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used.\r\n>\r\n> I'll open a separate issue to redesign this better for Datasette 1.0.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1558/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1524983536, "node_id": "I_kwDOBm6k_c5a5Wbw", "number": 1981, "title": "Canned query field labels truncated", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-01-09T06:04:24Z", "updated_at": "2023-01-09T06:05:44Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776\r\n\r\n![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1981/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1524867951, "node_id": "I_kwDOBm6k_c5a46Nv", "number": 1980, "title": "\"Cannot sort table by id\" when sortable_columns is used", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2023-01-09T03:21:33Z", "updated_at": "2023-01-09T03:23:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I had an instance with this in `metadata.yml`:\r\n\r\n```yaml\r\ndatabases:\r\n timezones:\r\n tables:\r\n timezones:\r\n sortable_columns:\r\n - tzid\r\n```\r\nWhen I clicked on the \"Apply\" button here:\r\n\r\n\"image\"\r\n\r\nIt sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message:\r\n\r\n> 500: Cannot sort table by id", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1980/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1516815571, "node_id": "I_kwDOBm6k_c5aaMTT", "number": 1975, "title": "_col=id can cause id column to export twice in CSV export", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-01-03T00:25:15Z", "updated_at": "2023-01-03T00:25:21Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1\r\n\r\n```csv\r\nid,id,title,body\r\n1,1,WaSP Phase II,\"

The Web Standards project has launched Phase II.

\"\r\n```\r\nThat should not have two `id` columns.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1975/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1115435536, "node_id": "I_kwDOBm6k_c5CfDIQ", "number": 1614, "title": "Try again with SQLite codemirror support", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-26T20:05:20Z", "updated_at": "2022-12-23T21:27:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I tried and failed to implement autocomplete a while ago. Relevant code:\r\n\r\nhttps://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335\r\n\r\nSounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1614/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1504352503, "node_id": "I_kwDOBm6k_c5Zqpj3", "number": 1968, "title": "Allow to hide some queries in metadata.yml", "user": {"value": 562352, "label": "CharlesNepote"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-12-20T10:45:41Z", "updated_at": "2022-12-20T10:45:41Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "By default all queries are displayed.\r\n\r\nBut there are many cases where it would be interesting to hide the queries by default:\r\n* the website is targeting non-tech people\r\n* the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator))\r\n* reading the query is not important for the users, they only want to see the result\r\n\r\nOf course, the user still could have the option to see the query.\r\n\r\nIt could be an option in the metadata file:\r\n```yml\r\ndatabases:\r\n awesome_db:\r\n tables:\r\n products:\r\n hide_sql: true\r\n queries:\r\n great_query:\r\n hide_sql: true\r\n sql: select * from products where code = :barcode\r\n```\r\n\r\nThe priority could be:\r\n* no option in the metadata and nothing in the URL: query displayed\r\n* hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata\r\n* hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL\r\n\r\nSee also: #1824\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1968/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1447050738, "node_id": "I_kwDOBm6k_c5WQD3y", "number": 1886, "title": "Call for birthday presents: if you're using Datasette, let us know how you're using it here", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 13, "created_at": "2022-11-13T19:25:51Z", "updated_at": "2022-12-18T17:34:20Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette is 5 years old today. To celebrate, I'm asking the community for birthday presents:\r\n\r\nhttps://simonwillison.net/2022/Nov/13/datasette-birthday/\r\n\r\n> To celebrate this open source project\u2019s birthday, I\u2019ve decided to try something new: I\u2019m going to ask for birthday presents.\r\n> \r\n> An aspect of Datastte\u2019s marketing that I\u2019ve so far neglected is social proof. I think it\u2019s time to change that: I know people are using the software to do cool things, but this often happens behind closed doors.\r\n> \r\n> For Datastte\u2019s birthday, I\u2019m looking for endorsements and case studies and just general demonstrations that show how people are using it do so cool stuff.\r\n> \r\n> So: if you\u2019ve used Datasette to solve a problem, and you\u2019re willing to publicize it, please give us the gift of your endorsement!\r\n> \r\n> [...]\r\n> \r\n> Add a comment to [this issue thread](https://github.com/simonw/datasette/issues/1886) describing what you\u2019re doing. Just a few sentences is fine\u2014though a screenshot or even a link to a live instance would be even better", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1886/reactions\", \"total_count\": 2, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 2, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1501778647, "node_id": "I_kwDOBm6k_c5Zg1LX", "number": 1964, "title": "Cog menu is not keyboard accessible (also no ARIA)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-18T06:36:28Z", "updated_at": "2022-12-18T06:37:28Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This menu here: https://latest.datasette.io/fixtures/attraction_characteristic\r\n\r\nYou can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard.\r\n\r\n![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1964/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1408757705, "node_id": "I_kwDOBm6k_c5T9-_J", "number": 1843, "title": "Intermittent \"Too many open files\" error running tests", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 16, "created_at": "2022-10-14T04:45:01Z", "updated_at": "2022-12-17T22:02:41Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Partial stack trace from one of them:\r\n```\r\n/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.10/site-packages/jinja2/loaders.py:200: in get_source\r\n f = open_if_exists(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nfilename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html', mode = 'rb'\r\n\r\n def open_if_exists(filename: str, mode: str = \"rb\") -> t.Optional[t.IO]:\r\n \"\"\"Returns a file descriptor for the filename if that file exists,\r\n otherwise ``None``.\r\n \"\"\"\r\n if not os.path.isfile(filename):\r\n return None\r\n \r\n> return open(filename, mode)\r\nE OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html'\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1843/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "reopened"} {"id": 1500636982, "node_id": "I_kwDOBm6k_c5Zcec2", "number": 1962, "title": "Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-16T17:56:51Z", "updated_at": "2022-12-16T21:55:29Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead:\r\n- #1959\r\n\r\nBut I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result.\r\n\r\nThe main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead.\r\n\r\nThis requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs:\r\n\r\n- #1843", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1962/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1493471221, "node_id": "I_kwDOBm6k_c5ZBI_1", "number": 1949, "title": "`.json` errors should be returned as JSON", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 10, "created_at": "2022-12-13T06:14:12Z", "updated_at": "2022-12-15T00:46:27Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Eg the error in this issue:\r\n- #1945 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1949/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1490576818, "node_id": "I_kwDOBm6k_c5Y2GWy", "number": 1943, "title": "`/-/permissions` should list available permissions", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-12-11T23:38:03Z", "updated_at": "2022-12-15T00:41:37Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Idea: a `/-/permissions` introspection endpoint for listing registered permissions\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1939#issuecomment-1345691103_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1943/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1497577017, "node_id": "I_kwDOBm6k_c5ZQzY5", "number": 1957, "title": "Reconsider row value truncation on query page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-14T23:49:47Z", "updated_at": "2022-12-14T23:50:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos\r\n\r\n```sql\r\nselect json_group_array(full_name) from repos\r\n```\r\n\r\n![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png)\r\n\r\nMy intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json\r\n\r\nThe truncation isn't helping here.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1957/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 855296937, "node_id": "MDU6SXNzdWU4NTUyOTY5Mzc=", "number": 1295, "title": "Errors should have links to further information", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-04-11T12:39:12Z", "updated_at": "2022-12-14T23:28:49Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Inspired by this tweet:\r\nhttps://twitter.com/willmcgugan/status/1381186384510255104\r\n\r\n> While I am thinking about faqs. I\u2019d also like to add short URLs to Rich exceptions.\r\n>\r\n> I loath cryptic error messages, and I\u2019ve created a fair few myself. In Rich I\u2019ve tried to make them as plain English as possible. But...\r\n>\r\n> would be great if every error message linked to a page that explains the error in detail and offers fixes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1295/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1495716243, "node_id": "I_kwDOBm6k_c5ZJtGT", "number": 1952, "title": "Improvements to /-/create-token restrictions interface", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-12-14T05:22:39Z", "updated_at": "2022-12-14T05:23:13Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks.\r\n>\r\n> That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402_\r\n\r\nAlso, DO show the `_memory` database there if Datasette was started in `--crossdb` mode.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1952/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1468689139, "node_id": "I_kwDOBm6k_c5Ximrz", "number": 1914, "title": "Finalize design of JSON for Datasette 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-11-29T20:59:10Z", "updated_at": "2022-12-13T06:15:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Tracking issue.\r\n\r\n- [ ] #1709\r\n- [ ] #1729\r\n- [ ] #1875", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1914/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1468495358, "node_id": "I_kwDOBm6k_c5Xh3X-", "number": 1910, "title": "Check incoming column types on various write APIs", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 0, "created_at": "2022-11-29T18:09:10Z", "updated_at": "2022-12-13T05:29:09Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> I do think this needs type checking - I just tried and you really can send a string to an integer column and have it work, which feels bad.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1331089156_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1910/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1428630253, "node_id": "I_kwDOBm6k_c5VJyrt", "number": 1873, "title": "Ensure insert API has good tests for rowid and compound primark key tables", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 11, "created_at": "2022-10-30T06:22:17Z", "updated_at": "2022-12-13T05:29:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Following:\r\n- #1866\r\n\r\nI need to design and implement various edge-cases or primary keys:\r\n\r\n- Table without an auto-incrementing primary key\r\n- Table with compound primary keys\r\n- Table with just a `rowid`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1873/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "reopened"} {"id": 1430797211, "node_id": "I_kwDOBm6k_c5VSDub", "number": 1875, "title": "Figure out design for JSON errors (consider RFC 7807)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 7, "created_at": "2022-11-01T03:14:15Z", "updated_at": "2022-12-13T05:29:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://datatracker.ietf.org/doc/draft-ietf-httpapi-rfc7807bis/ is a brand new standard.\r\n\r\nSince I need a neat, predictable format for my JSON errors, maybe I should use this one?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1875/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1447465004, "node_id": "I_kwDOBm6k_c5WRpAs", "number": 1889, "title": "Ability to create new tokens via the API", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 0, "created_at": "2022-11-14T06:21:36Z", "updated_at": "2022-12-13T05:29:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Refs:\r\n- #1850\r\n\r\nInitially I decided that the API shouldn't be able to create new tokens at all - I don't like the idea of an API token holder creating themselves additional tokens.\r\n\r\nThen I realized that two of the API features are specifically more useful if you can generate fresh tokens via the API:\r\n\r\n- Tokes that expire after a time limit are MUCH more useful if they can be automatically generated\r\n- Likewise, tokens that are restricted to a subset of permissions (see #1855) make more sense to be generated like this, especially in conjunction with expiry times", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1889/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1216436131, "node_id": "I_kwDOBm6k_c5IgVej", "number": 1721, "title": "Implement plugin hooks: `register_table_extras`, `register_row_extras`, `register_query_extras`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 0, "created_at": "2022-04-26T20:21:49Z", "updated_at": "2022-12-13T05:29:07Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Designed in:\r\n- #1720\r\n\r\nPart of:\r\n- #262\r\n- #1709", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1721/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1219385669, "node_id": "I_kwDOBm6k_c5IrllF", "number": 1729, "title": "Implement ?_extra and new API design for TableView", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 12, "created_at": "2022-04-28T22:28:14Z", "updated_at": "2022-12-13T05:29:07Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Part of:\r\n- #262\r\n- #1518", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1729/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200649124, "node_id": "I_kwDOBm6k_c5HkHOk", "number": 1708, "title": "Datasette 1.0 alpha upcoming release notes", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 2, "created_at": "2022-04-11T22:57:12Z", "updated_at": "2022-12-13T05:29:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I'm going to try writing the release notes first, to see if that helps unblock me.\r\n\r\n# \u26a0\ufe0f Any release notes in this issue are a draft, and should not be treated as the real thing \u26a0\ufe0f ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1708/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200649502, "node_id": "I_kwDOBm6k_c5HkHUe", "number": 1709, "title": "Redesigned JSON API with ?_extra= parameters", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-11T22:57:49Z", "updated_at": "2022-12-13T05:29:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This will be the single biggest breaking change for the 1.0 release.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1709/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1200650491, "node_id": "I_kwDOBm6k_c5HkHj7", "number": 1711, "title": "Template context powered entirely by the JSON API format", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-11T22:59:27Z", "updated_at": "2022-12-13T05:29:06Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette 1.0 will have a stable template context. I'm going to achieve this by refactoring the templates to work only with keys returned by the API (or some of its extras) - then the API documentation will double up as template documentation.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1711/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1197926598, "node_id": "I_kwDOBm6k_c5HZujG", "number": 1705, "title": "How to upgrade your plugin for 1.0 documentation", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 1, "created_at": "2022-04-08T23:16:47Z", "updated_at": "2022-12-13T05:29:05Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Among other things, needed by:\r\n- #1704", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1705/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1493404423, "node_id": "I_kwDOBm6k_c5ZA4sH", "number": 1948, "title": "500 error on permission debug page when testing actors with _r", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-13T05:22:03Z", "updated_at": "2022-12-13T05:22:19Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "\"image\"\r\n\r\nThe 500 error is silent unless you are looking at the DevTools network pane.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1948/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1384549993, "node_id": "I_kwDOBm6k_c5Sho5p", "number": 1818, "title": "Setting to turn off table row counts entirely", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2022-09-24T06:39:22Z", "updated_at": "2022-12-11T02:03:09Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "There are situations - such as loading SQLite files remotely using HTTP range headers - where counting all of the rows in a table should be avoided entirely.\r\n\r\n> > Also, this chunked inefficiency means that I have to hack the URL to not load tables of a database as it seems to try to load the whole database when I click on a database.\r\n>\r\n> I bet that's because Datasette tries to show a count of all of the rows in each table when it shows the list on that page, which triggers a full table scan.\r\n>\r\n> Would be great to have a setting that turns that feature off, which could then be exposed as a query string option for Datasette Lite.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-lite/issues/49#issuecomment-1256880715_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1818/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1487738738, "node_id": "I_kwDOBm6k_c5YrRdy", "number": 1942, "title": "Option for plugins to request that JSON be served on the page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2022-12-10T01:08:53Z", "updated_at": "2022-12-10T01:11:30Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say \"I'd like the JSON for a page to be included in a variable on the HTML page\"?\r\n\r\n`datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data.\r\n\r\nThis idea fits with my overall goals to unify the JSON and HTML context too.\r\n\r\nRefs:\r\n- #1711", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1942/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1198822563, "node_id": "I_kwDOBm6k_c5HdJSj", "number": 1706, "title": "[feature] immutable mode for a directory, not just individual sqlite file", "user": {"value": 9020979, "label": "hydrosquall"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2022-04-10T00:50:57Z", "updated_at": "2022-12-09T19:11:40Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "## Motivation\r\n\r\n- I have a directory of sqlite databases\r\n- I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode)\r\n- Currently using this flag throws the following error\r\n\r\n IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory'\r\n\r\n## Proposal\r\n\r\nImmutable flag works for both single files and directories\r\n\r\n datasette -i /folder-of-sqlite-files", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1706/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1486036269, "node_id": "I_kwDOBm6k_c5Ykx0t", "number": 1941, "title": "Mechanism for supporting key rotation for DATASETTE_SECRET", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-12-09T05:24:53Z", "updated_at": "2022-12-09T05:25:20Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire.\r\n\r\nAdding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly.\r\n\r\nCan model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1941/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1483250004, "node_id": "I_kwDOBm6k_c5YaJlU", "number": 1936, "title": "Fix /db/table/-/upsert in the API explorer", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 2, "created_at": "2022-12-08T00:59:34Z", "updated_at": "2022-12-08T01:36:02Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Split from:\r\n- #1931\r\n- #1878\r\n\r\nThis is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not `rowid` should be included.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1936/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1479920517, "node_id": "I_kwDOBm6k_c5YNcuF", "number": 1934, "title": "Return number of ignored/replaced items from /-/insert", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 0, "created_at": "2022-12-06T19:01:58Z", "updated_at": "2022-12-06T19:02:03Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Idea from here:\r\n- https://github.com/simonw/sqlite-utils/issues/516", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1934/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1175690070, "node_id": "I_kwDOBm6k_c5GE5tW", "number": 1676, "title": "Reconsider ensure_permissions() logic, can it be less confusing?", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 3, "created_at": "2022-03-21T17:14:57Z", "updated_at": "2022-12-02T01:23:40Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions\r\n>\r\n>> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted.\r\n>> \r\n>> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`:\r\n>\r\n> That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1676/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1469821027, "node_id": "I_kwDOBm6k_c5Xm7Bj", "number": 1921, "title": "Document methods to get canned queries", "user": {"value": 25778, "label": "eyeseast"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-11-30T15:26:33Z", "updated_at": "2022-11-30T23:34:21Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Two methods will get canned queries for a Datasette instance:\r\n\r\n[`Datasette.get_canned_queries`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L575) will return all canned queries for a database that an `actor` can see.\r\n\r\n[`Datasette.get_canned_query`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L592) will return a single canned query by name.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1921/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1469796454, "node_id": "I_kwDOBm6k_c5Xm1Bm", "number": 1920, "title": "Document Datasette.metadata() method", "user": {"value": 25778, "label": "eyeseast"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-11-30T15:10:36Z", "updated_at": "2022-11-30T15:10:36Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503\r\n\r\nThis will be the official way to access metadata from plugins.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1920/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1108671952, "node_id": "I_kwDOBm6k_c5CFP3Q", "number": 1605, "title": "Scripted exports", "user": {"value": 25778, "label": "eyeseast"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2022-01-19T23:45:55Z", "updated_at": "2022-11-30T15:06:38Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries.\r\n\r\nI used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries.\r\n\r\nThis is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1605/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1469062686, "node_id": "I_kwDOBm6k_c5XkB4e", "number": 1919, "title": "Intermittent `test_delete_row` test failure ", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-11-30T05:18:46Z", "updated_at": "2022-11-30T05:20:56Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/actions/runs/3580503393/jobs/6022689591\r\n\r\n```\r\n delete_response = await ds_write.client.post(\r\n \"/data/{}/{}/-/delete\".format(table, delete_path),\r\n headers={\r\n \"Authorization\": \"***\".format(write_token(ds_write)),\r\n },\r\n )\r\n> assert delete_response.status_code == 200\r\nE assert 404 == 200\r\nE + where 404 = .status_code\r\n\r\n/home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError\r\n=========================== short test summary info ============================\r\nFAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200\r\n + where 404 = .status_code\r\n```\r\nThis passes most of the time, but very occasionally fails - in this case in Python 3.7\r\n\r\nIt seems to only fail for the `article,k` compound primary key test.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1919/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1466952626, "node_id": "I_kwDOBm6k_c5Xb-uy", "number": 1909, "title": "Option to sort facets alphabetically", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-11-28T19:18:14Z", "updated_at": "2022-11-28T19:19:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Suggested here:\r\n- https://github.com/simonw/datasette/discussions/1908", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1909/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1439009231, "node_id": "I_kwDOBm6k_c5VxYnP", "number": 1884, "title": "Exclude virtual tables from datasette inspect", "user": {"value": 25778, "label": "eyeseast"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2022-11-07T21:26:01Z", "updated_at": "2022-11-21T04:40:56Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Ran `inspect` on a spatialite database and got these warnings:\r\n\r\n```\r\nERROR: conn=, sql = 'select count(*) from [SpatialIndex]', params = None: no such module: VirtualSpatialIndex\r\nERROR: conn=, sql = 'select count(*) from [ElementaryGeometries]', params = None: no such module: VirtualElementary\r\nERROR: conn=, sql = 'select count(*) from [KNN]', params = None: no such module: VirtualKNN\r\n```\r\n\r\nIt still worked, but probably want to catch this.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1884/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1455928469, "node_id": "I_kwDOBm6k_c5Wx7SV", "number": 1903, "title": "Refactor all error classes into a datasette.exceptions module", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 2, "created_at": "2022-11-18T22:44:45Z", "updated_at": "2022-11-20T22:35:01Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "While working on this issue:\r\n- #1896\r\n\r\nI realized that Datasette has error classes scattered around a fair bit, including some in the `datasette.utils.asgi` module for some reason.\r\n\r\nI should clean these up.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1903/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1456013930, "node_id": "I_kwDOBm6k_c5WyQJq", "number": 1906, "title": "Extract publish Heroku support to a plugin", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 0, "created_at": "2022-11-19T00:02:51Z", "updated_at": "2022-11-19T00:03:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> This is a strong argument for extracting the Heroku support out to a plugin - it would allow this to be fixed with a plugin release without needing to push a full release of Datasette itself.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1905#issuecomment-1320678715_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1906/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1452572348, "node_id": "I_kwDOBm6k_c5WlH68", "number": 1900, "title": "datasette package --spatialite throws error during build", "user": {"value": 419145, "label": "rdmurphy"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-11-17T02:03:28Z", "updated_at": "2022-11-18T08:00:38Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hello! Attempting to use `datasette package` to bundle up a SpatiaLite DB and I'm getting this error during the `docker build`:\r\n\r\n```\r\nsqlite3.OperationalError: /usr/lib/x86_64-linux-gnu/mod_spatialite.so.so: cannot open shared object file: No such file or directory\r\n```\r\n\r\nSeems to be throwing when this step is ran:\r\n\r\n```\r\nERROR [6/6] RUN datasette inspect results.db --inspect-file inspect-data.json\r\n```\r\n\r\nThis is with `v0.63.1`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1900/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1454532488, "node_id": "I_kwDOBm6k_c5WsmeI", "number": 1902, "title": "Document {% block crumbs %} for plugin authors", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 0, "created_at": "2022-11-18T06:16:30Z", "updated_at": "2022-11-18T06:16:39Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> For `datasette-copyable` I want to show breadcrumbs that take database/instance permissions into account, so I'm removing `{% block nav %}` entirely and replacing it with this:\r\n>\r\n> ```html+jinja\r\n> {% block crumbs %}\r\n> {{ crumbs.nav(request=request, database=database, table=table) }}\r\n> {% endblock %}\r\n> ```\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1901#issuecomment-1319588163_\r\n\r\nI should document this.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1902/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1452360613, "node_id": "I_kwDOBm6k_c5WkUOl", "number": 1895, "title": "Avoid using host name when building absolute URLs?", "user": {"value": 14294, "label": "hubgit"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-11-16T22:21:27Z", "updated_at": "2022-11-16T22:21:27Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with `https://[service].run.app` rather than the (custom) domain of the Firebase app. \r\n\r\nI guess this is because a) the custom domain of the Firebase app isn't being passed through in the `host` header of the request to the Cloud Run instance and b) the `absolute_url` function in Datasette is using information from the request to build the URL.\r\n\r\nWould it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1895/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1433576351, "node_id": "I_kwDOBm6k_c5VcqOf", "number": 1880, "title": "Datasette with many and large databases > Memory use", "user": {"value": 525934, "label": "amitkoth"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2022-11-02T18:10:27Z", "updated_at": "2022-11-16T17:50:29Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "> Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases.\r\n\r\nThe above is from the docs ^. There's two problems here - the number of datasette \"instances\" in a single server/VM and the size of the database itself. We want the **opposite** of in-memory, including what happens on SQLlite - documented in https://www.sqlite.org/inmemorydb.html\r\n\r\nFrom the context in https://github.com/simonw/datasette/issues/1150 - does it mean datasette is memory-bound to the size of the dataset - which might be a deal-breaker for many large-scale use cases?\r\n\r\nIn an extreme case - let's say a single server had 100 SQLlite databases, which would enable 100 \"instances\" of datasette to run, one per client (e.g. in a SaaS multi-tenant environment). How could we achieve all these goals:\r\n\r\n1. Allow any _one_ of these 100 databases to grow to say 2Tb in size \r\n2. Have one datasette instance, which connects to 1 of the 100 instances, based on incoming credentials/tenant ID\r\n3. Minimize memory use entirely - both by datasette and SQLlite, such that almost all operations are executed in real-time on-disk with little to no memory consumption per-tenant, or per-database.\r\n\r\nAny ideas appreciated - we're looking to use this in a SaaS type of setting - many instances, single server.\r\n\r\n@simonw great work on datasette, in general! Possibly related to https://github.com/simonw/datasette/issues/1480 but we don't want use any kind of serverless infra - this is a long-running VM/server.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1880/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1446657889, "node_id": "I_kwDOBm6k_c5WOj9h", "number": 1885, "title": "Integrate inside GUI app (tkinter)", "user": {"value": 5115787, "label": "dmalves"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-11-13T00:10:43Z", "updated_at": "2022-11-13T00:11:09Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi, I'd like to integrate datasette inside a tkinter app. The app should be able to start/stop datasette server. How could I integrate datasette inside my app, so it can start and stop datasette server?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1885/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 802513359, "node_id": "MDU6SXNzdWU4MDI1MTMzNTk=", "number": 1217, "title": "Possible to deploy as a python app (for Rstudio connect server)?", "user": {"value": 6165713, "label": "plpxsk"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-02-05T22:21:24Z", "updated_at": "2022-11-04T11:37:52Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Is it possible to deploy a `datasette` application as a python\u00a0web app?\r\n\r\nIn my enterprise, I have option to deploy python apps via [Rstudio Connect](https://github.com/rstudio/rsconnect-python), and I would like to publish a `datasette` dashboard for sharing.\r\n\r\nI welcome any pointers to converting `datasette serve` into a python app that can be run as something like `python datasette.py --my_data.db`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1217/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1432037325, "node_id": "I_kwDOBm6k_c5VWyfN", "number": 1879, "title": "Make it easier to fix URL proxy problems", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2022-11-01T20:19:23Z", "updated_at": "2022-11-01T20:33:52Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This came up on Discord again today: figuring out how to run Datasette behind a proxy that might hide the incoming Host: header (and strip HTTPS) is really hard!\r\n\r\nhttps://discord.com/channels/823971286308356157/823971286941302908/1037012475322847263", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1879/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1424980545, "node_id": "I_kwDOBm6k_c5U73pB", "number": 1861, "title": "request.headers.get(\"Content-Type\") fails", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-10-27T03:39:12Z", "updated_at": "2022-10-27T03:39:12Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Turns out this is case-sensitive, needs to be:\r\n\r\n request.headers.get(\"content-type\") != \"application/json\"\r\n\r\nThat's not great usability. It should be case insensitive.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1861/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1410305897, "node_id": "I_kwDOBm6k_c5UD49p", "number": 1845, "title": "Reconsider the Datasette first-run experience", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2022-10-15T22:21:31Z", "updated_at": "2022-10-16T08:54:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Had a really interesting conversation today about how hard it is to get from \"I installed Datasette\" to \"I've done something useful with it\": https://news.ycombinator.com/item?id=33216789#33218590\r\n\r\nSpending some time focusing on that first-run experience feels very worthwhile.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1845/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1406860394, "node_id": "I_kwDOBm6k_c5T2vxq", "number": 1841, "title": "Drop format_bytes for Jinja filesizeformat filter", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-10-12T22:06:34Z", "updated_at": "2022-10-12T22:06:34Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Turns out this isn't necessary:\r\n\r\nhttps://github.com/simonw/datasette/blob/5aa359b86907d11b3ee601510775a85a90224da8/datasette/utils/__init__.py#L849-L858\r\n\r\nI can use this instead: https://jinja.palletsprojects.com/en/3.1.x/templates/#jinja-filters.filesizeformat", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1841/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 912864936, "node_id": "MDU6SXNzdWU5MTI4NjQ5MzY=", "number": 1362, "title": "Consider using CSP to protect against future XSS", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 17, "created_at": "2021-06-06T15:32:20Z", "updated_at": "2022-10-08T18:42:09Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "The XSS in #1360 would have been a lot less damaging if Datasette used CSP to protect against such vulnerabilities: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1362/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1400374908, "node_id": "I_kwDOBm6k_c5TeAZ8", "number": 1836, "title": "docker image is duplicating db files somehow", "user": {"value": 536941, "label": "fgregg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 13, "created_at": "2022-10-06T22:35:54Z", "updated_at": "2022-10-08T16:56:51Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "if you look into the docker image created by docker publish, the `datasette inspect` line is duplicating the db files.\r\n\r\nhere's the result of the inspect command:\r\n\r\n\"Screen\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1836/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1015646369, "node_id": "I_kwDOBm6k_c48iYih", "number": 1480, "title": "Exceeding Cloud Run memory limits when deploying a 4.8G database", "user": {"value": 110420, "label": "ghing"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2021-10-04T21:20:24Z", "updated_at": "2022-10-07T04:39:10Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message:\r\n\r\n> Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits\r\n\r\nUnfortunately, the maximum amount of memory that can be allocated to an instance is 8192M.\r\n\r\nNaively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally:\r\n\r\n- Real Memory Size: 70.6 MB\r\n- Virtual Memory Size: 4.51 GB\r\n- Shared Memory Size: 2.5 MB\r\n- Private Memory Size: 57.4 MB\r\n\r\nI'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow.\r\n\r\nThis is somewhat related to #1082, but on a different platform, so I decided to open a new issue.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1480/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 860722711, "node_id": "MDU6SXNzdWU4NjA3MjI3MTE=", "number": 1301, "title": "Publishing to cloudrun with immutable mode?", "user": {"value": 5413548, "label": "louispotok"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-04-18T17:51:46Z", "updated_at": "2022-10-07T02:38:04Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "I'm a bit confused about immutable mode and publishing to cloudrun. (I want to publish with immutable mode so that I can support database downloads.)\r\n\r\nRunning `datasette publish cloudrun --extra-options=\"-i example.db\"` leads to an error:\r\n> Error: Invalid value for '-i' / '--immutable': Path 'example.db' does not exist. \r\n\r\nHowever, running `datasette publish cloudrun example.db` not only works but seems to publish in immutable mode anyway! I'm seeing this both with `/-/databases.json` and the fact that downloads are working.\r\n\r\nWhen I just `datasette serve` locally, this succeeds both ways and works as expected.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1301/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1399933513, "node_id": "I_kwDOBm6k_c5TcUpJ", "number": 1833, "title": "Ability to submit long queries by POST", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-10-06T16:03:26Z", "updated_at": "2022-10-06T16:18:00Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette doesn't limit URL lengths but some common web proxies do - the one in front of Google Cloud Run for example limits to 8KB total for incoming request headers: https://cloud.google.com/load-balancing/docs/quotas#https-lb-header-limits\r\n\r\nThis means longer SQL queries can break!\r\n\r\nNeed an optional mechanism for submitting queries by POST instead.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1833/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1387712501, "node_id": "I_kwDOBm6k_c5Sts_1", "number": 1824, "title": "Convert &_hide_sql=1 to #_hide_sql", "user": {"value": 562352, "label": "CharlesNepote"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-09-27T12:53:31Z", "updated_at": "2022-10-05T12:56:27Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hiding the SQL textarea with `&_hide_sql=1` enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries).\r\n\r\nIt could probably be done with a few lines of Javascript (I'm going to see if I can do that).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1824/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1386917344, "node_id": "PR_kwDOBm6k_c4_prjN", "number": 1823, "title": "Keyword-only arguments for a bunch of internal methods", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2022-09-27T00:44:59Z", "updated_at": "2022-10-05T04:37:54Z", "closed_at": null, "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1823", "body": "Refs #1822\r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--1823.org.readthedocs.build/en/1823/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1823/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1396977994, "node_id": "I_kwDOBm6k_c5TRDFK", "number": 1830, "title": "Add documentation for writing tests with signed actor cookies", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2022-10-04T23:51:26Z", "updated_at": "2022-10-04T23:51:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I use this pattirn in a lot of plugin tests, e.g. https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/tests/test_edit_templates.py#L55-L58\r\n```python\r\n actor = ds.sign({\"a\": {\"id\": \"root\"}}, \"actor\")\r\n response1 = await ds.client.get(\r\n \"/-/edit-templates/_footer.html\", cookies={\"ds_actor\": actor}\r\n )\r\n```\r\nI should add this to the documentation on this page: https://docs.datasette.io/en/latest/testing_plugins.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1830/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1363552780, "node_id": "I_kwDOBm6k_c5RRioM", "number": 1805, "title": "truncate_cells_html does not work for links?", "user": {"value": 562352, "label": "CharlesNepote"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2022-09-06T16:41:29Z", "updated_at": "2022-10-03T09:18:06Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "We have many links inside our dataset (please don't blame us ;-).\r\n\r\nWhen I use `--settings truncate_cells_html 60` it is not working for the links.\r\n\r\nEg. https://images.openfoodfacts.org/images/products/000/000/000/088/nutrition_fr.5.200.jpg (87 chars) is not truncated:\r\n![image](https://user-images.githubusercontent.com/562352/188689045-1946d776-2305-47cf-bfc5-b5685b9206b7.png)\r\n\r\nIMHO It would make sense that links should be treated as HTML. The link should work of course, but Datasette could truncate it:\r\n[https://images.openfoodfacts.org/images/products/00[...].jpg](https://images.openfoodfacts.org/images/products/000/000/000/088/nutrition_fr.5.200.jpg)\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1805/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "reopened"} {"id": 447469253, "node_id": "MDU6SXNzdWU0NDc0NjkyNTM=", "number": 485, "title": "Improvements to table label detection ", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 10, "created_at": "2019-05-23T06:19:49Z", "updated_at": "2022-10-03T00:04:42Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Label detection doesn't work if the primary key is called pk rather than id, so this page doesn't work: https://latest.datasette.io/fixtures/roadside_attraction_characteristics\r\n\r\nCode is here: \r\n\r\nhttps://github.com/simonw/datasette/blob/cccea85be6aaaeadb31f3b588ec7f732628815f5/datasette/app.py#L644-L653", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/485/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null}