{"id": 268469569, "node_id": "MDU6SXNzdWUyNjg0Njk1Njk=", "number": 39, "title": "Protect against malicious SQL that causes damage even though our DB is immutable", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 2857392, "label": "Ship first public release"}, "comments": 4, "created_at": "2017-10-25T16:44:27Z", "updated_at": "2021-08-17T23:52:07Z", "closed_at": "2017-11-05T02:53:47Z", "author_association": "OWNER", "pull_request": null, "body": "I\u2019m currently operating under the assumption that it\u2019s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I\u2019m not certain they cannot be used to break things in a way that would affect future requests to the API.\r\n\r\nSolution: provide a \u201csafe mode\u201d option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are \u201csafe\u201d.\r\n\r\nIn the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 970320615, "node_id": "MDU6SXNzdWU5NzAzMjA2MTU=", "number": 316, "title": "Fix visible backticks on reference page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-13T11:37:46Z", "updated_at": "2021-08-14T05:12:23Z", "closed_at": "2021-08-14T05:10:48Z", "author_association": "OWNER", "pull_request": null, "body": "https://sqlite-utils.datasette.io/en/latest/reference.html\r\n\r\nSearch for backtick to reveal various minor markup bugs.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/316/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 969840302, "node_id": "MDU6SXNzdWU5Njk4NDAzMDI=", "number": 1431, "title": "`--help-config` should be called `--help-settings`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-13T00:46:48Z", "updated_at": "2021-08-13T01:01:58Z", "closed_at": "2021-08-13T01:01:58Z", "author_association": "OWNER", "pull_request": null, "body": "Follow-on from #1105 rebranding exercise.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1431/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 969758038, "node_id": "MDExOlB1bGxSZXF1ZXN0NzExNzgzNjE2", "number": 1430, "title": "Column metadata", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-12T23:34:39Z", "updated_at": "2021-08-12T23:53:23Z", "closed_at": "2021-08-12T23:53:23Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1430", "body": "Refs #942\r\n\r\nStill needs:\r\n\r\n- [x] Tests\r\n- [x] Documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1430/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 965102534, "node_id": "MDU6SXNzdWU5NjUxMDI1MzQ=", "number": 311, "title": "Add reference documentation generated from docstrings", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-08-10T16:04:00Z", "updated_at": "2021-08-11T12:03:50Z", "closed_at": "2021-08-11T12:03:50Z", "author_association": "OWNER", "pull_request": null, "body": "Using https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html\r\n\r\nI'm not a big fan of this kind of documentation because it so often comes in place of narrative documentation - but the library has great narrative documentation now, so the reference documentation can link to it in places.\r\n\r\nThis will also encourage me to add good docstrings everywhere, useful for IDEs and suchlike.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/311/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 965143346, "node_id": "MDExOlB1bGxSZXF1ZXN0NzA3NDkwNzg5", "number": 312, "title": "Add reference page to documentation using Sphinx autodoc", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2021-08-10T16:59:17Z", "updated_at": "2021-08-10T23:09:32Z", "closed_at": "2021-08-10T23:09:28Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/312", "body": "Refs #311.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/312/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 965440017, "node_id": "MDU6SXNzdWU5NjU0NDAwMTc=", "number": 315, "title": "`.delete_where()` returns `[]` when it should return self", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-10T21:54:55Z", "updated_at": "2021-08-10T23:09:29Z", "closed_at": "2021-08-10T23:09:29Z", "author_association": "OWNER", "pull_request": null, "body": "If the table doesn't exist it should still return `self`, not `[]`:\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/ee469e3122d6f5973ec2584c1580d930daca2e7c/sqlite_utils/db.py#L1676-L1683\r\n\r\nSpotted with `mypy` while working on #312.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/315/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 965166058, "node_id": "MDU6SXNzdWU5NjUxNjYwNTg=", "number": 313, "title": "`.add_foreign_keys()` doesn't reject being called with a View", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-08-10T17:22:17Z", "updated_at": "2021-08-10T17:25:34Z", "closed_at": "2021-08-10T17:25:34Z", "author_association": "OWNER", "pull_request": null, "body": "Spotted this bug using `mypy` while working on #311 / #312!\r\n\r\n```\r\n% mypy sqlite_utils\r\nsqlite_utils/db.py:725: error: Item \"View\" of \"Union[Table, View]\" has no attribute \"foreign_keys\"\r\nFound 1 error in 1 file (checked 5 source files)\r\n```\r\nRefers to this code: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L710-L720\r\n\r\nIt's a bug! We run some checks earlier but none of them ensure that it's a view:\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L697-L709", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/313/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 963528457, "node_id": "MDU6SXNzdWU5NjM1Mjg0NTc=", "number": 1425, "title": "render_cell() hook should support returning an awaitable", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-08-08T22:32:29Z", "updated_at": "2021-08-09T07:14:35Z", "closed_at": "2021-08-09T03:00:37Z", "author_association": "OWNER", "pull_request": null, "body": "Many of the plugin hooks can return an awaitable - e.g. https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hook-extra-template-vars - but `render_cell()` doesn't support this.\r\n\r\nI recently found myself wanting to execute an additional SQL query from that hook, but it wasn't possible to do that since I couldn't use `await`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1425/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 961367843, "node_id": "MDU6SXNzdWU5NjEzNjc4NDM=", "number": 1422, "title": "Ability to default to hiding the SQL for a canned query", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-08-05T02:51:39Z", "updated_at": "2021-08-07T05:32:29Z", "closed_at": "2021-08-07T05:32:29Z", "author_association": "OWNER", "pull_request": null, "body": "I'm working on a project with some HUGE (400+ lines of SQL) canned queries right now.\r\n\r\nAny time you land on the canned query page you have to scroll down a long distance to get to the results!\r\n\r\nWould be useful to be able to default to https://latest.datasette.io/fixtures/magic_parameters?_hide_sql=1 without needing the parameter.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1422/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959898166, "node_id": "MDU6SXNzdWU5NTk4OTgxNjY=", "number": 1420, "title": "`datasette publish cloudrun --cpu X` option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-08-04T05:04:31Z", "updated_at": "2021-08-05T00:54:59Z", "closed_at": "2021-08-04T05:33:48Z", "author_association": "OWNER", "pull_request": null, "body": "For setting the number of vCPUs - current valid values are 1, 2 or 4: https://cloud.google.com/run/docs/configuring/cpu\r\n\r\nPass that through to `gcloud run deploy --image IMAGE_URL --cpu CPU`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1420/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959305209, "node_id": "MDU6SXNzdWU5NTkzMDUyMDk=", "number": 307, "title": "codespell to spell check documentation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-08-03T16:48:19Z", "updated_at": "2021-08-03T16:48:53Z", "closed_at": "2021-08-03T16:48:53Z", "author_association": "OWNER", "pull_request": null, "body": "As seen in https://github.com/simonw/datasette/issues/1417 and https://til.simonwillison.net/python/codespell", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/307/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959278472, "node_id": "MDU6SXNzdWU5NTkyNzg0NzI=", "number": 1417, "title": "Use codespell in CI to spot spelling errors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-03T16:14:15Z", "updated_at": "2021-08-03T16:36:40Z", "closed_at": "2021-08-03T16:36:40Z", "author_association": "OWNER", "pull_request": null, "body": "I noticed Rich is using this: https://github.com/willmcgugan/rich/commit/9c12a4537499797c43725fff5276ef0da62423ef#diff-ce84a1b2c9eb4ab3ea22f610cad7111cb9a2f66365c3b24679901376a2a73ab2\r\n\r\nRan it against the Datasette docs and found a bunch of obvious fixes, surprisingly with no false positives.\r\n\r\n```\r\ndatasette % codespell docs/*.rst\r\ndocs/authentication.rst:63: perfom ==> perform\r\ndocs/authentication.rst:76: perfom ==> perform\r\ndocs/changelog.rst:429: repsonse ==> response\r\ndocs/changelog.rst:503: permissons ==> permissions\r\ndocs/changelog.rst:717: compatibilty ==> compatibility\r\ndocs/changelog.rst:1172: browseable ==> browsable\r\ndocs/deploying.rst:191: similiar ==> similar\r\ndocs/internals.rst:434: Respons ==> Response, respond\r\ndocs/internals.rst:440: Respons ==> Response, respond\r\ndocs/internals.rst:717: tha ==> than, that, the\r\ndocs/performance.rst:42: databse ==> database\r\ndocs/plugin_hooks.rst:667: utilites ==> utilities\r\ndocs/publish.rst:168: countainer ==> container\r\ndocs/settings.rst:352: inalid ==> invalid\r\ndocs/sql_queries.rst:406: preceeded ==> preceded, proceeded\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1417/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959284434, "node_id": "MDExOlB1bGxSZXF1ZXN0NzAyNDIyMjYz", "number": 1418, "title": "Spelling corrections plus CI job for codespell", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-08-03T16:21:19Z", "updated_at": "2021-08-03T16:36:39Z", "closed_at": "2021-08-03T16:36:38Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1418", "body": "Refs #1417.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1418/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 959276629, "node_id": "MDU6SXNzdWU5NTkyNzY2Mjk=", "number": 1416, "title": "Use rich to render tracebacks on errors, if available", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-08-03T16:12:08Z", "updated_at": "2021-08-03T16:12:51Z", "closed_at": "2021-08-03T16:12:51Z", "author_association": "OWNER", "pull_request": null, "body": "> Now thinking I should try adding Rich as an optional dependency to Datasette - if it's there, show tracebacks using it. Could be really handy for development\r\n> https://twitter.com/simonw/status/1422576091055616003", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1416/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 841377702, "node_id": "MDU6SXNzdWU4NDEzNzc3MDI=", "number": 251, "title": "\"sqlite-utils convert\" command to replace the separate \"sqlite-transform\" tool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2021-03-25T22:36:36Z", "updated_at": "2021-08-02T22:39:46Z", "closed_at": "2021-08-02T04:47:40Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/sqlite-transform/issues/11 - I built a separate `sqlite-transform` tool a while ago that uses the word \"transform\" to means something entirely different from `sqlite-utils transform` - I'd like to resolve this by merging the two tools.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/251/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 958516743, "node_id": "MDU6SXNzdWU5NTg1MTY3NDM=", "number": 306, "title": "Configure sphinx.ext.extlinks for issues", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-08-02T21:19:19Z", "updated_at": "2021-08-02T21:39:34Z", "closed_at": "2021-08-02T21:29:22Z", "author_association": "OWNER", "pull_request": null, "body": "As seen in Datasette: https://github.com/simonw/datasette/issues/1227", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/306/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 810394616, "node_id": "MDU6SXNzdWU4MTAzOTQ2MTY=", "number": 1227, "title": "Configure sphinx.ext.extlinks for issues", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-17T17:38:02Z", "updated_at": "2021-08-02T21:38:39Z", "closed_at": "2021-02-18T01:20:33Z", "author_association": "OWNER", "pull_request": null, "body": "Spotted this in the aspw documentation: https://github.com/rogerbinns/apsw/blob/3.34.0-r1/doc/conf.py#L29-L36\r\n\r\n```python\r\nextlinks={\r\n 'cvstrac': ('https://sqlite.org/cvstrac/tktview?tn=%s',\r\n 'SQLite ticket #'),\r\n 'sqliteapi': ('https://sqlite.org/c3ref/%s.html', 'XXYouShouldNotSeeThisXX'),\r\n 'issue': ('https://github.com/rogerbinns/apsw/issues/%s',\r\n 'APSW issue '),\r\n 'source': ('https://github.com/rogerbinns/apsw/blob/master/%s',\r\n ''),\r\n }\r\n```\r\nWhich lets you link to issues like this:\r\n\r\n :issue:`268`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1227/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957383814, "node_id": "MDU6SXNzdWU5NTczODM4MTQ=", "number": 301, "title": "insert-files should get a --silent option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-08-01T04:11:03Z", "updated_at": "2021-08-02T19:12:21Z", "closed_at": "2021-08-02T19:12:21Z", "author_association": "OWNER", "pull_request": null, "body": "The new `sqlite-utils convert` command I'm adding in #251 will have a `--silent` option for turning off the progress bars. The only other command that has progress bars right now is `insert-files` so it should get this option too, for consistency.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/301/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957731178, "node_id": "MDU6SXNzdWU5NTc3MzExNzg=", "number": 304, "title": "`table.convert(..., where=)` and `sqlite-utils convert ... --where=`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-08-02T04:27:23Z", "updated_at": "2021-08-02T19:00:00Z", "closed_at": "2021-08-02T18:58:10Z", "author_association": "OWNER", "pull_request": null, "body": "For applying the conversion to a subset of rows selected using the where clause.\r\n\r\nShould also take optional arguments, as seen in `db[\"dogs\"].delete_where(\"age < ?\", [3])`.\r\n\r\nFollows #302 and #251. This was originally https://github.com/simonw/sqlite-transform/issues/9", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/304/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957741820, "node_id": "MDU6SXNzdWU5NTc3NDE4MjA=", "number": 305, "title": "Python: need a way to execute a count with an extra where clause", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-02T04:52:02Z", "updated_at": "2021-08-02T05:08:22Z", "closed_at": "2021-08-02T05:08:22Z", "author_association": "OWNER", "pull_request": null, "body": "I need this for #304. I'll probably add this to the `.execute_count()` method as `where=` and `where_args=`.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/305/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957536983, "node_id": "MDExOlB1bGxSZXF1ZXN0NzAwOTQ0NjQ0", "number": 303, "title": "sqlite-utils convert command and db[table].convert(...) method", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-01T16:52:42Z", "updated_at": "2021-08-02T04:47:42Z", "closed_at": "2021-08-02T04:47:39Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/303", "body": "Refs #251, #302.\r\n\r\n- [x] Get recipes working\r\n- [x] Document recipes\r\n- [x] Implement `db[table].convert(...)` method\r\n- [x] Add tests for recipes that use the new Python method\r\n- [x] Implement `db[table].convert(..., multi=True)` mechanism\r\n- [x] Documentation for `db[table].convert(...)`\r\n- [x] Refactor `sqlite-utils convert` to use the new method", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/303/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 957529248, "node_id": "MDU6SXNzdWU5NTc1MjkyNDg=", "number": 302, "title": "Python library version of `sqlite-utils convert`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 1, "created_at": "2021-08-01T16:11:02Z", "updated_at": "2021-08-02T04:47:40Z", "closed_at": "2021-08-02T04:47:40Z", "author_association": "OWNER", "pull_request": null, "body": "Spin off from #251. The ability to execute Python functions to convert and split columns should be part of the library too, not just the CLI.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/302/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957345476, "node_id": "MDU6SXNzdWU5NTczNDU0NzY=", "number": 1411, "title": "Canned query ?sql= is pointlessly echoed in query string starting from hidden mode", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-08-01T00:17:13Z", "updated_at": "2021-08-01T03:27:30Z", "closed_at": "2021-08-01T00:58:17Z", "author_association": "OWNER", "pull_request": null, "body": "Example: https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1\r\n\r\nSubmitting that form again results in this:\r\n\r\nhttps://latest.datasette.io/fixtures/neighborhood_search?sql=%0D%0Aselect+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities%0D%0A++++++++on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B%0D%0A&_hide_sql=1&text=cork\r\n\r\nBecause the HTML on https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1 includes this:\r\n\r\n```html\r\n

Custom SQL query returning 1 row (show)

\r\n \r\n \r\n \r\n \r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1411/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 957298475, "node_id": "MDU6SXNzdWU5NTcyOTg0NzU=", "number": 1407, "title": "OSError: AF_UNIX path too long in ds_unix_domain_socket_server", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-07-31T18:36:06Z", "updated_at": "2021-07-31T19:03:44Z", "closed_at": "2021-07-31T19:03:44Z", "author_association": "OWNER", "pull_request": null, "body": "Got this exception while working on #1406.\r\n\r\n```\r\n @pytest.fixture(scope=\"session\")\r\n def ds_unix_domain_socket_server(tmp_path_factory):\r\n socket_folder = tmp_path_factory.mktemp(\"uds\")\r\n uds = str(socket_folder / \"datasette.sock\")\r\n ds_proc = subprocess.Popen(\r\n [\"datasette\", \"--memory\", \"--uds\", uds],\r\n stdout=subprocess.PIPE,\r\n stderr=subprocess.STDOUT,\r\n cwd=tempfile.gettempdir(),\r\n )\r\n # Give the server time to start\r\n time.sleep(1.5)\r\n # Check it started successfully\r\n> assert not ds_proc.poll(), ds_proc.stdout.read().decode(\"utf-8\")\r\nE AssertionError: INFO: Started server process [48453]\r\nE INFO: Waiting for application startup.\r\nE INFO: Application startup complete.\r\nE Traceback (most recent call last):\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/datasette\", line 33, in \r\nE sys.exit(load_entry_point('datasette', 'console_scripts', 'datasette')())\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py\", line 1137, in __call__\r\nE return self.main(*args, **kwargs)\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py\", line 1062, in main\r\nE rv = self.invoke(ctx)\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py\", line 1668, in invoke\r\nE return _process_result(sub_ctx.command.invoke(sub_ctx))\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py\", line 1404, in invoke\r\nE return ctx.invoke(self.callback, **ctx.params)\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py\", line 763, in invoke\r\nE return __callback(*args, **kwargs)\r\nE File \"/Users/simon/Dropbox/Development/datasette/datasette/cli.py\", line 583, in serve\r\nE uvicorn.run(ds.app(), **uvicorn_kwargs)\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/main.py\", line 393, in run\r\nE server.run()\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py\", line 50, in run\r\nE loop.run_until_complete(self.serve(sockets=sockets))\r\nE File \"/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py\", line 616, in run_until_complete\r\nE return future.result()\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py\", line 67, in serve\r\nE await self.startup(sockets=sockets)\r\nE File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py\", line 133, in startup\r\nE server = await asyncio.start_unix_server(\r\nE File \"/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/streams.py\", line 132, in start_unix_server\r\nE return await loop.create_unix_server(factory, path, **kwds)\r\nE File \"/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/unix_events.py\", line 296, in create_unix_server\r\nE sock.bind(path)\r\nE OSError: AF_UNIX path too long\r\nE \r\nE assert not 1\r\nE + where 1 = >()\r\nE + where > = .poll\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1407/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 956303470, "node_id": "MDU6SXNzdWU5NTYzMDM0NzA=", "number": 1406, "title": "Tests failing with FileNotFoundError in runner.isolated_filesystem", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-07-30T00:39:00Z", "updated_at": "2021-07-31T18:56:35Z", "closed_at": "2021-07-31T18:56:35Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. https://github.com/simonw/datasette/runs/3197141955\r\n\r\nI've seen this error before, but I don't yet have a good workaround for it.\r\n\r\n```\r\n @contextlib.contextmanager\r\n def isolated_filesystem(\r\n self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None\r\n ) -> t.Iterator[str]:\r\n \"\"\"A context manager that creates a temporary directory and\r\n changes the current working directory to it. This isolates tests\r\n that affect the contents of the CWD to prevent them from\r\n interfering with each other.\r\n \r\n :param temp_dir: Create the temporary directory under this\r\n directory. If given, the created directory is not removed\r\n when exiting.\r\n \r\n .. versionchanged:: 8.0\r\n Added the ``temp_dir`` parameter.\r\n \"\"\"\r\n> cwd = os.getcwd()\r\nE FileNotFoundError: [Errno 2] No such file or directory\r\n\r\n/opt/hostedtoolcache/Python/3.6.14/x64/lib/python3.6/site-packages/click/testing.py:466: FileNotFoundError\r\n=========================== short test summary info ============================\r\nFAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_apt_get_install\r\nFAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[---setting force_https_urls on]\r\nFAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting base_url /foo---setting base_url /foo --setting force_https_urls on]\r\nFAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting force_https_urls off---setting force_https_urls off]\r\nFAILED tests/test_publish_heroku.py::test_publish_heroku_requires_heroku - Fi...\r\nFAILED tests/test_publish_heroku.py::test_publish_heroku_installs_plugin - Fi...\r\nFAILED tests/test_publish_heroku.py::test_publish_heroku - FileNotFoundError:...\r\nFAILED tests/test_publish_heroku.py::test_publish_heroku_plugin_secrets - Fil...\r\n================== 8 failed, 920 passed in 188.22s (0:03:08) ===================\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1406/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 955316250, "node_id": "MDU6SXNzdWU5NTUzMTYyNTA=", "number": 1405, "title": "utils.parse_metadata() should be a documented internal function", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-07-28T23:51:39Z", "updated_at": "2021-07-29T23:33:30Z", "closed_at": "2021-07-29T23:30:24Z", "author_association": "OWNER", "pull_request": null, "body": "Because it's used by this plugin: https://github.com/simonw/datasette-remote-metadata", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1405/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 953352015, "node_id": "MDU6SXNzdWU5NTMzNTIwMTU=", "number": 1404, "title": "`register_routes()` hook should take `datasette` argument", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-07-26T23:00:33Z", "updated_at": "2021-07-26T23:27:07Z", "closed_at": "2021-07-26T23:26:00Z", "author_association": "OWNER", "pull_request": null, "body": "Currently that plugin hook takes no arguments at all. This means it's not possible to conditionally register routes based on Datasette plugin configuration.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1404/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 952154468, "node_id": "MDU6SXNzdWU5NTIxNTQ0Njg=", "number": 299, "title": "Ability to see just specific table schemas with `sqlite-utils schema`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-07-24T22:00:05Z", "updated_at": "2021-07-24T22:12:01Z", "closed_at": "2021-07-24T22:08:46Z", "author_association": "OWNER", "pull_request": null, "body": "It currently accepts no arguments. Allowing for optional arguments specifying tables would be useful:\r\n\r\n sqlite-utils schema fixtures.db facetable searchable\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/299/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 946553953, "node_id": "MDExOlB1bGxSZXF1ZXN0NjkxNzA3NDA5", "number": 1397, "title": "Fix for race condition in refresh_schemas(), closes #1231", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-07-16T19:44:43Z", "updated_at": "2021-07-16T19:45:00Z", "closed_at": "2021-07-16T19:44:58Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1397", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1397/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 811367257, "node_id": "MDU6SXNzdWU4MTEzNjcyNTc=", "number": 1231, "title": "Race condition errors in new refresh_schemas() mechanism", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-02-18T18:49:54Z", "updated_at": "2021-07-16T19:44:59Z", "closed_at": "2021-07-16T19:44:59Z", "author_association": "OWNER", "pull_request": null, "body": "I tried running a Locust load test against Datasette and hit an error message about a failure to create tables because they already existed. I think this means there are race conditions in the new `refresh_schemas()` mechanism added in #1150.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1231/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 944870799, "node_id": "MDU6SXNzdWU5NDQ4NzA3OTk=", "number": 1394, "title": "Big performance boost on faceting: skip the inner order by", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-07-14T23:32:29Z", "updated_at": "2021-07-16T02:23:32Z", "closed_at": "2021-07-15T00:05:50Z", "author_association": "OWNER", "pull_request": null, "body": "I just noticed something that could make for a huge performance improvement in faceting.\r\n\r\nThe default query used by Datasette when faceting looks like this:\r\n```sql\r\nselect\r\n country_long,\r\n count(*)\r\nfrom (\r\n select * from [global-power-plants] order by rowid\r\n)\r\nwhere\r\n country_long is not null\r\ngroup by\r\n country_long\r\norder by\r\n count(*) desc\r\n```\r\nHere it takes 53ms: https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D+order+by+rowid%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc\r\n\r\nNote that there's a `order by rowid` in there which isn't necessary - the order on that inner query doesn't matter since we're grouping and counting.\r\n\r\nI had assumed SQLite would optimize this away - but it turns out it doesn't! Consider this version of the query, with that pointless order by removed:\r\n```\r\nselect\r\n country_long,\r\n count(*)\r\nfrom (\r\n select * from [global-power-plants]\r\n)\r\nwhere\r\n country_long is not null\r\ngroup by\r\n country_long\r\norder by\r\n count(*) desc\r\n```\r\nhttps://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc runs in 7.2ms!\r\n\r\nI tried this optimization on a table with 2.5m rows in it - without the optimization it took 5 seconds, with the optimization it took 450ms. So this is a very significant improvement!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1394/reactions\", \"total_count\": 2, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 466996584, "node_id": "MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw", "number": 557, "title": "Get tests running on Windows using Travis CI", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-07-11T16:36:57Z", "updated_at": "2021-07-10T23:39:48Z", "closed_at": "2021-07-10T23:39:48Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/557", "body": "Refs #511", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/557/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 941300946, "node_id": "MDU6SXNzdWU5NDEzMDA5NDY=", "number": 1391, "title": "Stop using generated columns in fixtures.db", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-07-10T18:26:11Z", "updated_at": "2021-07-10T19:26:58Z", "closed_at": "2021-07-10T19:26:00Z", "author_association": "OWNER", "pull_request": null, "body": "Refs #1376 - but I also keep running into this myself, where I try to run something against `fixtures.db` and get this confusing error:\r\n\r\n sqlite3.DatabaseError: malformed database schema (generated_columns) - near \"AS\": syntax error\r\n\r\nI'm going to stop using generated columns in `fixtures.db` and instead dynamically generate the generated column table for the duration of the relevant test.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1391/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 940077168, "node_id": "MDU6SXNzdWU5NDAwNzcxNjg=", "number": 1389, "title": "\"searchmode\": \"raw\" in table metadata", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2021-07-08T17:32:10Z", "updated_at": "2021-07-10T18:33:13Z", "closed_at": "2021-07-10T18:33:13Z", "author_association": "OWNER", "pull_request": null, "body": "> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw\r\n>\r\n> But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account)\r\n> Here it is:\r\n>\r\n> ```\r\n> {\r\n> \"databases\": {\r\n> \"index\": {\r\n> \"tables\": {\r\n> \"summary\": {\r\n> \"sort\": \"title\",\r\n> \"searchmode\": \"raw\"\r\n> }\r\n> }\r\n> }\r\n> }\r\n> }\r\n\r\n_Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1389/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 940891698, "node_id": "MDU6SXNzdWU5NDA4OTE2OTg=", "number": 1390, "title": "Mention restarting systemd in documentation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-07-09T16:05:15Z", "updated_at": "2021-07-09T16:32:57Z", "closed_at": "2021-07-09T16:32:33Z", "author_association": "OWNER", "pull_request": null, "body": "https://docs.datasette.io/en/stable/deploying.html#running-datasette-using-systemd\r\n\r\nNeed to clarify that if you add a new database or change metadata you need to restart systemd.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1390/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 935930820, "node_id": "MDU6SXNzdWU5MzU5MzA4MjA=", "number": 1387, "title": "absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-07-02T16:58:25Z", "updated_at": "2021-07-02T17:58:23Z", "closed_at": "2021-07-02T17:33:05Z", "author_association": "OWNER", "pull_request": null, "body": "Reported in the wild on https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber - the \"next page\" link links to https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber&_next=100\r\n\r\nThat installation uses `\"base_url\": \"/collection-analysis/\"`\r\n\r\nWeirdly all of the other links on that page - to facet results, sort orders, row permalinks etc - work fine. It's JUST the `next_url` one that is broken.\r\n\r\nAlso broken in their JSON: https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1 returns\r\n\r\n```json\r\n \"suggested_facets\": [],\r\n \"next\": \"1\",\r\n \"next_url\": \"https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1&_next=1\",\r\n \"private\": false,\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1387/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 927789811, "node_id": "MDU6SXNzdWU5Mjc3ODk4MTE=", "number": 292, "title": "Add contributing documentation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-23T02:13:05Z", "updated_at": "2021-06-25T17:53:51Z", "closed_at": "2021-06-25T17:53:51Z", "author_association": "OWNER", "pull_request": null, "body": "Like https://docs.datasette.io/en/latest/contributing.html (but simpler) - should cover how to run `black` and `flake8` and `mypy` and how to run the tests.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/292/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 926777310, "node_id": "MDU6SXNzdWU5MjY3NzczMTA=", "number": 290, "title": "`db.query()` method (renamed `db.execute_returning_dicts()`)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2021-06-22T03:03:54Z", "updated_at": "2021-06-24T23:17:38Z", "closed_at": "2021-06-24T22:54:43Z", "author_association": "OWNER", "pull_request": null, "body": "Most of this library deals with lists of Python dictionaries - `.insert_all()`, `.rows`, `.rows_where()`, `.search()`.\r\n\r\nThe `db.execute()` method is the only thing that returns a `sqlite3` cursor.\r\n\r\nThere is a clumsily named `db.execute_returning_dicts(sql)` method but it's not currently mentioned in the documentation.\r\n\r\nIt needs a better name, and needs to be properly documented.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/290/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 927766296, "node_id": "MDU6SXNzdWU5Mjc3NjYyOTY=", "number": 291, "title": "Adopt flake8", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-06-23T01:19:37Z", "updated_at": "2021-06-24T17:50:27Z", "closed_at": "2021-06-24T17:50:27Z", "author_association": "OWNER", "pull_request": null, "body": "", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/291/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 920884085, "node_id": "MDU6SXNzdWU5MjA4ODQwODU=", "number": 1377, "title": "Mechanism for plugins to exclude certain paths from CSRF checks", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-06-15T00:48:20Z", "updated_at": "2021-06-23T22:51:33Z", "closed_at": "2021-06-23T22:51:33Z", "author_association": "OWNER", "pull_request": null, "body": "I need this for a plugin I'm building that offers a POST API.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1377/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925487946, "node_id": "MDU6SXNzdWU5MjU0ODc5NDY=", "number": 286, "title": "Add installation instructions", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-19T23:55:36Z", "updated_at": "2021-06-20T18:47:13Z", "closed_at": "2021-06-20T18:47:13Z", "author_association": "OWNER", "pull_request": null, "body": "`pip install sqlite-utils`, `pipx install sqlite-utils` and `brew install sqlite-utils`", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/286/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925544070, "node_id": "MDU6SXNzdWU5MjU1NDQwNzA=", "number": 287, "title": "Update rowid examples in the docs", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-20T08:03:00Z", "updated_at": "2021-06-20T18:26:21Z", "closed_at": "2021-06-20T18:26:21Z", "author_association": "OWNER", "pull_request": null, "body": "Changed in #284 - a couple of examples need updating on https://github.com/simonw/sqlite-utils/blob/3.10/docs/cli.rst.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/287/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925545468, "node_id": "MDU6SXNzdWU5MjU1NDU0Njg=", "number": 288, "title": "sqlite-utils memory blah.json --schema", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-20T08:10:40Z", "updated_at": "2021-06-20T18:26:21Z", "closed_at": "2021-06-20T18:26:21Z", "author_association": "OWNER", "pull_request": null, "body": "Like `--dump` but only outputs the schema - useful for understanding what you are about to run queries against.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/288/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 921878733, "node_id": "MDU6SXNzdWU5MjE4Nzg3MzM=", "number": 272, "title": "Idea: import CSV to memory, run SQL, export in a single command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 22, "created_at": "2021-06-15T23:02:48Z", "updated_at": "2021-06-19T23:36:48Z", "closed_at": "2021-06-18T15:05:03Z", "author_association": "OWNER", "pull_request": null, "body": "I quite often load a CSV file into a SQLite DB, then do stuff with it (like export results back out again as a new CSV) without any intention of keeping the CSV file around afterwards.\r\n\r\nWhat if `sqlite-utils` could do this for me? Something like this:\r\n\r\n sqlite-utils --csv blah.csv --csv baz.csv \"select * from blah join baz ...\"\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/272/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925320167, "node_id": "MDU6SXNzdWU5MjUzMjAxNjc=", "number": 284, "title": ".transform(types=) turns rowid into a concrete column", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-06-19T05:25:27Z", "updated_at": "2021-06-19T15:28:30Z", "closed_at": "2021-06-19T15:28:30Z", "author_association": "OWNER", "pull_request": null, "body": "Noticed this in the tests for `sqlite-utils memory` in #282 - is it possible to fix this?\r\n\r\nhttps://github.com/simonw/sqlite-utils/commit/ec5174ed40fa283cb06f25ee0c0136297ec313ae", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/284/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925410305, "node_id": "MDU6SXNzdWU5MjU0MTAzMDU=", "number": 285, "title": "Introspection property for telling if a table is a rowid table", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-06-19T14:56:16Z", "updated_at": "2021-06-19T15:12:33Z", "closed_at": "2021-06-19T15:12:33Z", "author_association": "OWNER", "pull_request": null, "body": "_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416785_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/285/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925319214, "node_id": "MDU6SXNzdWU5MjUzMTkyMTQ=", "number": 283, "title": "memory: Shouldn't detect types for JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-19T05:17:35Z", "updated_at": "2021-06-19T14:52:48Z", "closed_at": "2021-06-19T14:52:48Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/sqlite-utils/blob/ec5174ed40fa283cb06f25ee0c0136297ec313ae/sqlite_utils/cli.py#L1244-L1251\r\n\r\nThis runs against JSON as well as CSV/TSV - which isn't necessary and In fact throws errors if there is any nested data.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/283/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 925305186, "node_id": "MDU6SXNzdWU5MjUzMDUxODY=", "number": 282, "title": "Automatic type detection for CSV data", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-06-19T03:33:21Z", "updated_at": "2021-06-19T04:42:03Z", "closed_at": "2021-06-19T04:38:00Z", "author_association": "OWNER", "pull_request": null, "body": "I've touched on this before in #179 - but now that I've added `sqlite-utils memory` this is much more important - because unlike with `sqlite-utils insert` the in-memory command doesn't give you the opportunity to fix any types you imported from CSV, so queries like `select * from stdin where age > 3` are never going to work correctly against these temporary in-memory tables.\r\n\r\nTeaching `sqlite-utils insert` to detect types for columns in a CSV file would be a backwards-compatibility breaking change. Teaching `sqlite-utils memory` that trick would not be, since it hasn't been included in a release yet.\r\n\r\nIt's a little inconsistent, but I'm going to have `sqlite-utils memory` default to detecting types while `sqlite-utils insert` does not. In each case this can be controlled by a new command-line option:\r\n\r\n cat file.csv | sqlite-utils memory - --no-detect-types\r\n\r\nTo opt-in for `sqlite-utils insert`:\r\n\r\n cat file.csv | sqlite-utils insert blah.db blah - --detect-types\r\n\r\nI'll have short options for these too: `-n` for `--no-detect-types` and `-d` for `--detect-types`.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/282/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 709577625, "node_id": "MDU6SXNzdWU3MDk1Nzc2MjU=", "number": 179, "title": "sqlite-utils transform/insert --detect-types", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-09-26T17:28:55Z", "updated_at": "2021-06-19T03:36:16Z", "closed_at": "2021-06-19T03:36:05Z", "author_association": "OWNER", "pull_request": null, "body": "Idea from https://github.com/simonw/datasette-edit-tables/issues/13 - provide Python utility methods and accompanying CLI options for detecting the likely types of TEXT columns.\r\n\r\nSo if you have a text column that actually contained exclusively integer string values, it can let you know and let you run transform against it.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/179/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 924990677, "node_id": "MDU6SXNzdWU5MjQ5OTA2Nzc=", "number": 279, "title": "sqlite-utils memory should handle TSV and JSON in addition to CSV", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-06-18T15:02:54Z", "updated_at": "2021-06-19T03:11:59Z", "closed_at": "2021-06-19T03:11:59Z", "author_association": "OWNER", "pull_request": null, "body": "- Use sniff to detect CSV or TSV (if `:tsv` or `:csv` was not specified) and delimiters\r\n\r\nFollow-on from #272", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/279/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 924992318, "node_id": "MDU6SXNzdWU5MjQ5OTIzMTg=", "number": 281, "title": "Mechanism for explicitly stating CSV or JSON or TSV for sqlite-utils memory", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-18T15:04:53Z", "updated_at": "2021-06-19T03:11:59Z", "closed_at": "2021-06-19T03:11:59Z", "author_association": "OWNER", "pull_request": null, "body": "- Implement `filename.json:json` and `-:nl` and suchlike options for specifying the format rather than guessing it - see https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861985944\r\n\r\nFollows #272", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/281/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 924991194, "node_id": "MDU6SXNzdWU5MjQ5OTExOTQ=", "number": 280, "title": "Add --encoding option to sqlite-utils memory", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-18T15:03:32Z", "updated_at": "2021-06-18T15:29:46Z", "closed_at": "2021-06-18T15:29:46Z", "author_association": "OWNER", "pull_request": null, "body": "Follow-on from #272 - this will work like `--encoding` on `sqlite-utils insert` and will affect all CSV files processed by `sqlite-utils memory`.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/280/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 922099793, "node_id": "MDExOlB1bGxSZXF1ZXN0NjcxMDE0NzUx", "number": 273, "title": "sqlite-utils memory command for directly querying CSV/JSON data", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-06-16T05:04:58Z", "updated_at": "2021-06-18T15:01:17Z", "closed_at": "2021-06-18T15:00:52Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/273", "body": "Refs #272. Initial implementation only does CSV data, still needs:\r\n\r\n- [x] Implement `--save`\r\n- [x] Add `--dump` to the documentation\r\n- [x] Add `--attach` example to the documentation\r\n- [x] Replace `:memory:` in documentation", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/273/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 268176505, "node_id": "MDU6SXNzdWUyNjgxNzY1MDU=", "number": 34, "title": "Support CSV export with a .csv extension", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-10-24T20:34:43Z", "updated_at": "2021-06-17T18:14:48Z", "closed_at": "2018-05-28T20:45:34Z", "author_association": "OWNER", "pull_request": null, "body": "Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports.\r\n\r\nHow would this work against a view which doesn\u2019t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records?\r\n\r\nRelates to #5 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/34/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 323681589, "node_id": "MDU6SXNzdWUzMjM2ODE1ODk=", "number": 266, "title": "Export to CSV", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 27, "created_at": "2018-05-16T15:50:24Z", "updated_at": "2021-06-17T18:14:24Z", "closed_at": "2018-06-18T06:05:25Z", "author_association": "OWNER", "pull_request": null, "body": "Datasette needs to be able to export data to CSV.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/266/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 333000163, "node_id": "MDU6SXNzdWUzMzMwMDAxNjM=", "number": 312, "title": "HTML, CSV and JSON views should support ?_col=&_col=", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-06-16T16:53:35Z", "updated_at": "2021-06-17T18:14:24Z", "closed_at": "2018-06-16T17:00:12Z", "author_association": "OWNER", "pull_request": null, "body": "To support whitelisting columns to display.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/312/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 335141434, "node_id": "MDU6SXNzdWUzMzUxNDE0MzQ=", "number": 326, "title": "CSV should respect --cors and return cors headers", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-06-24T00:44:07Z", "updated_at": "2021-06-17T18:14:24Z", "closed_at": "2018-06-24T00:59:45Z", "author_association": "OWNER", "pull_request": null, "body": "Otherwise tools like Vega can't load data via CSV.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/326/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 725184645, "node_id": "MDU6SXNzdWU3MjUxODQ2NDU=", "number": 1034, "title": "Better way of representing binary data in .csv output", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 19, "created_at": "2020-10-20T04:28:58Z", "updated_at": "2021-06-17T18:13:21Z", "closed_at": "2020-10-29T22:47:46Z", "author_association": "OWNER", "pull_request": null, "body": "I just noticed this: https://latest.datasette.io/fixtures/binary_data.csv\r\n\r\n```csv\r\nrowid,data\r\n1,b'\\x15\\x1c\\x02\\xc7\\xad\\x05\\xfe'\r\n2,b'\\x15\\x1c\\x03\\xc7\\xad\\x05\\xfe'\r\n```\r\nThere's no good way to represent binary data in a CSV file, but this seems like one of the more-bad options.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1034/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 503190241, "node_id": "MDU6SXNzdWU1MDMxOTAyNDE=", "number": 584, "title": "Codec error in some CSV exports", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-10-07T01:15:34Z", "updated_at": "2021-06-17T18:13:20Z", "closed_at": "2019-10-18T05:23:16Z", "author_association": "OWNER", "pull_request": null, "body": "Got this exploring my Swarm checkins:\r\n\r\n![448DBFC4-71F8-4846-83C0-BEA511B2157A](https://user-images.githubusercontent.com/9599/66279259-3af53480-e865-11e9-9651-04fd2d895392.jpeg)\r\n\r\n`/swarm/stickers.csv?stickerType=messageOnly&_size=max`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/584/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 516748849, "node_id": "MDU6SXNzdWU1MTY3NDg4NDk=", "number": 612, "title": "CSV export is broken for tables with null foreign keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-11-02T22:52:47Z", "updated_at": "2021-06-17T18:13:20Z", "closed_at": "2019-11-02T23:12:53Z", "author_association": "OWNER", "pull_request": null, "body": "Following on from #406 - this CSV export appears to be broken:\r\n\r\nhttps://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max\r\n```csv\r\npk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label\r\n1,1,hello,1,1\r\n2,,\r\n```\r\nThat second row should have 5 values, but it only has 4.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/612/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 906385991, "node_id": "MDU6SXNzdWU5MDYzODU5OTE=", "number": 1349, "title": "CSV ?_stream=on redundantly calculates facets for every page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2021-05-29T06:11:23Z", "updated_at": "2021-06-17T18:12:32Z", "closed_at": "2021-06-01T15:52:53Z", "author_association": "OWNER", "pull_request": null, "body": "I'm trying to figure out why a full CSV export from https://covid-19.datasettes.com/covid/ny_times_us_counties runs unbearably slowly.\r\n\r\nIt's because the streaming endpoint works by scrolling through every page, and it turns out every page calculates facets and suggested facets!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1349/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 906993731, "node_id": "MDU6SXNzdWU5MDY5OTM3MzE=", "number": 1351, "title": "Get `?_trace=1` working with CSV and streaming CSVs", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-05-31T03:02:15Z", "updated_at": "2021-06-17T18:12:32Z", "closed_at": "2021-06-01T15:50:09Z", "author_association": "OWNER", "pull_request": null, "body": "> I think it's worth getting `?_trace=1` to work with streaming CSV - this would have helped me spot this issue a long time ago.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1349#issuecomment-851133125_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1351/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 759695780, "node_id": "MDU6SXNzdWU3NTk2OTU3ODA=", "number": 1133, "title": "Option to omit header row in CSV export", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-08T18:54:46Z", "updated_at": "2021-06-17T18:12:31Z", "closed_at": "2020-12-10T23:28:51Z", "author_association": "OWNER", "pull_request": null, "body": "`?_header=off` - for symmetry with existing option `?_nl=on`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1133/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 732685643, "node_id": "MDU6SXNzdWU3MzI2ODU2NDM=", "number": 1063, "title": ".csv should link to .blob downloads", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 3, "created_at": "2020-10-29T21:45:58Z", "updated_at": "2021-06-17T18:12:30Z", "closed_at": "2020-10-29T22:47:45Z", "author_association": "OWNER", "pull_request": null, "body": "- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass)\r\n- ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~\r\n\r\n> Moving the CSV work to a separate ticket.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1063/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 922955697, "node_id": "MDU6SXNzdWU5MjI5NTU2OTc=", "number": 275, "title": "Enable code coverage", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-16T18:33:49Z", "updated_at": "2021-06-17T00:12:12Z", "closed_at": "2021-06-17T00:12:12Z", "author_association": "OWNER", "pull_request": null, "body": "https://app.codecov.io/gh/simonw/sqlite-utils\r\n\r\nSame mechanism as Datasette. Need to copy across the token from that page and add an equivalent of this workflow: https://github.com/simonw/datasette/blob/main/.github/workflows/test-coverage.yml", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/275/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 922832113, "node_id": "MDU6SXNzdWU5MjI4MzIxMTM=", "number": 274, "title": "sqlite-utils dump my.db command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-16T16:30:14Z", "updated_at": "2021-06-16T23:51:54Z", "closed_at": "2021-06-16T23:51:54Z", "author_association": "OWNER", "pull_request": null, "body": "Inspired by the `--dump` mechanism I added to `sqlite-utils memory` here: https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862018937\r\n\r\n> Can use `.iterdump()` to implement this: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.iterdump\r\n>\r\n> Maybe instead (or as-well-as) offer `--dump` which dumps out the SQL from that.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/274/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 919733213, "node_id": "MDU6SXNzdWU5MTk3MzMyMTM=", "number": 33, "title": "Searching for whitespace throws an error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-13T06:57:57Z", "updated_at": "2021-06-13T14:36:39Z", "closed_at": "2021-06-13T14:36:39Z", "author_association": "MEMBER", "pull_request": null, "body": "https://datasette.io/-/beta?q=+ returns a 500\r\n\r\n> fts5: syntax error near \"\"", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/33/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 919702451, "node_id": "MDU6SXNzdWU5MTk3MDI0NTE=", "number": 271, "title": "table.upsert_all() fails if input has a single column that should be a primary key", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-13T02:50:27Z", "updated_at": "2021-06-13T02:57:29Z", "closed_at": "2021-06-13T02:57:29Z", "author_association": "OWNER", "pull_request": null, "body": "This works:\r\n```pycon\r\n>>> db['foo'].insert_all([{\"name\": \"hello\"}], pk=\"name\")\r\n\r\n```\r\nBut this fails:\r\n```\r\n>>> db['foo3'].upsert_all([{\"name\": \"hello\"}], pk=\"name\")\r\nTraceback (most recent call last):\r\n File \"\", line 1, in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1837, in upsert_all\r\n return self.insert_all(\r\n File \"/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1778, in insert_all\r\n self.insert_chunk(\r\n File \"/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1588, in insert_chunk\r\n result = self.db.execute(query, params)\r\n File \"/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py\", line 213, in execute\r\n return self.conn.execute(sql, parameters)\r\nsqlite3.OperationalError: near \"WHERE\": syntax error\r\n```\r\nWith the debugger:\r\n```\r\n>>> import pdb; pdb.pm()\r\n> /Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py(213)execute()\r\n-> return self.conn.execute(sql, parameters)\r\n(Pdb) print(sql, parameters)\r\nUPDATE [foo3] SET WHERE [name] = ? ['hello']\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/271/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 919181559, "node_id": "MDU6SXNzdWU5MTkxODE1NTk=", "number": 268, "title": "db.schema property and sqlite-utils schema command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-06-11T20:25:47Z", "updated_at": "2021-06-11T20:51:56Z", "closed_at": "2021-06-11T20:51:56Z", "author_association": "OWNER", "pull_request": null, "body": "`table.schema` returns the schema for a table. `db.schema` should return the schema for the whole databes.\r\n\r\nCan do this using `select sql from sqlite_master where sql is not null`:\r\n\r\nhttps://latest.datasette.io/fixtures?sql=select+sql+from+sqlite_master+where+sql+is+not+null", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/268/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 915455228, "node_id": "MDU6SXNzdWU5MTU0NTUyMjg=", "number": 1371, "title": "Menu plugin hooks should include the request", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-08T20:23:35Z", "updated_at": "2021-06-10T04:46:01Z", "closed_at": "2021-06-10T04:46:01Z", "author_association": "OWNER", "pull_request": null, "body": "https://docs.datasette.io/en/stable/plugin_hooks.html#menu-links-datasette-actor\r\n\r\n- `menu_links(datasette, actor)`\r\n- `table_actions(datasette, actor, database, table)`\r\n- `database_actions(datasette, actor, database)`\r\n\r\nAll three of these should optionally also accept the `request` object. This would allow them to take into account additional cookies, `Authorization` headers or the current request URL (including the domain/subdomain) - or even access `request.scope` for extra context that might have been passed down from ASGI middleware.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1371/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 913823889, "node_id": "MDU6SXNzdWU5MTM4MjM4ODk=", "number": 1367, "title": "Navigation menu display bug", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-07T18:18:08Z", "updated_at": "2021-06-07T18:24:19Z", "closed_at": "2021-06-07T18:24:19Z", "author_association": "OWNER", "pull_request": null, "body": "With Datasette 0.57 the navigation menu looks like this:\r\n\r\n\"Datasette_Fixtures___memory___internal__fixtures__extra_database\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1367/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912959264, "node_id": "MDU6SXNzdWU5MTI5NTkyNjQ=", "number": 1364, "title": "Don't truncate columns on the list of databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-06T22:01:56Z", "updated_at": "2021-06-06T22:07:50Z", "closed_at": "2021-06-06T22:07:50Z", "author_association": "OWNER", "pull_request": null, "body": "https://covid-19.datasettes.com/covid currently truncates at 9 database columns:\r\n\r\n\"covid\"\r\n\r\nDjango SQL Dashboard showed me that this is a bad idea - having the full list of columns is actually really useful documentation for crafting custom SQL queries.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1364/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 325958506, "node_id": "MDU6SXNzdWUzMjU5NTg1MDY=", "number": 283, "title": "Support cross-database joins", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 26, "created_at": "2018-05-24T04:18:39Z", "updated_at": "2021-06-06T09:40:18Z", "closed_at": "2021-02-18T22:16:46Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite has the ability to attach multiple databases to a single connection and then run joins across multiple databases.\r\n\r\nSince Datasette supports more than one database, this would make a pretty neat feature.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/283/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912485040, "node_id": "MDU6SXNzdWU5MTI0ODUwNDA=", "number": 1361, "title": "Intermittent CI failure: restore_working_directory FileNotFoundError", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-06-05T22:48:13Z", "updated_at": "2021-06-05T23:16:24Z", "closed_at": "2021-06-05T23:16:24Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. in https://github.com/simonw/datasette/runs/2754772233 - this is an intermittent error:\r\n```\r\n__________ ERROR at setup of test_hook_register_routes_render_message __________\r\n[gw0] linux -- Python 3.8.10 /opt/hostedtoolcache/Python/3.8.10/x64/bin/python\r\n\r\ntmpdir = local('/tmp/pytest-of-runner/pytest-0/popen-gw0/test_hook_register_routes_rend0')\r\nrequest = >\r\n\r\n @pytest.fixture\r\n def restore_working_directory(tmpdir, request):\r\n> previous_cwd = os.getcwd()\r\nE FileNotFoundError: [Errno 2] No such file or directory\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1361/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912464443, "node_id": "MDU6SXNzdWU5MTI0NjQ0NDM=", "number": 1360, "title": "Security flaw, to be fixed in 0.56.1 and 0.57", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-06-05T21:53:51Z", "updated_at": "2021-06-05T22:23:23Z", "closed_at": "2021-06-05T22:22:06Z", "author_association": "OWNER", "pull_request": null, "body": "See security advisory here for details: https://github.com/simonw/datasette/security/advisories/GHSA-xw7c-jx9m-xh5g - the `?_trace=1` debugging option was not correctly escaping its JSON output, resulting in a [reflected cross-site scripting](https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks) vulnerability.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1360/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912418094, "node_id": "MDU6SXNzdWU5MTI0MTgwOTQ=", "number": 1358, "title": "Release Datasette 0.57", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-06-05T19:56:13Z", "updated_at": "2021-06-05T22:20:07Z", "closed_at": "2021-06-05T22:20:07Z", "author_association": "OWNER", "pull_request": null, "body": "Need release notes. Changes are here: https://github.com/simonw/datasette/compare/0.56...368aa5f1b16ca35f82d90ff747023b9a2bfa27c1\r\n\r\nPartial release notes already exist for the two alphas, https://github.com/simonw/datasette/releases/tag/0.57a0 and https://github.com/simonw/datasette/releases/tag/0.57a1", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1358/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912419349, "node_id": "MDU6SXNzdWU5MTI0MTkzNDk=", "number": 1359, "title": "`?_trace=1` should only be available with a new `trace_debug` setting", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-06-05T19:59:27Z", "updated_at": "2021-06-05T20:18:46Z", "closed_at": "2021-06-05T20:18:46Z", "author_association": "OWNER", "pull_request": null, "body": "Just like template debug mode is controlled by this off-by-default setting: https://github.com/simonw/datasette/blob/368aa5f1b16ca35f82d90ff747023b9a2bfa27c1/datasette/app.py#L160-L164", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1359/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 912394511, "node_id": "MDExOlB1bGxSZXF1ZXN0NjYyNTU3MjQw", "number": 1357, "title": "Make custom pages compatible with base_url setting", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-05T18:54:39Z", "updated_at": "2021-06-05T18:59:54Z", "closed_at": "2021-06-05T18:59:54Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1357", "body": "Refs #1238.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1357/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 906356331, "node_id": "MDU6SXNzdWU5MDYzNTYzMzE=", "number": 263, "title": "`sqlite-utils indexes` command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2021-05-29T04:52:34Z", "updated_at": "2021-06-03T04:34:38Z", "closed_at": "2021-06-03T04:34:38Z", "author_association": "OWNER", "pull_request": null, "body": "While working on #260 I realized there's no command to show indexes in a database, even though there is one for showing tables and one for triggers.\r\n\r\nI should implement #261 first.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/263/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 906345899, "node_id": "MDU6SXNzdWU5MDYzNDU4OTk=", "number": 261, "title": "`table.xindexes` using `PRAGMA index_xinfo(table)`", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-05-29T04:23:48Z", "updated_at": "2021-06-03T03:54:14Z", "closed_at": "2021-06-03T03:51:32Z", "author_association": "OWNER", "pull_request": null, "body": "> `PRAGMA index_xinfo(table)` DOES return that data:\r\n> ```\r\n> (Pdb) [c[0] for c in fresh_db.execute(\"PRAGMA > index_xinfo('idx_dogs_age_name')\").description]\r\n> ['seqno', 'cid', 'name', 'desc', 'coll', 'key']\r\n> (Pdb) fresh_db.execute(\"PRAGMA index_xinfo('idx_dogs_age_name')\").fetchall()\r\n> [(0, 2, 'age', 1, 'BINARY', 1), (1, 0, 'name', 0, 'BINARY', 1), (2, -1, None, 0, 'BINARY', 0)]\r\n> ```\r\n> See https://sqlite.org/pragma.html#pragma_index_xinfo\r\n> \r\n> Example output: https://covid-19.datasettes.com/covid?sql=select+*+from+pragma_index_xinfo%28%27idx_ny_times_us_counties_date%27%29\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/260#issuecomment-850766552_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/261/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 904537568, "node_id": "MDExOlB1bGxSZXF1ZXN0NjU1Njg0NDc3", "number": 1346, "title": "Re-display user's query with an error message if an error occurs", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-05-28T02:04:20Z", "updated_at": "2021-06-02T03:46:21Z", "closed_at": "2021-06-02T03:46:21Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1346", "body": "Refs #619", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1346/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 828811618, "node_id": "MDU6SXNzdWU4Mjg4MTE2MTg=", "number": 1257, "title": "Table names containing single quotes break things", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-11T06:29:38Z", "updated_at": "2021-06-02T03:28:29Z", "closed_at": "2021-06-02T03:28:29Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. I found a table called `Yesterday's ELRs by County`\r\n\r\nIt threw an error inside the `detect_fts()` function attempting to run this SQL query:\r\n\r\n```sql\r\n select name from sqlite_master\r\n where rootpage = 0\r\n and (\r\n sql like '%VIRTUAL TABLE%USING FTS%content=\"Yesterday's ELRs by County\"%'\r\n or sql like '%VIRTUAL TABLE%USING FTS%content=[Yesterday's ELRs by County]%'\r\n or (\r\n tbl_name = \"Yesterday's ELRs by County\"\r\n and sql like '%VIRTUAL TABLE%USING FTS%'\r\n )\r\n )\r\n```\r\nHere's the code at fault: https://github.com/simonw/datasette/blob/640ac7071b73111ba4423812cd683756e0e1936b/datasette/utils/__init__.py#L534-L548", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1257/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 800669347, "node_id": "MDU6SXNzdWU4MDA2NjkzNDc=", "number": 1216, "title": "/-/databases should reflect connection order, not alphabetical order", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-03T20:20:23Z", "updated_at": "2021-06-02T03:10:19Z", "closed_at": "2021-06-02T03:10:19Z", "author_association": "OWNER", "pull_request": null, "body": "The order in which databases are attached to Datasette matters - it affects the homepage, and it's beginning to influence how certain plugins work (see https://github.com/simonw/datasette-tiles/issues/8).\r\n\r\nTwo years ago in cccea85be6aaaeadb31f3b588ec7f732628815f5 I made `/-/databases` return things in alphabetical order, to fix a test failure in Python 3.5.\r\n\r\nPython 3.5 is no longer supported, so this is no longer necessary - and this behaviour should now be treated as a bug.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1216/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 323671577, "node_id": "MDU6SXNzdWUzMjM2NzE1Nzc=", "number": 263, "title": "Facets should not execute for ?shape=array|object", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-16T15:26:13Z", "updated_at": "2021-06-02T02:54:34Z", "closed_at": "2021-06-02T02:54:34Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/263/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 906977719, "node_id": "MDU6SXNzdWU5MDY5Nzc3MTk=", "number": 1350, "title": "?_nofacets=1 query string argument for disabling facets and suggested facets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-05-31T02:22:29Z", "updated_at": "2021-06-01T16:19:38Z", "closed_at": "2021-05-31T02:39:18Z", "author_association": "OWNER", "pull_request": null, "body": "This is needed as an internal option for #1349. `datasette-graphql` can benefit from this too - maybe can even use it so that if you pass `?_shape=array` it gets automatically added, fixing #263.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1350/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 908446997, "node_id": "MDU6SXNzdWU5MDg0NDY5OTc=", "number": 1353, "title": "?_nocount=1 for opting out of table counts", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-06-01T15:53:27Z", "updated_at": "2021-06-01T16:18:54Z", "closed_at": "2021-06-01T16:17:04Z", "author_association": "OWNER", "pull_request": null, "body": "Running a trace against a CSV streaming export with the new `_trace=1` feature from #1351 shows that the following code is executing a `select count(*) from table` for every page of results returned: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/table.py#L700-L705\r\n\r\nThis is inefficient - a new `?_nocount=1` option would let us disable this count in the same way as #1349: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/base.py#L264-L276\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1353/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 908465747, "node_id": "MDU6SXNzdWU5MDg0NjU3NDc=", "number": 1354, "title": "Update help in tests for latest Click", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-06-01T16:14:31Z", "updated_at": "2021-06-01T16:17:04Z", "closed_at": "2021-06-01T16:17:04Z", "author_association": "OWNER", "pull_request": null, "body": "Now that Uvicorn 0.14 is out with an unpinned Click dependency - https://github.com/encode/uvicorn/pull/1033 - our test suite runs against Click 8.0 - which subtly changes the output of `--help` causing test failures: https://github.com/simonw/datasette/runs/2720383031?check_suite_focus=true\r\n```\r\n def test_help_includes(name, filename):\r\n expected = (docs_path / filename).read_text()\r\n runner = CliRunner()\r\n result = runner.invoke(cli, name.split() + [\"--help\"], terminal_width=88)\r\n actual = f\"$ datasette {name} --help\\n\\n{result.output}\"\r\n # actual has \"Usage: cli package [OPTIONS] FILES\"\r\n # because it doesn't know that cli will be aliased to datasette\r\n expected = expected.replace(\"Usage: datasette\", \"Usage: cli\")\r\n> assert expected == actual\r\nE AssertionError: assert '$ datasette ...e and exit.\\n' == '$ datasette ...e and exit.\\n'\r\nE Skipping 848 identical leading characters in diff, use -v to show\r\nE nt_id xxx\r\nE + \r\nE --version-note TEXT Additional note to show on /-/versions\r\nE --secret TEXT Secret used for signing secure values, such as signed\r\nE cookies\r\nE + \r\nE --title TEXT Title for metadata\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1354/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 904071938, "node_id": "MDU6SXNzdWU5MDQwNzE5Mzg=", "number": 1345, "title": "?_nocol= does not interact well with default facets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-05-27T18:39:55Z", "updated_at": "2021-05-31T02:40:44Z", "closed_at": "2021-05-31T02:31:21Z", "author_association": "OWNER", "pull_request": null, "body": "Clicking \"Hide this column\" on `fips` on https://covid-19.datasettes.com/covid/ny_times_us_counties shows this error:\r\n\r\nhttps://covid-19.datasettes.com/covid/ny_times_us_counties?_nocol=fips\r\n\r\n> ## Invalid SQL\r\n> no such column: fips\r\n\r\nThe reason is that https://covid-19.datasettes.com/-/metadata sets up the following:\r\n\r\n```json\r\n \"ny_times_us_counties\": {\r\n \"sort_desc\": \"date\",\r\n \"facets\": [\r\n \"state\",\r\n \"county\",\r\n \"fips\"\r\n ],\r\n```\r\nIt's setting `fips` as a default facet, which breaks if you attempt to remove the column using `?_nocol`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1345/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 838148087, "node_id": "MDU6SXNzdWU4MzgxNDgwODc=", "number": 250, "title": "Handle byte order marks (BOMs) in CSV files", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-03-22T22:13:18Z", "updated_at": "2021-05-29T05:34:21Z", "closed_at": "2021-05-29T05:34:21Z", "author_association": "OWNER", "pull_request": null, "body": "I often find `sqlite-utils insert ... --csv` creates a first column with a weird character at the start of it - which it turns out is the UTF-8 BOM. Fix that.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/250/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 906355849, "node_id": "MDExOlB1bGxSZXF1ZXN0NjU3MzczNzI2", "number": 262, "title": "Ability to add descending order indexes", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-05-29T04:51:04Z", "updated_at": "2021-05-29T05:01:42Z", "closed_at": "2021-05-29T05:01:39Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/262", "body": "Refs #260", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/262/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 906330187, "node_id": "MDU6SXNzdWU5MDYzMzAxODc=", "number": 260, "title": "Support creating descending order indexes", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2021-05-29T03:42:59Z", "updated_at": "2021-05-29T05:01:39Z", "closed_at": "2021-05-29T05:01:39Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite lets you create indexes in reverse order, which can have a surprisingly big impact on performance, see https://github.com/simonw/covid-19-datasette/issues/27\r\n\r\nI tried doing this using `sqlite-utils` like so, but it's didn't work:\r\n\r\n```python\r\ndb[\"ny_times_us_counties\"].create_index([\"date desc\"])\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/260/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 858501079, "node_id": "MDU6SXNzdWU4NTg1MDEwNzk=", "number": 255, "title": "transform --help should tell you the available types", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-15T05:24:48Z", "updated_at": "2021-05-29T03:55:52Z", "closed_at": "2021-05-29T03:55:52Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\nUsage: sqlite-utils transform [OPTIONS] PATH TABLE\r\n\r\n Transform a table beyond the capabilities of ALTER TABLE\r\n\r\nOptions:\r\n --type ... Change column type to X\r\n```\r\nThis should specify that the possible types are 'INTEGER', 'TEXT', 'FLOAT', 'BLOB'.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/255/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 903978133, "node_id": "MDU6SXNzdWU5MDM5NzgxMzM=", "number": 1343, "title": "Figure out how to publish alpha/beta releases to Docker Hub", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-05-27T16:42:17Z", "updated_at": "2021-05-27T16:46:37Z", "closed_at": "2021-05-27T16:45:41Z", "author_association": "OWNER", "pull_request": null, "body": "> It looks like all I need to do to ship an alpha version to Docker Hub is NOT point the `latest` tag at it after it goes live: https://github.com/simonw/datasette/blob/1a8972f9c012cd22b088c6b70661a9c3d3847853/.github/workflows/publish.yml#L75-L77\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1319#issuecomment-849780481_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1343/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 898904402, "node_id": "MDU6SXNzdWU4OTg5MDQ0MDI=", "number": 1337, "title": "\"More\" link for facets that shows _facet_size=max results", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-05-23T00:08:51Z", "updated_at": "2021-05-27T16:14:14Z", "closed_at": "2021-05-27T16:01:03Z", "author_association": "OWNER", "pull_request": null, "body": "_Original title: \"More\" link for facets that shows the full set of results_\r\n\r\nThe simplest way to do this will be to have it link to a generated SQL query.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1332#issuecomment-846479062_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1337/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 903200328, "node_id": "MDU6SXNzdWU5MDMyMDAzMjg=", "number": 1341, "title": "\"Show all columns\" cog menu item should show if ?_col= is used", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-05-27T04:28:17Z", "updated_at": "2021-05-27T04:31:16Z", "closed_at": "2021-05-27T04:31:16Z", "author_association": "OWNER", "pull_request": null, "body": "On https://latest.datasette.io/fixtures/sortable?_col=sortable the \"Show all columns\" item (from #615) is not shown (it should be):\r\n\r\n\"fixtures__sortable__201_rows\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1341/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 517451234, "node_id": "MDU6SXNzdWU1MTc0NTEyMzQ=", "number": 615, "title": "?_col= and ?_nocol= support for toggling columns on table view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 16, "created_at": "2019-11-04T22:55:41Z", "updated_at": "2021-05-27T04:26:10Z", "closed_at": "2021-05-27T04:17:44Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #292 (I guess this is a re-opening of #312).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/615/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 326800219, "node_id": "MDU6SXNzdWUzMjY4MDAyMTk=", "number": 292, "title": "Mechanism for customizing the SQL used to select specific columns in the table view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2018-05-27T09:05:52Z", "updated_at": "2021-05-27T04:25:01Z", "closed_at": "2021-05-27T04:25:01Z", "author_association": "OWNER", "pull_request": null, "body": "Some columns don't make a lot of sense in their default representation - binary blobs such as SpatiaLite geometries for example, or lengthy columns that really should be truncated somehow.\r\n\r\nWe may also find that there are tables where we don't want to show all of the columns - so a mechanism to select a subset of columns would be nice.\r\n\r\nI think there are two features here:\r\n\r\n* the ability to request a subset of columns on the table view\r\n* the ability to override the SQL for a specific column and/or add extra columns - `AsGeoJSON(Geometry)` for example\r\n\r\nBoth features should be available via both querystring arguments and in `metadata.json`\r\n\r\nThe querystring argument for custom SQL should only work if `allow_sql` config is turned on.\r\n\r\nRefs #276", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/292/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 899851083, "node_id": "MDExOlB1bGxSZXF1ZXN0NjUxNDkyODg4", "number": 1339, "title": "?_col=/?_nocol= to show/hide columns on the table page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-05-24T17:15:20Z", "updated_at": "2021-05-27T04:17:44Z", "closed_at": "2021-05-27T04:17:43Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1339", "body": "See #615. Still to do:\r\n\r\n- [x] Allow combination of `?_col=` and `?_nocol=` (`_nocol` wins)\r\n- [x] Deduplicate same column if passed in `?_col=` multiple times\r\n- [x] Validate that user did not try to remove a primary key\r\n- [x] Add tests\r\n- [x] Ensure this works correctly for SQL views\r\n- [x] Add documentation\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1339/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 899169307, "node_id": "MDU6SXNzdWU4OTkxNjkzMDc=", "number": 1338, "title": "Fix jinja2 warnings", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-05-24T01:38:23Z", "updated_at": "2021-05-24T01:41:55Z", "closed_at": "2021-05-24T01:41:55Z", "author_association": "OWNER", "pull_request": null, "body": "Lots of these in the test suite now, after the Jinja upgrade in #1331:\r\n```\r\ntests/test_plugins.py::test_hook_render_cell_link_from_json\r\n datasette/tests/plugins/my_plugin_2.py:45: DeprecationWarning: 'jinja2.escape' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.escape' instead.\r\n label=jinja2.escape(data[\"label\"] or \"\") or \" \",\r\n\r\ntests/test_plugins.py::test_hook_render_cell_link_from_json\r\n datasette/tests/plugins/my_plugin_2.py:41: DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead.\r\n return jinja2.Markup(\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1338/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}