{"id": 628121234, "node_id": "MDU6SXNzdWU2MjgxMjEyMzQ=", "number": 788, "title": " /-/permissions debugging tool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 2, "created_at": "2020-06-01T03:13:47Z", "updated_at": "2020-06-06T00:43:40Z", "closed_at": "2020-06-01T05:01:01Z", "author_association": "OWNER", "pull_request": null, "body": "> Debugging tool idea: `/-/permissions` page which shows you the actor and lets you type in the strings for `action`, `resource_type` and `resource_identifier` - then shows you EVERY plugin hook that would have executed and what it would have said, plus when the chain would have terminated.\r\n>\r\n> Bonus: if you're logged in as the `root` user (or a user that matches some kind of permission check, maybe a check for `permissions_debug`) you get to see a rolling log of the last 30 permission checks and what the results were across the whole of Datasette. This should make figuring out permissions policies a whole lot easier.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/699#issuecomment-636576603_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/788/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 803356942, "node_id": "MDU6SXNzdWU4MDMzNTY5NDI=", "number": 1218, "title": " /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory", "user": {"value": 11855322, "label": "robmarkcole"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-08T09:07:00Z", "updated_at": "2021-02-23T12:12:17Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Error as above, however I do have python3.8 and the readme indicates this is supported.\r\n\r\n```\r\n(venv) (base) Robins-MacBook:datasette robin$ ls /usr/local/opt/python3/bin/\r\n\r\n.. pip3 python3 python3.8\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1218/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1871935751, "node_id": "I_kwDOD079W85vk3kH", "number": 40, "title": " ImportError: cannot import name 'formatargspec' from 'inspect'", "user": {"value": 36752421, "label": "hosslikw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-08-29T15:36:31Z", "updated_at": "2023-08-31T03:18:07Z", "closed_at": "2023-08-31T03:18:06Z", "author_association": "NONE", "pull_request": null, "body": "I get the following error when running \"pip3 install dogsheep-photos\"\r\n\" from inspect import ismethod, isclass, formatargspec\r\n ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?\"\r\n \r\nPython 3.12.0rc1\r\nsqlite 3.43.0\r\ndatasette, version 0.64.3", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1622640374, "node_id": "I_kwDOCGYnMM5gt4b2", "number": 534, "title": " ResourceWarning: unclosed file", "user": {"value": 1244826, "label": "djhenderson"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-03-14T03:02:18Z", "updated_at": "2023-05-08T19:56:29Z", "closed_at": "2023-05-08T19:56:29Z", "author_association": "NONE", "pull_request": null, "body": "Issuing either\r\n\r\n```\r\npy -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv\r\n [#############-----------------------] 36%\r\n [####################################] 100%C:\\Users\\Doug\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\sqlite_utils\\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'>\r\n insert_upsert_implementation(\r\nResourceWarning: Enable tracemalloc to get the object allocation traceback\r\n```\r\nor\r\n```\r\nset pythonwarnings=default\r\nsqlite-utils insert dogs.db dogs dogs0.csv --csv\r\n [#############-----------------------] 36%\r\n [####################################] 100%C:\\Users\\Doug\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\sqlite_utils\\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'>\r\n insert_upsert_implementation(\r\nResourceWarning: Enable tracemalloc to get the object allocation traceback\r\n```\r\n\r\nexhibits a ResourceWarning indicating that the CSV file being loaded is not closed.\r\n\r\nsqlite-utils --version\r\nsqlite-utils, version 3.30\r\npy --version\r\nPython 3.11.2\r\nWindows Version 10.0.19045 Build 19045\r\nSQLite version 3.41.0 2023-02-21 18:09:37\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1943259395, "node_id": "I_kwDOEhK-wc5z08kD", "number": 16, "title": " time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ'", "user": {"value": 3746270, "label": "linonetwo"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-10-14T13:24:39Z", "updated_at": "2023-10-14T13:24:39Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "\r\n```\r\nevernote-to-sqlite enex evernote.db ./\u6211\u7684\u7b14\u8bb0.enex\r\nImporting from ENEX [#####-------------------------------] 14%\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/evernote-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n ^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/click/core.py\", line 1157, in __call__\r\n return self.main(*args, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/click/core.py\", line 1078, in main\r\n rv = self.invoke(ctx)\r\n ^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/click/core.py\", line 1688, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/click/core.py\", line 1434, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/click/core.py\", line 783, in invoke\r\n return __callback(*args, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py\", line 31, in enex\r\n save_note(db, note)\r\n File \"/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py\", line 46, in save_note\r\n \"created\": convert_datetime(created),\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py\", line 111, in convert_datetime\r\n return datetime.datetime.strptime(s, \"%Y%m%dT%H%M%SZ\").isoformat()\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py\", line 568, in _strptime_datetime\r\n tt, fraction, gmtoff_fraction = _strptime(data_string, format)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py\", line 349, in _strptime\r\n raise ValueError(\"time data %r does not match format %r\" %\r\nValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ'\r\n```\r\n\r\nenex is exported by evernote mac client ", "repo": {"value": 303218369, "label": "evernote-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1524867951, "node_id": "I_kwDOBm6k_c5a46Nv", "number": 1980, "title": "\"Cannot sort table by id\" when sortable_columns is used", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2023-01-09T03:21:33Z", "updated_at": "2023-01-09T03:23:53Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I had an instance with this in `metadata.yml`:\r\n\r\n```yaml\r\ndatabases:\r\n timezones:\r\n tables:\r\n timezones:\r\n sortable_columns:\r\n - tzid\r\n```\r\nWhen I clicked on the \"Apply\" button here:\r\n\r\n\"image\"\r\n\r\nIt sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message:\r\n\r\n> 500: Cannot sort table by id", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1980/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1223527226, "node_id": "I_kwDOBm6k_c5I7Ys6", "number": 1738, "title": "\"Cannot use _sort and _sort_desc at the same time\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8303187, "label": "Datasette 0.62"}, "comments": 2, "created_at": "2022-05-03T01:06:24Z", "updated_at": "2022-08-14T16:13:55Z", "closed_at": "2022-08-14T16:13:55Z", "author_association": "OWNER", "pull_request": null, "body": "Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width:\r\n\r\nhttps://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1\r\n\r\nNavigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply:\r\n\r\n![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif)\r\n\r\nAlso notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1738/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1665053646, "node_id": "I_kwDOBm6k_c5jPrPO", "number": 2059, "title": "\"Deceptive site ahead\" alert on Heroku deployment", "user": {"value": 1186275, "label": "mtdukes"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-04-12T18:34:51Z", "updated_at": "2023-04-13T01:13:01Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a \"Deceptive site ahead\" warning to users.\r\n\r\nIs there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2059/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 721050815, "node_id": "MDU6SXNzdWU3MjEwNTA4MTU=", "number": 1019, "title": "\"Edit SQL\" button on canned queries", "user": {"value": 639012, "label": "jsfenfen"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 7, "created_at": "2020-10-14T00:51:39Z", "updated_at": "2020-10-23T19:44:06Z", "closed_at": "2020-10-14T03:44:23Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Feature request: Would it be possible to add an \"edit this query\" button on canned queries? Clicking it would open the canned query as an editable sql query. I think the intent is to have named parameters to allow this, but sometimes you just gotta rewrite it? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1019/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1180427792, "node_id": "I_kwDOCGYnMM5GW-YQ", "number": 421, "title": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI", "user": {"value": 24938923, "label": "learning4life"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2022-03-25T07:12:51Z", "updated_at": "2022-04-13T22:41:59Z", "closed_at": "2022-04-13T22:41:59Z", "author_association": "NONE", "pull_request": null, "body": "This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147\r\n\r\n**New error when using CLI: \"sqlite-utils indexes global.db --table\"**\r\n\r\n```\r\n(app-root) sqlite-utils indexes global.db --table\r\nError: near \"(\": syntax error\r\n(app-root) sqlite-utils --version\r\nsqlite-utils, version 3.25.1\r\n(app-root) sqlite3 --version\r\n3.36.0 2021-06-18 18:36:39\r\n(app-root) python --version\r\nPython 3.8.11\r\n```\r\n\r\n\r\nDockerfile\r\n```\r\nFROM centos/python-38-centos7\r\n\r\nUSER root\r\n\r\nRUN yum update -y\r\nRUN yum upgrade -y\r\n\r\n\r\n# epel\r\nRUN yum -y install epel-release && yum clean all\r\n\r\n# SQLite\r\nRUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel \r\n\r\nWORKDIR /build/\r\nCOPY sqlite-autoconf-3360000.tar.gz ./\r\nRUN tar -zxf sqlite-autoconf-3360000.tar.gz\r\nWORKDIR /build/sqlite-autoconf-3360000\r\nRUN ./configure\r\nRUN make\r\nRUN make install\r\n\r\n# \r\nRUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip\r\nRUN pip install sqlite-utils\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 520655983, "node_id": "MDU6SXNzdWU1MjA2NTU5ODM=", "number": 619, "title": "\"Invalid SQL\" page should let you edit the SQL", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 14, "created_at": "2019-11-10T20:54:12Z", "updated_at": "2022-01-13T22:21:42Z", "closed_at": "2021-06-02T04:15:54Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D\r\n\r\n\r\n\r\nWould be useful if this page showed you the invalid SQL you entered so you can edit it and try again.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/619/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1059509927, "node_id": "I_kwDOBm6k_c4_Jtan", "number": 1525, "title": "\"Links from other tables\" broken for columns starting with underscore", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-11-21T22:55:08Z", "updated_at": "2021-11-30T06:39:01Z", "closed_at": "2021-11-30T06:34:35Z", "author_association": "OWNER", "pull_request": null, "body": "Same bug as #1506, this time it's this link or the row page:\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1525/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 647103735, "node_id": "MDU6SXNzdWU2NDcxMDM3MzU=", "number": 875, "title": "\"Logged in as: XXX - logout\" navigation item", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5533512, "label": "Datasette 0.45"}, "comments": 3, "created_at": "2020-06-29T04:31:14Z", "updated_at": "2020-07-02T00:13:24Z", "closed_at": "2020-06-29T18:43:50Z", "author_association": "OWNER", "pull_request": null, "body": "_Originally posted by @simonw in https://github.com/simonw/datasette/issues/840#issuecomment-650895874_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/875/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 898904402, "node_id": "MDU6SXNzdWU4OTg5MDQ0MDI=", "number": 1337, "title": "\"More\" link for facets that shows _facet_size=max results", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-05-23T00:08:51Z", "updated_at": "2021-05-27T16:14:14Z", "closed_at": "2021-05-27T16:01:03Z", "author_association": "OWNER", "pull_request": null, "body": "_Original title: \"More\" link for facets that shows the full set of results_\r\n\r\nThe simplest way to do this will be to have it link to a generated SQL query.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1332#issuecomment-846479062_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1337/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 671056788, "node_id": "MDU6SXNzdWU2NzEwNTY3ODg=", "number": 914, "title": "\"Object of type bytes is not JSON serializable\" for _nl=on", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-01T17:43:10Z", "updated_at": "2020-08-16T21:10:27Z", "closed_at": "2020-08-16T18:26:59Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array returns this:\r\n```json\r\n[\r\n {\r\n \"rowid\": 1,\r\n \"data\": \"this is binary data\"\r\n }\r\n]\r\n```\r\nBut adding `&_nl=on` returns this: https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array&_nl=on\r\n```json\r\n{\r\n \"ok\": false,\r\n \"error\": \"Object of type bytes is not JSON serializable\",\r\n \"status\": 500,\r\n \"title\": null\r\n}\r\n```\r\nI found this error by running `wget -r 127.0.0.1:8001` against my local `fixtures.db`.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/914/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 761713079, "node_id": "MDU6SXNzdWU3NjE3MTMwNzk=", "number": 1138, "title": "\"Powered by Datasette\" should link to new datasette.io site", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-10T23:33:41Z", "updated_at": "2020-12-15T02:28:10Z", "closed_at": "2020-12-10T23:37:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://datasette.io/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1138/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959999095, "node_id": "MDU6SXNzdWU5NTk5OTkwOTU=", "number": 1421, "title": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times", "user": {"value": 6988, "label": "j4mie"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-08-04T07:29:04Z", "updated_at": "2021-08-09T03:41:07Z", "closed_at": "2021-08-09T03:33:02Z", "author_association": "NONE", "pull_request": null, "body": "Datasette version `0.58.1`.\r\n\r\nI'm guessing this is a bug in the code that looks for `:param`-style query parameters..\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1421/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1084007781, "node_id": "I_kwDOBm6k_c5AnKVl", "number": 1572, "title": "\"Query took\" should be \"Queries took\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 0, "created_at": "2021-12-19T04:03:00Z", "updated_at": "2022-01-13T22:27:43Z", "closed_at": "2021-12-19T04:03:24Z", "author_association": "OWNER", "pull_request": null, "body": "This is misleading, since usually there have been more than one query executed:\r\n\r\n![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png)\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1572/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 275228834, "node_id": "MDU6SXNzdWUyNzUyMjg4MzQ=", "number": 136, "title": "\"Reformat SQL\" button next to SQL editor textarea", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2017-11-20T03:42:19Z", "updated_at": "2019-10-14T03:46:13Z", "closed_at": "2019-10-14T03:46:13Z", "author_association": "OWNER", "pull_request": null, "body": "Can use this:\r\n\r\nhttps://github.com/zeroturnaround/sql-formatter\r\nhttps://zeroturnaround.github.io/sql-formatter/\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/136/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 903200328, "node_id": "MDU6SXNzdWU5MDMyMDAzMjg=", "number": 1341, "title": "\"Show all columns\" cog menu item should show if ?_col= is used", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-05-27T04:28:17Z", "updated_at": "2021-05-27T04:31:16Z", "closed_at": "2021-05-27T04:31:16Z", "author_association": "OWNER", "pull_request": null, "body": "On https://latest.datasette.io/fixtures/sortable?_col=sortable the \"Show all columns\" item (from #615) is not shown (it should be):\r\n\r\n\"fixtures__sortable__201_rows\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1341/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 784628163, "node_id": "MDU6SXNzdWU3ODQ2MjgxNjM=", "number": 1185, "title": "\"Statement may not contain PRAGMA\" error is not strictly true", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 3, "created_at": "2021-01-12T22:07:10Z", "updated_at": "2021-01-24T21:21:37Z", "closed_at": "2021-01-12T22:26:26Z", "author_association": "OWNER", "pull_request": null, "body": "Consider https://latest.datasette.io/fixtures?sql=select+%27select%0D%0A%27+%7C%7C+group_concat%28%27++++case+when+%5B%27+%7C%7C+name+%7C%7C+%27%5D+is+not+null+then+%27+%7C%7C+quote%28name+%7C%7C+%27%2C+%27%29+%7C%7C+%27+else+%27%27%27%27+end%27%2C+%27+%7C%7C%0D%0A%27%29+%7C%7C+%27%0D%0A++as+columns%2C%0D%0A++count%28*%29+as+num_rows%0D%0Afrom%0D%0A++%5B%27+%7C%7C+%3Atable+%7C%7C+%27%5D%0D%0Agroup+by%0D%0A++columns%0D%0Aorder+by%0D%0A++num_rows+desc%27+as+query+from+pragma_ytable_info%28%3Atable%29&table=facetable\r\n\r\nIt says \"Statement may not contain PRAGMA\" - but that's not actually true. Datasette has an allow-list of PRAGMA that are OK - in this case there was a typo in `pragma_ytable_info` which caused the error, but pragma_table_info` would have been OK.\r\n\r\nSo the error message is misleading.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1185/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 763361458, "node_id": "MDU6SXNzdWU3NjMzNjE0NTg=", "number": 1142, "title": "\"Stream all rows\" is not at all obvious", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-12-12T06:24:57Z", "updated_at": "2021-06-17T18:12:31Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Got a question about how to download all rows - the current option isn't at all clear.\r\n\r\n\"loans__ppp_loans__9_511_rows_where_where_search_matches__tech__sorted_by_rowid\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1142/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 573583971, "node_id": "MDU6SXNzdWU1NzM1ODM5NzE=", "number": 689, "title": "\"Templates considered\" comment broken in >=0.35", "user": {"value": 35075, "label": "chrishas35"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-03-01T17:31:21Z", "updated_at": "2020-04-05T19:39:44Z", "closed_at": "2020-04-05T19:39:44Z", "author_association": "NONE", "pull_request": null, "body": "Noticed that the \"Templates Considered\" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34:\r\n\r\n```diff\r\n< \"datasette_version\": \"0.34\",\r\n< \"app_css_hash\": \"ffa51a\",\r\n< \"select_templates\": [\r\n< \"*index.html\"\r\n< ],\r\n< \"zip\": \"\",\r\n< \"body_scripts\": [],\r\n< \"extra_css_urls\": \"\",\r\n< \"extra_js_urls\": \"\",\r\n< \"format_bytes\": \"\",\r\n< \"database_url\": \">\",\r\n< \"database_color\": \">\"\r\n---\r\n> \"datasette_version\": \"0.35\",\r\n> \"database_url\": \">\",\r\n> \"database_color\": \">\"\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/689/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1907655261, "node_id": "I_kwDOBm6k_c5xtIJd", "number": 2193, "title": "\"Test DATASETTE_LOAD_PLUGINS\" test shows errors but did not fail the CI run", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2023-09-21T19:49:34Z", "updated_at": "2023-09-21T21:56:43Z", "closed_at": "2023-09-21T21:56:43Z", "author_association": "OWNER", "pull_request": null, "body": "> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the \"Test DATASETTE_LOAD_PLUGINS\" test shows errors but did not fail the CI run.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2193/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 473083260, "node_id": "MDU6SXNzdWU0NzMwODMyNjA=", "number": 50, "title": "\"Too many SQL variables\" on large inserts", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-07-25T21:43:31Z", "updated_at": "2022-11-04T14:38:36Z", "closed_at": "2019-07-28T11:59:33Z", "author_association": "OWNER", "pull_request": null, "body": "Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9\r\n\r\nIt looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 638241779, "node_id": "MDU6SXNzdWU2MzgyNDE3Nzk=", "number": 846, "title": "\"Too many open files\" error running tests", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-06-13T22:11:40Z", "updated_at": "2020-06-14T00:26:31Z", "closed_at": "2020-06-14T00:26:31Z", "author_association": "OWNER", "pull_request": null, "body": "I got this on my laptop:\r\n```pytest\r\n...\r\n/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/loaders.py:171: in get_source\r\n f = open_if_exists(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n\r\nfilename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html', mode = 'rb'\r\n\r\n def open_if_exists(filename, mode='rb'):\r\n \"\"\"Returns a file descriptor for the filename if that file exists,\r\n otherwise `None`.\r\n \"\"\"\r\n try:\r\n> return open(filename, mode)\r\nE OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html'\r\n\r\n/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/utils.py:154: OSError\r\n```\r\nBased on the conversation in https://github.com/pytest-dev/pytest/issues/2970 I'm worried that my tests are opening too many files without closing them.\r\n\r\nIn particular... I call `sqlite3.connect(filepath)` a LOT - and I don't ever call `conn.close()` on those opened connections:\r\n\r\nhttps://github.com/simonw/datasette/blob/cf7a2bdb404734910ec07abc7571351a2d934828/datasette/database.py#L58-L60\r\n\r\nCould this be resulting in my tests eventually opening too many unclosed file handles? How could I confirm this?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/846/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 734777631, "node_id": "MDU6SXNzdWU3MzQ3Nzc2MzE=", "number": 1080, "title": "\"View all\" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 7, "created_at": "2020-11-02T19:55:06Z", "updated_at": "2022-02-04T06:25:18Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Can use `/database/-/...` namespace from #296", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1080/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 853672224, "node_id": "MDU6SXNzdWU4NTM2NzIyMjQ=", "number": 1294, "title": "\"You can check out any time you like. But you can never leave!\"", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-08T17:02:15Z", "updated_at": "2021-04-08T18:35:50Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "(Feel free to rename this one.)\r\n\r\n- The column gear lets you \"Show not-blank rows.\" Then it places a parameter in the URL, which a web developer would notice, but a lot of users won't notice, or know to delete it. Would be good to toggle \"Show not-blank rows\" with \"Show all rows.\" (Also would be quite helpful to have a \"Show blank rows | Show all rows\" option)\r\n- The column gear lets you \"Sort ascending\" and \"Sort descending\" but then you're stuck with some sort of sorted version thereafter, unless you know to sort the ID column, or to remove the full _sort parameter and its value in the URL. Would be good to offer a \"Remove sort\" option in the gear.\r\n- These requests are in the same camp as: https://github.com/simonw/datasette-vega/issues/36\r\n- I suspect there are other url parameter instances where similar analysis would be helpful, but the three above are the use cases I've run across. \r\n\r\nUPDATE:\r\n- It would be helpful to have a \"Previous page\" available for all but the first table page.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1294/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 760312579, "node_id": "MDU6SXNzdWU3NjAzMTI1Nzk=", "number": 1134, "title": "\"_searchmode=raw\" throws an index out of range error when combined with \"_search_COLUMN\"", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-12-09T13:05:37Z", "updated_at": "2020-12-10T05:57:17Z", "closed_at": "2020-12-09T19:56:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon!\r\nMaybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the \"_search_COLUMN\"-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler \"_search\"-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 122, in route_path\r\n return await view(new_scope, receive, send)\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 196, in view\r\n request, **scope[\"url_route\"][\"kwargs\"]\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 204, in get\r\n request, database, hash, correct_hash_provided, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 342, in view_get\r\n request, database, hash, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py\", line 393, in data\r\n search_col = key.split(\"_search_\", 1)[1]\r\nIndexError: list index out of range\r\n\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 457147936, "node_id": "MDU6SXNzdWU0NTcxNDc5MzY=", "number": 512, "title": "\"about\" parameter in metadata does not appear when alone", "user": {"value": 7936571, "label": "chrismp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-17T21:04:20Z", "updated_at": "2019-10-11T15:49:13Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Here's an example of metadata I have for one database on datasette.\r\n\r\n```\r\n\"Records-requests\": {\r\n\t\"tables\": {\r\n\t\t\"Some table\": {\r\n\t\t\t\"about\": \"This table has data.\"\r\n\t\t}\r\n\t}\r\n}\r\n```\r\n\r\nThe text in `about` does not show up when I publish the data. But it shows up after I add a `\"source\"` parameter in the metadata.\r\n\r\nIs this intended?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/512/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 665400224, "node_id": "MDU6SXNzdWU2NjU0MDAyMjQ=", "number": 906, "title": "\"allow\": true for anyone, \"allow\": false for nobody", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5607421, "label": "Datasette 0.46"}, "comments": 3, "created_at": "2020-07-24T20:28:10Z", "updated_at": "2020-07-25T00:07:10Z", "closed_at": "2020-07-25T00:05:04Z", "author_association": "OWNER", "pull_request": null, "body": "The \"allow\" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this:\r\n\r\n> An allow block can specify \"no-one is allowed to do this\" using an empty `{}`:\r\n> \r\n> ```\r\n> {\r\n> \"allow\": {}\r\n> }\r\n> ```\r\n\r\n`\"allow\": null` allows all access, though this isn't documented (it should be though).\r\n\r\nThese are not very intuitive. How about also supporting `\"allow\": true` for \"allow anyone\" and `\"allow\": false` for \"allow nobody\"?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/906/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1123849278, "node_id": "I_kwDOCGYnMM5C_JQ-", "number": 395, "title": "\"apt-get: command not found\" error on macOS", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-02-04T06:03:42Z", "updated_at": "2022-02-04T06:10:58Z", "closed_at": "2022-02-04T06:10:58Z", "author_association": "OWNER", "pull_request": null, "body": "Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 324720095, "node_id": "MDU6SXNzdWUzMjQ3MjAwOTU=", "number": 275, "title": "\"config\" section in metadata.json (root, database and table level)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-20T16:02:28Z", "updated_at": "2023-08-23T01:28:37Z", "closed_at": "2023-08-23T01:28:37Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #274 \r\n\r\nMetadata should an optional `\"config\"` section at root, table or database level.\r\n\r\nThe TableView and RowView and DatabaseView and BaseView classes could all have a `.config(\"key\")` method which knows how to resolve the hierarchy of configs.\r\n\r\nThis will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/275/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 647095487, "node_id": "MDU6SXNzdWU2NDcwOTU0ODc=", "number": 873, "title": "\"datasette -p 0 --root\" gives the wrong URL", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 14, "created_at": "2020-06-29T04:03:06Z", "updated_at": "2020-08-18T17:26:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "```\r\n$ datasette -p 0 --root\r\nhttp://127.0.0.1:0/-/auth-token?token=2d498c...\r\n```\r\nThe port is incorrect.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/873/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 776128565, "node_id": "MDU6SXNzdWU3NzYxMjg1NjU=", "number": 1163, "title": "\"datasette insert data.db url-to-csv\"", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-29T23:21:21Z", "updated_at": "2021-06-17T18:12:32Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Refs #1160 - get filesystem imports working first for #1162, then add import-from-URL.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1163/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 775666296, "node_id": "MDU6SXNzdWU3NzU2NjYyOTY=", "number": 1160, "title": "\"datasette insert\" command and plugin hook", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 23, "created_at": "2020-12-29T02:37:03Z", "updated_at": "2021-06-17T18:12:32Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Tools for loading data into Datasette currently mostly exist as separate utilities - `yaml-to-sqlite` and `csvs-to-sqlite` and suchlike.\r\n\r\nBringing these into Datasette could have some interesting properties:\r\n\r\n- A `datasette insert` command could be extended with plugins to handle more formats\r\n- Any format that can be inserted on the command-line could also be inserted using a web UI or web API - which would benefit from new format plugin hooks\r\n- If Datasette ever grows beyond SQLite (see #670) a built-in import mechanism could work for those other databases as well - without me needing to write `yaml-to-postgresql` and suchlike", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1160/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1102612922, "node_id": "I_kwDOBm6k_c5BuIm6", "number": 1597, "title": "\"datasette inspect\" has no help summary", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-14T00:02:16Z", "updated_at": "2022-01-14T00:07:36Z", "closed_at": "2022-01-14T00:07:36Z", "author_association": "OWNER", "pull_request": null, "body": "Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help\r\n```\r\nCommands:\r\n serve* Serve up specified SQLite database files with a web UI\r\n inspect\r\n install Install Python packages - e.g.\r\n```\r\n```\r\nUsage: datasette inspect [OPTIONS] [FILES]...\r\n\r\nOptions:\r\n --inspect-file TEXT\r\n --load-extension TEXT Path to a SQLite extension to load\r\n --help Show this message and exit.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1597/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 757481949, "node_id": "MDU6SXNzdWU3NTc0ODE5NDk=", "number": 1131, "title": "\"datasette inspect\" outputs invalid JSON if an error is logged", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-12-05T00:00:45Z", "updated_at": "2020-12-05T20:48:34Z", "closed_at": "2020-12-05T05:21:19Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/register-of-members-interests/issues/6:\r\n```\r\n% datasette inspect regmem.db \r\nERROR: conn=, sql = 'select count(*) from [items_fts]', params = None: SQL logic error\r\n{\r\n \"regmem\": {\r\n \"hash\": \"6fde27e3dea80d6b65f2ac7f89cd8448980fee8c91b505ba29c311ba0393317f\",\r\n \"size\": 936198144,\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1131/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677227912, "node_id": "MDU6SXNzdWU2NzcyMjc5MTI=", "number": 925, "title": "\"datasette install\" and \"datasette uninstall\" commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-11T22:04:32Z", "updated_at": "2020-08-11T22:34:37Z", "closed_at": "2020-08-11T22:32:12Z", "author_association": "OWNER", "pull_request": null, "body": "When installing Datasette plugins it's crucial that they end up in the same virtual environment as Datasette itself.\r\n\r\nIt's not necessarily obvious how to do this, especially if you install Datasette via pipx or homebrew.\r\n\r\nSolution: `datasette install datasette-vega` and `datasette uninstall datasette-vega` commands that know how to install to the correct place - a very thin wrapper around `pip install`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/925/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 632056825, "node_id": "MDU6SXNzdWU2MzIwNTY4MjU=", "number": 802, "title": "\"datasette plugins\" command is broken", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-06-05T23:33:01Z", "updated_at": "2020-06-05T23:46:43Z", "closed_at": "2020-06-05T23:46:43Z", "author_association": "OWNER", "pull_request": null, "body": "I broke it in https://github.com/simonw/datasette/commit/a7137dfe069e5fceca56f78631baebd4a6a19967 - and it turns out there was no test coverage so I didn't realize it was broken.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/802/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 811458446, "node_id": "MDU6SXNzdWU4MTE0NTg0NDY=", "number": 1233, "title": "\"datasette publish cloudrun\" cannot publish files with spaces in their name", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-18T21:08:31Z", "updated_at": "2021-02-18T21:10:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Got this error:\r\n```\r\nStep 6/9 : RUN datasette inspect fixtures.db extra database.db --inspect-file inspect-data.json\r\n ---> Running in db9da0068592\r\nUsage: datasette inspect [OPTIONS] [FILES]...\r\nTry 'datasette inspect --help' for help.\r\n\r\nError: Invalid value for '[FILES]...': Path 'extra' does not exist.\r\nThe command '/bin/sh -c datasette inspect fixtures.db extra database.db --inspect-file inspect-data.json' returned a non-zero code: 2\r\nERROR\r\nERROR: build step 0 \"gcr.io/cloud-builders/docker\" failed: step exited with non-zero status: 2\r\n```\r\nWhile working on the demo for #1232, using this deploy command:\r\n```\r\nGITHUB_SHA=crossdb datasette publish cloudrun fixtures.db 'extra database.db' \\\r\n -m fixtures.json \\\r\n --plugins-dir=plugins \\\r\n --branch=$GITHUB_SHA \\\r\n --version-note=$GITHUB_SHA \\\r\n --extra-options=\"--setting template_debug 1 --crossdb\" \\\r\n --install=pysqlite3-binary \\\r\n --service=datasette-latest-crossdb\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1233/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 521335335, "node_id": "MDU6SXNzdWU1MjEzMzUzMzU=", "number": 629, "title": "\"datasette publish\" commands should deploy with Python 3.8", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-11-12T05:22:31Z", "updated_at": "2019-11-12T06:03:10Z", "closed_at": "2019-11-12T06:03:10Z", "author_association": "OWNER", "pull_request": null, "body": "Now that we support 3.8 (#627) `datasette publish` should always deploy using Python 3.8.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/629/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 628089318, "node_id": "MDU6SXNzdWU2MjgwODkzMTg=", "number": 787, "title": "\"datasette publish\" should bake in a random --secret", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 1, "created_at": "2020-06-01T01:15:26Z", "updated_at": "2020-06-11T16:02:05Z", "closed_at": "2020-06-11T16:02:05Z", "author_association": "OWNER", "pull_request": null, "body": "To allow signed cookies etc to work reliably (see #785) all of the `datasette publish` commands should generate a random secret on publish and bake it into the configuration - probably by setting the `DATASETTE_SECRET` environment variable.\r\n\r\n- [ ] Cloud Run\r\n- [ ] Heroku\r\n- [ ] https://github.com/simonw/datasette-publish-now\r\n- [ ] https://github.com/simonw/datasette-publish-fly", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/787/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 628499086, "node_id": "MDU6SXNzdWU2Mjg0OTkwODY=", "number": 790, "title": "\"flash messages\" mechanism", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 20, "created_at": "2020-06-01T14:55:44Z", "updated_at": "2020-06-08T19:33:59Z", "closed_at": "2020-06-02T21:14:03Z", "author_association": "OWNER", "pull_request": null, "body": "> Passing `?_success` like this isn't necessarily the best approach. Potential improvements include:\r\n> \r\n> - Signing this message so it can't be tampered with (I could generate a signing secret on startup)\r\n> - Using a cookie with a temporary flash message in it instead\r\n> - Using HTML5 history API to remove the `?_success=` from the URL bar when the user lands on the page\r\n> \r\n> If I add an option to redirect the user to another page after success I may need a mechanism to show a flash message on that page as well, in which case I'll need a general flash message solution that works for any page.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/703_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/790/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 520508502, "node_id": "MDU6SXNzdWU1MjA1MDg1MDI=", "number": 31, "title": "\"friends\" command (similar to \"followers\")", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-11-09T20:20:20Z", "updated_at": "2022-09-20T05:05:03Z", "closed_at": "2020-02-07T07:03:28Z", "author_association": "MEMBER", "pull_request": null, "body": "Current list of commands:\r\n```\r\n followers Save followers for specified user (defaults to...\r\n followers-ids Populate followers table with IDs of account followers\r\n friends-ids Populate followers table with IDs of account friends\r\n```\r\nObvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 668308777, "node_id": "MDU6SXNzdWU2NjgzMDg3Nzc=", "number": 129, "title": "\"insert-files --sqlar\" for creating SQLite archives", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-30T02:28:29Z", "updated_at": "2020-07-30T22:41:01Z", "closed_at": "2020-07-30T22:40:55Z", "author_association": "OWNER", "pull_request": null, "body": "A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism.\r\n\r\nhttps://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/129/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 944903881, "node_id": "MDU6SXNzdWU5NDQ5MDM4ODE=", "number": 1396, "title": "\"invalid reference format\" publishing Docker image", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2021-07-15T01:02:07Z", "updated_at": "2021-10-19T08:10:26Z", "closed_at": "2021-07-15T19:47:25Z", "author_association": "OWNER", "pull_request": null, "body": "Error ocurred at the end of the publish flow for Datasette 0.58: https://github.com/simonw/datasette/runs/3072216421\r\n```\r\nRemoving intermediate container cf32b9440907\r\n ---> dfd6985b2afc\r\nSuccessfully built dfd6985b2afc\r\nSuccessfully tagged ***/datasette:0.58\r\ninvalid reference format\r\nError: Process completed with exit code 1.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1396/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 655465863, "node_id": "MDU6SXNzdWU2NTU0NjU4NjM=", "number": 892, "title": "\"latest\" in new documentation navbar is invisible", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-12T19:57:21Z", "updated_at": "2020-07-12T20:02:35Z", "closed_at": "2020-07-12T20:02:17Z", "author_association": "OWNER", "pull_request": null, "body": "On https://datasette.readthedocs.io/en/latest/\r\n\r\n\"Datasette_\u2014_Datasette_documentation\"\r\n\r\nCompare with https://datasette.readthedocs.io/en/0.45/\r\n\r\n\"Datasette_\u2014_Datasette_documentation\"\r\n\r\nSome custom CSS should fix it.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/892/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1138008042, "node_id": "I_kwDOBm6k_c5D1J_q", "number": 1636, "title": "\"permissions\" propery in metadata for configuring arbitrary permissions", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8711695, "label": " Datasette 1.0a2"}, "comments": 14, "created_at": "2022-02-15T00:25:59Z", "updated_at": "2022-12-13T02:40:50Z", "closed_at": "2022-12-13T02:40:50Z", "author_association": "OWNER", "pull_request": null, "body": "The `\"allow\"` block mechanism can already be used to configure various default permissions. When adding permissions to `datasette-tiddlywiki` I realized it would be good to be able to configure arbitrary permissions such as `edit-tiddlywiki` there too.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1636/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 940077168, "node_id": "MDU6SXNzdWU5NDAwNzcxNjg=", "number": 1389, "title": "\"searchmode\": \"raw\" in table metadata", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2021-07-08T17:32:10Z", "updated_at": "2021-07-10T18:33:13Z", "closed_at": "2021-07-10T18:33:13Z", "author_association": "OWNER", "pull_request": null, "body": "> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw\r\n>\r\n> But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account)\r\n> Here it is:\r\n>\r\n> ```\r\n> {\r\n> \"databases\": {\r\n> \"index\": {\r\n> \"tables\": {\r\n> \"summary\": {\r\n> \"sort\": \"title\",\r\n> \"searchmode\": \"raw\"\r\n> }\r\n> }\r\n> }\r\n> }\r\n> }\r\n\r\n_Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1389/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 841377702, "node_id": "MDU6SXNzdWU4NDEzNzc3MDI=", "number": 251, "title": "\"sqlite-utils convert\" command to replace the separate \"sqlite-transform\" tool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2021-03-25T22:36:36Z", "updated_at": "2021-08-02T22:39:46Z", "closed_at": "2021-08-02T04:47:40Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/sqlite-transform/issues/11 - I built a separate `sqlite-transform` tool a while ago that uses the word \"transform\" to means something entirely different from `sqlite-utils transform` - I'd like to resolve this by merging the two tools.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/251/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403624090, "node_id": "MDU6SXNzdWU0MDM2MjQwOTA=", "number": 6, "title": "\"sqlite-utils insert\" should support newline-delimited JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-01-28T02:00:02Z", "updated_at": "2019-01-28T02:17:45Z", "closed_at": "2019-01-28T02:17:45Z", "author_association": "OWNER", "pull_request": null, "body": "We can already export newline delimited JSON. We should learn to import it as well.\r\n\r\nThe neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array.\r\n\r\nDatasette can export it now: https://github.com/simonw/datasette/issues/405\r\n\r\nDemo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on\r\n\r\nIt should be possible to do this:\r\n\r\n $ curl \"https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on\" \\\r\n | sqlite-utils insert data.db facetable - --nl\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777560474, "node_id": "MDU6SXNzdWU3Nzc1NjA0NzQ=", "number": 218, "title": "\"sqlite-utils triggers\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-03T02:34:50Z", "updated_at": "2021-01-03T03:49:51Z", "closed_at": "2021-01-03T03:03:35Z", "author_association": "OWNER", "pull_request": null, "body": "A command to list the triggers in the database.\r\n\r\n sqlite-utils triggers my.db\r\n\r\nCan optionally take one or more tables:\r\n\r\n sqlite-utils triggers my.db table1 table2", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/218/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853576, "node_id": "MDU6SXNzdWU2MTA4NTM1NzY=", "number": 105, "title": "\"sqlite-utils views\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-01T16:56:11Z", "updated_at": "2020-05-01T20:40:07Z", "closed_at": "2020-05-01T20:38:36Z", "author_association": "OWNER", "pull_request": null, "body": "Similar to `sqlite-utils tables`. See also #104.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602176870, "node_id": "MDU6SXNzdWU2MDIxNzY4NzA=", "number": 43, "title": "\"twitter-to-sqlite lists\" command for retrieving a user's owned lists", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-17T19:08:59Z", "updated_at": "2020-04-17T23:48:28Z", "closed_at": "2020-04-17T23:30:39Z", "author_association": "MEMBER", "pull_request": null, "body": "https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships\r\n\r\n`https://api.twitter.com/1.1/lists/ownerships.json `", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 488833698, "node_id": "MDU6SXNzdWU0ODg4MzM2OTg=", "number": 2, "title": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-03T21:29:12Z", "updated_at": "2019-09-04T20:02:11Z", "closed_at": "2019-09-04T20:02:11Z", "author_association": "MEMBER", "pull_request": null, "body": "Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html\r\n\r\nI'm going to do:\r\n\r\n $ twitter-to-sqlite tweets simonw\r\n\r\n", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 706486323, "node_id": "MDU6SXNzdWU3MDY0ODYzMjM=", "number": 973, "title": "'bool' object is not callable error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5971510, "label": "Datasette 0.50"}, "comments": 2, "created_at": "2020-09-22T15:30:54Z", "updated_at": "2020-10-08T23:54:32Z", "closed_at": "2020-09-22T15:40:35Z", "author_association": "OWNER", "pull_request": null, "body": "I'm getting this when latest is deployed to Cloud Run:\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/datasette\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/site-packages/datasette/cli.py\", line 406, in serve\r\n inspect_data = json.load(open(inspect_file))\r\nTypeError: 'bool' object is not callable\r\n```\r\nI think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/973/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1303169663, "node_id": "I_kwDOCGYnMM5NrMp_", "number": 453, "title": "'unclosed file' warning when using insert_upsert_implementation from Python", "user": {"value": 311257, "label": "makkus"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-07-13T09:34:35Z", "updated_at": "2022-07-15T21:52:25Z", "closed_at": "2022-07-15T21:52:21Z", "author_association": "NONE", "pull_request": null, "body": "I'm using the `[insert_upsert_implementation](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py)` function directly in my Python code to import a csv file with all the bells and whistles `sqlite-utils` provides, but I'm getting a resource warning that a io.TextWrapper object is not closed.\r\n\r\nThe warning goes away when wrapping the code from [this line](https://github.com/simonw/sqlite-utils/blob/42440d6345c242ee39778045e29143fb550bd2c2/sqlite_utils/cli.py#L924) in a try/finally block like:\r\n\r\n```\r\ntry:\r\n ...\r\n ...\r\nfinally:\r\n decoded.close()\r\n```\r\n(might be that `sniff_buffer` must also be closed if non null, but I might be wrong)\r\n\r\nI suspect Python closes the reference automatically when the sqlite-utils cli run is done, but since my code doesn't exit, I'm getting the warning.\r\n\r\nAlternatively, it'd be cool if the 'import csv/tsv' functionality could be added directly to the Database class.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/453/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 811680502, "node_id": "MDU6SXNzdWU4MTE2ODA1MDI=", "number": 236, "title": "--attach command line option for attaching extra databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-19T04:38:30Z", "updated_at": "2021-02-19T05:10:41Z", "closed_at": "2021-02-19T05:08:43Z", "author_association": "OWNER", "pull_request": null, "body": "This will enable cross-database joins, as seen in https://github.com/simonw/datasette/issues/283\r\n\r\nAlso refs #113", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/236/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 730797787, "node_id": "MDU6SXNzdWU3MzA3OTc3ODc=", "number": 1057, "title": "--cors should enable /fixtures.db CORS access", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 1, "created_at": "2020-10-27T20:38:34Z", "updated_at": "2020-10-27T20:52:05Z", "closed_at": "2020-10-27T20:51:09Z", "author_association": "OWNER", "pull_request": null, "body": "So Datasette can work with `SQL.js` as seen in https://observablehq.com/@mbostock/sqlite", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1057/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 567902704, "node_id": "MDU6SXNzdWU1Njc5MDI3MDQ=", "number": 675, "title": "--cp option for datasette publish and datasette package for shipping additional files and directories", "user": {"value": 141844, "label": "aviflax"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2020-02-19T22:55:56Z", "updated_at": "2020-12-28T18:49:21Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I\u2019m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image \u2014 in my case, it\u2019s a sort of a deployment directive. I\u2019ve worked out a way to do this after the image has been created, but it\u2019s convoluted and brittle.\r\n\r\nSo it\u2019d be excellent if there was an additional option for this command, something like, like, `--copy`.\r\n\r\nI\u2019d envision it looking something like:\r\n\r\n```shell\r\n$ datasette package --copy /the/source/path:/the/target/path data.db\r\n```\r\n\r\nI\u2019d be happy to help design, specify, implement, and test this feature, if you\u2019d be interested.\r\n\r\nThanks for the fantastic tools!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/675/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 788527932, "node_id": "MDU6SXNzdWU3ODg1Mjc5MzI=", "number": 223, "title": "--delimiter option for CSV import", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-18T20:25:03Z", "updated_at": "2021-02-06T01:39:47Z", "closed_at": "2021-02-06T01:34:54Z", "author_association": "OWNER", "pull_request": null, "body": "https://bruxellesdata.opendatasoft.com/explore/dataset/dog-toilets/export/?location=12,50.85802,4.38054 says:\r\n\r\n> CSV uses semicolon (;) as a separator.\r\n\r\nWould be useful to be able to do this:\r\n\r\n sqlite-utils insert places.db places places.csv --delimiter ';'\r\n\r\n`--delimiter` could imply `--csv`", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/223/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679637501, "node_id": "MDU6SXNzdWU2Nzk2Mzc1MDE=", "number": 934, "title": "--get doesn't fully invoke the startup routine", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-15T20:30:25Z", "updated_at": "2020-08-15T20:53:49Z", "closed_at": "2020-08-15T20:53:49Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/7702ea602188899ee9b0446a874a6a9b546b564d/datasette/cli.py#L417-L433\r\n\r\nSpotted this working on https://github.com/simonw/latest-datasette-with-all-plugins/issues/3 - I'd like to be able to use `datasette --get /` as a sanity checking test, but that doesn't work if the init hooks aren't fully executed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/934/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 323830051, "node_id": "MDU6SXNzdWUzMjM4MzAwNTE=", "number": 270, "title": "--limit= CLI option for setting limits", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-05-17T00:14:24Z", "updated_at": "2018-05-18T06:19:31Z", "closed_at": "2018-05-18T06:16:39Z", "author_association": "OWNER", "pull_request": null, "body": "#264 calls for four new datasette limit options, on top of the two existing ones:\r\n\r\n* `--max_returned_rows`\r\n* `--sql_time_limit_ms`\r\n\r\nThese are already clogging up `datasette serve --help` a bit.\r\n\r\nHow about this syntax instead?\r\n\r\n datasette --limit max_returned_rows:100 \\\r\n --limit facet_timeout_ms:500 demo.db\r\n\r\nThen we can add as many new user over-rideable limits as we like without clogging up `--help` too much - though it would be good to have a way of optionally listings their documentation as well.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/270/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683830416, "node_id": "MDU6SXNzdWU2ODM4MzA0MTY=", "number": 137, "title": "--load-extension for other sqlite-utils commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-21T21:12:56Z", "updated_at": "2020-10-16T19:14:32Z", "closed_at": "2020-10-16T19:14:32Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. for this:\r\n```\r\ncalands-datasette % sqlite-utils tables calands.db --counts\r\n[{\"table\": \"spatial_ref_sys\", \"count\": 4924},\r\n {\"table\": \"spatialite_history\", \"count\": 14},\r\n {\"table\": \"sqlite_sequence\", \"count\": 1},\r\n {\"table\": \"geometry_columns\", \"count\": 2},\r\n {\"table\": \"spatial_ref_sys_aux\", \"count\": 4873},\r\n {\"table\": \"views_geometry_columns\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns\", \"count\": 0},\r\n {\"table\": \"geometry_columns_statistics\", \"count\": 2},\r\n {\"table\": \"views_geometry_columns_statistics\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_statistics\", \"count\": 0},\r\n {\"table\": \"geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"views_geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"geometry_columns_time\", \"count\": 2},\r\n {\"table\": \"geometry_columns_auth\", \"count\": 2},\r\n {\"table\": \"views_geometry_columns_auth\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_auth\", \"count\": 0},\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/sqlite-utils\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 143, in tables\r\n for line in output_rows(_iter(), headers, nl, arrays, json_cols):\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 922, in output_rows\r\n for row, next_row in itertools.zip_longest(current_iter, next_iter):\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 123, in _iter\r\n row.append(db[name].count)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/db.py\", line 458, in count\r\n return self.db.conn.execute(\r\nsqlite3.OperationalError: no such module: VirtualSpatialIndex\r\n```\r\nThe `tables` command could take `--load-extension` too - as could `rows` and other similar commands.\r\n\r\nFollow-on from #134 ", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/137/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683804172, "node_id": "MDU6SXNzdWU2ODM4MDQxNzI=", "number": 134, "title": "--load-extension option for sqlite-utils query", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-08-21T20:12:42Z", "updated_at": "2020-08-21T21:06:26Z", "closed_at": "2020-08-21T20:54:19Z", "author_association": "OWNER", "pull_request": null, "body": "I got this error:\r\n```\r\n% sqlite-utils calands.db 'create table superunits_with_maps_view_concrete as select * from superunits_with_maps_view'\r\nTraceback (most recent call last):\r\n...\r\n cursor = db.conn.execute(sql, dict(param))\r\nsqlite3.OperationalError: no such function: AsGeoJSON\r\n```\r\nA `--load-extension=/usr/local/lib/mod_spatialite.dylib` option (imitating the same option for Datasette) would help.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1931794126, "node_id": "I_kwDOBm6k_c5zJNbO", "number": 2198, "title": "--load-extension=spatialite not working with Windows", "user": {"value": 363004, "label": "hcarter333"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-10-08T12:50:22Z", "updated_at": "2023-10-08T12:50:22Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Using each of\r\n`python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite`\r\n\r\nand \r\n\r\n`python -m datasette counties.db --load-extension=\"C:\\Windows\\System32\\mod_spatialite.dll\"`\r\n\r\nand\r\n\r\n`python -m datasette counties.db --load-extension=C:\\Windows\\System32\\mod_spatialite.dll`\r\n\r\nI got the error:\r\n\r\n```\r\n File \"C:\\Users\\m3n7es\\AppData\\Local\\Packages\\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\\LocalCache\\local-packages\\Python311\\site-packages\\datasette\\database.py\", line 209, in in_thread\r\n self.ds._prepare_connection(conn, self.name)\r\n File \"C:\\Users\\m3n7es\\AppData\\Local\\Packages\\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\\LocalCache\\local-packages\\Python311\\site-packages\\datasette\\app.py\", line 596, in _prepare_connection\r\n conn.execute(\"SELECT load_extension(?, ?)\", [path, entrypoint])\r\nsqlite3.OperationalError: The specified module could not be found.\r\n\r\n```\r\n\r\nI finally tried modifying the code in app.py to read:\r\n\r\n```\r\n def _prepare_connection(self, conn, database):\r\n conn.row_factory = sqlite3.Row\r\n conn.text_factory = lambda x: str(x, \"utf-8\", \"replace\")\r\n if self.sqlite_extensions:\r\n conn.enable_load_extension(True)\r\n for extension in self.sqlite_extensions:\r\n # \"extension\" is either a string path to the extension\r\n # or a 2-item tuple that specifies which entrypoint to load.\r\n #if isinstance(extension, tuple):\r\n # path, entrypoint = extension\r\n # conn.execute(\"SELECT load_extension(?, ?)\", [path, entrypoint])\r\n #else:\r\n conn.execute(\"SELECT load_extension('C:\\Windows\\System32\\mod_spatialite.dll')\")\r\n\r\n```\r\nAt which point the counties example worked. \r\n\r\nIs there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used.\r\n\r\nOn an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line:\r\n`python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite`\r\n\r\n\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2198/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 752966476, "node_id": "MDU6SXNzdWU3NTI5NjY0NzY=", "number": 1114, "title": "--load-extension=spatialite not working with datasetteproject/datasette docker image", "user": {"value": 2182, "label": "danp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-11-29T17:35:20Z", "updated_at": "2022-01-20T21:29:42Z", "closed_at": "2020-11-29T17:37:45Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places:\r\n\r\nhttps://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60\r\n\r\nHowever, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`.\r\n\r\nThis results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing:\r\n\r\n```\r\n% docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite\r\nError: Could not find SpatiaLite extension\r\n```\r\n\r\nBut it does work when given an explicit path:\r\n\r\n```\r\n% docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so\r\nINFO: Started server process [1]\r\nINFO: Waiting for application startup.\r\nINFO: Application startup complete.\r\nINFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit)\r\n...\r\n```\r\n\r\nPerhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1114/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 723803777, "node_id": "MDU6SXNzdWU3MjM4MDM3Nzc=", "number": 1028, "title": "--load-extension=spatialite shortcut", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 1, "created_at": "2020-10-17T17:02:08Z", "updated_at": "2022-01-20T21:29:41Z", "closed_at": "2020-10-19T22:37:55Z", "author_association": "OWNER", "pull_request": null, "body": "I added this to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/136 and I really like it: pass a special value of `spatialite` and Datasette should attempt to load it from known likely installation locations.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1028/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683812642, "node_id": "MDU6SXNzdWU2ODM4MTI2NDI=", "number": 136, "title": "--load-extension=spatialite shortcut option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-21T20:31:25Z", "updated_at": "2022-02-05T00:04:26Z", "closed_at": "2020-10-16T19:14:32Z", "author_association": "OWNER", "pull_request": null, "body": "In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134)", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 807437089, "node_id": "MDU6SXNzdWU4MDc0MzcwODk=", "number": 228, "title": "--no-headers option for CSV and TSV", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2021-02-12T17:56:51Z", "updated_at": "2021-12-26T07:01:31Z", "closed_at": "2021-02-14T22:25:17Z", "author_association": "OWNER", "pull_request": null, "body": "https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c has a fascinating CSV file that doesn't have a header row - it starts like this:\r\n\r\n```csv\r\nComputation and measurement of turbulent flow through idealized turbine blade passages,,\"Loizou, Panos A.\",https://isni.org/isni/0000000136122593,,University of Manchester,https://isni.org/isni/0000000121662407,1989,Thesis (Ph.D.),,Physical Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781,\r\n\"Prolactin and growth hormone secretion in normal, hyperprolactinaemic and acromegalic man\",,\"Prescott, R. W. G.\",https://isni.org/isni/0000000134992122,,University of Newcastle upon Tyne,https://isni.org/isni/0000000104627212,1983,Thesis (Ph.D.),,Biological Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232784,\r\n```\r\n\r\nIt would be useful if `sqlite-utils insert ... --csv` had a mechanism for importing files like this one.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/228/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 598013965, "node_id": "MDU6SXNzdWU1OTgwMTM5NjU=", "number": 724, "title": "--plugin-secret over-rides existing metadata.json plugin config", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-10T17:56:30Z", "updated_at": "2020-04-16T04:58:12Z", "closed_at": "2020-04-10T18:34:21Z", "author_association": "OWNER", "pull_request": null, "body": "This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored.\r\n\r\nhttps://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/724/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808843401, "node_id": "MDU6SXNzdWU4MDg4NDM0MDE=", "number": 1226, "title": "--port option should validate port is between 0 and 65535", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-02-15T22:01:33Z", "updated_at": "2021-02-18T18:41:27Z", "closed_at": "2021-02-18T18:41:27Z", "author_association": "OWNER", "pull_request": null, "body": "Currently throws an ugly error message:\r\n```\r\n(datasette-graphql) datasette-graphql % datasette fivethirtyeight.db -p 80094\r\nINFO: Started server process [45497]\r\nINFO: Waiting for application startup.\r\nINFO: Application startup complete.\r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-graphql-n1OSJCS8/bin/datasette\", line 8, in \r\n sys.exit(cli())\r\n...\r\n server = await loop.create_server(\r\n File \"/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py\", line 1461, in create_server\r\n sock.bind(sa)\r\nOverflowError: bind(): port must be 0-65535.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1226/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 555832585, "node_id": "MDU6SXNzdWU1NTU4MzI1ODU=", "number": 661, "title": "--port option to expose a port other than 8001 in \"datasette package\"", "user": {"value": 134771, "label": "dvhthomas"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-01-27T21:05:56Z", "updated_at": "2020-01-30T04:17:52Z", "closed_at": "2020-01-29T22:46:45Z", "author_association": "NONE", "pull_request": null, "body": "I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080.\r\n\r\nhttps://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41\r\n\r\nIs there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/661/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665701216, "node_id": "MDU6SXNzdWU2NjU3MDEyMTY=", "number": 123, "title": "--raw option for outputting binary content", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-07-26T03:35:39Z", "updated_at": "2020-07-26T16:44:11Z", "closed_at": "2020-07-26T16:44:11Z", "author_association": "OWNER", "pull_request": null, "body": "Related to the `insert-files` work in #122 - it should be easy to get binary data back out of the database again.\r\n\r\nOne way to do that could be:\r\n\r\n sqlite-utils files.db \"select content from files where key = 'foo.jpg'\" --raw\r\n\r\nThe `--raw` option would cause just the contents of the first column to be output directly to stdout.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/123/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 758944006, "node_id": "MDU6SXNzdWU3NTg5NDQwMDY=", "number": 57, "title": "--readme throws 404 error if README does not exist in repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-07T23:58:49Z", "updated_at": "2020-12-16T18:17:54Z", "closed_at": "2020-12-16T18:17:54Z", "author_association": "MEMBER", "pull_request": null, "body": "It should fail silently (populate the column with a null) instead.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 449931899, "node_id": "MDU6SXNzdWU0NDk5MzE4OTk=", "number": 494, "title": "--reload should only trigger for -i databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 1, "created_at": "2019-05-29T17:28:43Z", "updated_at": "2020-02-24T19:45:05Z", "closed_at": "2020-02-24T19:45:05Z", "author_association": "OWNER", "pull_request": null, "body": "Right now it's triggering any time a mutable database changes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/494/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 574021194, "node_id": "MDU6SXNzdWU1NzQwMjExOTQ=", "number": 691, "title": "--reload sould reload server if code in --plugins-dir changes", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-03-02T14:42:21Z", "updated_at": "2020-06-14T02:35:17Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/691/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 487598468, "node_id": "MDU6SXNzdWU0ODc1OTg0Njg=", "number": 2, "title": "--save option to dump checkins to a JSON file on disk", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-08-30T17:41:06Z", "updated_at": "2019-08-31T02:40:21Z", "closed_at": "2019-08-31T02:40:21Z", "author_association": "MEMBER", "pull_request": null, "body": "This is a complement to the `--load` option - mainly useful for development purposes.\r\n\r\n(I'll rename `--file` to `--load` as part of this issue).", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853393, "node_id": "MDU6SXNzdWU2MTA4NTMzOTM=", "number": 104, "title": "--schema option to \"sqlite-utils tables\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-05-01T16:55:49Z", "updated_at": "2020-05-01T17:12:37Z", "closed_at": "2020-05-01T17:12:37Z", "author_association": "OWNER", "pull_request": null, "body": "Adds output showing the table schema.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 590666760, "node_id": "MDU6SXNzdWU1OTA2NjY3NjA=", "number": 39, "title": "--since feature can be confused by retweets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-03-30T23:25:33Z", "updated_at": "2020-04-01T03:45:16Z", "closed_at": "2020-04-01T03:45:16Z", "author_association": "MEMBER", "pull_request": null, "body": "If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run.\r\n\r\nIt does this by seeking out the max ID of their previous tweets:\r\n\r\nhttps://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311\r\n\r\nBUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 771324837, "node_id": "MDU6SXNzdWU3NzEzMjQ4Mzc=", "number": 53, "title": "--since support for favorites", "user": {"value": 27, "label": "anotherjesse"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-19T07:08:23Z", "updated_at": "2020-12-19T07:47:11Z", "closed_at": "2020-12-19T07:47:11Z", "author_association": "NONE", "pull_request": null, "body": "Having support for `--since` for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites.\r\n\r\nhttps://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3\r\n\r\nThe api seems to take an optional `since_id` parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506268945, "node_id": "MDU6SXNzdWU1MDYyNjg5NDU=", "number": 20, "title": "--since support for various commands for refresh-by-cron", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-13T03:40:46Z", "updated_at": "2019-10-21T03:32:04Z", "closed_at": "2019-10-16T19:26:11Z", "author_association": "MEMBER", "pull_request": null, "body": "I want to run a cron that updates my Twitter database every X minutes.\r\n\r\nIt should be able to retrieve the following without needing to paginate through everything:\r\n\r\n- [x] Tweets I have tweeted\r\n- [x] My home timeline (see #19)\r\n- [x] Tweets I have favourited\r\n\r\nIt would be nice if this could be standardized across all commands as a `--since` option.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808008305, "node_id": "MDU6SXNzdWU4MDgwMDgzMDU=", "number": 230, "title": "--sniff option for sniffing delimiters", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-02-14T17:43:54Z", "updated_at": "2021-02-14T21:15:33Z", "closed_at": "2021-02-14T19:24:32Z", "author_association": "OWNER", "pull_request": null, "body": "> I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/230/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 328155946, "node_id": "MDU6SXNzdWUzMjgxNTU5NDY=", "number": 301, "title": "--spatialite option for \"datasette publish heroku\"", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-05-31T14:13:09Z", "updated_at": "2022-01-20T21:28:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Split off from #243. Need to figure out how to install and configure SpatiaLite on Heroku.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/301/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 318737808, "node_id": "MDU6SXNzdWUzMTg3Mzc4MDg=", "number": 243, "title": "--spatialite option for datasette publish commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2018-04-29T18:19:32Z", "updated_at": "2018-05-31T14:17:53Z", "closed_at": "2018-05-31T14:17:53Z", "author_association": "OWNER", "pull_request": null, "body": "Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/243/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 490803176, "node_id": "MDU6SXNzdWU0OTA4MDMxNzY=", "number": 8, "title": "--sql and --attach options for feeding commands from SQL queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-09-08T20:35:49Z", "updated_at": "2020-03-20T23:13:01Z", "closed_at": "2020-03-20T23:13:01Z", "author_association": "MEMBER", "pull_request": null, "body": "Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:\r\n\r\n $ twitter-to-sqlite users-lookup users.db --attach attending.db \\\r\n --sql \"select Twitter from attending.attendes where Twitter is not null\"\r\n\r\nThe SQL query you feed in is expected to return a list of screen names suitable for processing further by the command.\r\n\r\nShould be supported by all three of:\r\n\r\n- [x] `twitter-to-sqlite users-lookup`\r\n- [x] `twitter-to-sqlite user-timeline`\r\n- [x] `twitter-to-sqlite followers` and `friends`\r\n\r\nThe `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 735648209, "node_id": "MDU6SXNzdWU3MzU2NDgyMDk=", "number": 193, "title": "--tsv output format option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6079500, "label": "3.0"}, "comments": 0, "created_at": "2020-11-03T21:31:18Z", "updated_at": "2020-11-07T00:09:52Z", "closed_at": "2020-11-07T00:09:52Z", "author_association": "OWNER", "pull_request": null, "body": "We already support `--csv` for output, and the `insert` command accepts `--tsv`. The output format options should accept `--tsv` too.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/193/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1345452427, "node_id": "I_kwDODLZ_YM5QMfmL", "number": 11, "title": "-a option is used for \"--auth\" and for \"--all\"", "user": {"value": 2467, "label": "fernand0"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2022-08-21T10:50:48Z", "updated_at": "2022-08-21T21:11:57Z", "closed_at": "2022-08-21T21:11:57Z", "author_association": "NONE", "pull_request": null, "body": "I'm not sure which option is best, instead of -a -all.", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 470691999, "node_id": "MDU6SXNzdWU0NzA2OTE5OTk=", "number": 43, "title": ".add_column() doesn't match indentation of initial creation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-20T16:33:10Z", "updated_at": "2019-07-23T13:09:11Z", "closed_at": "2019-07-23T13:09:05Z", "author_association": "OWNER", "pull_request": null, "body": "I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this:\r\n\r\n```sql\r\nCREATE TABLE [records] (\r\n [type] TEXT,\r\n [sourceName] TEXT,\r\n [sourceVersion] TEXT,\r\n [unit] TEXT,\r\n [creationDate] TEXT,\r\n [startDate] TEXT,\r\n [endDate] TEXT,\r\n [value] TEXT,\r\n [metadata_Health Mate App Version] TEXT,\r\n [metadata_Withings User Identifier] TEXT,\r\n [metadata_Modified Date] TEXT,\r\n [metadata_Withings Link] TEXT,\r\n [metadata_HKWasUserEntered] TEXT\r\n, [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT)\r\n```\r\n\r\nIt would be nice if the columns that were added later matched the indentation of the initial columns.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 816523763, "node_id": "MDU6SXNzdWU4MTY1MjM3NjM=", "number": 238, "title": ".add_foreign_key() corrupts database if column contains a space", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-25T15:07:20Z", "updated_at": "2021-02-25T16:54:02Z", "closed_at": "2021-02-25T16:54:02Z", "author_association": "OWNER", "pull_request": null, "body": "I ran this:\r\n\r\n db[\"Reports\"].add_foreign_key(\"Reported by ID\", \"Reporters\", \"id\")\r\n\r\nAnd got this:\r\n\r\n```\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in add_foreign_keys(self, foreign_keys)\r\n 616 # Have to VACUUM outside the transaction to ensure .foreign_keys property\r\n 617 # can see the newly created foreign key.\r\n--> 618 self.vacuum()\r\n 619 \r\n 620 def index_foreign_keys(self):\r\n\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in vacuum(self)\r\n 629 \r\n 630 def vacuum(self):\r\n--> 631 self.execute(\"VACUUM;\")\r\n 632 \r\n 633 \r\n\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in execute(self, sql, parameters)\r\n 234 return self.conn.execute(sql, parameters)\r\n 235 else:\r\n--> 236 return self.conn.execute(sql)\r\n 237 \r\n 238 def executescript(self, sql):\r\n\r\nDatabaseError: database disk image is malformed\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/238/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 783910901, "node_id": "MDU6SXNzdWU3ODM5MTA5MDE=", "number": 221, "title": ".add_missing_columns() does not take case insensitivity into account", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-12T05:01:00Z", "updated_at": "2021-01-12T23:17:33Z", "closed_at": "2021-01-12T23:17:33Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite columns are case insensitive - but the `.add_missing_columns()` method doesn't know that. This means that it can crash if it identifies a column that is a case-insensitive duplicate of an existing column. https://github.com/simonw/sqlite-utils/blob/4cc82fd0bccc9d2eeb3510beb4e691d7da099f84/sqlite_utils/db.py#L1974-L1980", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/221/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 581339961, "node_id": "MDU6SXNzdWU1ODEzMzk5NjE=", "number": 92, "title": ".columns_dict doesn't work for all possible column types", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-03-14T19:30:35Z", "updated_at": "2020-03-15T18:37:43Z", "closed_at": "2020-03-14T20:04:14Z", "author_association": "OWNER", "pull_request": null, "body": "Got this error:\r\n```\r\n File \".../python3.7/site-packages/sqlite_utils/db.py\", line 462, in \r\n for column in self.columns\r\nKeyError: 'REAL'\r\n```\r\n`.columns_dict` uses `REVERSE_COLUMN_TYPE_MAPPING`:\r\nhttps://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L457-L463\r\n`REVERSE_COLUMN_TYPE_MAPPING` defines `FLOAT` not `REAL`A\r\nhttps://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L68-L74", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/92/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 732685643, "node_id": "MDU6SXNzdWU3MzI2ODU2NDM=", "number": 1063, "title": ".csv should link to .blob downloads", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 3, "created_at": "2020-10-29T21:45:58Z", "updated_at": "2021-06-17T18:12:30Z", "closed_at": "2020-10-29T22:47:45Z", "author_association": "OWNER", "pull_request": null, "body": "- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass)\r\n- ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~\r\n\r\n> Moving the CSV work to a separate ticket.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1063/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1223699280, "node_id": "I_kwDOBm6k_c5I8CtQ", "number": 1739, "title": ".db downloads should be served with an ETag", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2022-05-03T05:11:21Z", "updated_at": "2022-05-04T18:21:18Z", "closed_at": "2022-05-03T14:59:51Z", "author_association": "OWNER", "pull_request": null, "body": "I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it:\r\n\r\n![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1739/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 702386948, "node_id": "MDU6SXNzdWU3MDIzODY5NDg=", "number": 159, "title": ".delete_where() does not auto-commit (unlike .insert() or .upsert())", "user": {"value": 11712349, "label": "spdkils"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-09-16T01:55:52Z", "updated_at": "2023-04-01T17:21:05Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "When you use the delete_where() function on a table, it never commits....\r\n\r\nIs that intentional?", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/159/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1199158210, "node_id": "I_kwDOCGYnMM5HebPC", "number": 423, "title": ".extract() doesn't set foreign key when extracted columns contain NULL value", "user": {"value": 37447552, "label": "jlieth"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-10T20:05:30Z", "updated_at": "2022-08-27T14:45:04Z", "closed_at": "2022-08-27T14:45:04Z", "author_association": "NONE", "pull_request": null, "body": "I've run into an issue with `extract` and I don't believe this is the intended behaviour.\r\n\r\nI'm working with a database with music listening information. Currently it has one large table `listens` that contains all information. I'm trying to normalize the database by extracting relevant columns to separate tables (`artists`, `tracks`, `albums`). Not every track has an album.\r\n\r\nA simplified demonstration with just `track_title` and `album_title` columns:\r\n```ipython\r\nIn [1]: import sqlite_utils\r\n\r\nIn [2]: db = sqlite_utils.Database(memory=True)\r\n\r\nIn [3]: db[\"listens\"].insert_all([\r\n ...: {\"id\": 1, \"track_title\": \"foo\", \"album_title\": \"bar\"},\r\n ...: {\"id\": 2, \"track_title\": \"baz\", \"album_title\": None}\r\n ...: ], pk=\"id\")\r\nOut[3]: \r\n```\r\n\r\nThe track in the first row has an album, the second track doesn't. Now I extract album information into a separate column:\r\n```ipython\r\nIn [4]: db[\"listens\"].extract(columns=[\"album_title\"], table=\"albums\", fk_column=\"album_id\")\r\nOut[4]:
\r\n\r\nIn [5]: list(db[\"albums\"].rows)\r\nOut[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}]\r\n\r\nIn [6]: list(db[\"listens\"].rows)\r\nOut[6]: \r\n[{'id': 1, 'track_title': 'foo', 'album_id': 1},\r\n {'id': 2, 'track_title': 'baz', 'album_id': None}]\r\n```\r\n\r\nThis behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty).\r\n\r\nNow I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table.\r\n```ipython\r\nIn [7]: db[\"listens\"].extract(columns=[\"track_title\", \"album_id\"], table=\"tracks\", fk_column=\"track_id\")\r\nOut[7]:
\r\n\r\nIn [8]: list(db[\"tracks\"].rows)\r\nOut[8]: \r\n[{'id': 1, 'track_title': 'foo', 'album_id': 1},\r\n {'id': 2, 'track_title': 'baz', 'album_id': None}]\r\n\r\nIn [9]: list(db[\"listens\"].rows)\r\nOut[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}]\r\n```\r\n\r\nExtracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL.\r\n\r\nChanging the order of extracts doesn't help.\r\n\r\nI poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 722816436, "node_id": "MDU6SXNzdWU3MjI4MTY0MzY=", "number": 186, "title": ".extract() shouldn't extract null values", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-10-16T02:41:08Z", "updated_at": "2021-08-12T12:32:14Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This almost works, but it creates a rogue `type` record with a value of None.\r\n```\r\nIn [1]: import sqlite_utils\r\nIn [2]: db = sqlite_utils.Database(memory=True)\r\nIn [5]: db[\"creatures\"].insert_all([\r\n {\"id\": 1, \"name\": \"Simon\", \"type\": None},\r\n {\"id\": 2, \"name\": \"Natalie\", \"type\": None},\r\n {\"id\": 3, \"name\": \"Cleo\", \"type\": \"dog\"}], pk=\"id\")\r\nOut[5]:
\r\nIn [7]: db[\"creatures\"].extract(\"type\")\r\nOut[7]:
\r\nIn [8]: list(db[\"creatures\"].rows)\r\nOut[8]: \r\n[{'id': 1, 'name': 'Simon', 'type_id': None},\r\n {'id': 2, 'name': 'Natalie', 'type_id': None},\r\n {'id': 3, 'name': 'Cleo', 'type_id': 2}]\r\nIn [9]: db[\"type\"]\r\nOut[9]:
\r\nIn [10]: list(db[\"type\"].rows)\r\nOut[10]: [{'id': 1, 'type': None}, {'id': 2, 'type': 'dog'}]\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/186/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 413871266, "node_id": "MDU6SXNzdWU0MTM4NzEyNjY=", "number": 18, "title": ".insert/.upsert/.insert_all/.upsert_all should add missing columns", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 4348046, "label": "1.0"}, "comments": 2, "created_at": "2019-02-24T21:36:11Z", "updated_at": "2019-05-25T00:42:11Z", "closed_at": "2019-05-25T00:42:11Z", "author_association": "OWNER", "pull_request": null, "body": "This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808046597, "node_id": "MDU6SXNzdWU4MDgwNDY1OTc=", "number": 234, "title": ".insert_all() fails if subsequent chunks contain additional columns", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-14T21:01:51Z", "updated_at": "2021-02-14T21:03:40Z", "closed_at": "2021-02-14T21:03:40Z", "author_association": "OWNER", "pull_request": null, "body": "Reported by @nieuwenhoven in #225 along with a proposed fix.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/234/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}