{"id": 628121234, "node_id": "MDU6SXNzdWU2MjgxMjEyMzQ=", "number": 788, "title": " /-/permissions debugging tool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 2, "created_at": "2020-06-01T03:13:47Z", "updated_at": "2020-06-06T00:43:40Z", "closed_at": "2020-06-01T05:01:01Z", "author_association": "OWNER", "pull_request": null, "body": "> Debugging tool idea: `/-/permissions` page which shows you the actor and lets you type in the strings for `action`, `resource_type` and `resource_identifier` - then shows you EVERY plugin hook that would have executed and what it would have said, plus when the chain would have terminated.\r\n>\r\n> Bonus: if you're logged in as the `root` user (or a user that matches some kind of permission check, maybe a check for `permissions_debug`) you get to see a rolling log of the last 30 permission checks and what the results were across the whole of Datasette. This should make figuring out permissions policies a whole lot easier.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/699#issuecomment-636576603_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/788/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 314455877, "node_id": "MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz", "number": 209, "title": " Don't duplicate simple primary keys in the link column", "user": {"value": 45057, "label": "russss"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2018-04-15T21:56:15Z", "updated_at": "2018-04-18T08:40:37Z", "closed_at": "2018-04-18T01:13:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/209", "body": "When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column.\r\n\r\nThis change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. \r\n\r\nThis might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text \"view\" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.)\r\n\r\nBonus change in this PR: fix urlencoding of links in the displayed HTML.\r\n\r\nBefore:\r\n![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png)\r\n\r\nAfter:\r\n![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png)", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/209/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 351017365, "node_id": "MDExOlB1bGxSZXF1ZXN0MjA4NzE5MDQz", "number": 361, "title": " Import pysqlite3 if available, closes #360 ", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2018-08-16T00:52:21Z", "updated_at": "2018-08-16T00:58:57Z", "closed_at": "2018-08-16T00:58:57Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/361", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/361/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1871935751, "node_id": "I_kwDOD079W85vk3kH", "number": 40, "title": " ImportError: cannot import name 'formatargspec' from 'inspect'", "user": {"value": 36752421, "label": "hosslikw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2023-08-29T15:36:31Z", "updated_at": "2023-08-31T03:18:07Z", "closed_at": "2023-08-31T03:18:06Z", "author_association": "NONE", "pull_request": null, "body": "I get the following error when running \"pip3 install dogsheep-photos\"\r\n\" from inspect import ismethod, isclass, formatargspec\r\n ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?\"\r\n \r\nPython 3.12.0rc1\r\nsqlite 3.43.0\r\ndatasette, version 0.64.3", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1622640374, "node_id": "I_kwDOCGYnMM5gt4b2", "number": 534, "title": " ResourceWarning: unclosed file", "user": {"value": 1244826, "label": "djhenderson"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-03-14T03:02:18Z", "updated_at": "2023-05-08T19:56:29Z", "closed_at": "2023-05-08T19:56:29Z", "author_association": "NONE", "pull_request": null, "body": "Issuing either\r\n\r\n```\r\npy -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv\r\n [#############-----------------------] 36%\r\n [####################################] 100%C:\\Users\\Doug\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\sqlite_utils\\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'>\r\n insert_upsert_implementation(\r\nResourceWarning: Enable tracemalloc to get the object allocation traceback\r\n```\r\nor\r\n```\r\nset pythonwarnings=default\r\nsqlite-utils insert dogs.db dogs dogs0.csv --csv\r\n [#############-----------------------] 36%\r\n [####################################] 100%C:\\Users\\Doug\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\sqlite_utils\\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'>\r\n insert_upsert_implementation(\r\nResourceWarning: Enable tracemalloc to get the object allocation traceback\r\n```\r\n\r\nexhibits a ResourceWarning indicating that the CSV file being loaded is not closed.\r\n\r\nsqlite-utils --version\r\nsqlite-utils, version 3.30\r\npy --version\r\nPython 3.11.2\r\nWindows Version 10.0.19045 Build 19045\r\nSQLite version 3.41.0 2023-02-21 18:09:37\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1223527226, "node_id": "I_kwDOBm6k_c5I7Ys6", "number": 1738, "title": "\"Cannot use _sort and _sort_desc at the same time\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8303187, "label": "Datasette 0.62"}, "comments": 2, "created_at": "2022-05-03T01:06:24Z", "updated_at": "2022-08-14T16:13:55Z", "closed_at": "2022-08-14T16:13:55Z", "author_association": "OWNER", "pull_request": null, "body": "Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width:\r\n\r\nhttps://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1\r\n\r\nNavigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply:\r\n\r\n![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif)\r\n\r\nAlso notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1738/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 721050815, "node_id": "MDU6SXNzdWU3MjEwNTA4MTU=", "number": 1019, "title": "\"Edit SQL\" button on canned queries", "user": {"value": 639012, "label": "jsfenfen"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 7, "created_at": "2020-10-14T00:51:39Z", "updated_at": "2020-10-23T19:44:06Z", "closed_at": "2020-10-14T03:44:23Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Feature request: Would it be possible to add an \"edit this query\" button on canned queries? Clicking it would open the canned query as an editable sql query. I think the intent is to have named parameters to allow this, but sometimes you just gotta rewrite it? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1019/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1180427792, "node_id": "I_kwDOCGYnMM5GW-YQ", "number": 421, "title": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI", "user": {"value": 24938923, "label": "learning4life"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2022-03-25T07:12:51Z", "updated_at": "2022-04-13T22:41:59Z", "closed_at": "2022-04-13T22:41:59Z", "author_association": "NONE", "pull_request": null, "body": "This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147\r\n\r\n**New error when using CLI: \"sqlite-utils indexes global.db --table\"**\r\n\r\n```\r\n(app-root) sqlite-utils indexes global.db --table\r\nError: near \"(\": syntax error\r\n(app-root) sqlite-utils --version\r\nsqlite-utils, version 3.25.1\r\n(app-root) sqlite3 --version\r\n3.36.0 2021-06-18 18:36:39\r\n(app-root) python --version\r\nPython 3.8.11\r\n```\r\n\r\n\r\nDockerfile\r\n```\r\nFROM centos/python-38-centos7\r\n\r\nUSER root\r\n\r\nRUN yum update -y\r\nRUN yum upgrade -y\r\n\r\n\r\n# epel\r\nRUN yum -y install epel-release && yum clean all\r\n\r\n# SQLite\r\nRUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel \r\n\r\nWORKDIR /build/\r\nCOPY sqlite-autoconf-3360000.tar.gz ./\r\nRUN tar -zxf sqlite-autoconf-3360000.tar.gz\r\nWORKDIR /build/sqlite-autoconf-3360000\r\nRUN ./configure\r\nRUN make\r\nRUN make install\r\n\r\n# \r\nRUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip\r\nRUN pip install sqlite-utils\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 520655983, "node_id": "MDU6SXNzdWU1MjA2NTU5ODM=", "number": 619, "title": "\"Invalid SQL\" page should let you edit the SQL", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 14, "created_at": "2019-11-10T20:54:12Z", "updated_at": "2022-01-13T22:21:42Z", "closed_at": "2021-06-02T04:15:54Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D\r\n\r\n\r\n\r\nWould be useful if this page showed you the invalid SQL you entered so you can edit it and try again.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/619/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1059509927, "node_id": "I_kwDOBm6k_c4_Jtan", "number": 1525, "title": "\"Links from other tables\" broken for columns starting with underscore", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-11-21T22:55:08Z", "updated_at": "2021-11-30T06:39:01Z", "closed_at": "2021-11-30T06:34:35Z", "author_association": "OWNER", "pull_request": null, "body": "Same bug as #1506, this time it's this link or the row page:\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1525/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 647103735, "node_id": "MDU6SXNzdWU2NDcxMDM3MzU=", "number": 875, "title": "\"Logged in as: XXX - logout\" navigation item", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5533512, "label": "Datasette 0.45"}, "comments": 3, "created_at": "2020-06-29T04:31:14Z", "updated_at": "2020-07-02T00:13:24Z", "closed_at": "2020-06-29T18:43:50Z", "author_association": "OWNER", "pull_request": null, "body": "_Originally posted by @simonw in https://github.com/simonw/datasette/issues/840#issuecomment-650895874_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/875/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 898904402, "node_id": "MDU6SXNzdWU4OTg5MDQ0MDI=", "number": 1337, "title": "\"More\" link for facets that shows _facet_size=max results", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-05-23T00:08:51Z", "updated_at": "2021-05-27T16:14:14Z", "closed_at": "2021-05-27T16:01:03Z", "author_association": "OWNER", "pull_request": null, "body": "_Original title: \"More\" link for facets that shows the full set of results_\r\n\r\nThe simplest way to do this will be to have it link to a generated SQL query.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1332#issuecomment-846479062_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1337/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 671056788, "node_id": "MDU6SXNzdWU2NzEwNTY3ODg=", "number": 914, "title": "\"Object of type bytes is not JSON serializable\" for _nl=on", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-01T17:43:10Z", "updated_at": "2020-08-16T21:10:27Z", "closed_at": "2020-08-16T18:26:59Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array returns this:\r\n```json\r\n[\r\n {\r\n \"rowid\": 1,\r\n \"data\": \"this is binary data\"\r\n }\r\n]\r\n```\r\nBut adding `&_nl=on` returns this: https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array&_nl=on\r\n```json\r\n{\r\n \"ok\": false,\r\n \"error\": \"Object of type bytes is not JSON serializable\",\r\n \"status\": 500,\r\n \"title\": null\r\n}\r\n```\r\nI found this error by running `wget -r 127.0.0.1:8001` against my local `fixtures.db`.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/914/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 761713079, "node_id": "MDU6SXNzdWU3NjE3MTMwNzk=", "number": 1138, "title": "\"Powered by Datasette\" should link to new datasette.io site", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-10T23:33:41Z", "updated_at": "2020-12-15T02:28:10Z", "closed_at": "2020-12-10T23:37:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://datasette.io/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1138/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959999095, "node_id": "MDU6SXNzdWU5NTk5OTkwOTU=", "number": 1421, "title": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times", "user": {"value": 6988, "label": "j4mie"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-08-04T07:29:04Z", "updated_at": "2021-08-09T03:41:07Z", "closed_at": "2021-08-09T03:33:02Z", "author_association": "NONE", "pull_request": null, "body": "Datasette version `0.58.1`.\r\n\r\nI'm guessing this is a bug in the code that looks for `:param`-style query parameters..\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1421/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1084007781, "node_id": "I_kwDOBm6k_c5AnKVl", "number": 1572, "title": "\"Query took\" should be \"Queries took\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 0, "created_at": "2021-12-19T04:03:00Z", "updated_at": "2022-01-13T22:27:43Z", "closed_at": "2021-12-19T04:03:24Z", "author_association": "OWNER", "pull_request": null, "body": "This is misleading, since usually there have been more than one query executed:\r\n\r\n![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png)\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1572/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 275228834, "node_id": "MDU6SXNzdWUyNzUyMjg4MzQ=", "number": 136, "title": "\"Reformat SQL\" button next to SQL editor textarea", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2017-11-20T03:42:19Z", "updated_at": "2019-10-14T03:46:13Z", "closed_at": "2019-10-14T03:46:13Z", "author_association": "OWNER", "pull_request": null, "body": "Can use this:\r\n\r\nhttps://github.com/zeroturnaround/sql-formatter\r\nhttps://zeroturnaround.github.io/sql-formatter/\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/136/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 903200328, "node_id": "MDU6SXNzdWU5MDMyMDAzMjg=", "number": 1341, "title": "\"Show all columns\" cog menu item should show if ?_col= is used", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-05-27T04:28:17Z", "updated_at": "2021-05-27T04:31:16Z", "closed_at": "2021-05-27T04:31:16Z", "author_association": "OWNER", "pull_request": null, "body": "On https://latest.datasette.io/fixtures/sortable?_col=sortable the \"Show all columns\" item (from #615) is not shown (it should be):\r\n\r\n\"fixtures__sortable__201_rows\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1341/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 784628163, "node_id": "MDU6SXNzdWU3ODQ2MjgxNjM=", "number": 1185, "title": "\"Statement may not contain PRAGMA\" error is not strictly true", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 3, "created_at": "2021-01-12T22:07:10Z", "updated_at": "2021-01-24T21:21:37Z", "closed_at": "2021-01-12T22:26:26Z", "author_association": "OWNER", "pull_request": null, "body": "Consider https://latest.datasette.io/fixtures?sql=select+%27select%0D%0A%27+%7C%7C+group_concat%28%27++++case+when+%5B%27+%7C%7C+name+%7C%7C+%27%5D+is+not+null+then+%27+%7C%7C+quote%28name+%7C%7C+%27%2C+%27%29+%7C%7C+%27+else+%27%27%27%27+end%27%2C+%27+%7C%7C%0D%0A%27%29+%7C%7C+%27%0D%0A++as+columns%2C%0D%0A++count%28*%29+as+num_rows%0D%0Afrom%0D%0A++%5B%27+%7C%7C+%3Atable+%7C%7C+%27%5D%0D%0Agroup+by%0D%0A++columns%0D%0Aorder+by%0D%0A++num_rows+desc%27+as+query+from+pragma_ytable_info%28%3Atable%29&table=facetable\r\n\r\nIt says \"Statement may not contain PRAGMA\" - but that's not actually true. Datasette has an allow-list of PRAGMA that are OK - in this case there was a typo in `pragma_ytable_info` which caused the error, but pragma_table_info` would have been OK.\r\n\r\nSo the error message is misleading.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1185/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 573583971, "node_id": "MDU6SXNzdWU1NzM1ODM5NzE=", "number": 689, "title": "\"Templates considered\" comment broken in >=0.35", "user": {"value": 35075, "label": "chrishas35"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-03-01T17:31:21Z", "updated_at": "2020-04-05T19:39:44Z", "closed_at": "2020-04-05T19:39:44Z", "author_association": "NONE", "pull_request": null, "body": "Noticed that the \"Templates Considered\" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34:\r\n\r\n```diff\r\n< \"datasette_version\": \"0.34\",\r\n< \"app_css_hash\": \"ffa51a\",\r\n< \"select_templates\": [\r\n< \"*index.html\"\r\n< ],\r\n< \"zip\": \"\",\r\n< \"body_scripts\": [],\r\n< \"extra_css_urls\": \"\",\r\n< \"extra_js_urls\": \"\",\r\n< \"format_bytes\": \"\",\r\n< \"database_url\": \">\",\r\n< \"database_color\": \">\"\r\n---\r\n> \"datasette_version\": \"0.35\",\r\n> \"database_url\": \">\",\r\n> \"database_color\": \">\"\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/689/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1907655261, "node_id": "I_kwDOBm6k_c5xtIJd", "number": 2193, "title": "\"Test DATASETTE_LOAD_PLUGINS\" test shows errors but did not fail the CI run", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2023-09-21T19:49:34Z", "updated_at": "2023-09-21T21:56:43Z", "closed_at": "2023-09-21T21:56:43Z", "author_association": "OWNER", "pull_request": null, "body": "> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the \"Test DATASETTE_LOAD_PLUGINS\" test shows errors but did not fail the CI run.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2193/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 473083260, "node_id": "MDU6SXNzdWU0NzMwODMyNjA=", "number": 50, "title": "\"Too many SQL variables\" on large inserts", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-07-25T21:43:31Z", "updated_at": "2022-11-04T14:38:36Z", "closed_at": "2019-07-28T11:59:33Z", "author_association": "OWNER", "pull_request": null, "body": "Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9\r\n\r\nIt looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 638241779, "node_id": "MDU6SXNzdWU2MzgyNDE3Nzk=", "number": 846, "title": "\"Too many open files\" error running tests", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-06-13T22:11:40Z", "updated_at": "2020-06-14T00:26:31Z", "closed_at": "2020-06-14T00:26:31Z", "author_association": "OWNER", "pull_request": null, "body": "I got this on my laptop:\r\n```pytest\r\n...\r\n/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/loaders.py:171: in get_source\r\n f = open_if_exists(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n\r\nfilename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html', mode = 'rb'\r\n\r\n def open_if_exists(filename, mode='rb'):\r\n \"\"\"Returns a file descriptor for the filename if that file exists,\r\n otherwise `None`.\r\n \"\"\"\r\n try:\r\n> return open(filename, mode)\r\nE OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html'\r\n\r\n/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/utils.py:154: OSError\r\n```\r\nBased on the conversation in https://github.com/pytest-dev/pytest/issues/2970 I'm worried that my tests are opening too many files without closing them.\r\n\r\nIn particular... I call `sqlite3.connect(filepath)` a LOT - and I don't ever call `conn.close()` on those opened connections:\r\n\r\nhttps://github.com/simonw/datasette/blob/cf7a2bdb404734910ec07abc7571351a2d934828/datasette/database.py#L58-L60\r\n\r\nCould this be resulting in my tests eventually opening too many unclosed file handles? How could I confirm this?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/846/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 760312579, "node_id": "MDU6SXNzdWU3NjAzMTI1Nzk=", "number": 1134, "title": "\"_searchmode=raw\" throws an index out of range error when combined with \"_search_COLUMN\"", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-12-09T13:05:37Z", "updated_at": "2020-12-10T05:57:17Z", "closed_at": "2020-12-09T19:56:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon!\r\nMaybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the \"_search_COLUMN\"-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler \"_search\"-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 122, in route_path\r\n return await view(new_scope, receive, send)\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 196, in view\r\n request, **scope[\"url_route\"][\"kwargs\"]\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 204, in get\r\n request, database, hash, correct_hash_provided, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 342, in view_get\r\n request, database, hash, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py\", line 393, in data\r\n search_col = key.split(\"_search_\", 1)[1]\r\nIndexError: list index out of range\r\n\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665400224, "node_id": "MDU6SXNzdWU2NjU0MDAyMjQ=", "number": 906, "title": "\"allow\": true for anyone, \"allow\": false for nobody", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5607421, "label": "Datasette 0.46"}, "comments": 3, "created_at": "2020-07-24T20:28:10Z", "updated_at": "2020-07-25T00:07:10Z", "closed_at": "2020-07-25T00:05:04Z", "author_association": "OWNER", "pull_request": null, "body": "The \"allow\" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this:\r\n\r\n> An allow block can specify \"no-one is allowed to do this\" using an empty `{}`:\r\n> \r\n> ```\r\n> {\r\n> \"allow\": {}\r\n> }\r\n> ```\r\n\r\n`\"allow\": null` allows all access, though this isn't documented (it should be though).\r\n\r\nThese are not very intuitive. How about also supporting `\"allow\": true` for \"allow anyone\" and `\"allow\": false` for \"allow nobody\"?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/906/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1123849278, "node_id": "I_kwDOCGYnMM5C_JQ-", "number": 395, "title": "\"apt-get: command not found\" error on macOS", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-02-04T06:03:42Z", "updated_at": "2022-02-04T06:10:58Z", "closed_at": "2022-02-04T06:10:58Z", "author_association": "OWNER", "pull_request": null, "body": "Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 324720095, "node_id": "MDU6SXNzdWUzMjQ3MjAwOTU=", "number": 275, "title": "\"config\" section in metadata.json (root, database and table level)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-20T16:02:28Z", "updated_at": "2023-08-23T01:28:37Z", "closed_at": "2023-08-23T01:28:37Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #274 \r\n\r\nMetadata should an optional `\"config\"` section at root, table or database level.\r\n\r\nThe TableView and RowView and DatabaseView and BaseView classes could all have a `.config(\"key\")` method which knows how to resolve the hierarchy of configs.\r\n\r\nThis will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/275/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1102612922, "node_id": "I_kwDOBm6k_c5BuIm6", "number": 1597, "title": "\"datasette inspect\" has no help summary", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-01-14T00:02:16Z", "updated_at": "2022-01-14T00:07:36Z", "closed_at": "2022-01-14T00:07:36Z", "author_association": "OWNER", "pull_request": null, "body": "Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help\r\n```\r\nCommands:\r\n serve* Serve up specified SQLite database files with a web UI\r\n inspect\r\n install Install Python packages - e.g.\r\n```\r\n```\r\nUsage: datasette inspect [OPTIONS] [FILES]...\r\n\r\nOptions:\r\n --inspect-file TEXT\r\n --load-extension TEXT Path to a SQLite extension to load\r\n --help Show this message and exit.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1597/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 757481949, "node_id": "MDU6SXNzdWU3NTc0ODE5NDk=", "number": 1131, "title": "\"datasette inspect\" outputs invalid JSON if an error is logged", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-12-05T00:00:45Z", "updated_at": "2020-12-05T20:48:34Z", "closed_at": "2020-12-05T05:21:19Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/register-of-members-interests/issues/6:\r\n```\r\n% datasette inspect regmem.db \r\nERROR: conn=, sql = 'select count(*) from [items_fts]', params = None: SQL logic error\r\n{\r\n \"regmem\": {\r\n \"hash\": \"6fde27e3dea80d6b65f2ac7f89cd8448980fee8c91b505ba29c311ba0393317f\",\r\n \"size\": 936198144,\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1131/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677227912, "node_id": "MDU6SXNzdWU2NzcyMjc5MTI=", "number": 925, "title": "\"datasette install\" and \"datasette uninstall\" commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-11T22:04:32Z", "updated_at": "2020-08-11T22:34:37Z", "closed_at": "2020-08-11T22:32:12Z", "author_association": "OWNER", "pull_request": null, "body": "When installing Datasette plugins it's crucial that they end up in the same virtual environment as Datasette itself.\r\n\r\nIt's not necessarily obvious how to do this, especially if you install Datasette via pipx or homebrew.\r\n\r\nSolution: `datasette install datasette-vega` and `datasette uninstall datasette-vega` commands that know how to install to the correct place - a very thin wrapper around `pip install`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/925/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 632056825, "node_id": "MDU6SXNzdWU2MzIwNTY4MjU=", "number": 802, "title": "\"datasette plugins\" command is broken", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-06-05T23:33:01Z", "updated_at": "2020-06-05T23:46:43Z", "closed_at": "2020-06-05T23:46:43Z", "author_association": "OWNER", "pull_request": null, "body": "I broke it in https://github.com/simonw/datasette/commit/a7137dfe069e5fceca56f78631baebd4a6a19967 - and it turns out there was no test coverage so I didn't realize it was broken.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/802/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 434321685, "node_id": "MDExOlB1bGxSZXF1ZXN0MjcxMzM4NDA1", "number": 434, "title": "\"datasette publish cloudrun\" command to publish to Google Cloud Run", "user": {"value": 10352819, "label": "rprimet"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2019-04-17T14:41:18Z", "updated_at": "2019-05-03T21:50:44Z", "closed_at": "2019-05-03T13:59:02Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/434", "body": "This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400).\r\n\r\nThe main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract).\r\n\r\nThis was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach?\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/434/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 521335335, "node_id": "MDU6SXNzdWU1MjEzMzUzMzU=", "number": 629, "title": "\"datasette publish\" commands should deploy with Python 3.8", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-11-12T05:22:31Z", "updated_at": "2019-11-12T06:03:10Z", "closed_at": "2019-11-12T06:03:10Z", "author_association": "OWNER", "pull_request": null, "body": "Now that we support 3.8 (#627) `datasette publish` should always deploy using Python 3.8.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/629/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 628089318, "node_id": "MDU6SXNzdWU2MjgwODkzMTg=", "number": 787, "title": "\"datasette publish\" should bake in a random --secret", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 1, "created_at": "2020-06-01T01:15:26Z", "updated_at": "2020-06-11T16:02:05Z", "closed_at": "2020-06-11T16:02:05Z", "author_association": "OWNER", "pull_request": null, "body": "To allow signed cookies etc to work reliably (see #785) all of the `datasette publish` commands should generate a random secret on publish and bake it into the configuration - probably by setting the `DATASETTE_SECRET` environment variable.\r\n\r\n- [ ] Cloud Run\r\n- [ ] Heroku\r\n- [ ] https://github.com/simonw/datasette-publish-now\r\n- [ ] https://github.com/simonw/datasette-publish-fly", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/787/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 628499086, "node_id": "MDU6SXNzdWU2Mjg0OTkwODY=", "number": 790, "title": "\"flash messages\" mechanism", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 20, "created_at": "2020-06-01T14:55:44Z", "updated_at": "2020-06-08T19:33:59Z", "closed_at": "2020-06-02T21:14:03Z", "author_association": "OWNER", "pull_request": null, "body": "> Passing `?_success` like this isn't necessarily the best approach. Potential improvements include:\r\n> \r\n> - Signing this message so it can't be tampered with (I could generate a signing secret on startup)\r\n> - Using a cookie with a temporary flash message in it instead\r\n> - Using HTML5 history API to remove the `?_success=` from the URL bar when the user lands on the page\r\n> \r\n> If I add an option to redirect the user to another page after success I may need a mechanism to show a flash message on that page as well, in which case I'll need a general flash message solution that works for any page.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/703_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/790/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 520508502, "node_id": "MDU6SXNzdWU1MjA1MDg1MDI=", "number": 31, "title": "\"friends\" command (similar to \"followers\")", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-11-09T20:20:20Z", "updated_at": "2022-09-20T05:05:03Z", "closed_at": "2020-02-07T07:03:28Z", "author_association": "MEMBER", "pull_request": null, "body": "Current list of commands:\r\n```\r\n followers Save followers for specified user (defaults to...\r\n followers-ids Populate followers table with IDs of account followers\r\n friends-ids Populate followers table with IDs of account friends\r\n```\r\nObvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 668308777, "node_id": "MDU6SXNzdWU2NjgzMDg3Nzc=", "number": 129, "title": "\"insert-files --sqlar\" for creating SQLite archives", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-30T02:28:29Z", "updated_at": "2020-07-30T22:41:01Z", "closed_at": "2020-07-30T22:40:55Z", "author_association": "OWNER", "pull_request": null, "body": "A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism.\r\n\r\nhttps://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/129/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 944903881, "node_id": "MDU6SXNzdWU5NDQ5MDM4ODE=", "number": 1396, "title": "\"invalid reference format\" publishing Docker image", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2021-07-15T01:02:07Z", "updated_at": "2021-10-19T08:10:26Z", "closed_at": "2021-07-15T19:47:25Z", "author_association": "OWNER", "pull_request": null, "body": "Error ocurred at the end of the publish flow for Datasette 0.58: https://github.com/simonw/datasette/runs/3072216421\r\n```\r\nRemoving intermediate container cf32b9440907\r\n ---> dfd6985b2afc\r\nSuccessfully built dfd6985b2afc\r\nSuccessfully tagged ***/datasette:0.58\r\ninvalid reference format\r\nError: Process completed with exit code 1.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1396/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 655465863, "node_id": "MDU6SXNzdWU2NTU0NjU4NjM=", "number": 892, "title": "\"latest\" in new documentation navbar is invisible", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-12T19:57:21Z", "updated_at": "2020-07-12T20:02:35Z", "closed_at": "2020-07-12T20:02:17Z", "author_association": "OWNER", "pull_request": null, "body": "On https://datasette.readthedocs.io/en/latest/\r\n\r\n\"Datasette_\u2014_Datasette_documentation\"\r\n\r\nCompare with https://datasette.readthedocs.io/en/0.45/\r\n\r\n\"Datasette_\u2014_Datasette_documentation\"\r\n\r\nSome custom CSS should fix it.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/892/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1485488236, "node_id": "PR_kwDOBm6k_c5E1iJG", "number": 1938, "title": "\"permissions\" blocks in metadata.json/yaml", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8711695, "label": " Datasette 1.0a2"}, "comments": 3, "created_at": "2022-12-08T22:07:36Z", "updated_at": "2022-12-13T05:23:18Z", "closed_at": "2022-12-13T05:23:18Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1938", "body": "Refs #1636\r\n\r\n- [x] Documentation\r\n- [ ] Implementation\r\n- [ ] Validate metadata to check there are no nonsensical permissions (like `debug-menu` set at the table level)\r\n- [ ] Tests\r\n\r\n\r\n----\r\n:books: Documentation preview :books:: https://datasette--1938.org.readthedocs.build/en/1938/\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1938/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1138008042, "node_id": "I_kwDOBm6k_c5D1J_q", "number": 1636, "title": "\"permissions\" propery in metadata for configuring arbitrary permissions", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 8711695, "label": " Datasette 1.0a2"}, "comments": 14, "created_at": "2022-02-15T00:25:59Z", "updated_at": "2022-12-13T02:40:50Z", "closed_at": "2022-12-13T02:40:50Z", "author_association": "OWNER", "pull_request": null, "body": "The `\"allow\"` block mechanism can already be used to configure various default permissions. When adding permissions to `datasette-tiddlywiki` I realized it would be good to be able to configure arbitrary permissions such as `edit-tiddlywiki` there too.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1636/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 940077168, "node_id": "MDU6SXNzdWU5NDAwNzcxNjg=", "number": 1389, "title": "\"searchmode\": \"raw\" in table metadata", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2021-07-08T17:32:10Z", "updated_at": "2021-07-10T18:33:13Z", "closed_at": "2021-07-10T18:33:13Z", "author_association": "OWNER", "pull_request": null, "body": "> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw\r\n>\r\n> But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account)\r\n> Here it is:\r\n>\r\n> ```\r\n> {\r\n> \"databases\": {\r\n> \"index\": {\r\n> \"tables\": {\r\n> \"summary\": {\r\n> \"sort\": \"title\",\r\n> \"searchmode\": \"raw\"\r\n> }\r\n> }\r\n> }\r\n> }\r\n> }\r\n\r\n_Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1389/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 841377702, "node_id": "MDU6SXNzdWU4NDEzNzc3MDI=", "number": 251, "title": "\"sqlite-utils convert\" command to replace the separate \"sqlite-transform\" tool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2021-03-25T22:36:36Z", "updated_at": "2021-08-02T22:39:46Z", "closed_at": "2021-08-02T04:47:40Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/sqlite-transform/issues/11 - I built a separate `sqlite-transform` tool a while ago that uses the word \"transform\" to means something entirely different from `sqlite-utils transform` - I'd like to resolve this by merging the two tools.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/251/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403624090, "node_id": "MDU6SXNzdWU0MDM2MjQwOTA=", "number": 6, "title": "\"sqlite-utils insert\" should support newline-delimited JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-01-28T02:00:02Z", "updated_at": "2019-01-28T02:17:45Z", "closed_at": "2019-01-28T02:17:45Z", "author_association": "OWNER", "pull_request": null, "body": "We can already export newline delimited JSON. We should learn to import it as well.\r\n\r\nThe neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array.\r\n\r\nDatasette can export it now: https://github.com/simonw/datasette/issues/405\r\n\r\nDemo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on\r\n\r\nIt should be possible to do this:\r\n\r\n $ curl \"https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on\" \\\r\n | sqlite-utils insert data.db facetable - --nl\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777560474, "node_id": "MDU6SXNzdWU3Nzc1NjA0NzQ=", "number": 218, "title": "\"sqlite-utils triggers\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-03T02:34:50Z", "updated_at": "2021-01-03T03:49:51Z", "closed_at": "2021-01-03T03:03:35Z", "author_association": "OWNER", "pull_request": null, "body": "A command to list the triggers in the database.\r\n\r\n sqlite-utils triggers my.db\r\n\r\nCan optionally take one or more tables:\r\n\r\n sqlite-utils triggers my.db table1 table2", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/218/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853576, "node_id": "MDU6SXNzdWU2MTA4NTM1NzY=", "number": 105, "title": "\"sqlite-utils views\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-01T16:56:11Z", "updated_at": "2020-05-01T20:40:07Z", "closed_at": "2020-05-01T20:38:36Z", "author_association": "OWNER", "pull_request": null, "body": "Similar to `sqlite-utils tables`. See also #104.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602176870, "node_id": "MDU6SXNzdWU2MDIxNzY4NzA=", "number": 43, "title": "\"twitter-to-sqlite lists\" command for retrieving a user's owned lists", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-17T19:08:59Z", "updated_at": "2020-04-17T23:48:28Z", "closed_at": "2020-04-17T23:30:39Z", "author_association": "MEMBER", "pull_request": null, "body": "https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships\r\n\r\n`https://api.twitter.com/1.1/lists/ownerships.json `", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 488833698, "node_id": "MDU6SXNzdWU0ODg4MzM2OTg=", "number": 2, "title": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-03T21:29:12Z", "updated_at": "2019-09-04T20:02:11Z", "closed_at": "2019-09-04T20:02:11Z", "author_association": "MEMBER", "pull_request": null, "body": "Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html\r\n\r\nI'm going to do:\r\n\r\n $ twitter-to-sqlite tweets simonw\r\n\r\n", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 706486323, "node_id": "MDU6SXNzdWU3MDY0ODYzMjM=", "number": 973, "title": "'bool' object is not callable error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5971510, "label": "Datasette 0.50"}, "comments": 2, "created_at": "2020-09-22T15:30:54Z", "updated_at": "2020-10-08T23:54:32Z", "closed_at": "2020-09-22T15:40:35Z", "author_association": "OWNER", "pull_request": null, "body": "I'm getting this when latest is deployed to Cloud Run:\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/datasette\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/site-packages/datasette/cli.py\", line 406, in serve\r\n inspect_data = json.load(open(inspect_file))\r\nTypeError: 'bool' object is not callable\r\n```\r\nI think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/973/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677265716, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY2NDEwNzU1", "number": 927, "title": "'datasette --get' option, refs #926", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-08-11T23:31:52Z", "updated_at": "2020-08-12T00:24:42Z", "closed_at": "2020-08-12T00:24:41Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/927", "body": "Refs #926, #898", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/927/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1303169663, "node_id": "I_kwDOCGYnMM5NrMp_", "number": 453, "title": "'unclosed file' warning when using insert_upsert_implementation from Python", "user": {"value": 311257, "label": "makkus"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-07-13T09:34:35Z", "updated_at": "2022-07-15T21:52:25Z", "closed_at": "2022-07-15T21:52:21Z", "author_association": "NONE", "pull_request": null, "body": "I'm using the `[insert_upsert_implementation](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py)` function directly in my Python code to import a csv file with all the bells and whistles `sqlite-utils` provides, but I'm getting a resource warning that a io.TextWrapper object is not closed.\r\n\r\nThe warning goes away when wrapping the code from [this line](https://github.com/simonw/sqlite-utils/blob/42440d6345c242ee39778045e29143fb550bd2c2/sqlite_utils/cli.py#L924) in a try/finally block like:\r\n\r\n```\r\ntry:\r\n ...\r\n ...\r\nfinally:\r\n decoded.close()\r\n```\r\n(might be that `sniff_buffer` must also be closed if non null, but I might be wrong)\r\n\r\nI suspect Python closes the reference automatically when the sqlite-utils cli run is done, but since my code doesn't exit, I'm getting the warning.\r\n\r\nAlternatively, it'd be cool if the 'import csv/tsv' functionality could be added directly to the Database class.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/453/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 811680502, "node_id": "MDU6SXNzdWU4MTE2ODA1MDI=", "number": 236, "title": "--attach command line option for attaching extra databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-19T04:38:30Z", "updated_at": "2021-02-19T05:10:41Z", "closed_at": "2021-02-19T05:08:43Z", "author_association": "OWNER", "pull_request": null, "body": "This will enable cross-database joins, as seen in https://github.com/simonw/datasette/issues/283\r\n\r\nAlso refs #113", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/236/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 730797787, "node_id": "MDU6SXNzdWU3MzA3OTc3ODc=", "number": 1057, "title": "--cors should enable /fixtures.db CORS access", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 1, "created_at": "2020-10-27T20:38:34Z", "updated_at": "2020-10-27T20:52:05Z", "closed_at": "2020-10-27T20:51:09Z", "author_association": "OWNER", "pull_request": null, "body": "So Datasette can work with `SQL.js` as seen in https://observablehq.com/@mbostock/sqlite", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1057/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 811407131, "node_id": "MDExOlB1bGxSZXF1ZXN0NTc1OTQwMTkz", "number": 1232, "title": "--crossdb option for joining across databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-02-18T19:48:50Z", "updated_at": "2021-02-18T22:09:13Z", "closed_at": "2021-02-18T22:09:12Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1232", "body": "Refs #283. Still needs:\r\n\r\n- [x] Unit test for --crossdb queries\r\n- [x] Show warning on console if it truncates at ten databases (or on web interface)\r\n- [x] Show connected databases on the `/_memory` database page\r\n- [x] Documentation\r\n- [x] https://latest.datasette.io/ demo should demonstrate this feature", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1232/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 788527932, "node_id": "MDU6SXNzdWU3ODg1Mjc5MzI=", "number": 223, "title": "--delimiter option for CSV import", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-18T20:25:03Z", "updated_at": "2021-02-06T01:39:47Z", "closed_at": "2021-02-06T01:34:54Z", "author_association": "OWNER", "pull_request": null, "body": "https://bruxellesdata.opendatasoft.com/explore/dataset/dog-toilets/export/?location=12,50.85802,4.38054 says:\r\n\r\n> CSV uses semicolon (;) as a separator.\r\n\r\nWould be useful to be able to do this:\r\n\r\n sqlite-utils insert places.db places places.csv --delimiter ';'\r\n\r\n`--delimiter` could imply `--csv`", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/223/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679637501, "node_id": "MDU6SXNzdWU2Nzk2Mzc1MDE=", "number": 934, "title": "--get doesn't fully invoke the startup routine", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-15T20:30:25Z", "updated_at": "2020-08-15T20:53:49Z", "closed_at": "2020-08-15T20:53:49Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/7702ea602188899ee9b0446a874a6a9b546b564d/datasette/cli.py#L417-L433\r\n\r\nSpotted this working on https://github.com/simonw/latest-datasette-with-all-plugins/issues/3 - I'd like to be able to use `datasette --get /` as a sanity checking test, but that doesn't work if the init hooks aren't fully executed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/934/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 323830051, "node_id": "MDU6SXNzdWUzMjM4MzAwNTE=", "number": 270, "title": "--limit= CLI option for setting limits", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-05-17T00:14:24Z", "updated_at": "2018-05-18T06:19:31Z", "closed_at": "2018-05-18T06:16:39Z", "author_association": "OWNER", "pull_request": null, "body": "#264 calls for four new datasette limit options, on top of the two existing ones:\r\n\r\n* `--max_returned_rows`\r\n* `--sql_time_limit_ms`\r\n\r\nThese are already clogging up `datasette serve --help` a bit.\r\n\r\nHow about this syntax instead?\r\n\r\n datasette --limit max_returned_rows:100 \\\r\n --limit facet_timeout_ms:500 demo.db\r\n\r\nThen we can add as many new user over-rideable limits as we like without clogging up `--help` too much - though it would be good to have a way of optionally listings their documentation as well.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/270/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1094890366, "node_id": "PR_kwDOCGYnMM4wlm3B", "number": 361, "title": "--lines and --text and --convert and --import", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2022-01-06T01:49:44Z", "updated_at": "2022-01-06T06:37:03Z", "closed_at": "2022-01-06T06:24:54Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/361", "body": "Refs #356\r\n\r\nStill TODO:\r\n\r\n- [x] Get `--lines` working, with tests\r\n- [x] Get `--text` working, with tests\r\n- [x] Get regular JSON import working with `--convert` with tests\r\n- [x] Get `--lines` working with `--convert` with tests\r\n- [x] Get `--text` working with `--convert` with tests\r\n- [x] Get `--csv` and `--tsv` import working with `--convert` with tests\r\n- [x] Get `--nl` working with `--convert` with tests\r\n- [x] Documentation for all of the above", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/361/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 683830416, "node_id": "MDU6SXNzdWU2ODM4MzA0MTY=", "number": 137, "title": "--load-extension for other sqlite-utils commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-21T21:12:56Z", "updated_at": "2020-10-16T19:14:32Z", "closed_at": "2020-10-16T19:14:32Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. for this:\r\n```\r\ncalands-datasette % sqlite-utils tables calands.db --counts\r\n[{\"table\": \"spatial_ref_sys\", \"count\": 4924},\r\n {\"table\": \"spatialite_history\", \"count\": 14},\r\n {\"table\": \"sqlite_sequence\", \"count\": 1},\r\n {\"table\": \"geometry_columns\", \"count\": 2},\r\n {\"table\": \"spatial_ref_sys_aux\", \"count\": 4873},\r\n {\"table\": \"views_geometry_columns\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns\", \"count\": 0},\r\n {\"table\": \"geometry_columns_statistics\", \"count\": 2},\r\n {\"table\": \"views_geometry_columns_statistics\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_statistics\", \"count\": 0},\r\n {\"table\": \"geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"views_geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_field_infos\", \"count\": 0},\r\n {\"table\": \"geometry_columns_time\", \"count\": 2},\r\n {\"table\": \"geometry_columns_auth\", \"count\": 2},\r\n {\"table\": \"views_geometry_columns_auth\", \"count\": 0},\r\n {\"table\": \"virts_geometry_columns_auth\", \"count\": 0},\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/sqlite-utils\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 143, in tables\r\n for line in output_rows(_iter(), headers, nl, arrays, json_cols):\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 922, in output_rows\r\n for row, next_row in itertools.zip_longest(current_iter, next_iter):\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py\", line 123, in _iter\r\n row.append(db[name].count)\r\n File \"/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/db.py\", line 458, in count\r\n return self.db.conn.execute(\r\nsqlite3.OperationalError: no such module: VirtualSpatialIndex\r\n```\r\nThe `tables` command could take `--load-extension` too - as could `rows` and other similar commands.\r\n\r\nFollow-on from #134 ", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/137/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683804172, "node_id": "MDU6SXNzdWU2ODM4MDQxNzI=", "number": 134, "title": "--load-extension option for sqlite-utils query", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-08-21T20:12:42Z", "updated_at": "2020-08-21T21:06:26Z", "closed_at": "2020-08-21T20:54:19Z", "author_association": "OWNER", "pull_request": null, "body": "I got this error:\r\n```\r\n% sqlite-utils calands.db 'create table superunits_with_maps_view_concrete as select * from superunits_with_maps_view'\r\nTraceback (most recent call last):\r\n...\r\n cursor = db.conn.execute(sql, dict(param))\r\nsqlite3.OperationalError: no such function: AsGeoJSON\r\n```\r\nA `--load-extension=/usr/local/lib/mod_spatialite.dylib` option (imitating the same option for Datasette) would help.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 752966476, "node_id": "MDU6SXNzdWU3NTI5NjY0NzY=", "number": 1114, "title": "--load-extension=spatialite not working with datasetteproject/datasette docker image", "user": {"value": 2182, "label": "danp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-11-29T17:35:20Z", "updated_at": "2022-01-20T21:29:42Z", "closed_at": "2020-11-29T17:37:45Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places:\r\n\r\nhttps://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60\r\n\r\nHowever, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`.\r\n\r\nThis results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing:\r\n\r\n```\r\n% docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite\r\nError: Could not find SpatiaLite extension\r\n```\r\n\r\nBut it does work when given an explicit path:\r\n\r\n```\r\n% docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so\r\nINFO: Started server process [1]\r\nINFO: Waiting for application startup.\r\nINFO: Application startup complete.\r\nINFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit)\r\n...\r\n```\r\n\r\nPerhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1114/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 723803777, "node_id": "MDU6SXNzdWU3MjM4MDM3Nzc=", "number": 1028, "title": "--load-extension=spatialite shortcut", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 1, "created_at": "2020-10-17T17:02:08Z", "updated_at": "2022-01-20T21:29:41Z", "closed_at": "2020-10-19T22:37:55Z", "author_association": "OWNER", "pull_request": null, "body": "I added this to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/136 and I really like it: pass a special value of `spatialite` and Datasette should attempt to load it from known likely installation locations.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1028/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683812642, "node_id": "MDU6SXNzdWU2ODM4MTI2NDI=", "number": 136, "title": "--load-extension=spatialite shortcut option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-21T20:31:25Z", "updated_at": "2022-02-05T00:04:26Z", "closed_at": "2020-10-16T19:14:32Z", "author_association": "OWNER", "pull_request": null, "body": "In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134)", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 592844348, "node_id": "MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz", "number": 714, "title": "--metadata accepts YAML as well as JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-02T18:36:02Z", "updated_at": "2020-04-02T19:30:54Z", "closed_at": "2020-04-02T19:30:54Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/714", "body": "Refs #713. Still needs tests and documentation.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/714/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 807437089, "node_id": "MDU6SXNzdWU4MDc0MzcwODk=", "number": 228, "title": "--no-headers option for CSV and TSV", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2021-02-12T17:56:51Z", "updated_at": "2021-12-26T07:01:31Z", "closed_at": "2021-02-14T22:25:17Z", "author_association": "OWNER", "pull_request": null, "body": "https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c has a fascinating CSV file that doesn't have a header row - it starts like this:\r\n\r\n```csv\r\nComputation and measurement of turbulent flow through idealized turbine blade passages,,\"Loizou, Panos A.\",https://isni.org/isni/0000000136122593,,University of Manchester,https://isni.org/isni/0000000121662407,1989,Thesis (Ph.D.),,Physical Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781,\r\n\"Prolactin and growth hormone secretion in normal, hyperprolactinaemic and acromegalic man\",,\"Prescott, R. W. G.\",https://isni.org/isni/0000000134992122,,University of Newcastle upon Tyne,https://isni.org/isni/0000000104627212,1983,Thesis (Ph.D.),,Biological Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232784,\r\n```\r\n\r\nIt would be useful if `sqlite-utils insert ... --csv` had a mechanism for importing files like this one.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/228/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 464894812, "node_id": "MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2", "number": 544, "title": "--plugin-secret option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 4471010, "label": "Datasette 0.29"}, "comments": 1, "created_at": "2019-07-06T22:18:20Z", "updated_at": "2019-07-08T02:06:31Z", "closed_at": "2019-07-08T02:06:31Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/544", "body": "Refs #543\r\n\r\n- [x] Zeit Now v1 support\r\n- [x] Solve escaping of ENV in Dockerfile\r\n- [x] Heroku support\r\n- [x] Unit tests\r\n- [x] Cloud Run support\r\n- [x] Documentation\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/544/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 598013965, "node_id": "MDU6SXNzdWU1OTgwMTM5NjU=", "number": 724, "title": "--plugin-secret over-rides existing metadata.json plugin config", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-10T17:56:30Z", "updated_at": "2020-04-16T04:58:12Z", "closed_at": "2020-04-10T18:34:21Z", "author_association": "OWNER", "pull_request": null, "body": "This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored.\r\n\r\nhttps://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/724/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808843401, "node_id": "MDU6SXNzdWU4MDg4NDM0MDE=", "number": 1226, "title": "--port option should validate port is between 0 and 65535", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-02-15T22:01:33Z", "updated_at": "2021-02-18T18:41:27Z", "closed_at": "2021-02-18T18:41:27Z", "author_association": "OWNER", "pull_request": null, "body": "Currently throws an ugly error message:\r\n```\r\n(datasette-graphql) datasette-graphql % datasette fivethirtyeight.db -p 80094\r\nINFO: Started server process [45497]\r\nINFO: Waiting for application startup.\r\nINFO: Application startup complete.\r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-graphql-n1OSJCS8/bin/datasette\", line 8, in \r\n sys.exit(cli())\r\n...\r\n server = await loop.create_server(\r\n File \"/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py\", line 1461, in create_server\r\n sock.bind(sa)\r\nOverflowError: bind(): port must be 0-65535.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1226/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 555832585, "node_id": "MDU6SXNzdWU1NTU4MzI1ODU=", "number": 661, "title": "--port option to expose a port other than 8001 in \"datasette package\"", "user": {"value": 134771, "label": "dvhthomas"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-01-27T21:05:56Z", "updated_at": "2020-01-30T04:17:52Z", "closed_at": "2020-01-29T22:46:45Z", "author_association": "NONE", "pull_request": null, "body": "I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080.\r\n\r\nhttps://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41\r\n\r\nIs there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/661/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665701216, "node_id": "MDU6SXNzdWU2NjU3MDEyMTY=", "number": 123, "title": "--raw option for outputting binary content", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-07-26T03:35:39Z", "updated_at": "2020-07-26T16:44:11Z", "closed_at": "2020-07-26T16:44:11Z", "author_association": "OWNER", "pull_request": null, "body": "Related to the `insert-files` work in #122 - it should be easy to get binary data back out of the database again.\r\n\r\nOne way to do that could be:\r\n\r\n sqlite-utils files.db \"select content from files where key = 'foo.jpg'\" --raw\r\n\r\nThe `--raw` option would cause just the contents of the first column to be output directly to stdout.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/123/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 758944006, "node_id": "MDU6SXNzdWU3NTg5NDQwMDY=", "number": 57, "title": "--readme throws 404 error if README does not exist in repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-07T23:58:49Z", "updated_at": "2020-12-16T18:17:54Z", "closed_at": "2020-12-16T18:17:54Z", "author_association": "MEMBER", "pull_request": null, "body": "It should fail silently (populate the column with a null) instead.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 449931899, "node_id": "MDU6SXNzdWU0NDk5MzE4OTk=", "number": 494, "title": "--reload should only trigger for -i databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 1, "created_at": "2019-05-29T17:28:43Z", "updated_at": "2020-02-24T19:45:05Z", "closed_at": "2020-02-24T19:45:05Z", "author_association": "OWNER", "pull_request": null, "body": "Right now it's triggering any time a mutable database changes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/494/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 487598468, "node_id": "MDU6SXNzdWU0ODc1OTg0Njg=", "number": 2, "title": "--save option to dump checkins to a JSON file on disk", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-08-30T17:41:06Z", "updated_at": "2019-08-31T02:40:21Z", "closed_at": "2019-08-31T02:40:21Z", "author_association": "MEMBER", "pull_request": null, "body": "This is a complement to the `--load` option - mainly useful for development purposes.\r\n\r\n(I'll rename `--file` to `--load` as part of this issue).", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853393, "node_id": "MDU6SXNzdWU2MTA4NTMzOTM=", "number": 104, "title": "--schema option to \"sqlite-utils tables\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-05-01T16:55:49Z", "updated_at": "2020-05-01T17:12:37Z", "closed_at": "2020-05-01T17:12:37Z", "author_association": "OWNER", "pull_request": null, "body": "Adds output showing the table schema.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 590666760, "node_id": "MDU6SXNzdWU1OTA2NjY3NjA=", "number": 39, "title": "--since feature can be confused by retweets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-03-30T23:25:33Z", "updated_at": "2020-04-01T03:45:16Z", "closed_at": "2020-04-01T03:45:16Z", "author_association": "MEMBER", "pull_request": null, "body": "If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run.\r\n\r\nIt does this by seeking out the max ID of their previous tweets:\r\n\r\nhttps://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311\r\n\r\nBUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 771324837, "node_id": "MDU6SXNzdWU3NzEzMjQ4Mzc=", "number": 53, "title": "--since support for favorites", "user": {"value": 27, "label": "anotherjesse"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-19T07:08:23Z", "updated_at": "2020-12-19T07:47:11Z", "closed_at": "2020-12-19T07:47:11Z", "author_association": "NONE", "pull_request": null, "body": "Having support for `--since` for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites.\r\n\r\nhttps://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3\r\n\r\nThe api seems to take an optional `since_id` parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506268945, "node_id": "MDU6SXNzdWU1MDYyNjg5NDU=", "number": 20, "title": "--since support for various commands for refresh-by-cron", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-13T03:40:46Z", "updated_at": "2019-10-21T03:32:04Z", "closed_at": "2019-10-16T19:26:11Z", "author_association": "MEMBER", "pull_request": null, "body": "I want to run a cron that updates my Twitter database every X minutes.\r\n\r\nIt should be able to retrieve the following without needing to paginate through everything:\r\n\r\n- [x] Tweets I have tweeted\r\n- [x] My home timeline (see #19)\r\n- [x] Tweets I have favourited\r\n\r\nIt would be nice if this could be standardized across all commands as a `--since` option.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808008305, "node_id": "MDU6SXNzdWU4MDgwMDgzMDU=", "number": 230, "title": "--sniff option for sniffing delimiters", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-02-14T17:43:54Z", "updated_at": "2021-02-14T21:15:33Z", "closed_at": "2021-02-14T19:24:32Z", "author_association": "OWNER", "pull_request": null, "body": "> I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/230/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 318737808, "node_id": "MDU6SXNzdWUzMTg3Mzc4MDg=", "number": 243, "title": "--spatialite option for datasette publish commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2018-04-29T18:19:32Z", "updated_at": "2018-05-31T14:17:53Z", "closed_at": "2018-05-31T14:17:53Z", "author_association": "OWNER", "pull_request": null, "body": "Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/243/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 490803176, "node_id": "MDU6SXNzdWU0OTA4MDMxNzY=", "number": 8, "title": "--sql and --attach options for feeding commands from SQL queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-09-08T20:35:49Z", "updated_at": "2020-03-20T23:13:01Z", "closed_at": "2020-03-20T23:13:01Z", "author_association": "MEMBER", "pull_request": null, "body": "Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:\r\n\r\n $ twitter-to-sqlite users-lookup users.db --attach attending.db \\\r\n --sql \"select Twitter from attending.attendes where Twitter is not null\"\r\n\r\nThe SQL query you feed in is expected to return a list of screen names suitable for processing further by the command.\r\n\r\nShould be supported by all three of:\r\n\r\n- [x] `twitter-to-sqlite users-lookup`\r\n- [x] `twitter-to-sqlite user-timeline`\r\n- [x] `twitter-to-sqlite followers` and `friends`\r\n\r\nThe `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 806861312, "node_id": "MDExOlB1bGxSZXF1ZXN0NTcyMjA5MjQz", "number": 1222, "title": "--ssl-keyfile and --ssl-certfile, refs #1221", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-02-12T00:45:58Z", "updated_at": "2021-02-12T00:52:18Z", "closed_at": "2021-02-12T00:52:17Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1222", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1222/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 735648209, "node_id": "MDU6SXNzdWU3MzU2NDgyMDk=", "number": 193, "title": "--tsv output format option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6079500, "label": "3.0"}, "comments": 0, "created_at": "2020-11-03T21:31:18Z", "updated_at": "2020-11-07T00:09:52Z", "closed_at": "2020-11-07T00:09:52Z", "author_association": "OWNER", "pull_request": null, "body": "We already support `--csv` for output, and the `insert` command accepts `--tsv`. The output format options should accept `--tsv` too.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/193/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1345452427, "node_id": "I_kwDODLZ_YM5QMfmL", "number": 11, "title": "-a option is used for \"--auth\" and for \"--all\"", "user": {"value": 2467, "label": "fernand0"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2022-08-21T10:50:48Z", "updated_at": "2022-08-21T21:11:57Z", "closed_at": "2022-08-21T21:11:57Z", "author_association": "NONE", "pull_request": null, "body": "I'm not sure which option is best, instead of -a -all.", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 557077945, "node_id": "MDExOlB1bGxSZXF1ZXN0MzY4NzM0NTAw", "number": 663, "title": "-p argument for datasette package, plus tests - refs #661", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-01-29T19:47:50Z", "updated_at": "2020-01-29T22:46:43Z", "closed_at": "2020-01-29T22:46:43Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/663", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/663/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1870345352, "node_id": "PR_kwDOBm6k_c5Y90K9", "number": 2161, "title": "-s/--setting x y gets merged into datasette.yml, refs #2143, #2156", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2023-08-28T19:30:42Z", "updated_at": "2023-08-28T20:06:15Z", "closed_at": "2023-08-28T20:06:14Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/2161", "body": "This change updates the `-s/--setting` option to `datasette serve` to allow it to be used to set arbitrarily complex nested settings in a way that is compatible with the new `-c datasette.yml` work happening in:\r\n- #2143\r\n\r\nIt will enable things like this:\r\n```\r\ndatasette data.db --setting plugins.datasette-ripgrep.path \"/home/simon/code\"\r\n```\r\nFor the moment though it just affects [settings](https://docs.datasette.io/en/1.0a4/settings.html) - so you can do this:\r\n```\r\ndatasette data.db --setting settings.sql_time_limit_ms 3500\r\n```\r\nI've also implemented a backwards compatibility mechanism, so if you use it this way (the old way):\r\n```\r\ndatasette data.db --setting sql_time_limit_ms 3500\r\n```\r\nIt will notice that the setting you passed is one of Datasette's core settings, and will treat that as if you said `settings.sql_time_limit_ms` instead.\r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--2161.org.readthedocs.build/en/2161/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2161/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 470691999, "node_id": "MDU6SXNzdWU0NzA2OTE5OTk=", "number": 43, "title": ".add_column() doesn't match indentation of initial creation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-20T16:33:10Z", "updated_at": "2019-07-23T13:09:11Z", "closed_at": "2019-07-23T13:09:05Z", "author_association": "OWNER", "pull_request": null, "body": "I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this:\r\n\r\n```sql\r\nCREATE TABLE [records] (\r\n [type] TEXT,\r\n [sourceName] TEXT,\r\n [sourceVersion] TEXT,\r\n [unit] TEXT,\r\n [creationDate] TEXT,\r\n [startDate] TEXT,\r\n [endDate] TEXT,\r\n [value] TEXT,\r\n [metadata_Health Mate App Version] TEXT,\r\n [metadata_Withings User Identifier] TEXT,\r\n [metadata_Modified Date] TEXT,\r\n [metadata_Withings Link] TEXT,\r\n [metadata_HKWasUserEntered] TEXT\r\n, [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT)\r\n```\r\n\r\nIt would be nice if the columns that were added later matched the indentation of the initial columns.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 816523763, "node_id": "MDU6SXNzdWU4MTY1MjM3NjM=", "number": 238, "title": ".add_foreign_key() corrupts database if column contains a space", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-25T15:07:20Z", "updated_at": "2021-02-25T16:54:02Z", "closed_at": "2021-02-25T16:54:02Z", "author_association": "OWNER", "pull_request": null, "body": "I ran this:\r\n\r\n db[\"Reports\"].add_foreign_key(\"Reported by ID\", \"Reporters\", \"id\")\r\n\r\nAnd got this:\r\n\r\n```\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in add_foreign_keys(self, foreign_keys)\r\n 616 # Have to VACUUM outside the transaction to ensure .foreign_keys property\r\n 617 # can see the newly created foreign key.\r\n--> 618 self.vacuum()\r\n 619 \r\n 620 def index_foreign_keys(self):\r\n\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in vacuum(self)\r\n 629 \r\n 630 def vacuum(self):\r\n--> 631 self.execute(\"VACUUM;\")\r\n 632 \r\n 633 \r\n\r\n~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in execute(self, sql, parameters)\r\n 234 return self.conn.execute(sql, parameters)\r\n 235 else:\r\n--> 236 return self.conn.execute(sql)\r\n 237 \r\n 238 def executescript(self, sql):\r\n\r\nDatabaseError: database disk image is malformed\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/238/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 783910901, "node_id": "MDU6SXNzdWU3ODM5MTA5MDE=", "number": 221, "title": ".add_missing_columns() does not take case insensitivity into account", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-12T05:01:00Z", "updated_at": "2021-01-12T23:17:33Z", "closed_at": "2021-01-12T23:17:33Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite columns are case insensitive - but the `.add_missing_columns()` method doesn't know that. This means that it can crash if it identifies a column that is a case-insensitive duplicate of an existing column. https://github.com/simonw/sqlite-utils/blob/4cc82fd0bccc9d2eeb3510beb4e691d7da099f84/sqlite_utils/db.py#L1974-L1980", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/221/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 732634375, "node_id": "MDExOlB1bGxSZXF1ZXN0NTEyNTQ1MzY0", "number": 1061, "title": ".blob output renderer", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 4, "created_at": "2020-10-29T20:25:08Z", "updated_at": "2020-10-29T22:01:40Z", "closed_at": "2020-10-29T22:01:39Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1061", "body": "- [x] Remove the `/-/...blob/...` route I added in #1040 in place of the new `.blob` renderer URLs\r\n- [x] Link to new `.blob` download links on the arbitrary query page (using `_blob_hash=...`) - plus tests for this\r\n\r\nCloses #1050, Closes #1051", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1061/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 581339961, "node_id": "MDU6SXNzdWU1ODEzMzk5NjE=", "number": 92, "title": ".columns_dict doesn't work for all possible column types", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-03-14T19:30:35Z", "updated_at": "2020-03-15T18:37:43Z", "closed_at": "2020-03-14T20:04:14Z", "author_association": "OWNER", "pull_request": null, "body": "Got this error:\r\n```\r\n File \".../python3.7/site-packages/sqlite_utils/db.py\", line 462, in \r\n for column in self.columns\r\nKeyError: 'REAL'\r\n```\r\n`.columns_dict` uses `REVERSE_COLUMN_TYPE_MAPPING`:\r\nhttps://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L457-L463\r\n`REVERSE_COLUMN_TYPE_MAPPING` defines `FLOAT` not `REAL`A\r\nhttps://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L68-L74", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/92/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 732685643, "node_id": "MDU6SXNzdWU3MzI2ODU2NDM=", "number": 1063, "title": ".csv should link to .blob downloads", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 3, "created_at": "2020-10-29T21:45:58Z", "updated_at": "2021-06-17T18:12:30Z", "closed_at": "2020-10-29T22:47:45Z", "author_association": "OWNER", "pull_request": null, "body": "- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass)\r\n- ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~\r\n\r\n> Moving the CSV work to a separate ticket.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1063/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1223699280, "node_id": "I_kwDOBm6k_c5I8CtQ", "number": 1739, "title": ".db downloads should be served with an ETag", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2022-05-03T05:11:21Z", "updated_at": "2022-05-04T18:21:18Z", "closed_at": "2022-05-03T14:59:51Z", "author_association": "OWNER", "pull_request": null, "body": "I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it:\r\n\r\n![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1739/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 570101428, "node_id": "MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4", "number": 683, "title": ".execute_write() and .execute_write_fn() methods on Database", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 14, "created_at": "2020-02-24T19:51:58Z", "updated_at": "2020-05-30T18:40:20Z", "closed_at": "2020-02-25T04:45:08Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/683", "body": "See #682\r\n\r\n- [x] Come up with design for `.execute_write()` and `.execute_write_fn()`\r\n- [x] Build some quick demo plugins to exercise the design\r\n- [x] Write some unit tests\r\n- [x] Write the documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/683/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 1199158210, "node_id": "I_kwDOCGYnMM5HebPC", "number": 423, "title": ".extract() doesn't set foreign key when extracted columns contain NULL value", "user": {"value": 37447552, "label": "jlieth"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2022-04-10T20:05:30Z", "updated_at": "2022-08-27T14:45:04Z", "closed_at": "2022-08-27T14:45:04Z", "author_association": "NONE", "pull_request": null, "body": "I've run into an issue with `extract` and I don't believe this is the intended behaviour.\r\n\r\nI'm working with a database with music listening information. Currently it has one large table `listens` that contains all information. I'm trying to normalize the database by extracting relevant columns to separate tables (`artists`, `tracks`, `albums`). Not every track has an album.\r\n\r\nA simplified demonstration with just `track_title` and `album_title` columns:\r\n```ipython\r\nIn [1]: import sqlite_utils\r\n\r\nIn [2]: db = sqlite_utils.Database(memory=True)\r\n\r\nIn [3]: db[\"listens\"].insert_all([\r\n ...: {\"id\": 1, \"track_title\": \"foo\", \"album_title\": \"bar\"},\r\n ...: {\"id\": 2, \"track_title\": \"baz\", \"album_title\": None}\r\n ...: ], pk=\"id\")\r\nOut[3]: \r\n```\r\n\r\nThe track in the first row has an album, the second track doesn't. Now I extract album information into a separate column:\r\n```ipython\r\nIn [4]: db[\"listens\"].extract(columns=[\"album_title\"], table=\"albums\", fk_column=\"album_id\")\r\nOut[4]:
\r\n\r\nIn [5]: list(db[\"albums\"].rows)\r\nOut[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}]\r\n\r\nIn [6]: list(db[\"listens\"].rows)\r\nOut[6]: \r\n[{'id': 1, 'track_title': 'foo', 'album_id': 1},\r\n {'id': 2, 'track_title': 'baz', 'album_id': None}]\r\n```\r\n\r\nThis behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty).\r\n\r\nNow I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table.\r\n```ipython\r\nIn [7]: db[\"listens\"].extract(columns=[\"track_title\", \"album_id\"], table=\"tracks\", fk_column=\"track_id\")\r\nOut[7]:
\r\n\r\nIn [8]: list(db[\"tracks\"].rows)\r\nOut[8]: \r\n[{'id': 1, 'track_title': 'foo', 'album_id': 1},\r\n {'id': 2, 'track_title': 'baz', 'album_id': None}]\r\n\r\nIn [9]: list(db[\"listens\"].rows)\r\nOut[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}]\r\n```\r\n\r\nExtracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL.\r\n\r\nChanging the order of extracts doesn't help.\r\n\r\nI poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 467928674, "node_id": "MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3", "number": 40, "title": ".get() method plus support for compound primary keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-07-15T03:43:13Z", "updated_at": "2019-07-15T04:28:57Z", "closed_at": "2019-07-15T04:28:52Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/40", "body": "- [x] Tests for the `NotFoundError` exception\r\n- [x] Documentation for `.get()` method\r\n- [x] Support `--pk` multiple times to define CLI compound primary keys\r\n- [x] Documentation for compound primary keys", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/40/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 413871266, "node_id": "MDU6SXNzdWU0MTM4NzEyNjY=", "number": 18, "title": ".insert/.upsert/.insert_all/.upsert_all should add missing columns", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 4348046, "label": "1.0"}, "comments": 2, "created_at": "2019-02-24T21:36:11Z", "updated_at": "2019-05-25T00:42:11Z", "closed_at": "2019-05-25T00:42:11Z", "author_association": "OWNER", "pull_request": null, "body": "This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 808046597, "node_id": "MDU6SXNzdWU4MDgwNDY1OTc=", "number": 234, "title": ".insert_all() fails if subsequent chunks contain additional columns", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-14T21:01:51Z", "updated_at": "2021-02-14T21:03:40Z", "closed_at": "2021-02-14T21:03:40Z", "author_association": "OWNER", "pull_request": null, "body": "Reported by @nieuwenhoven in #225 along with a proposed fix.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/234/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403625674, "node_id": "MDU6SXNzdWU0MDM2MjU2NzQ=", "number": 7, "title": ".insert_all() should accept a generator and process it efficiently", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-28T02:11:58Z", "updated_at": "2019-01-28T06:26:53Z", "closed_at": "2019-01-28T06:26:53Z", "author_association": "OWNER", "pull_request": null, "body": "Right now you have to load every record into memory before passing the list to `.insert_all()` and friends.\r\n\r\nIf you want to process millions of rows, this is inefficient. Python has generators - we should use them!\r\n\r\nThe only catch here is that part of the magic of `sqlite-utils` is that it guesses the column types and creates the table for you. This code will need to be updated to notice if the table needs creating and, if it does, create it using the first X (where x=1,000 but can be customized) records.\r\n\r\nIf a record outside of those first 1,000 has a rogue column, we can crash with an error.\r\n\r\nThis will free us up to make the `--nl` option added in #6 much more efficient.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 742011049, "node_id": "MDU6SXNzdWU3NDIwMTEwNDk=", "number": 1091, "title": ".json and .csv exports fail to apply base_url", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 22, "created_at": "2020-11-12T23:45:16Z", "updated_at": "2021-01-24T21:20:24Z", "closed_at": "2021-01-09T22:19:29Z", "author_association": "OWNER", "pull_request": null, "body": "> Just tested with the latest Docker image, and it works pretty much everywhere! THANK YOU!\r\n> \r\n> I did notice that if I try to export json or csv, the base is not applied. Not sure if I should reopen this issue or open a new one.\r\n> \r\n> To see this, go here: https://corpora.tika.apache.org/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES\r\n> \r\n> Click/hover over json or CSV and you'll see that the 'datasette' base is not included.\r\n\r\n_Originally posted by @tballison in https://github.com/simonw/datasette/issues/865#issuecomment-726385422_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1091/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403617881, "node_id": "MDU6SXNzdWU0MDM2MTc4ODE=", "number": 405, "title": ".json?_nl=on option for exporting newline-delimited JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-01-28T01:10:45Z", "updated_at": "2019-01-28T01:49:00Z", "closed_at": "2019-01-28T01:48:37Z", "author_association": "OWNER", "pull_request": null, "body": "The neat thing about newline-delimited JSON is that you don't have to read an entire array (of potentially thousands of objects) into memory in order to parse it - you can parse things a line at a time instead.\r\n\r\nIt will look like this:\r\n\r\n`https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on`\r\n```\r\n{\"pk\": 1, \"planet_int\": 1, \"on_earth\": 1, \"state\": \"CA\", \"city_id\": 1, \"neighborhood\": \"Mission\"}\r\n{\"pk\": 2, \"planet_int\": 1, \"on_earth\": 1, \"state\": \"CA\", \"city_id\": 1, \"neighborhood\": \"Dogpatch\"}\r\n{\"pk\": 3, \"planet_int\": 1, \"on_earth\": 1, \"state\": \"CA\", \"city_id\": 1, \"neighborhood\": \"SOMA\"}\r\n{\"pk\": 4, \"planet_int\": 1, \"on_earth\": 1, \"state\": \"CA\", \"city_id\": 1, \"neighborhood\": \"Tenderloin\"}\r\n{\"pk\": 5, \"planet_int\": 1, \"on_earth\": 1, \"state\": \"CA\", \"city_id\": 1, \"neighborhood\": \"Bernal Heights\"}\r\n```\r\n\r\nI added this as part of the `sqlite-utils json` CLI command is this commit - I think Datasette should offer it as well: https://github.com/simonw/sqlite-utils/commit/5466c9745dfef858286146ea158ffd5a71391d10\r\n\r\nIt can be offered alongside `_stream=on` (which currently only works for CSV, but it could work for JSON as well thanks to this trick).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/405/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}