{"id": 648637666, "node_id": "MDU6SXNzdWU2NDg2Mzc2NjY=", "number": 880, "title": "POST to /db/canned-query that returns JSON should be supported (for API clients)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 11, "created_at": "2020-07-01T03:14:43Z", "updated_at": "2020-09-14T21:28:21Z", "closed_at": "2020-09-14T21:25:01Z", "author_association": "OWNER", "pull_request": null, "body": "Now that CSRF is solved for API requests (#835) it would be good to support API requests to the `.json` extension.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/880/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 701294727, "node_id": "MDU6SXNzdWU3MDEyOTQ3Mjc=", "number": 965, "title": "Documentation for 404.html, 500.html templates", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 3, "created_at": "2020-09-14T17:36:59Z", "updated_at": "2020-09-14T18:49:49Z", "closed_at": "2020-09-14T18:47:22Z", "author_association": "OWNER", "pull_request": null, "body": "This mechanism is not documented: https://github.com/simonw/datasette/blob/30b98e4d2955073ca2bca92ca7b3d97fcd0191bf/datasette/app.py#L1119-L1129", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/965/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 700728217, "node_id": "MDU6SXNzdWU3MDA3MjgyMTc=", "number": 964, "title": "raise_404 mechanism for custom templates", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-09-14T03:22:15Z", "updated_at": "2020-09-14T17:49:44Z", "closed_at": "2020-09-14T17:39:34Z", "author_association": "OWNER", "pull_request": null, "body": "> Having tried this out I think it does need a `raise_404()` mechanism - which needs to be smart enough to trigger the default 404 handler without accidentally going into an infinite loop.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/944#issuecomment-691788478_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/964/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 681516976, "node_id": "MDU6SXNzdWU2ODE1MTY5NzY=", "number": 944, "title": "Path parameters for custom pages", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 5, "created_at": "2020-08-19T03:25:17Z", "updated_at": "2020-09-14T03:21:45Z", "closed_at": "2020-09-14T02:34:58Z", "author_association": "OWNER", "pull_request": null, "body": "[Custom pages](https://docs.datasette.io/en/stable/custom_templates.html#custom-pages) let you e.g. create a `templates/pages/about.html` page and have it automatically served at `/about`.\r\n\r\nIt would be useful if these pages could capture path patterns. I like the Python format string syntax for this (also used by Starlette): `/foo/bar/{slug}`.\r\n\r\nSo... how about embedding those patterns in the filenames themselves?\r\n\r\n templates/pages/museums/{slug}.html\r\n\r\nWould capture any hits to `/museums/something` and use that page to serve them.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/944/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 698791218, "node_id": "MDU6SXNzdWU2OTg3OTEyMTg=", "number": 50, "title": "favorites --stop_after=N stops after min(N, 200)", "user": {"value": 370930, "label": "mikepqr"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-11T03:38:14Z", "updated_at": "2020-09-13T05:11:14Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "For any number greater than 200, `favorites --stop_after` stops after getting 200 tweets, e.g.\r\n```\r\n$ twitter-to-sqlite favorites tweets.db --stop_after=300\r\nImporting favorites [####################################] 199\r\n$\r\n```\r\nI don't _think_ this is a limitation of the API (if you omit `--stop_after` you get some very large number, possibly all of them), so I _think_ this is a bug.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 699947574, "node_id": "MDU6SXNzdWU2OTk5NDc1NzQ=", "number": 963, "title": "Currently selected array facets are not correctly persisted through hidden form fields", "user": {"value": 649467, "label": "mhalle"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-09-12T01:49:17Z", "updated_at": "2020-09-12T21:54:29Z", "closed_at": "2020-09-12T21:54:09Z", "author_association": "NONE", "pull_request": null, "body": "Faceted search uses JSON array elements as facets rather than the arrays. However, if a search is \"Apply\"ed (using the Apply button), the array itself rather than its elements used. \r\n\r\nTo reproduce:\r\nhttps://latest.datasette.io/fixtures/facetable?_sort=pk&_facet=created&_facet=tags&_facet_array=tags\r\n\r\nPress \"Apply\", which might be done when removing a filter. Notice that the \"tags\" facet values are now arrays, not array elements. It appears the \"&_facet_array=tags\" element of the query string is dropped.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/963/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 569275763, "node_id": "MDU6SXNzdWU1NjkyNzU3NjM=", "number": 680, "title": "Release automation: automate the bit that posts the GitHub release", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-02-22T03:50:40Z", "updated_at": "2020-09-12T18:18:50Z", "closed_at": "2020-09-12T18:18:50Z", "author_association": "OWNER", "pull_request": null, "body": "The most manual part of [the release process](https://datasette.readthedocs.io/en/stable/contributing.html#release-process) right now is having to post a GitHub release that matches the updated changelog.\r\n\r\nThis is particularly annoying because the changelog is in `.rst` while the GitHub release needs markdown - so I currently manually translate between the two.\r\n\r\nHaving the release script automatically post a GitHub release at the end would be much more convenient.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/680/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 649429772, "node_id": "MDU6SXNzdWU2NDk0Mjk3NzI=", "number": 886, "title": "Reconsider how _actor_X magic parameter deals with missing values", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-02T00:00:38Z", "updated_at": "2020-09-11T21:35:26Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I had to build a custom `_actorornull` prefix for [datasette-saved-queries](https://github.com/simonw/datasette-saved-queries/blob/37c00e56ac398e1f9aa342d30357de013a9b37b4/datasette_saved_queries/__init__.py):\r\n```python\r\ndef actorornull(key, request):\r\n if request.actor is None:\r\n return None\r\n return request.actor.get(key)\r\n\r\n\r\n@hookimpl\r\ndef register_magic_parameters():\r\n return [\r\n (\"actorornull\", actorornull),\r\n ]\r\n```\r\nMaybe the `actor` magic in Datasette core should do that out of the box?\r\n\r\nhttps://github.com/simonw/datasette/blob/f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb/datasette/default_magic_parameters.py#L14-L17\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/886/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 691475400, "node_id": "MDU6SXNzdWU2OTE0NzU0MDA=", "number": 958, "title": "Upgrade to latest Black (20.8b1)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 0, "created_at": "2020-09-02T22:24:19Z", "updated_at": "2020-09-11T21:34:24Z", "closed_at": "2020-09-02T22:25:10Z", "author_association": "OWNER", "pull_request": null, "body": "Black has some changes: https://black.readthedocs.io/en/stable/change_log.html#b0 - in particular:\r\n\r\n> - re-implemented support for explicit trailing commas: now it works consistently within any bracket pair, including nested structures (#1288 and duplicates)\r\n> - Black now reindents docstrings when reindenting code around it (#1053)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/958/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 699622046, "node_id": "MDU6SXNzdWU2OTk2MjIwNDY=", "number": 962, "title": "datasette --pdb option for debugging errors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-09-11T18:33:10Z", "updated_at": "2020-09-11T21:34:24Z", "closed_at": "2020-09-11T18:38:01Z", "author_association": "OWNER", "pull_request": null, "body": "I needed to debug an exception from deep inside a Jinja template the other day. I hacked this together and it helped.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/962/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 684111953, "node_id": "MDU6SXNzdWU2ODQxMTE5NTM=", "number": 947, "title": "datasette --get exit code should reflect HTTP errors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-08-23T04:17:08Z", "updated_at": "2020-09-11T21:33:15Z", "closed_at": "2020-09-11T21:33:15Z", "author_association": "OWNER", "pull_request": null, "body": "If you run `datasette . --get /` and the result is a 500 or 404 error (anything that's not a 200 or a 30x) the exit code from the command should not be 0.\r\n\r\nIt should still output the returned content to stdout.\r\n\r\nThis will help with writing soundness checks, as seen in https://til.simonwillison.net/til/til/github-actions_grep-tests.md", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/947/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 697162939, "node_id": "MDU6SXNzdWU2OTcxNjI5Mzk=", "number": 20, "title": "Add more tags so people can find your project.", "user": {"value": 7902810, "label": "ran88dom99"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-09T21:14:09Z", "updated_at": "2020-09-09T21:14:09Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "quantified-self habit-tracking google-fit time-tracking wearables quantifiedself \r\nfor example", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/20/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 1, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 697030843, "node_id": "MDExOlB1bGxSZXF1ZXN0NDgzMDI3NTg3", "number": 156, "title": "Typos in tests", "user": {"value": 96218, "label": "simonwiles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-09T18:00:58Z", "updated_at": "2020-09-09T18:24:50Z", "closed_at": "2020-09-09T18:21:23Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/sqlite-utils/pulls/156", "body": "One of these is my fault, and the other is one I just happened to come across. They're harmless, but might as well be fixed.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/156/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 696908389, "node_id": "MDU6SXNzdWU2OTY5MDgzODk=", "number": 961, "title": "Verification checks for metadata.json on startup", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-09T15:21:53Z", "updated_at": "2020-09-09T15:24:31Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I lost a bunch of time yesterday trying to figure out why a Datasette instance wasn't starting up - it turned out it was because I had a `facets:` reference that mentioned a column that did not exist.\r\n\r\nCatching these on startup would be good.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/961/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 694500679, "node_id": "MDU6SXNzdWU2OTQ1MDA2Nzk=", "number": 17, "title": "Rename \"table\" to \"type\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-06T19:34:41Z", "updated_at": "2020-09-09T03:03:22Z", "closed_at": "2020-09-09T03:03:22Z", "author_association": "MEMBER", "pull_request": null, "body": "I think \"table\" is the wrong name for the concept I'm using it for here.\r\n\r\nTwo reasons: firstly, `table` is a reserved word in SQLite. More importantly, it turns out there's not a direct mapping from tables to types of search result. In particular, for GitHub I ended up having two different \"tables\" of repositories - one for repos created by me, another for repos that I have starred.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 688659182, "node_id": "MDU6SXNzdWU2ODg2NTkxODI=", "number": 145, "title": "Bug when first record contains fewer columns than subsequent records", "user": {"value": 96218, "label": "simonwiles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-08-30T05:44:44Z", "updated_at": "2020-09-08T23:21:23Z", "closed_at": "2020-09-08T23:21:23Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "`insert_all()` selects the maximum batch size based on the number of fields in the first record. If the first record has fewer fields than subsequent records (and `alter=True` is passed), this can result in SQL statements with more than the maximum permitted number of host parameters. This situation is perhaps unlikely to occur, but could happen if the first record had, say, 10 columns, such that `batch_size` (based on `SQLITE_MAX_VARIABLE_NUMBER = 999`) would be 99. If the next 98 rows had 11 columns, the resulting SQL statement for the first batch would have `10 * 1 + 11 * 98 = 1088` host parameters (and subsequent batches, if the data were consistent from thereon out, would have `99 * 11 = 1089`).\r\n\r\nI suspect that this bug is masked somewhat by the fact that while:\r\n> [`SQLITE_MAX_VARIABLE_NUMBER`](https://www.sqlite.org/limits.html#max_variable_number) ... defaults to 999 for SQLite versions prior to 3.32.0 (2020-05-22) or 32766 for SQLite versions after 3.32.0.\r\n\r\nit is common that it is increased at compile time. Debian-based systems, for example, seem to ship with a version of sqlite compiled with `SQLITE_MAX_VARIABLE_NUMBER` set to 250,000, and I believe this is the case for homebrew installations too.\r\n\r\nA test for this issue might look like this:\r\n```python\r\ndef test_columns_not_in_first_record_should_not_cause_batch_to_be_too_large(fresh_db):\r\n # sqlite on homebrew and Debian/Ubuntu etc. is typically compiled with\r\n # SQLITE_MAX_VARIABLE_NUMBER set to 250,000, so we need to exceed this value to\r\n # trigger the error on these systems.\r\n THRESHOLD = 250000\r\n extra_columns = 1 + (THRESHOLD - 1) // 99\r\n records = [\r\n {\"c0\": \"first record\"}, # one column in first record -> batch_size = 100\r\n # fill out the batch with 99 records with enough columns to exceed THRESHOLD\r\n *[\r\n dict([(\"c{}\".format(i), j) for i in range(extra_columns)])\r\n for j in range(99)\r\n ]\r\n ]\r\n try:\r\n fresh_db[\"too_many_columns\"].insert_all(records, alter=True)\r\n except sqlite3.OperationalError:\r\n raise\r\n```\r\n\r\nThe best solution, I think, is simply to process all the records when determining columns, column types, and the batch size. In my tests this doesn't seem to be particularly costly at all, and cuts out a lot of complications (including obviating my implementation of #139 at #142). I'll raise a PR for your consideration.\r\n\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/145/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 688668680, "node_id": "MDExOlB1bGxSZXF1ZXN0NDc1ODc0NDkz", "number": 146, "title": "Handle case where subsequent records (after first batch) include extra columns", "user": {"value": 96218, "label": "simonwiles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-08-30T07:13:58Z", "updated_at": "2020-09-08T23:20:37Z", "closed_at": "2020-09-08T23:20:37Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/sqlite-utils/pulls/146", "body": "Addresses #145.\r\n\r\nI think this should do the job. If it meets with your approval I'll update this PR to include an update to the documentation -- I came across this bug while preparing a PR to update the documentation around `batch_size` in any event.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/146/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 695556681, "node_id": "MDU6SXNzdWU2OTU1NTY2ODE=", "number": 19, "title": "Figure out incremental re-indexing", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-08T05:23:31Z", "updated_at": "2020-09-08T05:27:07Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "As tables get bigger reindexing everything on a schedule (essentially recreating the entire index from scratch) will start to become a performance bottleneck.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 695553522, "node_id": "MDU6SXNzdWU2OTU1NTM1MjI=", "number": 18, "title": "Deleted records stay in the search index", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-08T05:14:23Z", "updated_at": "2020-09-08T05:15:51Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Here's why: https://github.com/dogsheep/dogsheep-beta/blob/24f7898d41a39218058f174c75ba62f7c0fcfff6/dogsheep_beta/utils.py#L44-L53\r\n\r\nThat should probably do `DELETE FROM index1.search_index WHERE [table] = ?` first.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 695441530, "node_id": "MDU6SXNzdWU2OTU0NDE1MzA=", "number": 154, "title": "OperationalError: cannot change into wal mode from within a transaction", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-07T23:42:44Z", "updated_at": "2020-09-07T23:47:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I'm getting this error when running:\r\n\r\n sqlite-utils enable-wal beta.db\r\n\r\n`OperationalError: cannot change into wal mode from within a transaction`\r\n\r\nI'm worried that maybe that's because of this new code from #152:\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/deb2eb013ff85bbc828ebc244a9654f0d9c3139e/sqlite_utils/db.py#L128-L129", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/154/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 695359607, "node_id": "MDU6SXNzdWU2OTUzNTk2MDc=", "number": 150, "title": "Feature for tracing SQL queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-07T19:43:08Z", "updated_at": "2020-09-07T21:57:01Z", "closed_at": "2020-09-07T21:57:01Z", "author_association": "OWNER", "pull_request": null, "body": "Debugging `sqlite-utils` when something weird happens (e.g. #149) can be a bit tricky since it runs a bunch of different SQL statements behind the scenes.\r\n\r\nAn optional \"tracing\" mechanism for seeing what SQL is being executed would be useful.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/150/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 695360889, "node_id": "MDExOlB1bGxSZXF1ZXN0NDgxNjE2NzA0", "number": 151, "title": "Tracer mechanism for seeing underlying SQL", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-07T19:46:43Z", "updated_at": "2020-09-07T21:57:00Z", "closed_at": "2020-09-07T21:57:00Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/151", "body": "Refs #150. Needs tests and documentation, including for the new `db.execute()` and `db.executescript()` methods.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/151/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 695376054, "node_id": "MDU6SXNzdWU2OTUzNzYwNTQ=", "number": 152, "title": "Turn on recursive_triggers by default", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-07T20:26:36Z", "updated_at": "2020-09-07T21:17:48Z", "closed_at": "2020-09-07T20:45:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://www.sqlite.org/pragma.html#pragma_recursive_triggers says:\r\n\r\n> Prior to SQLite [version 3.6.18](https://www.sqlite.org/releaselog/3_6_18.html) (2009-09-11), recursive triggers were not supported. The behavior of SQLite was always as if this pragma was set to OFF. Support for recursive triggers was added in version 3.6.18 but was initially turned OFF by default, for compatibility. Recursive triggers may be turned on by default in future versions of SQLite.\r\n\r\nSo I think the fix for the complex issue in #149 is to turn on `recursive_triggers` globally by default for `sqlite-utils`.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688499924_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/152/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 695319258, "node_id": "MDU6SXNzdWU2OTUzMTkyNTg=", "number": 149, "title": "FTS table with 7 rows has _fts_docsize table with 9,141 rows", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2020-09-07T18:06:16Z", "updated_at": "2020-09-07T21:16:34Z", "closed_at": "2020-09-07T21:16:34Z", "author_association": "OWNER", "pull_request": null, "body": "I'm seeing a weird issue with some of the SQLite databases that I am using with the FTS5 module.\r\n\r\nI have a database with a `licenses` table that contains 7 rows: \r\n\r\nThe FTS table also has 7 rows: \r\n\r\nSomehow the accompanying `licenses_fts_docsize` shadow table now has 9,141 rows in it! \r\n\r\nAnd `licenses_fts_data` has 41 rows - should I expect that to have 7 rows? \r\n\r\nI have a hunch that it might be a problem with the triggers. These are the triggers that are updating that FTS table: \r\n\r\n| type | name | tbl_name | rootpage | sql |\r\n| --- | --- | --- | --- | --- |\r\n| trigger | licenses_ai | licenses | 0 | `CREATE TRIGGER [licenses_ai] AFTER INSERT ON [licenses] BEGIN INSERT INTO [licenses_fts] (rowid, [name]) VALUES (new.rowid, new.[name]); END` |\r\n| trigger | licenses_ad | licenses | 0 | `CREATE TRIGGER [licenses_ad] AFTER DELETE ON [licenses] BEGIN INSERT INTO [licenses_fts] ([licenses_fts], rowid, [name]) VALUES('delete', old.rowid, old.[name]); END` |\r\n| trigger | licenses_au | licenses | 0 | `CREATE TRIGGER [licenses_au] AFTER UPDATE ON [licenses] BEGIN INSERT INTO [licenses_fts] ([licenses_fts], rowid, [name]) VALUES('delete', old.rowid, old.[name]); INSERT INTO [licenses_fts] (rowid, [name]) VALUES (new.rowid, new.[name]); END` |", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/149/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 695276328, "node_id": "MDU6SXNzdWU2OTUyNzYzMjg=", "number": 148, "title": "More attractive indentation of created FTS table schema", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-07T16:49:30Z", "updated_at": "2020-09-07T18:12:50Z", "closed_at": "2020-09-07T18:12:50Z", "author_association": "OWNER", "pull_request": null, "body": "On https://github-to-sqlite.dogsheep.net/github/licenses_fts the create table SQL is displayed as:\r\n```sql\r\nCREATE VIRTUAL TABLE [licenses_fts] USING FTS5 (\r\n [name],\r\n content=[licenses]\r\n );\r\n```\r\nIt would be more aesthetically pleasing if it looked like this:\r\n```sql\r\nCREATE VIRTUAL TABLE [licenses_fts] USING FTS5 (\r\n [name],\r\n content=[licenses]\r\n);\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/148/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 693318095, "node_id": "MDU6SXNzdWU2OTMzMTgwOTU=", "number": 14, "title": "On FTS exception rerun the query with quoting", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-04T15:44:18Z", "updated_at": "2020-09-05T16:23:01Z", "closed_at": "2020-09-05T16:23:01Z", "author_association": "MEMBER", "pull_request": null, "body": "Searching for eg `#dogfest` currently throws an FTS exception - but I want to support advanced FTS query tricks as seen in #13.\r\n\r\nhttps://dogsheep.simonwillison.net/-/beta?q=%23dogfest\r\n\r\n> fts5: syntax error near \"#\"\r\n\r\nIdea: catch that error and re-run the query with FTS escaping applied!\r\n", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/14/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 692202408, "node_id": "MDU6SXNzdWU2OTIyMDI0MDg=", "number": 12, "title": "Idea: maps and GeoJSON support", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-03T18:47:10Z", "updated_at": "2020-09-04T01:45:03Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "It would be cool if the `display_sql` could return a column populated with GeoJSON which would the automatically be displayed on a map in the results (or maybe default JS would look for a `class=\"geojson\"` element output by the `display` template) - ala https://github.com/simonw/datasette-leaflet-geojson\r\n\r\nThen I could render workout routes on a map, or Swarm checkin points.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/12/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 692386625, "node_id": "MDU6SXNzdWU2OTIzODY2MjU=", "number": 13, "title": "Support advanced FTS queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-03T21:29:56Z", "updated_at": "2020-09-03T21:40:51Z", "closed_at": "2020-09-03T21:40:51Z", "author_association": "MEMBER", "pull_request": null, "body": "`simon willison NOT screenshot` for example.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 691521965, "node_id": "MDU6SXNzdWU2OTE1MjE5NjU=", "number": 9, "title": "Mechanism for defining custom display of results", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2020-09-03T00:14:07Z", "updated_at": "2020-09-03T21:12:14Z", "closed_at": "2020-09-03T21:09:55Z", "author_association": "MEMBER", "pull_request": null, "body": "Part of #3 - in particular I want to make sure my photos are displayed with a thumbnail.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 689810340, "node_id": "MDU6SXNzdWU2ODk4MTAzNDA=", "number": 3, "title": "Datasette plugin to provide custom page for running faceted, ranked searches", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-09-01T05:00:22Z", "updated_at": "2020-09-03T21:01:41Z", "closed_at": "2020-09-03T21:01:41Z", "author_association": "MEMBER", "pull_request": null, "body": "This will be a page at `/-/beta` which renders using a custom template.\r\n\r\nIt will offer a default timeline view plus search and facet by type/date.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 689847361, "node_id": "MDU6SXNzdWU2ODk4NDczNjE=", "number": 5, "title": "Add a context column that's not searchable", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-01T06:13:42Z", "updated_at": "2020-09-03T18:43:50Z", "closed_at": "2020-09-03T18:43:50Z", "author_association": "MEMBER", "pull_request": null, "body": "I sometimes like to configure titles that are things like \"Comment on issue X\" or \"Photo in Golden Gate Park\" - these shouldn't be included in the search index but should be stored so they can be displayed to provide context.\r\n\r\nAdd a column for this - probably called `context` - and make it so it can be populated.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 691557547, "node_id": "MDU6SXNzdWU2OTE1NTc1NDc=", "number": 10, "title": "Category 3: received", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-03T01:40:36Z", "updated_at": "2020-09-03T17:38:51Z", "closed_at": "2020-09-03T17:38:51Z", "author_association": "MEMBER", "pull_request": null, "body": "A category for things that were sent to me: DMs, emails etc. Follows #7.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 692125110, "node_id": "MDU6SXNzdWU2OTIxMjUxMTA=", "number": 11, "title": "Public / Private mechanism", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-03T16:47:03Z", "updated_at": "2020-09-03T17:33:52Z", "closed_at": "2020-09-03T17:33:52Z", "author_association": "MEMBER", "pull_request": null, "body": "Some of the data in Dogsheep is stuff that was written publicly - tweets, blog posts, GitHub commits to public repos.\r\n\r\nSome of it is private data - emails, photos, direct messages, Swarm checkins, commits to private repos.\r\n\r\nBeing able to filter for just one or the other (or both) would be useful. Especially when giving demos!", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 691537426, "node_id": "MDU6SXNzdWU2OTE1Mzc0MjY=", "number": 959, "title": "Internals API idea: results.dicts in addition to results.rows", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-03T00:50:17Z", "updated_at": "2020-09-03T00:50:17Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I just wrote this code:\r\n```python\r\n results = await database.execute(SEARCH_SQL, {\"query\": query})\r\n return [dict(r) for r in results.rows]\r\n```\r\nHow about having `results.dicts` as a utility property that does that?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/959/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 687245650, "node_id": "MDExOlB1bGxSZXF1ZXN0NDc0NzAzMDA3", "number": 952, "title": "Update black requirement from ~=19.10b0 to >=19.10,<21.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-27T13:31:36Z", "updated_at": "2020-09-02T22:26:17Z", "closed_at": "2020-09-02T22:26:16Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/952", "body": "Updates the requirements on [black](https://github.com/psf/black) to permit the latest version.\n
\nChangelog\n

Sourced from black's changelog.

\n
\n

20.8b1

\n

Packaging

\n
    \n
  • explicitly depend on Click 7.1.2 or newer as Black no longer works with versions\nolder than 7.0
  • \n
\n

20.8b0

\n

Black

\n
    \n
  • \n

    re-implemented support for explicit trailing commas: now it works consistently within\nany bracket pair, including nested structures (#1288 and duplicates)

    \n
  • \n
  • \n

    Black now reindents docstrings when reindenting code around it (#1053)

    \n
  • \n
  • \n

    Black now shows colored diffs (#1266)

    \n
  • \n
  • \n

    Black is now packaged using 'py3' tagged wheels (#1388)

    \n
  • \n
  • \n

    Black now supports Python 3.8 code, e.g. star expressions in return statements\n(#1121)

    \n
  • \n
  • \n

    Black no longer normalizes capital R-string prefixes as those have a\ncommunity-accepted meaning (#1244)

    \n
  • \n
  • \n

    Black now uses exit code 2 when specified configuration file doesn't exit (#1361)

    \n
  • \n
  • \n

    Black now works on AWS Lambda (#1141)

    \n
  • \n
  • \n

    added --force-exclude argument (#1032)

    \n
  • \n
  • \n

    removed deprecated --py36 option (#1236)

    \n
  • \n
  • \n

    fixed --diff output when EOF is encountered (#526)

    \n
  • \n
  • \n

    fixed # fmt: off handling around decorators (#560)

    \n
  • \n
  • \n

    fixed unstable formatting with some # type: ignore comments (#1113)

    \n
  • \n
  • \n

    fixed invalid removal on organizing brackets followed by indexing (#1575)

    \n
  • \n
  • \n

    introduced black-primer, a CI tool that allows us to run regression tests against\nexisting open source users of Black (#1402)

    \n
  • \n
  • \n

    introduced property-based fuzzing to our test suite based on Hypothesis and\nHypothersmith (#1566)

    \n
  • \n
  • \n

    implemented experimental and disabled by default long string rewrapping (#1132),\nhidden under a --experimental-string-processing flag while it's being worked on;

    \n
  • \n
\n\n
\n
\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/952/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 691265198, "node_id": "MDU6SXNzdWU2OTEyNjUxOTg=", "number": 7, "title": "Mechanism for differentiating between \"by me\" and \"liked by me\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-09-02T17:44:37Z", "updated_at": "2020-09-02T21:06:28Z", "closed_at": "2020-09-02T21:06:28Z", "author_association": "MEMBER", "pull_request": null, "body": "Some of the content I'm indexing is by me - photos I've taken, tweets I wrote, commits, comments I posted.\r\n\r\nSome of it is stuff that I've \"liked\" or \"bookmarked\" in some way - favourited tweets, Pocket articles, starred GitHub repos.\r\n\r\nIt woud be useful to be able to differentiate between the two.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 691369691, "node_id": "MDU6SXNzdWU2OTEzNjk2OTE=", "number": 8, "title": "Create a view for running faceted searches", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-02T19:44:07Z", "updated_at": "2020-09-02T19:50:47Z", "closed_at": "2020-09-02T19:50:47Z", "author_association": "MEMBER", "pull_request": null, "body": "```sql\r\nselect\r\n search_index_fts.rank,\r\n search_index.rowid,\r\n search_index.[table],\r\n search_index.key,\r\n search_index.title,\r\n search_index.timestamp,\r\n search_index.search_1\r\nfrom\r\n search_index join search_index_fts on search_index.rowid = search_index_fts.rowid\r\norder by\r\n search_index_fts.rank, search_index.timestamp desc\r\n```", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 689809225, "node_id": "MDU6SXNzdWU2ODk4MDkyMjU=", "number": 2, "title": "Apply porter stemming", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-09-01T04:57:55Z", "updated_at": "2020-09-01T20:42:00Z", "closed_at": "2020-09-01T20:40:24Z", "author_association": "MEMBER", "pull_request": null, "body": "This can be on by default. You can turn it off for a table in the config file using `stemming: none` - or maybe `tokenize: none` to match the terminology used by SQLite and `sqlite-utils`: https://sqlite-utils.readthedocs.io/en/stable/python-api.html#enabling-full-text-search", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 689850810, "node_id": "MDU6SXNzdWU2ODk4NTA4MTA=", "number": 6, "title": "Set up a demo instance", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-01T06:20:24Z", "updated_at": "2020-09-01T06:20:24Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Once I've got the Datasette plugin to a state where it's worth building a demo: #3\r\n\r\nI can use data from my public https://github-to-sqlite.dogsheep.net/ demo plus the Pocket data subset I use for the demo in https://github.com/dogsheep/pocket-to-sqlite/issues/5 - I could pull in the https://dogsheep-photos.dogsheep.net/ photos data too.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 629473827, "node_id": "MDU6SXNzdWU2Mjk0NzM4Mjc=", "number": 5, "title": "Set up a demo", "user": {"value": 26745575, "label": "harryvederci"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-06-02T19:56:49Z", "updated_at": "2020-09-01T06:18:43Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "First off, thanks for open sourcing this application! This is a suggestion to increase the amount of people that would make use of it: an example in the readme file would help.\r\n\r\nCurrently, users have to clone the app, install it, authorize through pocket, run a command, an then find out if this application does what they hope it does.\r\n\r\nAnother possibility is to add a file `example-output.db`, containing one (mock) Pocket article.\r\n\r\nKeep up the good work!", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/5/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 503243784, "node_id": "MDU6SXNzdWU1MDMyNDM3ODQ=", "number": 3, "title": "Extract images into separate tables", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-10-07T05:43:01Z", "updated_at": "2020-09-01T06:17:45Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both `image` and `images`.\r\n\r\n\"memory__items\"\r\n", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 689848827, "node_id": "MDU6SXNzdWU2ODk4NDg4Mjc=", "number": 6, "title": "ISO timestamps", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-01T06:16:42Z", "updated_at": "2020-09-01T06:16:42Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "The `time_added`, `time_updated` and `time_read` columns currently store data like this:\r\n\r\n September 19, 2019 - 00:30:30 UTC\r\n\r\nShould use ISO instead, e.g. `2020-07-26T01:05:24+00:00`", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 689839399, "node_id": "MDU6SXNzdWU2ODk4MzkzOTk=", "number": 4, "title": "Optimize the FTS table", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-09-01T05:58:17Z", "updated_at": "2020-09-01T06:10:08Z", "closed_at": "2020-09-01T06:10:08Z", "author_association": "MEMBER", "pull_request": null, "body": "", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 689800307, "node_id": "MDU6SXNzdWU2ODk4MDAzMDc=", "number": 1, "title": "Add an index on the timestamp column", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-09-01T04:33:37Z", "updated_at": "2020-09-01T04:49:23Z", "closed_at": "2020-09-01T04:49:23Z", "author_association": "MEMBER", "pull_request": null, "body": "Since default view will likely be ordered by timestamp descending.", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/1/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 688386219, "node_id": "MDExOlB1bGxSZXF1ZXN0NDc1NjY1OTg0", "number": 142, "title": "insert_all(..., alter=True) should work for new columns introduced after the first 100 records", "user": {"value": 96218, "label": "simonwiles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-28T22:22:57Z", "updated_at": "2020-08-30T07:28:23Z", "closed_at": "2020-08-28T22:30:14Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/sqlite-utils/pulls/142", "body": "Closes #139.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/142/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 688395275, "node_id": "MDU6SXNzdWU2ODgzOTUyNzU=", "number": 144, "title": "Run some tests against numpy", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-08-28T22:53:00Z", "updated_at": "2020-08-28T22:57:05Z", "closed_at": "2020-08-28T22:57:04Z", "author_association": "OWNER", "pull_request": null, "body": "Accidentally removed in #143:\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/d7d3f962861ef32c5ead8f514c8756f5b6f7c4a0/.travis.yml#L18-L19", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/144/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 686978131, "node_id": "MDU6SXNzdWU2ODY5NzgxMzE=", "number": 139, "title": "insert_all(..., alter=True) should work for new columns introduced after the first 100 records", "user": {"value": 96218, "label": "simonwiles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-08-27T06:25:25Z", "updated_at": "2020-08-28T22:48:51Z", "closed_at": "2020-08-28T22:30:14Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Is there a way to make `.insert_all()` work properly when new columns are introduced outside the first 100 records (with or without the `alter=True` argument)?\r\n\r\nI'm using `.insert_all()` to bulk insert ~3-4k records at a time and it is common for records to need to introduce new columns. However, if new columns are introduced after the first 100 records, `sqlite_utils` doesn't even raise the `OperationalError: table ... has no column named ...` exception; it just silently drops the extra data and moves on.\r\n\r\nIt took me a while to find this little snippet in the [documentation for `.insert_all()`](https://sqlite-utils.readthedocs.io/en/stable/python-api.html#bulk-inserts) (it's not mentioned under [Adding columns automatically on insert/update](https://sqlite-utils.readthedocs.io/en/stable/python-api.html#bulk-inserts)):\r\n\r\n> The column types used in the CREATE TABLE statement are automatically derived from the types of data in that first batch of rows. **_Any additional or missing columns in subsequent batches will be ignored._**\r\n\r\nI tried changing the `batch_size` argument to the total number of records, but it seems only to effect the number of rows that are committed at a time, and has no influence on this problem.\r\n\r\nIs there a way around this that you would suggest? It seems like it should raise an exception at least.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/139/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 688389933, "node_id": "MDU6SXNzdWU2ODgzODk5MzM=", "number": 143, "title": "Move to GitHub Actions CI", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-28T22:34:11Z", "updated_at": "2020-08-28T22:41:35Z", "closed_at": "2020-08-28T22:41:35Z", "author_association": "OWNER", "pull_request": null, "body": "", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/143/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 687694947, "node_id": "MDU6SXNzdWU2ODc2OTQ5NDc=", "number": 954, "title": "Remove old register_output_renderer dict mechanism in Datasette 1.0", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 1, "created_at": "2020-08-28T04:04:23Z", "updated_at": "2020-08-28T04:56:31Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Documentation says that the old dictionary mechanism will be deprecated by 1.0:\r\n> \r\n> https://github.com/simonw/datasette/blob/799ecae94824640bdff21f86997f69844048d5c3/docs/plugin_hooks.rst#L460\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/953#issuecomment-682312494_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/954/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 682005535, "node_id": "MDU6SXNzdWU2ODIwMDU1MzU=", "number": 945, "title": "datasette install -U for upgrading packages", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-08-19T17:12:04Z", "updated_at": "2020-08-28T04:53:14Z", "closed_at": "2020-08-19T17:20:50Z", "author_association": "OWNER", "pull_request": null, "body": "This will also give Homebrew a way to upgrade Datasette itself without having to wait for the latest packaged version to land in Homebrew core.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/945/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 687681018, "node_id": "MDU6SXNzdWU2ODc2ODEwMTg=", "number": 953, "title": "register_output_renderer render function should be able to return a Response", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 1, "created_at": "2020-08-28T03:21:21Z", "updated_at": "2020-08-28T04:53:03Z", "closed_at": "2020-08-28T04:03:01Z", "author_association": "OWNER", "pull_request": null, "body": "That plugin hook was designed before Datasette had a documented Response class. It should optionally be allowed to return a Response in addition to the current custom dictionary.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/953/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 683804172, "node_id": "MDU6SXNzdWU2ODM4MDQxNzI=", "number": 134, "title": "--load-extension option for sqlite-utils query", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-08-21T20:12:42Z", "updated_at": "2020-08-21T21:06:26Z", "closed_at": "2020-08-21T20:54:19Z", "author_association": "OWNER", "pull_request": null, "body": "I got this error:\r\n```\r\n% sqlite-utils calands.db 'create table superunits_with_maps_view_concrete as select * from superunits_with_maps_view'\r\nTraceback (most recent call last):\r\n...\r\n cursor = db.conn.execute(sql, dict(param))\r\nsqlite3.OperationalError: no such function: AsGeoJSON\r\n```\r\nA `--load-extension=/usr/local/lib/mod_spatialite.dylib` option (imitating the same option for Datasette) would help.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 671763164, "node_id": "MDU6SXNzdWU2NzE3NjMxNjQ=", "number": 915, "title": "Refactor TableView class so things like datasette-graphql can reuse the logic", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-03T03:13:33Z", "updated_at": "2020-08-18T22:28:37Z", "closed_at": "2020-08-18T22:28:37Z", "author_association": "OWNER", "pull_request": null, "body": "_Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-667780040_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/915/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 647095487, "node_id": "MDU6SXNzdWU2NDcwOTU0ODc=", "number": 873, "title": "\"datasette -p 0 --root\" gives the wrong URL", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 14, "created_at": "2020-06-29T04:03:06Z", "updated_at": "2020-08-18T17:26:10Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "```\r\n$ datasette -p 0 --root\r\nhttp://127.0.0.1:0/-/auth-token?token=2d498c...\r\n```\r\nThe port is incorrect.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/873/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 681086659, "node_id": "MDU6SXNzdWU2ODEwODY2NTk=", "number": 47, "title": "emojis command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-18T14:26:26Z", "updated_at": "2020-08-18T14:52:13Z", "closed_at": "2020-08-18T14:52:13Z", "author_association": "MEMBER", "pull_request": null, "body": "For fun - it can import https://api.github.com/emojis - maybe with an option to fetch the binary representations in addition to the URLs.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/47/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 613777056, "node_id": "MDU6SXNzdWU2MTM3NzcwNTY=", "number": 39, "title": "issues foreign key to repo isn't working", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-07T05:11:48Z", "updated_at": "2020-08-18T14:24:46Z", "closed_at": "2020-08-18T14:23:56Z", "author_association": "MEMBER", "pull_request": null, "body": "https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo\r\n\r\n\"github__issues__2_303_rows_where_sorted_by_updated_at_descending\"\r\n\r\nIf the foreign key was working those would be repository names.\r\n\r\nFrom the schema at the bottom of the page:\r\n```\r\n [repo] TEXT,\r\n```\r\nThat's the wrong type and not a foreign key.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679809281, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY4NDg0MDMx", "number": 941, "title": "Run CI on GitHub Actions, not Travis", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-16T19:13:39Z", "updated_at": "2020-08-18T05:09:36Z", "closed_at": "2020-08-18T05:09:35Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/941", "body": "Refs #940", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/941/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 671056788, "node_id": "MDU6SXNzdWU2NzEwNTY3ODg=", "number": 914, "title": "\"Object of type bytes is not JSON serializable\" for _nl=on", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-01T17:43:10Z", "updated_at": "2020-08-16T21:10:27Z", "closed_at": "2020-08-16T18:26:59Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array returns this:\r\n```json\r\n[\r\n {\r\n \"rowid\": 1,\r\n \"data\": \"this is binary data\"\r\n }\r\n]\r\n```\r\nBut adding `&_nl=on` returns this: https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array&_nl=on\r\n```json\r\n{\r\n \"ok\": false,\r\n \"error\": \"Object of type bytes is not JSON serializable\",\r\n \"status\": 500,\r\n \"title\": null\r\n}\r\n```\r\nI found this error by running `wget -r 127.0.0.1:8001` against my local `fixtures.db`.\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/914/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679779797, "node_id": "MDU6SXNzdWU2Nzk3Nzk3OTc=", "number": 939, "title": "extra_ plugin hooks should take the same arguments", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-08-16T16:04:54Z", "updated_at": "2020-08-16T18:25:05Z", "closed_at": "2020-08-16T16:50:29Z", "author_association": "OWNER", "pull_request": null, "body": "- [x] `extra_css_urls(template, database, table, datasette)`\r\n- [x] `extra_js_urls(template, database, table, datasette)`\r\n- [x] `extra_body_script(template, database, table, view_name, datasette)`\r\n- [x] `extra_template_vars(template, database, table, view_name, request, datasette)`\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/938#issuecomment-674544691_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/939/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679660778, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY4Mzc3MjEy", "number": 937, "title": "Docs now live at docs.datasette.io", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-15T23:53:52Z", "updated_at": "2020-08-15T23:57:06Z", "closed_at": "2020-08-15T23:57:05Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/937", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/937/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 679646710, "node_id": "MDU6SXNzdWU2Nzk2NDY3MTA=", "number": 935, "title": "db.execute_write_fn(create_tables, block=True) hangs a thread if connection fails", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-15T21:49:17Z", "updated_at": "2020-08-15T22:35:33Z", "closed_at": "2020-08-15T22:35:33Z", "author_association": "OWNER", "pull_request": null, "body": "Discovered in https://github.com/simonw/latest-datasette-with-all-plugins/issues/3#issuecomment-674449757", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/935/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 679650632, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY4MzcwNjU4", "number": 936, "title": "Don't hang in db.execute_write_fn() if connection fails", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-08-15T22:20:12Z", "updated_at": "2020-08-15T22:35:33Z", "closed_at": "2020-08-15T22:35:32Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/936", "body": "Refs #935", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/936/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 679637501, "node_id": "MDU6SXNzdWU2Nzk2Mzc1MDE=", "number": 934, "title": "--get doesn't fully invoke the startup routine", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-15T20:30:25Z", "updated_at": "2020-08-15T20:53:49Z", "closed_at": "2020-08-15T20:53:49Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/7702ea602188899ee9b0446a874a6a9b546b564d/datasette/cli.py#L417-L433\r\n\r\nSpotted this working on https://github.com/simonw/latest-datasette-with-all-plugins/issues/3 - I'd like to be able to use `datasette --get /` as a sanity checking test, but that doesn't work if the init hooks aren't fully executed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/934/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677926613, "node_id": "MDU6SXNzdWU2Nzc5MjY2MTM=", "number": 931, "title": "Docker container is no longer being pushed (it's stuck on 0.45)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-08-12T19:33:03Z", "updated_at": "2020-08-12T21:36:20Z", "closed_at": "2020-08-12T21:36:20Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. https://travis-ci.org/github/simonw/datasette/jobs/717123725\r\n\r\nHere's how it broke:\r\n```\r\n--2020-08-12 03:08:17-- https://www.gaia-gis.it/gaia-sins/freexl-1.0.5.tar.gz\r\nResolving www.gaia-gis.it (www.gaia-gis.it)... 212.83.162.51\r\nConnecting to www.gaia-gis.it (www.gaia-gis.it)|212.83.162.51|:443... connected.\r\nHTTP request sent, awaiting response... 404 Not Found\r\n2020-08-12 03:08:18 ERROR 404: Not Found.\r\nThe command '/bin/sh -c wget \"https://www.gaia-gis.it/gaia-sins/freexl-1.0.5.tar.gz\" && tar zxf freexl-1.0.5.tar.gz && cd freexl-1.0.5 && ./configure && make && make install' returned a non-zero code: 8\r\nThe command \"docker build -f Dockerfile -t $REPO:$TRAVIS_TAG .\" exited with 8.\r\n0.07s$ docker tag $REPO:$TRAVIS_TAG $REPO:latest\r\nError response from daemon: No such image: [secure]/datasette:0.47.1\r\nThe command \"docker tag $REPO:$TRAVIS_TAG $REPO:latest\" exited with 1.\r\n0.08s$ docker push $REPO\r\nThe push refers to repository [docker.io/[secure]/datasette]\r\nAn image does not exist locally with the tag: [secure]/datasette\r\nThe command \"docker push $REPO\" exited with 1.\r\ncache.2\r\nstore build cache\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/931/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677839979, "node_id": "MDU6SXNzdWU2Nzc4Mzk5Nzk=", "number": 133, "title": "Release a sdist to PyPI", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-08-12T16:55:09Z", "updated_at": "2020-08-12T17:05:06Z", "closed_at": "2020-08-12T17:05:06Z", "author_association": "OWNER", "pull_request": null, "body": "https://pypi.org/project/sqlite-utils/#files currently just has a wheel. I need this to package for homebrew: https://github.com/simonw/homebrew-datasette/issues/10", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/133/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677326155, "node_id": "MDU6SXNzdWU2NzczMjYxNTU=", "number": 930, "title": "Datasette sdist is missing templates (hence broken when installing from Homebrew)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-08-12T02:20:16Z", "updated_at": "2020-08-12T03:30:59Z", "closed_at": "2020-08-12T03:30:59Z", "author_association": "OWNER", "pull_request": null, "body": "Pretty nasty bug this: I'm getting 500 errors for all pages that try to render a template after installing the newly released Datasette 0.47 - both from `pip install` and via Homebrew.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/930/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677250834, "node_id": "MDU6SXNzdWU2NzcyNTA4MzQ=", "number": 926, "title": "datasette fixtures.db --get \"/fixtures.json\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-08-11T22:55:36Z", "updated_at": "2020-08-12T00:26:17Z", "closed_at": "2020-08-12T00:24:42Z", "author_association": "OWNER", "pull_request": null, "body": "I can expose ALL of Datasette's functionality on the command-line (without even running a web server) by adding `--get` and `--post` options to `datasette serve`.\r\n\r\n datasette fixtures.db --get \"/fixtures.json\"\r\n\r\nThis would instantiate the Datasette ASGI app, run a fake request for `/fixtures.json` through it, dump the results out to standard output and quit.\r\n\r\nA `--post` option could do the same for a POST request. Treating that as a stretch goal for the moment.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/926/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677265716, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY2NDEwNzU1", "number": 927, "title": "'datasette --get' option, refs #926", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-08-11T23:31:52Z", "updated_at": "2020-08-12T00:24:42Z", "closed_at": "2020-08-12T00:24:41Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/927", "body": "Refs #926, #898", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/927/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 677037043, "node_id": "MDU6SXNzdWU2NzcwMzcwNDM=", "number": 923, "title": "Add homebrew installation to documentation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-08-11T16:54:31Z", "updated_at": "2020-08-11T22:53:07Z", "closed_at": "2020-08-11T22:52:46Z", "author_association": "OWNER", "pull_request": null, "body": "> ```\r\n> $ brew tap simonw/datasette\r\n> $ brew install simonw/datasette/datasette\r\n> $ datasette --version\r\n> datasette, version 0.46\r\n> ```\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/335#issuecomment-672088880_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/923/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 677227912, "node_id": "MDU6SXNzdWU2NzcyMjc5MTI=", "number": 925, "title": "\"datasette install\" and \"datasette uninstall\" commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-11T22:04:32Z", "updated_at": "2020-08-11T22:34:37Z", "closed_at": "2020-08-11T22:32:12Z", "author_association": "OWNER", "pull_request": null, "body": "When installing Datasette plugins it's crucial that they end up in the same virtual environment as Datasette itself.\r\n\r\nIt's not necessarily obvious how to do this, especially if you install Datasette via pipx or homebrew.\r\n\r\nSolution: `datasette install datasette-vega` and `datasette uninstall datasette-vega` commands that know how to install to the correct place - a very thin wrapper around `pip install`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/925/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 339505204, "node_id": "MDU6SXNzdWUzMzk1MDUyMDQ=", "number": 335, "title": "Package datasette for installation using homebrew", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2018-07-09T15:45:03Z", "updated_at": "2020-08-11T16:54:06Z", "closed_at": "2020-08-11T16:54:06Z", "author_association": "OWNER", "pull_request": null, "body": "https://docs.brew.sh/Python-for-Formula-Authors describes how.\r\n\r\n> Applications should be installed into a Python virtualenv environment rooted in libexec. This prevents the app\u2019s Python modules from contaminating the system site-packages and vice versa.\r\n\r\nIt recommends using https://github.com/tdsmith/homebrew-pypi-poet", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/335/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 675839512, "node_id": "MDU6SXNzdWU2NzU4Mzk1MTI=", "number": 132, "title": "Features for enabling and disabling WAL mode", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-08-10T03:25:44Z", "updated_at": "2020-08-10T18:59:35Z", "closed_at": "2020-08-10T18:59:35Z", "author_association": "OWNER", "pull_request": null, "body": "I finally figured out how to enable WAL - turns out it's a property of the database file itself: https://github.com/simonw/til/blob/master/sqlite/enabling-wal-mode.md", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/132/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 675724951, "node_id": "MDU6SXNzdWU2NzU3MjQ5NTE=", "number": 918, "title": "Security issue: read-only canned queries leak CSRF token in URL", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-08-09T16:03:01Z", "updated_at": "2020-08-09T16:56:48Z", "closed_at": "2020-08-09T16:11:59Z", "author_association": "OWNER", "pull_request": null, "body": "The HTML form for a read-only canned query includes the hidden CSRF token field added in #798 for writable canned queries (#698).\r\n\r\nThis means that submitting those read-only forms exposes the CSRF token in the URL - for example on https://latest.datasette.io/fixtures/neighborhood_search submitting the form took me to:\r\n\r\nhttps://latest.datasette.io/fixtures/neighborhood_search?text=down&csrftoken=IlFubnoxVVpLU1NGT3NMVUoi.HbOPd2YH_epQmp8f_aAt0s-MxtU\r\n\r\nThis token could potentially leak to an attacker if the resulting page has a link to an external site on it and the user clicks the link, since the token would be exposed in the referral logs.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/918/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 675727366, "node_id": "MDU6SXNzdWU2NzU3MjczNjY=", "number": 919, "title": "Travis should not build the master branch, only the main branch", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-09T16:18:25Z", "updated_at": "2020-08-09T16:26:18Z", "closed_at": "2020-08-09T16:19:37Z", "author_association": "OWNER", "pull_request": null, "body": "Caused by #849 - since we are mirroring the two branches (to ensure old links to `master` keep working) Travis is building both.\r\n\r\nThe following in `.travis.yml` should fix that:\r\n```\r\nbranches:\r\n except:\r\n - master\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/919/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 675594325, "node_id": "MDU6SXNzdWU2NzU1OTQzMjU=", "number": 917, "title": "Idea: \"datasette publish\" option for \"only if the data has changed", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-08T21:58:27Z", "updated_at": "2020-08-08T21:58:27Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This is a pattern I often find myself needing. I usually implement this in GitHub Actions like this:\r\n\r\nhttps://github.com/simonw/covid-19-datasette/blob/efa01c39abc832b8641fc2a92840cc3acae2fb08/.github/workflows/scheduled.yml#L52-L63\r\n\r\n```yaml\r\n - name: Set variables to decide if we should deploy\r\n id: decide_variables\r\n run: |-\r\n echo \"##[set-output name=latest;]$(datasette inspect covid.db | jq '.covid.hash' -r)\"\r\n echo \"##[set-output name=deployed;]$(curl -s https://covid-19.datasettes.com/-/databases.json | jq '.[0].hash' -r)\"\r\n - name: Set up Cloud Run\r\n if: github.event_name == 'workflow_dispatch' || steps.decide_variables.outputs.latest != steps.decide_variables.outputs.deployed\r\n uses: GoogleCloudPlatform/github-actions/setup-gcloud@master\r\n```\r\nThis is pretty fiddly. It might be good for `datasette publish` to grow a helper option that does effectively this - hashes the databases (and the `metadata.json`) and compares them to the deployed version.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/917/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 673602857, "node_id": "MDU6SXNzdWU2NzM2MDI4NTc=", "number": 9, "title": "Define a view that displays photos correctly", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-08-05T14:53:39Z", "updated_at": "2020-08-05T14:53:39Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "The `photos` table stores data like this:\r\n\r\nid | createdAt | source | prefix | suffix | width | height | visibility | created\u00a0\u25b2 | user\r\n-- | -- | -- | -- | -- | -- | -- | -- | -- | --\r\n5e12c9708506bc000840262a | January 06, 2020 - 05:45:20 UTC | Swarm for iOS\u00a01 | https://fastly.4sqi.net/img/general/ | /15889193_AXxGk4I1nbzUZuyYqObgbXdJNyEHiwj6AUDq0tPZWtw.jpg | 1920 | 1440 | public | 2020-01-06T05:45:20 | 15889193\r\n\r\nThe photo URL can be derived from those pieces - define a SQL view which does that (using `datasette-json-html` to display the pictures)", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/9/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 671130371, "node_id": "MDU6SXNzdWU2NzExMzAzNzE=", "number": 130, "title": "Support tokenize option for FTS", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-08-01T19:27:22Z", "updated_at": "2020-08-01T20:51:28Z", "closed_at": "2020-08-01T20:51:14Z", "author_association": "OWNER", "pull_request": null, "body": "FTS5 supports things like porter stemming using a `tokenize=` option:\r\n\r\nhttps://www.sqlite.org/fts5.html#tokenizers\r\n\r\nSomething like this in code:\r\n```\r\n CREATE VIRTUAL TABLE [{table}_fts] USING {fts_version} (\r\n {columns},\r\n tokenize='porter',\r\n content=[{table}]\r\n );\r\n```\r\nI tried this out just now and it worked exactly as expected.\r\n\r\nSo... `db[table].enable_fts(...) should accept a 'tokenize=` argument, and `sqlite-utils enable-fts ...` should support a `--tokenize` option.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/130/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 660827546, "node_id": "MDU6SXNzdWU2NjA4Mjc1NDY=", "number": 899, "title": "How to setup a request limit per user", "user": {"value": 133845, "label": "Krazybug"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-07-19T13:08:25Z", "updated_at": "2020-07-31T23:54:42Z", "closed_at": "2020-07-31T23:54:42Z", "author_association": "NONE", "pull_request": null, "body": "Hello,\r\n\r\nUntil now I'm using datasette without any authentication system but I would like to setup a configuration or limiting the number of requests per user (eventually by IP or with a cookie mechanism) and eventually allowing me to ban specific users/IPs.\r\n\r\nIs there a plugin available for this use case ? \r\nIf not what are your insights regarding this UC ?\r\n\r\nShould I write a plugin ? Should I deploy datasette behind a reverse proxy to manage this ?\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/899/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 662322234, "node_id": "MDExOlB1bGxSZXF1ZXN0NDUzODkwMjky", "number": 901, "title": "Use None as a default arg", "user": {"value": 56323389, "label": "Alyetama"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-07-20T22:18:38Z", "updated_at": "2020-07-31T18:42:39Z", "closed_at": "2020-07-31T18:42:39Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/901", "body": "When passing a mutable value as a default argument in a function, the default argument is mutated anytime that value is mutated. This poses a bug risk. Instead, use None as a default and assign the mutable value inside the function.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/901/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 668064778, "node_id": "MDU6SXNzdWU2NjgwNjQ3Nzg=", "number": 912, "title": "Add \"publishing to Vercel\" to the publish docs", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-07-29T18:50:58Z", "updated_at": "2020-07-31T17:06:35Z", "closed_at": "2020-07-31T17:06:35Z", "author_association": "OWNER", "pull_request": null, "body": "https://datasette.readthedocs.io/en/0.45/publish.html#datasette-publish currently only lists Cloud Run, Heroku and Fly. It should list Vercel too.\r\n\r\n(I should probably rename `datasette-publish-now` to `datasette-publish-vercel`)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/912/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665802405, "node_id": "MDU6SXNzdWU2NjU4MDI0MDU=", "number": 124, "title": "sqlite-utils query should support named parameters", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-07-26T15:25:10Z", "updated_at": "2020-07-30T22:57:51Z", "closed_at": "2020-07-27T03:53:58Z", "author_association": "OWNER", "pull_request": null, "body": "To help out with escaping - so you can run this:\r\n\r\n sqlite-utils query \"insert into foo (blah) values (:blah)\" --param blah `something here`", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/124/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 668308777, "node_id": "MDU6SXNzdWU2NjgzMDg3Nzc=", "number": 129, "title": "\"insert-files --sqlar\" for creating SQLite archives", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-30T02:28:29Z", "updated_at": "2020-07-30T22:41:01Z", "closed_at": "2020-07-30T22:40:55Z", "author_association": "OWNER", "pull_request": null, "body": "A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism.\r\n\r\nhttps://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/129/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 666040390, "node_id": "MDU6SXNzdWU2NjYwNDAzOTA=", "number": 127, "title": "Ability to insert files piped to insert-files stdin", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-07-27T07:09:33Z", "updated_at": "2020-07-30T03:08:52Z", "closed_at": "2020-07-30T03:08:18Z", "author_association": "OWNER", "pull_request": null, "body": "> Inserting files by piping them in should work - but since a filename cannot be derived this will need a `--name blah.gif` option.\r\n>\r\n> cat blah.gif | sqlite-utils insert-files files.db files - --name=blah.gif\r\n>\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/122#issuecomment-664128071_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/127/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 666639051, "node_id": "MDU6SXNzdWU2NjY2MzkwNTE=", "number": 128, "title": "Support UUID and memoryview types", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-07-27T23:08:34Z", "updated_at": "2020-07-30T01:10:43Z", "closed_at": "2020-07-30T01:10:43Z", "author_association": "OWNER", "pull_request": null, "body": "`psycopg2` can return data from PostgreSQL as `uuid.UUID` or `memoryview` objects. These should to be supported by `sqlite-utils` - mainly for https://github.com/simonw/db-to-sqlite", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/128/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 667467128, "node_id": "MDU6SXNzdWU2Njc0NjcxMjg=", "number": 909, "title": "AsgiFileDownload: filename not correctly passed", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-29T00:41:43Z", "updated_at": "2020-07-30T00:56:17Z", "closed_at": "2020-07-29T21:34:48Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/3c33b421320c0be81a625ca7307b2e4416a9ed5b/datasette/utils/asgi.py#L396-L405\r\n`self.filename` should be passed to `asgi_send_file()`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/909/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 667840539, "node_id": "MDExOlB1bGxSZXF1ZXN0NDU4NDM1NTky", "number": 910, "title": "Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-07-29T13:21:17Z", "updated_at": "2020-07-29T21:26:05Z", "closed_at": "2020-07-29T21:26:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/910", "body": "Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

6.0.0

\n

pytest 6.0.0 (2020-07-28)

\n

(Please see the full set of changes for this release also in the 6.0.0rc1 notes below)

\n

Breaking Changes

\n
    \n
  • \n

    #5584: PytestDeprecationWarning are now errors by default.

    \n

    Following our plan to remove deprecated features with as little disruption as\npossible, all warnings of type PytestDeprecationWarning now generate errors\ninstead of warning messages.

    \n

    The affected features will be effectively removed in pytest 6.1, so please consult the\nDeprecations and Removals\nsection in the docs for directions on how to update existing code.

    \n

    In the pytest 6.0.X series, it is possible to change the errors back into warnings as a\nstopgap measure by adding this to your pytest.ini file:

    \n
    [pytest]\nfilterwarnings =\n    ignore::pytest.PytestDeprecationWarning\n
    \n

    But this will stop working when pytest 6.1 is released.

    \n

    If you have concerns about the removal of a specific feature, please add a\ncomment to #5584.

    \n
  • \n
  • \n

    #7472: The exec_() and is_true() methods of _pytest._code.Frame have been removed.

    \n
  • \n
\n

Features

\n
    \n
  • #7464: Added support for NO_COLOR and FORCE_COLOR environment variables to control colored output.
  • \n
\n

Improvements

\n
    \n
  • #7467: --log-file CLI option and log_file ini marker now create subdirectories if needed.
  • \n
  • #7489: The pytest.raises function has a clearer error message when match equals the obtained string but is not a regex match. In this case it is suggested to escape the regex.
  • \n
\n

Bug Fixes

\n
    \n
  • #7392: Fix the reported location of tests skipped with @pytest.mark.skip when --runxfail is used.
  • \n
\n\n
\n
\n
\nChangelog\n

Sourced from pytest's changelog.

\n
\n
\nCommits\n
    \n
  • 41a4539 Add link to 6.0.0rc1 changelog
  • \n
  • 45ced1d Update doc/en/announce/release-6.0.0.rst
  • \n
  • 1e4b8d4 Prepare release version 6.0.0
  • \n
  • 3802982 Support generating major releases using issue comments (#7548)
  • \n
  • c2c0b7a Merge pull request #7545 from asottile/pylib_in_docs
  • \n
  • 9818899 remove usage of pylib in docs
  • \n
  • 3a060b7 Revert change to traceback repr (#7535)
  • \n
  • 7ec6401 Change pytest deprecation warnings into errors for 6.0 release (#7362)
  • \n
  • a9799f0 Merge pull request #7531 from bluetech/changelog-mypy-version
  • \n
  • 102360b Merge pull request #7519 from hroncok/pytest_warning_captured_deprecated
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/910/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 668064026, "node_id": "MDU6SXNzdWU2NjgwNjQwMjY=", "number": 911, "title": "Rethink the --name option to \"datasette publish\"", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 0, "created_at": "2020-07-29T18:49:49Z", "updated_at": "2020-07-29T18:49:49Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "`--name` works inconsistently across the different publish providers - on Cloud Run you should use `--service` instead for example. Need to review it across all of them and either remove it or clarify what it does.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/911/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 665700495, "node_id": "MDU6SXNzdWU2NjU3MDA0OTU=", "number": 122, "title": "CLI utility for inserting binary files into SQLite", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2020-07-26T03:27:39Z", "updated_at": "2020-07-27T07:10:41Z", "closed_at": "2020-07-27T07:09:03Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite BLOB columns can store entire binary files. The challenge is inserting them, since they don't neatly fit into JSON objects.\r\n\r\nIt would be great if the `sqlite-utils` CLI had a trick for helping with this.\r\n\r\nInspired by https://github.com/simonw/datasette-media/issues/14", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/122/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665819048, "node_id": "MDU6SXNzdWU2NjU4MTkwNDg=", "number": 126, "title": "Ability to insert binary data on the CLI using JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-26T16:54:14Z", "updated_at": "2020-07-27T04:00:33Z", "closed_at": "2020-07-27T03:59:45Z", "author_association": "OWNER", "pull_request": null, "body": "> I could solve round tripping (at least a bit) by allowing insert to be run with a flag that says \"these columns are base64 encoded, store the decoded data in a BLOB\".\r\n>\r\n> That would solve inserting binary data using JSON too.\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/125#issuecomment-664012247_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/126/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665817570, "node_id": "MDU6SXNzdWU2NjU4MTc1NzA=", "number": 125, "title": "Output binary columns in \"sqlite-utils query\" JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-07-26T16:47:02Z", "updated_at": "2020-07-27T00:49:41Z", "closed_at": "2020-07-27T00:48:45Z", "author_association": "OWNER", "pull_request": null, "body": "You get an error if you try to run a query that returns data from a BLOB.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/125/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665701216, "node_id": "MDU6SXNzdWU2NjU3MDEyMTY=", "number": 123, "title": "--raw option for outputting binary content", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-07-26T03:35:39Z", "updated_at": "2020-07-26T16:44:11Z", "closed_at": "2020-07-26T16:44:11Z", "author_association": "OWNER", "pull_request": null, "body": "Related to the `insert-files` work in #122 - it should be easy to get binary data back out of the database again.\r\n\r\nOne way to do that could be:\r\n\r\n sqlite-utils files.db \"select content from files where key = 'foo.jpg'\" --raw\r\n\r\nThe `--raw` option would cause just the contents of the first column to be output directly to stdout.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/123/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665400224, "node_id": "MDU6SXNzdWU2NjU0MDAyMjQ=", "number": 906, "title": "\"allow\": true for anyone, \"allow\": false for nobody", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5607421, "label": "Datasette 0.46"}, "comments": 3, "created_at": "2020-07-24T20:28:10Z", "updated_at": "2020-07-25T00:07:10Z", "closed_at": "2020-07-25T00:05:04Z", "author_association": "OWNER", "pull_request": null, "body": "The \"allow\" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this:\r\n\r\n> An allow block can specify \"no-one is allowed to do this\" using an empty `{}`:\r\n> \r\n> ```\r\n> {\r\n> \"allow\": {}\r\n> }\r\n> ```\r\n\r\n`\"allow\": null` allows all access, though this isn't documented (it should be though).\r\n\r\nThese are not very intuitive. How about also supporting `\"allow\": true` for \"allow anyone\" and `\"allow\": false` for \"allow nobody\"?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/906/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665407663, "node_id": "MDU6SXNzdWU2NjU0MDc2NjM=", "number": 908, "title": "Interactive debugging tool for \"allow\" blocks", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5607421, "label": "Datasette 0.46"}, "comments": 3, "created_at": "2020-07-24T20:43:44Z", "updated_at": "2020-07-25T00:06:15Z", "closed_at": "2020-07-24T22:56:52Z", "author_association": "OWNER", "pull_request": null, "body": "> It might be good to have a little interactive tool which helps debug these things, since there are quite a few edge-cases and the damage caused if people use them incorrectly is substantial.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/907#issuecomment-663726146_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/908/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 665403403, "node_id": "MDU6SXNzdWU2NjU0MDM0MDM=", "number": 907, "title": "Allow documentation doesn't explain what happens with multiple allow keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5607421, "label": "Datasette 0.46"}, "comments": 2, "created_at": "2020-07-24T20:34:40Z", "updated_at": "2020-07-24T22:53:07Z", "closed_at": "2020-07-24T22:53:07Z", "author_association": "OWNER", "pull_request": null, "body": "Documentation here: https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks\r\n\r\nDoesn't explain that with the following \"allow\" block:\r\n```json\r\n{\r\n \"allow\": {\r\n \"id\": \"simonw\",\r\n \"role\": \"staff\"\r\n }\r\n}\r\n```\r\nThe rule will match if EITHER the id is simonw OR the role includes staff.\r\n\r\nThe tests are missing this case too: https://github.com/simonw/datasette/blob/028f193dd6233fa116262ab4b07b13df7dcec9be/tests/test_utils.py#L504\r\n\r\nRelated to #906", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/907/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 442327592, "node_id": "MDU6SXNzdWU0NDIzMjc1OTI=", "number": 456, "title": "Installing installs the tests package", "user": {"value": 7725188, "label": "hellerve"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-05-09T16:35:16Z", "updated_at": "2020-07-24T20:39:54Z", "closed_at": "2020-07-24T20:39:54Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior.\r\n\r\nThe offending line is here:\r\nhttps://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40\r\n\r\nAnd only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling:\r\n\r\n```\r\n$ pip uninstall datasette\r\nUninstalling datasette-0.27:\r\n Would remove:\r\n /usr/local/bin/datasette\r\n /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/*\r\n /usr/local/lib/python3.7/site-packages/datasette/*\r\n /usr/local/lib/python3.7/site-packages/tests/*\r\n Would not remove (might be manually added):\r\n [ .. snip .. ]\r\nProceed (y/n)? \r\n```\r\n\r\nThis should be a relatively simple fix, and I could drop a PR if desired!\r\n\r\nCheers", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/456/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 662439034, "node_id": "MDExOlB1bGxSZXF1ZXN0NDUzOTk1MTc5", "number": 902, "title": "Don't install tests package", "user": {"value": 32467826, "label": "abeyerpath"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-21T01:08:50Z", "updated_at": "2020-07-24T20:39:54Z", "closed_at": "2020-07-24T20:39:54Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/902", "body": "The `exclude` argument to `find_packages` needs an iterable of package\r\nnames.\r\n\r\nFixes: #456 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/902/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 664793260, "node_id": "MDU6SXNzdWU2NjQ3OTMyNjA=", "number": 2, "title": "Yak shave", "user": {"value": 145425, "label": "ekg"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-07-23T22:04:18Z", "updated_at": "2020-07-23T22:04:18Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Just a quick note... The 23andme data is not exactly your genome, but a SNP chip of your genome. It's \"some of your genotypes.\" Or about 0.1% of your genome. Nice work in any case! It deserves to be liberated!!!!!", "repo": {"value": 209590345, "label": "genome-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 663976976, "node_id": "MDU6SXNzdWU2NjM5NzY5NzY=", "number": 48, "title": "Add a table of contents to the README", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-07-22T18:54:33Z", "updated_at": "2020-07-23T17:46:07Z", "closed_at": "2020-07-22T19:03:02Z", "author_association": "MEMBER", "pull_request": null, "body": "Using https://github.com/jonschlinkert/markdown-toc", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 663317875, "node_id": "MDU6SXNzdWU2NjMzMTc4NzU=", "number": 905, "title": "/database.db download should include content-length header", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-07-21T21:23:48Z", "updated_at": "2020-07-22T04:59:46Z", "closed_at": "2020-07-22T04:52:45Z", "author_association": "OWNER", "pull_request": null, "body": "I can do this by modifying this function: https://github.com/simonw/datasette/blob/02dc6298bdbfb1d63e0d2a39ff597b5fcc60e06b/datasette/utils/asgi.py#L248-L270", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/905/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 442402832, "node_id": "MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy", "number": 458, "title": "setup: add tests to package exclusion", "user": {"value": 7725188, "label": "hellerve"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-05-09T19:47:21Z", "updated_at": "2020-07-21T01:14:42Z", "closed_at": "2019-05-10T01:54:51Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/458", "body": "This PR fixes #456 by adding `tests` to the package exclusion list.\r\n\r\nCheers", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/458/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null}