{"id": 267886330, "node_id": "MDU6SXNzdWUyNjc4ODYzMzA=", "number": 27, "title": "Ability to plot a simple graph", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-10-24T03:34:59Z", "updated_at": "2018-07-10T17:52:41Z", "closed_at": "2018-07-10T17:52:41Z", "author_association": "OWNER", "pull_request": null, "body": "Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. It\u2019s up to the user to come up with SQL that gets the right values.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/27/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 268262480, "node_id": "MDU6SXNzdWUyNjgyNjI0ODA=", "number": 36, "title": "date, year, month and day querystring lookups", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-10-25T04:23:45Z", "updated_at": "2018-05-28T17:30:53Z", "closed_at": "2018-05-28T17:30:53Z", "author_association": "OWNER", "pull_request": null, "body": "- [ ] `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date\r\n- [ ] `?timestamp___year=2017` - return every item where the timestamp falls within 2017\r\n- [ ] `?timestamp___month=1` - return every item where the month component is January\r\n- [ ] `?timestamp___day=10` - return every item where the day-of-the-month component is 10\r\n\r\nFollow on from #23 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/36/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 273569068, "node_id": "MDU6SXNzdWUyNzM1NjkwNjg=", "number": 79, "title": "Add more detailed API documentation to the README", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-13T20:36:21Z", "updated_at": "2018-05-28T17:24:48Z", "closed_at": "2018-05-28T17:24:48Z", "author_association": "OWNER", "pull_request": null, "body": "Need to document:\r\n\r\n- [ ] The ?column__gt=4 style filter arguments for tables\r\n- [ ] The ?sql= API, and how named parameters work\r\n- [ ] How API pagination works\r\n- [ ] How redirects and cache headers work", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/79/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274001453, "node_id": "MDU6SXNzdWUyNzQwMDE0NTM=", "number": 96, "title": "UI for editing named parameters", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-15T01:19:21Z", "updated_at": "2017-11-16T01:45:51Z", "closed_at": "2017-11-16T01:33:38Z", "author_association": "OWNER", "pull_request": null, "body": "On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters.\r\n\r\nEg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/96/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274022950, "node_id": "MDU6SXNzdWUyNzQwMjI5NTA=", "number": 97, "title": "Link to JSON for the list of tables ", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-15T03:29:05Z", "updated_at": "2018-05-29T18:51:35Z", "closed_at": "2018-05-28T20:57:21Z", "author_association": "OWNER", "pull_request": null, "body": "https://twitter.com/yschimke/status/930606210855854080", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/97/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 275166669, "node_id": "MDU6SXNzdWUyNzUxNjY2Njk=", "number": 131, "title": "UI support for running FTS searches", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-19T15:16:20Z", "updated_at": "2017-11-19T17:18:05Z", "closed_at": "2017-11-19T17:00:12Z", "author_association": "OWNER", "pull_request": null, "body": "Here's an example query that searches all FTS indexed columns in a table: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+Street_Tree_List_fts+match+%27grove+london+dpw%27%29%0D%0A\r\n\r\nAnd here's a query that searches a specific column: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+qSpecies+match+%27london%27%29%0D%0A\r\n\r\nIf we detect that a table has FTS enabled (which we can do by looking for it as a content table reference in another FTS table's create definition) we should add a search box to the table page which constructs this query - maybe using `?_search=XXX` in the query string?\r\n\r\nTo support search against specified columns, we can do `?_search__ qSpecies=London`. - not necessary, see comment below.\r\n\r\n- [x] Detect if a table has a FTS index defined against it as a content= parameter\r\n- [x] Decide what to do if there is more than one FTS index (maybe just pick the first one?)\r\n- [x] Add the `?_search=` query string argument\r\n- [x] Add the UI", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/131/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 275917760, "node_id": "MDU6SXNzdWUyNzU5MTc3NjA=", "number": 142, "title": "Show extra instructions with the interrupted", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-22T01:44:29Z", "updated_at": "2018-05-28T21:25:06Z", "closed_at": "2018-05-28T21:24:35Z", "author_association": "OWNER", "pull_request": null, "body": "When you are using Datasette locally for ad-hoc analysis it can be frustrating to hit the time limit.\r\n\r\n If you start it with the correct command line arguments you can disable that time limit. So how about we tell you how to do that anytime you hit the interrupted error provided you are accessing it from localhost.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/142/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 276091279, "node_id": "MDU6SXNzdWUyNzYwOTEyNzk=", "number": 144, "title": "apsw as alternative sqlite3 binding (for full text search)", "user": {"value": 649467, "label": "mhalle"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-22T14:40:39Z", "updated_at": "2018-05-28T21:29:42Z", "closed_at": "2018-05-28T21:29:42Z", "author_association": "NONE", "pull_request": null, "body": "Hey there,\r\n\r\nHave you considered providing apsw support as an alternative to stock python sqlite3? I use apsw because it keeps up with sqlite3 and is straightforward to bring in extensions like FTS5. FTS really accelerates the kind of searching often done by web clients.\r\n\r\nI may be able to help (it shouldn't be much code), but there are a couple of stylistic questions that come up when supporting an optional package. Also, apsw is tricky in that it doesn't have a pypi package (author says limitations in providing options to setup.py). ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/144/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 276704327, "node_id": "MDU6SXNzdWUyNzY3MDQzMjc=", "number": 150, "title": "_group_count= feature improvements", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-24T22:06:18Z", "updated_at": "2018-05-28T16:41:28Z", "closed_at": "2018-05-28T16:41:28Z", "author_association": "OWNER", "pull_request": null, "body": "- [ ] The \"apply filters\" form should keep you on the _group_count= page\r\n- [ ] Foreign key references should be expand\r\n- [ ] Page title should reflect the view you are on", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/150/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 306811513, "node_id": "MDU6SXNzdWUzMDY4MTE1MTM=", "number": 186, "title": "proposal new option to disable user agents cache", "user": {"value": 47107, "label": "stefanocudini"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-03-20T10:42:20Z", "updated_at": "2018-03-21T09:07:22Z", "closed_at": "2018-03-21T01:28:31Z", "author_association": "NONE", "pull_request": null, "body": "I think it would be very useful for debugging an option of adding headers to http replies\r\n\r\n```\r\nCache-Control: no-cache \r\n```\r\n\r\nespecially in the html output", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/186/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 310882100, "node_id": "MDU6SXNzdWUzMTA4ODIxMDA=", "number": 193, "title": "Cleaner mechanism for handling custom errors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-03T15:19:13Z", "updated_at": "2018-04-13T18:18:59Z", "closed_at": "2018-04-13T18:18:59Z", "author_association": "OWNER", "pull_request": null, "body": "This code is pretty messy:\r\n\r\nhttps://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L245-L265\r\n\r\nInstead, it would be nice if I could raise an exception that would be converted into the appropriate JSON or HTML error message, with a corresponding HTTP code.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/193/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 312313496, "node_id": "MDU6SXNzdWUzMTIzMTM0OTY=", "number": 195, "title": "Run pks_for_table in inspect, executing once at build time rather than constantly", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-08T15:12:40Z", "updated_at": "2018-04-10T00:54:43Z", "closed_at": "2018-04-10T00:54:43Z", "author_association": "OWNER", "pull_request": null, "body": "Right now several Datasette views call the `await self.pks_for_table(...)` method to figure out what primary keys are set for a specific table. This executes a `PRAGMA table_info` SQL query.\r\n\r\nIt would be faster and more efficient to execute this query for each table as part of the `inspect()` method.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/195/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 313494458, "node_id": "MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0", "number": 200, "title": "Hide Spatialite system tables", "user": {"value": 45057, "label": "russss"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-11T21:26:58Z", "updated_at": "2018-04-12T21:34:48Z", "closed_at": "2018-04-12T21:34:48Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/200", "body": "They were getting on my nerves.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/200/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 314319372, "node_id": "MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0", "number": 205, "title": "Support filtering with units and more", "user": {"value": 45057, "label": "russss"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-14T10:47:51Z", "updated_at": "2018-04-14T15:24:04Z", "closed_at": "2018-04-14T15:24:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/205", "body": "The first commit:\r\n* Adds units to exported JSON\r\n* Adds units key to metadata skeleton\r\n* Adds some docs for units\r\n\r\nThe second commit adds filtering by units by the first method I mentioned in #203:\r\n![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png)\r\n\r\n[Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly.\r\n\r\nThe third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/205/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 315142414, "node_id": "MDU6SXNzdWUzMTUxNDI0MTQ=", "number": 221, "title": "Allow plugins to add new cli sub commands ", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-17T16:40:13Z", "updated_at": "2021-01-04T20:12:14Z", "closed_at": "2021-01-04T20:12:14Z", "author_association": "OWNER", "pull_request": null, "body": "I could then test this out by having https://github.com/simonw/csvs-to-sqlite register itself as a plugin", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/221/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 315327860, "node_id": "MDU6SXNzdWUzMTUzMjc4NjA=", "number": 223, "title": "datasette publish --install=name-of-plugin", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-18T04:33:59Z", "updated_at": "2018-04-18T14:56:17Z", "closed_at": "2018-04-18T14:56:17Z", "author_association": "OWNER", "pull_request": null, "body": "Mechanism for causing datasette publish and datasette package to install one or more additional plugins using `pip install` - refs #14 ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/223/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 316526433, "node_id": "MDU6SXNzdWUzMTY1MjY0MzM=", "number": 234, "title": "label_column option in metadata.json", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-21T21:19:08Z", "updated_at": "2018-04-22T20:47:12Z", "closed_at": "2018-04-22T20:47:12Z", "author_association": "OWNER", "pull_request": null, "body": "Currently the column used for displaying a foreign key relationship is automatically detected by `inspect()` by looking for tables that have a primary key column and one other column.\r\n\r\nThis doesn't work for tables with more than two columns.\r\n\r\nLet's allow the table section in `metadata.json` to optionally define a `label_column` which, if present, will be used for those displays.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/234/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 317714268, "node_id": "MDU6SXNzdWUzMTc3MTQyNjg=", "number": 238, "title": "External metadata.json", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-25T17:02:30Z", "updated_at": "2019-06-24T06:52:55Z", "closed_at": "2019-06-24T06:52:45Z", "author_association": "OWNER", "pull_request": null, "body": "A frustration I'm having with https://register-of-members-interests.datasettes.com/ is that I keep coming up with new canned queries but I don't want to redeploy the whole thing just to add them to `metadata.json`\r\n\r\nMaybe Datasette could optionally take a `--metadata-url` option which causes it to load from a URL instead and occasionally check for updates.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/238/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 322741659, "node_id": "MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1", "number": 258, "title": "Add new metadata key persistent_urls which removes the hash from all database urls", "user": {"value": 247131, "label": "philroche"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-14T09:39:18Z", "updated_at": "2018-05-21T07:38:15Z", "closed_at": "2018-05-21T07:38:15Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/258", "body": "Add new metadata key \"persistent_urls\" which removes the hash from all database urls when set to \"true\"\r\n\r\nThis PR is just to gauge if this, or something like it, is something you would consider merging?\r\n\r\nI understand the reason why the substring of the hash is included in the url but\r\nthere are some use cases where the urls should persist across deployments. For bookmarks\r\nfor example or for scripts that use the JSON API.\r\n\r\nThis is the initial commit for this feature. Tests and documentation updates to follow.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/258/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 323671577, "node_id": "MDU6SXNzdWUzMjM2NzE1Nzc=", "number": 263, "title": "Facets should not execute for ?shape=array|object", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-16T15:26:13Z", "updated_at": "2021-06-02T02:54:34Z", "closed_at": "2021-06-02T02:54:34Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/263/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 323716411, "node_id": "MDU6SXNzdWUzMjM3MTY0MTE=", "number": 267, "title": "Documentation for URL hashing, redirects and cache policy", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-16T17:29:01Z", "updated_at": "2019-06-24T06:41:02Z", "closed_at": "2019-06-24T06:41:02Z", "author_association": "OWNER", "pull_request": null, "body": "See my comments on #258 for a starting point", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/267/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 324720095, "node_id": "MDU6SXNzdWUzMjQ3MjAwOTU=", "number": 275, "title": "\"config\" section in metadata.json (root, database and table level)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-20T16:02:28Z", "updated_at": "2023-08-23T01:28:37Z", "closed_at": "2023-08-23T01:28:37Z", "author_association": "OWNER", "pull_request": null, "body": "Split off from #274 \r\n\r\nMetadata should an optional `\"config\"` section at root, table or database level.\r\n\r\nThe TableView and RowView and DatabaseView and BaseView classes could all have a `.config(\"key\")` method which knows how to resolve the hierarchy of configs.\r\n\r\nThis will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/275/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 325294102, "node_id": "MDU6SXNzdWUzMjUyOTQxMDI=", "number": 278, "title": "Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-22T13:28:40Z", "updated_at": "2018-05-23T17:43:36Z", "closed_at": "2018-05-23T17:43:36Z", "author_association": "OWNER", "pull_request": null, "body": "A Dockerfile that does the following:\r\n\r\n* Bundles Datasette master\r\n* Python 3.6 most recent version (or 3.7 if it has been released)\r\n* SQLite 3.23.1 (or most recent release) such that \"import sqite3\" in Python gets that version. Ideally with the json1 module baked in by default, but having it loadable as an optional module is fine too\r\n* SpatiaLite 4.4.0-RC0 (or most recent version) such that it can be loaded as an optional module\r\n* Uses multi-stage builds to stay as small as possible\r\n\r\nNote that the current \"release\" of SpatiaLite is 4.3.0 which is missing key features like https://www.gaia-gis.it/fossil/libspatialite/wiki?name=KNN - 4.4.0 probably needs to be compiled from source.\r\n\r\nI don't know the best way to get a current SQLite version bundled for Python 3. Maybe https://github.com/coleifer/pysqlite3 ?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/278/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 326767626, "node_id": "MDU6SXNzdWUzMjY3Njc2MjY=", "number": 288, "title": "Support multiple filters of the same type", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-26T21:13:12Z", "updated_at": "2019-04-15T23:45:04Z", "closed_at": "2019-04-15T23:44:26Z", "author_association": "OWNER", "pull_request": null, "body": "This should work for example:\r\n\r\nhttps://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/biopics%2Fbiopics?year_release__not=2014&year_release__not=2015", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/288/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 326768188, "node_id": "MDU6SXNzdWUzMjY3NjgxODg=", "number": 289, "title": "?_ttl= parameter to control caching", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-26T21:22:55Z", "updated_at": "2018-05-26T22:22:47Z", "closed_at": "2018-05-26T22:17:48Z", "author_association": "OWNER", "pull_request": null, "body": "This would allow clients to specify the max-age caching header that should be returned with the query.\r\n\r\nMost important this will allow caching to be completely urned off for specific queries using `?_ttl=0`. Sending 0 should cause a `Cache-Control: no-cache` header to be returned.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/289/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 326783670, "node_id": "MDU6SXNzdWUzMjY3ODM2NzA=", "number": 291, "title": "Avoid plugins accidentally loading dependencies twice", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-27T03:15:21Z", "updated_at": "2020-09-30T20:36:12Z", "closed_at": "2018-05-28T20:42:02Z", "author_association": "OWNER", "pull_request": null, "body": "Plugins that include JavaScript files risk loading the same code twice. In particular: I want to build a second plugin that uses the Leaflet mapping library (the first was [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/)). But I don't want the two plugins to load duplicate copies of Leaflet.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/291/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 327395270, "node_id": "MDU6SXNzdWUzMjczOTUyNzA=", "number": 296, "title": "Per-database and per-table /-/ URL namespace", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-29T16:23:13Z", "updated_at": "2019-06-28T16:46:34Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Initially this will be for subsets of `/-/inspect` and `/-/metadata` but it will also give us a URL namespace for future features like `/-/facet` (expanded list of a specific facet, linked to from `...`) and `/-/graph`\r\n\r\nTo start:\r\n\r\n* `/dbname/-/inspect`\r\n* `/dbname/-/metadata`\r\n* `/dbname/tablename/-/inspect`\r\n* `/dbname/tablename/-/metadata`\r\n\r\nThis means we will no longer allow databases or tables to have the name `\"-\"` - I think that's OK\r\n\r\nWe will continue to support rows with a primary key of `\"-\"` at the following URL:\r\n\r\n* `/dbname/tablename/-`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/296/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 328229224, "node_id": "MDU6SXNzdWUzMjgyMjkyMjQ=", "number": 304, "title": "Ability to configure SQLite cache_size", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-31T17:28:07Z", "updated_at": "2018-06-04T16:13:32Z", "closed_at": "2018-06-04T16:03:19Z", "author_association": "OWNER", "pull_request": null, "body": "See https://www.sqlite.org/pragma.html#pragma_cache_size\r\n\r\nLet's call the config setting `cache_size_kb` to emphasize that we're using the negative option.\r\n\r\nNote this warning: perhaps we should raise an error if you try to use this setting against a SQLite version prior to 3.7.10\r\n> If the argument N is positive then the suggested cache size is set to N. If the argument N is negative, then the number of cache pages is adjusted to use approximately abs(N*1024) bytes of memory. Backwards compatibility note: The behavior of cache_size with a negative N was different in prior to version 3.7.10 (2012-01-16). In version 3.7.9 and earlier, the number of pages in the cache was set to the absolute value of N.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/304/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 340039409, "node_id": "MDU6SXNzdWUzNDAwMzk0MDk=", "number": 336, "title": "Ensure --help examples in docs are always up to date", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-07-10T23:20:01Z", "updated_at": "2018-07-24T16:01:29Z", "closed_at": "2018-07-24T16:01:29Z", "author_association": "OWNER", "pull_request": null, "body": "Ideally I would automatically generate the --help output shown in our docs, but I don't think I can get that working with readthedocs.\r\n\r\nInstead, I'm going to add a unit test that checks that those extracts in the documentation match the current output of the --help command.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/336/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 351017129, "node_id": "MDU6SXNzdWUzNTEwMTcxMjk=", "number": 360, "title": "Use pysqlite3 if available", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-08-16T00:50:45Z", "updated_at": "2018-08-16T01:50:42Z", "closed_at": "2018-08-16T00:58:58Z", "author_association": "OWNER", "pull_request": null, "body": "[pysqlite3](https://github.com/coleifer/pysqlite3) is a way to provide access to a more recent version of SQLite than the standard library `sqlite3` module (which tends to use the version provided with the operating system - which on e.g. the Travis CI Ubuntu build environment can be as old as 3.8.0).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/360/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 377156339, "node_id": "MDU6SXNzdWUzNzcxNTYzMzk=", "number": 371, "title": "datasette publish digitalocean plugin", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-11-04T14:07:41Z", "updated_at": "2021-01-04T20:14:28Z", "closed_at": "2021-01-04T20:14:28Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Provide support for launching `datasette` on Digital Ocean.\r\n\r\nExample: [Deploy Docker containers into Digital Ocean](https://blog.machinebox.io/deploy-machine-box-in-digital-ocean-385265fbeafd).\r\n\r\nDigital Ocean also has a preconfigured VM running Docker that can be launched from the command line via the Digital Ocean API: [Docker One-Click Application](https://www.digitalocean.com/docs/one-clicks/docker/).\r\n\r\nRelated:\r\n- Launching containers in Digital Ocean servers running docker: [How To Provision and Manage Remote Docker Hosts with Docker Machine on Ubuntu 16.04](https://www.digitalocean.com/community/tutorials/how-to-provision-and-manage-remote-docker-hosts-with-docker-machine-on-ubuntu-16-04)\r\n- [How To Use Doctl, the Official DigitalOcean Command-Line Client](https://www.digitalocean.com/community/tutorials/how-to-use-doctl-the-official-digitalocean-command-line-client)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/371/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 397129564, "node_id": "MDU6SXNzdWUzOTcxMjk1NjQ=", "number": 397, "title": "Update official datasetteproject/datasette Docker container to SQLite 3.26.0", "user": {"value": 43564, "label": "claes"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-08T22:51:50Z", "updated_at": "2019-01-11T01:25:33Z", "closed_at": "2019-01-11T00:56:18Z", "author_association": "NONE", "pull_request": null, "body": "I try to start datasette on a database that contains the below view\r\n\r\nIt fails in a way that makes me think it does not support the window functions SQL syntax.\r\n\r\n\r\n```\r\ncreate view general_ledger as\r\nselect transactions.account_number, strftime(\"%Y-%m-%d\", verifications.verification_date) as verification_date, verifications.verification_number, verifications.verification_text, \r\ncase when transactions.centi_amount >= 0 and verifications.verification_number > 0 then printf(\"%.2f\", (transactions.centi_amount/100.0))\r\nend as debit,\r\ncase when transactions.centi_amount <= 0 and verifications.verification_number > 0 then printf(\"%.2f\", (transactions.centi_amount/100.0)) \r\nend as credit,\r\nprintf(\"%.2f\", sum(transactions.centi_amount) over (partition by transactions.account_number \r\n order by verifications.verification_number range between unbounded preceding and current row)/100.0)\r\nfrom verifications inner join transactions on transactions.verification_id = verifications.id \r\norder by transactions.account_number, verifications.verification_number;\r\n```\r\n\r\n```\r\ndocker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/ledger.db\r\nServe! files=('/mnt/ledger.db',) on port 8001\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/datasette\", line 11, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 722, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 697, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 1066, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 895, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 535, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/cli.py\", line 375, in serve\r\n ds.inspect()\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/app.py\", line 308, in inspect\r\n \"views\": inspect_views(conn),\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/inspect.py\", line 30, in inspect_views\r\n return [v[0] for v in conn.execute('select name from sqlite_master where type = \"view\"')]\r\nsqlite3.DatabaseError: malformed database schema (general_ledger) - near \"over\": syntax error\r\n\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/397/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 400229984, "node_id": "MDU6SXNzdWU0MDAyMjk5ODQ=", "number": 401, "title": "How to pass configuration to plugins?", "user": {"value": 1055831, "label": "dazzag24"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-17T11:20:41Z", "updated_at": "2019-01-18T11:48:13Z", "closed_at": "2019-01-18T06:49:07Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nFirstly, thanks for your work on datasette, it is a hugely useful tool!\r\n\r\nI've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead.\r\n\r\nIt uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so:\r\n``` \r\nlet tiles = L.tileLayer.provider('Esri.WorldTopoMap');\r\n```\r\ninstead of the current:\r\n```\r\nlet tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {\r\n maxZoom: 19,\r\n detectRetina: true,\r\n attribution: '© OpenStreetMap contributors'\r\n }),\r\n ```\r\nHowever I've got stuck in trying to work out how to pass the provider string to the plugin.\r\nIn the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used.\r\n\r\nCan you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version?\r\n\r\nMany thanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/401/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 400340905, "node_id": "MDU6SXNzdWU0MDAzNDA5MDU=", "number": 402, "title": "Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-17T15:52:28Z", "updated_at": "2019-01-17T16:15:21Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> Was just having a skim through the datasette source. Given that the vuln impacts shadow tables, wasn't sure whether these are also covered by the immutable flag. Latest release introduced a SQLITE_DBCONFIG_DEFENSIVE flag that they recommend setting: https://sqlite.org/security.html\r\n\r\nhttps://twitter.com/ignoredambience/status/1085926961413869568", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/402/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 403499298, "node_id": "MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3", "number": 404, "title": "Experiment: run Jinja in async mode", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-27T00:28:44Z", "updated_at": "2019-11-12T05:02:18Z", "closed_at": "2019-11-12T05:02:13Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/404", "body": "See http://jinja.pocoo.org/docs/2.10/api/#async-support\r\n\r\nTests all pass. Have not checked performance difference yet.\r\n\r\nCreating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/404/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 403625674, "node_id": "MDU6SXNzdWU0MDM2MjU2NzQ=", "number": 7, "title": ".insert_all() should accept a generator and process it efficiently", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-28T02:11:58Z", "updated_at": "2019-01-28T06:26:53Z", "closed_at": "2019-01-28T06:26:53Z", "author_association": "OWNER", "pull_request": null, "body": "Right now you have to load every record into memory before passing the list to `.insert_all()` and friends.\r\n\r\nIf you want to process millions of rows, this is inefficient. Python has generators - we should use them!\r\n\r\nThe only catch here is that part of the magic of `sqlite-utils` is that it guesses the column types and creates the table for you. This code will need to be updated to notice if the table needs creating and, if it does, create it using the first X (where x=1,000 but can be customized) records.\r\n\r\nIf a record outside of those first 1,000 has a rogue column, we can crash with an error.\r\n\r\nThis will free us up to make the `--nl` option added in #6 much more efficient.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403922644, "node_id": "MDU6SXNzdWU0MDM5MjI2NDQ=", "number": 8, "title": "Problems handling column names containing spaces or - ", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-28T17:23:28Z", "updated_at": "2019-04-14T15:29:33Z", "closed_at": "2019-02-23T21:09:03Z", "author_association": "NONE", "pull_request": null, "body": "Irrrespective of whether using column names containing a space or - character is good practice, SQLite does allow it, but `sqlite-utils` throws an error in the following cases:\r\n\r\n```python\r\nfrom sqlite_utils import Database\r\n\r\ndbname = 'test.db'\r\nDB = Database(sqlite3.connect(dbname))\r\n\r\nimport pandas as pd\r\ndf = pd.DataFrame({'col1':range(3), 'col2':range(3)})\r\n\r\n#Convert pandas dataframe to appropriate list/dict format\r\nDB['test1'].insert_all( df.to_dict(orient='records') )\r\n#Works fine\r\n```\r\n\r\nHowever:\r\n\r\n```python\r\ndf = pd.DataFrame({'col 1':range(3), 'col2':range(3)})\r\nDB['test1'].insert_all(df.to_dict(orient='records'))\r\n```\r\n\r\nthrows:\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nOperationalError Traceback (most recent call last)\r\n in ()\r\n 1 import pandas as pd\r\n 2 df = pd.DataFrame({'col 1':range(3), 'col2':range(3)})\r\n----> 3 DB['test1'].insert_all(df.to_dict(orient='records'))\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \"1\": syntax error\r\n```\r\n\r\nand:\r\n\r\n```python\r\ndf = pd.DataFrame({'col-1':range(3), 'col2':range(3)})\r\nDB['test1'].upsert_all(df.to_dict(orient='records'))\r\n```\r\n\r\nresults in:\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nOperationalError Traceback (most recent call last)\r\n in ()\r\n 1 import pandas as pd\r\n 2 df = pd.DataFrame({'col-1':range(3), 'col2':range(3)})\r\n----> 3 DB['test1'].insert_all(df.to_dict(orient='records'))\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \"-\": syntax error\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 408376825, "node_id": "MDU6SXNzdWU0MDgzNzY4MjU=", "number": 409, "title": "Zeit API v1 does not work for new users - need to migrate to v2", "user": {"value": 209967, "label": "michaelmcandrew"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-09T00:50:33Z", "updated_at": "2020-04-06T15:44:46Z", "closed_at": "2020-04-06T15:44:46Z", "author_association": "NONE", "pull_request": null, "body": "Hello there,\r\n\r\nThis looks like a great tool. Thanks. \r\n\r\nUnfortunately, I hit the following error:\r\n\r\n```\r\nmichael@hazel ~/src/cc-datasette/data/out datasette publish now cc-datasette.db\r\n> WARN! You are using an old version of the Now Platform. More: https://zeit.co/docs/v1-upgrade\r\n> Deploying /tmp/tmpjtrxwsyf/datasette under michaelmcandrew\r\n> Using project datasette\r\n> Error! You tried to create a Now 1.0 deployment. Please use Now 2.0 instead: https://zeit.co/upgrade\r\n```\r\nI'm guessing you might not hit this because you are not a 'new user' of Zeit (https://github.com/zeit/now-cli/issues/1805#issuecomment-452470953).\r\n\r\nWould it be a lot of work to upgrade to the new Zeit API, do you think?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/409/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 410384988, "node_id": "MDU6SXNzdWU0MTAzODQ5ODg=", "number": 411, "title": "How to pass named parameter into spatialite MakePoint() function", "user": {"value": 1055831, "label": "dazzag24"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-14T16:30:22Z", "updated_at": "2023-10-25T13:23:04Z", "closed_at": "2019-05-05T12:25:04Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\ndatasette version: \"0.26.2\"\r\nextensions: \r\n spatialite: \"4.4.0-RC0\"\r\nsqlite version: \"3.22.0\"\r\n\r\nI have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables:\r\n\r\n```\r\nconn.execute('SELECT InitSpatialMetadata(1)')\r\n\r\nconn.execute(\"SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);\")\r\n\r\nconn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||\"longitude\"||' '||\"latitude\"||')',4326);''')\r\n\r\nconn.execute(\"SELECT CreateSpatialIndex('airports', 'point_geom');\")\r\n```\r\n\r\nI'm attempting to create a canned query and have this in my metadata.json file:\r\n```\r\n\"find_airports_nearest_to_point\":{\r\n \"sql\":\"SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \\\"airports\\\" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;\"}\r\n```\r\nwhich doesn't seem to perform the templating of the name parameters correctly and I get no results. \r\n\r\nHave also tired:\r\n```\r\nMakePoint( || :Long || , || :Lat || )\r\n```\r\nwhich returns this error:\r\n```\r\nnear \"||\": syntax error\r\n```\r\n\r\nHowever I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported?\r\n\r\nThanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/411/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 413842611, "node_id": "MDU6SXNzdWU0MTM4NDI2MTE=", "number": 14, "title": "Utilities for adding indexes", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-24T16:57:28Z", "updated_at": "2019-02-24T19:11:28Z", "closed_at": "2019-02-24T19:11:28Z", "author_association": "OWNER", "pull_request": null, "body": "Both in the Python API and the CLI tool. For the CLI tool this should work:\r\n\r\n $ sqlite-utils create-index mydb.db mytable col1 col2\r\n\r\nThis will create a compound index across col1 and col2. The name of the index will be automatically chosen unless you use the `--name=...` option.\r\n\r\nSupport a `--unique` option too.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/14/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 418329842, "node_id": "MDU6SXNzdWU0MTgzMjk4NDI=", "number": 415, "title": "Add query parameter to hide SQL textarea", "user": {"value": 36796532, "label": "ad-si"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-03-07T14:11:30Z", "updated_at": "2019-03-15T09:30:57Z", "closed_at": "2019-03-15T05:22:43Z", "author_association": "NONE", "pull_request": null, "body": "It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/415/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 432870248, "node_id": "MDU6SXNzdWU0MzI4NzAyNDg=", "number": 431, "title": "Datasette doesn't reload when database file changes", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-04-13T16:50:43Z", "updated_at": "2019-05-02T05:13:55Z", "closed_at": "2019-05-02T05:13:54Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "My understanding of the `--reload` option was that if the database file changed `datasette` would automatically reload.\r\n\r\nI'm running on a Mac and from the `datasette` UI queries don't seem to be picking up data in a newly changed db (I checked the db timestamp - it certainly updated).\r\n\r\nI was also expecting to see some sort of log statement in the datasette logging to say that it had detected a file change and restarted, but don't see anything there?\r\n\r\nWill try to check on an Ubuntu box when I get a chance to see if this is a Mac thing.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/431/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 442327592, "node_id": "MDU6SXNzdWU0NDIzMjc1OTI=", "number": 456, "title": "Installing installs the tests package", "user": {"value": 7725188, "label": "hellerve"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-05-09T16:35:16Z", "updated_at": "2020-07-24T20:39:54Z", "closed_at": "2020-07-24T20:39:54Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior.\r\n\r\nThe offending line is here:\r\nhttps://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40\r\n\r\nAnd only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling:\r\n\r\n```\r\n$ pip uninstall datasette\r\nUninstalling datasette-0.27:\r\n Would remove:\r\n /usr/local/bin/datasette\r\n /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/*\r\n /usr/local/lib/python3.7/site-packages/datasette/*\r\n /usr/local/lib/python3.7/site-packages/tests/*\r\n Would not remove (might be manually added):\r\n [ .. snip .. ]\r\nProceed (y/n)? \r\n```\r\n\r\nThis should be a relatively simple fix, and I could drop a PR if desired!\r\n\r\nCheers", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/456/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 448391492, "node_id": "MDU6SXNzdWU0NDgzOTE0OTI=", "number": 21, "title": "Option to ignore inserts if primary key exists already", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-05-25T00:17:12Z", "updated_at": "2019-05-29T05:09:01Z", "closed_at": "2019-05-29T04:18:26Z", "author_association": "OWNER", "pull_request": null, "body": "> I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows.\r\n> \r\n> Do `sqlite_utils` support such (cavalier!) behaviour?\r\n\r\n_Originally posted by @psychemedia in https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/21/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 451585764, "node_id": "MDU6SXNzdWU0NTE1ODU3NjQ=", "number": 499, "title": "Accessibility for non-techie newsies? ", "user": {"value": 7936571, "label": "chrismp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-03T16:49:37Z", "updated_at": "2019-06-05T21:22:55Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi again, I'm having fun uploading datasets to Heroku via datasette. I'd like to set up datasette so that it's easy for other newsroom workers, who don't use Linux and aren't programmers, to upload datasets. Does datsette provide this out-of-the-box, or as a plugin? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/499/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 452901999, "node_id": "MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw", "number": 501, "title": "Test against Python 3.8-dev using Travis", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-06T08:37:53Z", "updated_at": "2019-11-11T03:23:29Z", "closed_at": "2019-11-11T03:23:29Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/501", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/501/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 453131917, "node_id": "MDU6SXNzdWU0NTMxMzE5MTc=", "number": 502, "title": "Exporting sqlite database(s)?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-06T16:39:53Z", "updated_at": "2021-04-03T05:16:54Z", "closed_at": "2019-06-11T18:50:42Z", "author_association": "NONE", "pull_request": null, "body": "I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/502/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 457147936, "node_id": "MDU6SXNzdWU0NTcxNDc5MzY=", "number": 512, "title": "\"about\" parameter in metadata does not appear when alone", "user": {"value": 7936571, "label": "chrismp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-17T21:04:20Z", "updated_at": "2019-10-11T15:49:13Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Here's an example of metadata I have for one database on datasette.\r\n\r\n```\r\n\"Records-requests\": {\r\n\t\"tables\": {\r\n\t\t\"Some table\": {\r\n\t\t\t\"about\": \"This table has data.\"\r\n\t\t}\r\n\t}\r\n}\r\n```\r\n\r\nThe text in `about` does not show up when I publish the data. But it shows up after I add a `\"source\"` parameter in the metadata.\r\n\r\nIs this intended?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/512/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 459598080, "node_id": "MDU6SXNzdWU0NTk1OTgwODA=", "number": 520, "title": "asgi_wrapper plugin hook", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 3, "created_at": "2019-06-23T17:16:45Z", "updated_at": "2019-07-03T04:40:34Z", "closed_at": "2019-07-03T04:06:28Z", "author_association": "OWNER", "pull_request": null, "body": "After #272 we can finally add this hook. It will allow plugins to wrap their own ASGI middleware around Datasette. Potential use-cases include:\r\n\r\n* adding authentication\r\n* custom CORS headers (see #454)\r\n* maybe gzip support?\r\n* possibly defining entirely new routes, though that may be better handled by a separate hook", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/520/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 459936585, "node_id": "MDU6SXNzdWU0NTk5MzY1ODU=", "number": 527, "title": "Unable to use rank when fts-table generated with csvs-to-sqlite", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-24T14:49:48Z", "updated_at": "2019-06-24T15:21:18Z", "closed_at": "2019-06-24T15:09:10Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon.\r\n\r\nIf i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.:\r\n\r\ndatasette, version 0.28\r\nsqlite-utils, version 1.2.1\r\ncsvs-to-sqlite, version 0.9\r\n\r\nNo column named rank with these commands:\r\n$ csvs-to-sqlite minutes.csv minutes.db -f text_data\r\n$ datasette -i minutes.db\r\nselect rank, * from minutes_fts where minutes_fts match 'dog'\r\n\r\nEverything ok with these commands:\r\n$ csvs-to-sqlite minutes.csv minutes.db\r\n$ sqlite-utils enable-fts minutes.db text_data\r\n$ datasette -i minutes.db\r\nselect rank, * from minutes_fts where minutes_fts match 'dog'\r\n\r\nAm I doing something wrong?\r\n\r\nThank you for a great application!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/527/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 463915863, "node_id": "MDU6SXNzdWU0NjM5MTU4NjM=", "number": 538, "title": "Mechanism for secrets in plugin configuration", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-03T19:23:34Z", "updated_at": "2019-07-04T05:47:54Z", "closed_at": "2019-07-04T05:47:54Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/datasette-auth-github/issues/1\r\n\r\nWe need a mechanism where by plugins can tap into \"secret\" config options without exposing them in the visible metadata.json (where plugin configs currently live, see https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration )", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/538/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 465327844, "node_id": "MDU6SXNzdWU0NjUzMjc4NDQ=", "number": 553, "title": "Potential improvements to facet-by-date", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-08T15:37:53Z", "updated_at": "2019-07-08T15:41:55Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "In addition to #483 Tobias had some useful suggestions on Twitter:\r\n\r\nhttps://twitter.com/rixxtr/status/1148253926476701696\r\n> I think for date facets, it might be more meaningful to order them by date, rather than by size? Or offer both? I'm *definitely* often interested in size-over-time, so https://data.rixx.de/django_tickets/tickets?_facet_date=created#facet-created \u2026 isn't all that helpful!\r\n\r\nScreenshot of that link:\r\n\r\n\"django_tickets__tickets__29_846_rows\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/553/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 467790646, "node_id": "MDU6SXNzdWU0Njc3OTA2NDY=", "number": 560, "title": "CodeMirror fails to load on database page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-14T03:31:00Z", "updated_at": "2019-09-03T01:03:02Z", "closed_at": "2019-07-14T03:38:59Z", "author_association": "OWNER", "pull_request": null, "body": "It's not loading on https://latest.datasette.io/fixtures\r\n\r\nBut it does load on https://latest.datasette.io/fixtures?sql=select+*+from+facetable", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/560/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 470691999, "node_id": "MDU6SXNzdWU0NzA2OTE5OTk=", "number": 43, "title": ".add_column() doesn't match indentation of initial creation", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-20T16:33:10Z", "updated_at": "2019-07-23T13:09:11Z", "closed_at": "2019-07-23T13:09:05Z", "author_association": "OWNER", "pull_request": null, "body": "I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this:\r\n\r\n```sql\r\nCREATE TABLE [records] (\r\n [type] TEXT,\r\n [sourceName] TEXT,\r\n [sourceVersion] TEXT,\r\n [unit] TEXT,\r\n [creationDate] TEXT,\r\n [startDate] TEXT,\r\n [endDate] TEXT,\r\n [value] TEXT,\r\n [metadata_Health Mate App Version] TEXT,\r\n [metadata_Withings User Identifier] TEXT,\r\n [metadata_Modified Date] TEXT,\r\n [metadata_Withings Link] TEXT,\r\n [metadata_HKWasUserEntered] TEXT\r\n, [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT)\r\n```\r\n\r\nIt would be nice if the columns that were added later matched the indentation of the initial columns.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 471780443, "node_id": "MDU6SXNzdWU0NzE3ODA0NDM=", "number": 46, "title": "extracts= option for insert/update/etc", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-23T15:55:46Z", "updated_at": "2020-03-01T16:53:40Z", "closed_at": "2019-07-23T17:00:44Z", "author_association": "OWNER", "pull_request": null, "body": "Relates to #42 and #44. I want the ability to extract values out into lookup tables during bulk insert/upsert operations.\r\n\r\n`db.insert_all(rows, extracts=[\"species\"])`\r\n\r\n- creates species table for values in the species column\r\n\r\n`db.insert_all(rows, extracts={\"species\": \"Species\"})`\r\n\r\n- as above but the new table is called `Species`.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/46/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 472097220, "node_id": "MDU6SXNzdWU0NzIwOTcyMjA=", "number": 7, "title": "Script uses a lot of RAM", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-07-24T06:11:11Z", "updated_at": "2019-07-24T06:35:52Z", "closed_at": "2019-07-24T06:35:52Z", "author_association": "MEMBER", "pull_request": null, "body": "I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor.\r\n\r\nI think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later.\r\n\r\nhttp://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says:\r\n\r\n> The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element.\r\n\r\nSo I will try that recipe and see if it helps.", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 476573875, "node_id": "MDU6SXNzdWU0NzY1NzM4NzU=", "number": 567, "title": "Datasette Edit", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-08-04T17:09:28Z", "updated_at": "2020-02-25T03:40:50Z", "closed_at": "2020-02-25T03:40:50Z", "author_association": "OWNER", "pull_request": null, "body": "Datasette started out immutable. Then it gained the ability to run against read-only databases that were being modified by other processes. It's time for the next logical progression: the option to allow Datasette (or more likely individual plugins) to write to the database!\r\n\r\nThis is going to require some careful rethinking of how connection management works.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/567/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 488833698, "node_id": "MDU6SXNzdWU0ODg4MzM2OTg=", "number": 2, "title": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-03T21:29:12Z", "updated_at": "2019-09-04T20:02:11Z", "closed_at": "2019-09-04T20:02:11Z", "author_association": "MEMBER", "pull_request": null, "body": "Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html\r\n\r\nI'm going to do:\r\n\r\n $ twitter-to-sqlite tweets simonw\r\n\r\n", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 492153532, "node_id": "MDU6SXNzdWU0OTIxNTM1MzI=", "number": 573, "title": "Exposing Datasette via Jupyter-server-proxy", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-11T10:32:36Z", "updated_at": "2020-03-26T09:41:30Z", "closed_at": "2020-03-26T09:41:30Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "It is possible to expose a running `datasette` service in a Jupyter environment such as a MyBinder environment using the [`jupyter-server-proxy`](https://github.com/jupyterhub/jupyter-server-proxy).\r\n\r\nFor example, using [this demo Binder](https://mybinder.org/v2/gh/binder-examples/r/master?filepath=index.ipynb) which has the server proxy installed, we can then upload a simple test database from the notebook homepage, from a Jupyter termianl install datasette and set it running against the test db on eg port 8001 and then view it via the path `proxy/8001`.\r\n\r\nClicking links results in 404s though because the `datasette` links aren't relative to the current path?\r\n\r\n![image](https://user-images.githubusercontent.com/82988/64689964-44b69280-d487-11e9-8f9f-3681422bcc9f.png)\r\n\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/573/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 499954048, "node_id": "MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx", "number": 578, "title": "Added support for multi arch builds", "user": {"value": 887095, "label": "heussd"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-29T18:43:03Z", "updated_at": "2019-11-13T19:13:15Z", "closed_at": "2019-11-13T19:13:15Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/578", "body": "Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/578/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 503053800, "node_id": "MDU6SXNzdWU1MDMwNTM4MDA=", "number": 12, "title": "Extract \"source\" into a separate lookup table", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-06T05:17:23Z", "updated_at": "2019-10-17T15:49:24Z", "closed_at": "2019-10-17T15:49:24Z", "author_association": "MEMBER", "pull_request": null, "body": "It's pretty bulky and ugly at the moment:\r\n\r\n\"trump__tweets__1_820_rows\"\r\n", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 503234169, "node_id": "MDU6SXNzdWU1MDMyMzQxNjk=", "number": 2, "title": "Track and use the 'since' value", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-07T05:02:59Z", "updated_at": "2020-03-27T22:22:30Z", "closed_at": "2020-03-27T22:22:30Z", "author_association": "MEMBER", "pull_request": null, "body": "Pocket says:\r\n\r\n> Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time.\r\n\r\nAt the bottom of https://getpocket.com/developer/docs/v3/retrieve", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 505512251, "node_id": "MDU6SXNzdWU1MDU1MTIyNTE=", "number": 588, "title": "Queries per DB table in metadata.json", "user": {"value": 12617395, "label": "bsilverm"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-10T21:08:19Z", "updated_at": "2019-10-21T12:58:22Z", "closed_at": "2019-10-21T01:48:42Z", "author_association": "NONE", "pull_request": null, "body": "It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries:\r\n\r\n` \"databases\": {\r\n \"MYDB\": {\r\n \"tables\": {\r\n \"MYFIRSTTABLE\": {\r\n \"source\": \"Test\",\r\n \"source_url\": \"https://www.google.com\",\r\n \"queries\": {\r\n \"Query 1\": {\r\n \"sql\": \"select * from MYFIRSTTABLE\",\r\n \"title\": \"Query 1\",\r\n \"description\": \"This is the first query\"\r\n },\r\n }\r\n },\r\n \"MYSECONDTABLE\": {\r\n \"source\":\"Test2\",\r\n \"source_url\":\"https://www.google.com\",\r\n \"queries\": {\r\n \"Query 2\" : {\r\n \"sql\":\"select * from MYSECONDTABLE;\",\r\n \"title\": \"Query 2\",\r\n \"description\":\"This is the second query\"\r\n }\r\n }\r\n }\r\n }`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/588/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 505818256, "node_id": "MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1", "number": 590, "title": "Handle spaces in DB names", "user": {"value": 2657547, "label": "rixx"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-11T12:18:22Z", "updated_at": "2019-11-04T23:16:31Z", "closed_at": "2019-11-04T23:16:30Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/590", "body": "Closes #503", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/590/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 506087267, "node_id": "MDU6SXNzdWU1MDYwODcyNjc=", "number": 19, "title": "since_id support for home-timeline", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-11T22:48:24Z", "updated_at": "2019-10-16T19:13:06Z", "closed_at": "2019-10-16T19:12:46Z", "author_association": "MEMBER", "pull_request": null, "body": "Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506183241, "node_id": "MDU6SXNzdWU1MDYxODMyNDE=", "number": 593, "title": "make uvicorn optional dependancy (because not ok on windows python yet)", "user": {"value": 4312421, "label": "stonebig"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-12T12:51:07Z", "updated_at": "2019-10-13T06:22:08Z", "closed_at": "2019-10-13T06:22:07Z", "author_association": "NONE", "pull_request": null, "body": "would it be possible to:\r\n- remove uvicorn mandatory dependancy ?\r\n- eventually make a fallback to hypercorn ?\r\n\r\nreason:\r\n- uvloop not yet supported on Windows/Python-3.8 and below, may happen with Python-3.9 only.\r\n- it seems a 6 lines effort (but I'm not expert)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/593/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506268945, "node_id": "MDU6SXNzdWU1MDYyNjg5NDU=", "number": 20, "title": "--since support for various commands for refresh-by-cron", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-13T03:40:46Z", "updated_at": "2019-10-21T03:32:04Z", "closed_at": "2019-10-16T19:26:11Z", "author_association": "MEMBER", "pull_request": null, "body": "I want to run a cron that updates my Twitter database every X minutes.\r\n\r\nIt should be able to retrieve the following without needing to paginate through everything:\r\n\r\n- [x] Tweets I have tweeted\r\n- [x] My home timeline (see #19)\r\n- [x] Tweets I have favourited\r\n\r\nIt would be nice if this could be standardized across all commands as a `--since` option.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506297048, "node_id": "MDU6SXNzdWU1MDYyOTcwNDg=", "number": 594, "title": "upgrade to uvicorn-0.9 to be Python-3.8 friendly", "user": {"value": 4312421, "label": "stonebig"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-13T09:23:43Z", "updated_at": "2019-11-12T04:47:04Z", "closed_at": "2019-11-12T04:47:04Z", "author_association": "NONE", "pull_request": null, "body": "uvicorn-0.8 relies on websockets-0.7 which lacks python-3.8 compatiblity", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/594/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 509535510, "node_id": "MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz", "number": 602, "title": "Offer to format readonly SQL", "user": {"value": 2657547, "label": "rixx"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-20T02:29:32Z", "updated_at": "2019-11-04T07:29:33Z", "closed_at": "2019-11-04T02:39:56Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/602", "body": "Following discussion in #601, this PR adds a \"Format SQL\" button to\r\nread-only SQL (if the SQL actually differs from the formatting result).\r\n\r\nIt also removes a console error on readonly SQL queries.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/602/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 509693773, "node_id": "MDU6SXNzdWU1MDk2OTM3NzM=", "number": 604, "title": "_where= parameter is not persisted in hidden form fields", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-21T02:14:10Z", "updated_at": "2019-10-30T19:12:38Z", "closed_at": "2019-10-30T18:49:44Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. on this page: https://v0-30.datasette.io/fixtures/roadside_attractions?_where=name%20like%20%27%museum%%27\r\n\r\nClick the \"Apply\" button and the `_where=` parameter will be dropped.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/604/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 512218858, "node_id": "MDU6SXNzdWU1MTIyMTg4NTg=", "number": 606, "title": "/-/plugins shows incorrect name for plugins", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-24T22:53:25Z", "updated_at": "2019-11-01T05:41:04Z", "closed_at": "2019-11-01T05:40:07Z", "author_association": "OWNER", "pull_request": null, "body": "https://fivethirtyeight.datasettes.com/-/plugins\r\n\r\n```json\r\n[\r\n {\r\n \"name\": \"datasette_jellyfish\",\r\n \"static\": false,\r\n \"templates\": false,\r\n \"version\": \"0.3\"\r\n },\r\n {\r\n \"name\": \"datasette_vega\",\r\n \"static\": true,\r\n \"templates\": false,\r\n \"version\": \"0.6.2\"\r\n }\r\n]\r\n```\r\n\r\nThese should be shown as `datasette-jellyfish` and `datasette-vega` since those are the names on PyPI.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/606/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 516370822, "node_id": "MDU6SXNzdWU1MTYzNzA4MjI=", "number": 611, "title": "Static assets no longer loading for installed plugins", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-01T22:07:00Z", "updated_at": "2019-11-01T22:15:55Z", "closed_at": "2019-11-01T22:15:55Z", "author_association": "OWNER", "pull_request": null, "body": "Caused by fix I made in #606\r\n\r\ne.g. `/-/static-plugins/datasette_leaflet_geojson/datasette-leaflet-geojson.js` is a 404, but view-`/-/static-plugins/datasette-leaflet-geojson/datasette-leaflet-geojson.js` works correctly.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/611/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 516967682, "node_id": "MDU6SXNzdWU1MTY5Njc2ODI=", "number": 10, "title": "Add this repos_starred view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-04T05:44:38Z", "updated_at": "2020-05-02T16:37:36Z", "closed_at": "2020-05-02T16:37:36Z", "author_association": "MEMBER", "pull_request": null, "body": "```sql\r\ncreate view repos_starred as select\r\n stars.starred_at,\r\n users.login,\r\n repos.*\r\nfrom\r\n repos\r\n join stars on repos.id = stars.repo\r\n join users on repos.owner = users.id\r\norder by\r\n starred_at desc;\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 522334771, "node_id": "MDU6SXNzdWU1MjIzMzQ3NzE=", "number": 633, "title": "Publish to Heroku is broken: \"WARNING: You must pass the application as an import string to enable 'reload' or 'workers\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-13T16:32:11Z", "updated_at": "2020-04-28T20:37:50Z", "closed_at": "2019-11-13T16:43:23Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\n2019-11-13T16:27:59.821483+00:00 heroku[web.1]: Starting process with command `datasette serve --host 0.0.0.0 -i fixtures.db --cors --port 36817 --inspect-file inspect-data.json`\r\n2019-11-13T16:28:01.856471+00:00 heroku[web.1]: State changed from starting to crashed\r\n2019-11-13T16:28:01.750253+00:00 app[web.1]: Serve! files=() (immutables=('fixtures.db',)) on port 36817\r\n2019-11-13T16:28:01.771524+00:00 app[web.1]: WARNING: You must pass the application as an import string to enable 'reload' or 'workers'.\r\n2019-11-13T16:28:01.837839+00:00 heroku[web.1]: Process exited with status 1\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/633/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 525254973, "node_id": "MDU6SXNzdWU1MjUyNTQ5NzM=", "number": 636, "title": "rowid is not included in dropdown filter menus", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-19T20:43:04Z", "updated_at": "2019-11-19T23:01:17Z", "closed_at": "2019-11-19T23:01:17Z", "author_association": "OWNER", "pull_request": null, "body": "For `rowid` tables the `rowid` column isn't shown in the list of filter options:\r\n\r\n\"md__md__53_805_rows_where_where_rowid___1060124_sorted_by_rowid_descending\"\r\n\r\nThis also means if you link to e.g. `?rowid__gt=1060124` the resulting filter interface will be slightly broken: clicking the \"apply\" button again will lose your filter for example.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/636/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 525993034, "node_id": "MDU6SXNzdWU1MjU5OTMwMzQ=", "number": 637, "title": "Custom queries with 0 results should say \"0 results\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-20T18:28:14Z", "updated_at": "2019-11-23T06:17:23Z", "closed_at": "2019-11-23T06:07:08Z", "author_association": "OWNER", "pull_request": null, "body": "Consider https://latest.datasette.io/fixtures/neighborhood_search?text=foop\r\n\r\n\"fixtures__select_neighborhood__facet_cities_name__state_from_facetable_join_facet_cities_on_facetable_city_id___facet_cities_id_where_neighborhood_like_________text________order_by_neighborhood_\"\r\n\r\nIt's currently not obvious that the query executed and returned 0 results.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/637/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 526913133, "node_id": "MDU6SXNzdWU1MjY5MTMxMzM=", "number": 638, "title": "Don't suggest column for faceting if all values are 1", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-22T00:14:22Z", "updated_at": "2019-11-22T01:14:59Z", "closed_at": "2019-11-22T00:57:49Z", "author_association": "OWNER", "pull_request": null, "body": "https://www.niche-museums.com/museums/museums?_facet=wikipedia_url\r\n\r\n\"museums__museums__42_rows\"\r\n\r\nChallenge is how to do this efficiently, since suggested facet queries need to be lightning fast.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/638/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 530491074, "node_id": "MDU6SXNzdWU1MzA0OTEwNzQ=", "number": 14, "title": "Command for importing events", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-29T21:28:58Z", "updated_at": "2020-04-14T19:38:34Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Eg from https://api.github.com/users/simonw/events\r\n\r\nDocs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 534507142, "node_id": "MDU6SXNzdWU1MzQ1MDcxNDI=", "number": 69, "title": "Feature request: enable extensions loading", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-08T08:06:25Z", "updated_at": "2022-02-05T00:04:25Z", "closed_at": "2020-10-16T18:42:49Z", "author_association": "NONE", "pull_request": null, "body": "Hi, it would be great to add a parameter that enables the load of a sqlite extension you need.\r\n\r\nSomething like \"-ext modspatialite\".\r\n\r\nIn this way your great tool would be even more comfortable and powerful.\r\n\r\n\r\nThank you very much", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 534629631, "node_id": "MDU6SXNzdWU1MzQ2Mjk2MzE=", "number": 650, "title": "Add a glossary to the documentation", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-09T00:23:45Z", "updated_at": "2022-01-13T22:04:56Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Call it `glossary.rst` - it can use a definition list something like this:\r\n```rst\r\n.. _glossary:\r\n\r\nGlossary\r\n========\r\n\r\nTerm\r\n A definition of the term.\r\n\r\nAnother term\r\n Another definition.\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/650/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 539590148, "node_id": "MDU6SXNzdWU1Mzk1OTAxNDg=", "number": 651, "title": "fts5 syntax error when using punctuation", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-18T10:25:35Z", "updated_at": "2021-07-14T19:26:06Z", "closed_at": "2019-12-30T06:42:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon\r\n\r\nI get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' \"enable-fts\"-command.\r\n\r\nThe same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.)\r\n\r\nWhat am I doing wrong? Many thanks!\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/651/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 541467590, "node_id": "MDU6SXNzdWU1NDE0Njc1OTA=", "number": 654, "title": "Template debug mode that outputs template context", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-22T15:51:25Z", "updated_at": "2019-12-22T16:13:11Z", "closed_at": "2019-12-22T16:04:51Z", "author_association": "OWNER", "pull_request": null, "body": "It would make writing templates (including custom templates) easier if there was an option to dump out the full template context - maybe `?_context=1`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/654/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 542814756, "node_id": "MDU6SXNzdWU1NDI4MTQ3NTY=", "number": 71, "title": "Tests are failing due to missing FTS5", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-27T09:41:16Z", "updated_at": "2019-12-27T09:49:37Z", "closed_at": "2019-12-27T09:49:37Z", "author_association": "OWNER", "pull_request": null, "body": "https://travis-ci.com/simonw/sqlite-utils/jobs/268436167\r\n\r\nThis is a recent change: 2 months ago they worked fine.\r\n\r\nI'm not sure what changed here. Maybe something to do with https://launchpad.net/~jonathonf/+archive/ubuntu/backports ?", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/71/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 546073980, "node_id": "MDU6SXNzdWU1NDYwNzM5ODA=", "number": 74, "title": "Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column", "user": {"value": 15092, "label": "jayvdb"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-01-07T04:35:50Z", "updated_at": "2020-01-12T07:21:17Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "openSUSE 15.1 is using python 3.6.5 and click-7.0 , however it has test failures while openSUSE Tumbleweed on py37 passes.\r\n\r\nMost fail on the cli exit code like\r\n```py\r\n[ 74s] =================================== FAILURES ===================================\r\n[ 74s] _________________________________ test_tables __________________________________\r\n[ 74s] \r\n[ 74s] db_path = '/tmp/pytest-of-abuild/pytest-0/test_tables0/test.db'\r\n[ 74s] \r\n[ 74s] def test_tables(db_path):\r\n[ 74s] result = CliRunner().invoke(cli.cli, [\"tables\", db_path])\r\n[ 74s] > assert '[{\"table\": \"Gosh\"},\\n {\"table\": \"Gosh2\"}]' == result.output.strip()\r\n[ 74s] E assert '[{\"table\": \"...e\": \"Gosh2\"}]' == ''\r\n[ 74s] E - [{\"table\": \"Gosh\"},\r\n[ 74s] E - {\"table\": \"Gosh2\"}]\r\n[ 74s] \r\n[ 74s] tests/test_cli.py:28: AssertionError\r\n```\r\n\r\npackaging project at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-sqlite-utils\r\n\r\nI'll keep digging into this after I have github-to-sqlite working on Tumbleweed, as I'll need openSUSE Leap 15.1 working before I can submit this into the main python repo.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/74/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 555832585, "node_id": "MDU6SXNzdWU1NTU4MzI1ODU=", "number": 661, "title": "--port option to expose a port other than 8001 in \"datasette package\"", "user": {"value": 134771, "label": "dvhthomas"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-01-27T21:05:56Z", "updated_at": "2020-01-30T04:17:52Z", "closed_at": "2020-01-29T22:46:45Z", "author_association": "NONE", "pull_request": null, "body": "I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080.\r\n\r\nhttps://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41\r\n\r\nIs there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/661/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 559197745, "node_id": "MDU6SXNzdWU1NTkxOTc3NDU=", "number": 82, "title": "Tutorial command no longer works", "user": {"value": 10350886, "label": "petey284"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-02-03T16:36:11Z", "updated_at": "2020-02-27T04:16:43Z", "closed_at": "2020-02-27T04:16:30Z", "author_association": "NONE", "pull_request": null, "body": "Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site.\r\n\r\nThe following command no longer works, and breaks with the previous too many variables error: #50\r\n\r\n``` cmd\r\n> curl \"https://data.nasa.gov/resource/y77d-th95.json\" | \\\r\n sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\nOutput:\r\n``` cmd\r\nTraceback (most recent call last):\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\runpy.py\", line 193, in _run_module_as_main\r\n \"__main__\", mod_spec)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"Continuum\\miniconda3\\envs\\main\\Scripts\\sqlite-utils.exe\\__main__.py\", line 9, in \r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\cli.py\", line 434, in insert\r\n default=default,\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\cli.py\", line 384, in insert_upsert_implementation\r\n docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\db.py\", line 1081, in insert_all\r\n result = self.db.conn.execute(query, params)\r\nsqlite3.OperationalError: too many SQL variables\r\n```\r\n\r\nMy thought is that maybe the dataset grew over the last few years and so didn't run into this issue before.\r\n\r\nNo error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails.\r\n\r\n// This passes\r\n``` cmd\r\ntype meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\n// But this fails\r\n``` cmd\r\ntype meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\nA potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction.\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 562085508, "node_id": "MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2", "number": 666, "title": "Use inspect-file, if possible, for total row count", "user": {"value": 13896256, "label": "kevindkeogh"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-02-08T22:10:35Z", "updated_at": "2020-03-09T02:47:15Z", "closed_at": "2020-02-25T20:19:29Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/666", "body": "For large tables, counting the number of rows in the table can take a\r\nsignficant amount of time. Instead, where an inspect-file is provided\r\nfor an immutable database, look up the row-count for a plain count(*).", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/666/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 569253072, "node_id": "MDU6SXNzdWU1NjkyNTMwNzI=", "number": 678, "title": "prepare_connection() plugin hook should accept optional datasette argument", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-02-22T00:50:26Z", "updated_at": "2020-02-22T03:53:19Z", "closed_at": "2020-02-22T02:28:51Z", "author_association": "OWNER", "pull_request": null, "body": "I want to build a plugin that allows users to configure certain database columns to be \"masked\" - so the `password` column on a users table is never revealed, for example.\r\n\r\nTo do this, I need to use the `conn.set_authorizer()` SQLite mechanism. So the plugin needs to build off the `prepare_connection(conn)` hook. But that hook doesn't currently get passed `datasette` so it doesn't have a way of looking up its plugin configuration!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/678/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 573578548, "node_id": "MDU6SXNzdWU1NzM1Nzg1NDg=", "number": 89, "title": "Ability to customize columns used by extracts= feature", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-01T16:54:48Z", "updated_at": "2020-10-16T19:17:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, `.lookup()` allows you to define the \"value\" column (the documentation uses name), but when you use `extracts` keyword as part of `.insert()`, `.upsert()` etc. the lookup must be done against a column named \"value\". I have an existing lookup table that I've populated with columns \"id\" and \"name\" as opposed to \"id\" and \"value\", and seems I can't use `extracts=`, unless I'm missing something...\r\n\r\nInitial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so:\r\n```\r\ntable = db.table(\"trees\", extracts={\"species_id\": (\"Species\", \"name\"})\r\n```\r\n\r\nI haven't dug too much into the existing code yet, but does this make sense? Worth doing?\r\n\r\n_Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/89/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 574043218, "node_id": "MDU6SXNzdWU1NzQwNDMyMTg=", "number": 693, "title": "Variables from extra_template_vars() not exposed in _context=1", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-02T15:14:51Z", "updated_at": "2020-04-05T19:12:48Z", "closed_at": "2020-04-05T19:12:48Z", "author_association": "OWNER", "pull_request": null, "body": "The `_context=1` debugging mode does not show variables that should have been added to the context by the `extra_template_vars()` plugin hook.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/693/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 585353598, "node_id": "MDU6SXNzdWU1ODUzNTM1OTg=", "number": 37, "title": "Handle \"User not found\" error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-20T22:14:32Z", "updated_at": "2020-04-17T23:43:46Z", "closed_at": "2020-04-17T23:43:46Z", "author_association": "MEMBER", "pull_request": null, "body": "While running `user-timeline` I got this bug (because a screen name I asked for didn't exist):\r\n```\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py\", line 185, in transform_user\r\n user[\"created_at\"] = parser.parse(user[\"created_at\"])\r\nKeyError: 'created_at'\r\n>>> import pdb\r\n>>> pdb.pm()\r\n> /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user()\r\n-> user[\"created_at\"] = parser.parse(user[\"created_at\"])\r\n(Pdb) user\r\n{'errors': [{'code': 50, 'message': 'User not found.'}]}\r\n```", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 585597133, "node_id": "MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5", "number": 703, "title": "WIP implementation of writable canned queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-21T22:23:51Z", "updated_at": "2020-06-03T00:08:14Z", "closed_at": "2020-06-02T23:57:35Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/703", "body": "Refs #698.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/703/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 1, "state_reason": null} {"id": 598013965, "node_id": "MDU6SXNzdWU1OTgwMTM5NjU=", "number": 724, "title": "--plugin-secret over-rides existing metadata.json plugin config", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-10T17:56:30Z", "updated_at": "2020-04-16T04:58:12Z", "closed_at": "2020-04-10T18:34:21Z", "author_association": "OWNER", "pull_request": null, "body": "This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored.\r\n\r\nhttps://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/724/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 598891570, "node_id": "MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0", "number": 725, "title": "Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-13T13:32:47Z", "updated_at": "2020-05-04T18:16:54Z", "closed_at": "2020-05-04T16:17:49Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/725", "body": "Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/725/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 601333634, "node_id": "MDU6SXNzdWU2MDEzMzM2MzQ=", "number": 28, "title": "Pull repository contributors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-16T18:46:40Z", "updated_at": "2020-04-18T15:05:10Z", "closed_at": "2020-04-18T15:05:10Z", "author_association": "MEMBER", "pull_request": null, "body": "https://developer.github.com/v3/repos/#list-contributors\r\n\r\n`GET /repos/:owner/:repo/contributors`\r\n\r\nNot sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601358649, "node_id": "MDU6SXNzdWU2MDEzNTg2NDk=", "number": 100, "title": "Mechanism for forcing column-type, over-riding auto-detection", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-16T19:12:52Z", "updated_at": "2020-04-17T23:53:32Z", "closed_at": "2020-04-17T23:53:32Z", "author_association": "OWNER", "pull_request": null, "body": "As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate).\r\n\r\nSome kind of mechanism for over-riding the detected types of columns would be useful here.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 603624862, "node_id": "MDU6SXNzdWU2MDM2MjQ4NjI=", "number": 31, "title": "Issue and milestone should have foreign key to repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-21T00:46:24Z", "updated_at": "2020-04-22T01:20:19Z", "closed_at": "2020-04-22T01:20:19Z", "author_association": "MEMBER", "pull_request": null, "body": "Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`.\r\n\r\n_Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 604222295, "node_id": "MDU6SXNzdWU2MDQyMjIyOTU=", "number": 32, "title": "Issue comments don't appear to populate issues foreign key", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-21T19:17:32Z", "updated_at": "2020-04-22T01:17:44Z", "closed_at": "2020-04-22T01:17:44Z", "author_association": "MEMBER", "pull_request": null, "body": "https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101\r\n\r\n\"Screen\r\n", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 606720674, "node_id": "MDU6SXNzdWU2MDY3MjA2NzQ=", "number": 736, "title": "strange behavior using accented characters", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-25T08:34:51Z", "updated_at": "2020-04-28T06:09:28Z", "closed_at": "2020-04-27T18:59:16Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nwhen I search `incompatibilit\u00e0` [here](https://my-database.now.sh/commissioniComunePalermo/youtube), using full text search, it becomes `incompatibilit\u00c3\u0083\u00c2\u00a0` and I have no result.\r\n\r\nIf I encode the `\u00e0` char in the URL (`incompatibilit%C3%A0`) I have the right result.\r\n\r\n![image](https://user-images.githubusercontent.com/30607/80275201-00a79380-86e0-11ea-865e-f7e1474e8098.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/736/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607107849, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw", "number": 739, "title": "Configuration directory mode", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-26T20:37:46Z", "updated_at": "2020-04-27T16:30:25Z", "closed_at": "2020-04-27T16:30:25Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/739", "body": "Refs #731\r\n\r\nTODO:\r\n\r\n- [x] Decide how to combine explicit command-line options with items detected from the directory structure\r\n- [x] Add unit tests\r\n- [x] Implement `inspect-data.json` mechanism for populating `immutables`\r\n- [x] Add documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/739/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null}