{"id": 273944952, "node_id": "MDU6SXNzdWUyNzM5NDQ5NTI=", "number": 93, "title": "Package as standalone binary", "user": {"value": 67420, "label": "atomotic"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 18, "created_at": "2017-11-14T21:14:07Z", "updated_at": "2021-11-21T07:00:23Z", "closed_at": "2021-11-21T07:00:23Z", "author_association": "NONE", "pull_request": null, "body": "hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute.\r\n\r\ni would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/93/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274160723, "node_id": "MDU6SXNzdWUyNzQxNjA3MjM=", "number": 100, "title": "TemplateAssertionError: no filter named 'tojson'", "user": {"value": 13304454, "label": "coisnepe"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2017-11-15T13:43:41Z", "updated_at": "2017-11-16T09:25:10Z", "closed_at": "2017-11-16T00:14:13Z", "author_association": "NONE", "pull_request": null, "body": "A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though...\r\n\r\n```\r\n2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last):\r\n File \"/usr/local/lib/python3.5/dist-packages/sanic/app.py\", line 503, in handle_request\r\n response = await response\r\n File \"/usr/local/lib/python3.5/dist-packages/datasette/app.py\", line 155, in get\r\n return await self.view_get(request, name, hash, **kwargs)\r\n File \"/usr/local/lib/python3.5/dist-packages/datasette/app.py\", line 219, in view_get\r\n **context,\r\n File \"/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py\", line 84, in render\r\n return html(self.render_string(template, request, **context))\r\n File \"/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py\", line 81, in render_string\r\n return self.env.get_template(template).render(**context)\r\n File \"/usr/lib/python3/dist-packages/jinja2/environment.py\", line 812, in get_template\r\n return self._load_template(name, self.make_globals(globals))\r\n File \"/usr/lib/python3/dist-packages/jinja2/environment.py\", line 786, in _load_template\r\n template = self.loader.load(self, name, globals)\r\n File \"/usr/lib/python3/dist-packages/jinja2/loaders.py\", line 125, in load\r\n code = environment.compile(source, name, filename)\r\n File \"/usr/lib/python3/dist-packages/jinja2/environment.py\", line 565, in compile\r\n self.handle_exception(exc_info, source_hint=source_hint)\r\n File \"/usr/lib/python3/dist-packages/jinja2/environment.py\", line 754, in handle_exception\r\n reraise(exc_type, exc_value, tb)\r\n File \"/usr/lib/python3/dist-packages/jinja2/_compat.py\", line 37, in reraise\r\n raise value.with_traceback(tb)\r\n File \"/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html\", line 29, in template\r\n
params = {{ query.params|tojson(4) }}
\r\n File \"/usr/lib/python3/dist-packages/jinja2/environment.py\", line 515, in _generate\r\n return generate(source, self, name, filename, defer_init=defer_init)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 62, in generate\r\n generator.visit(node)\r\n File \"/usr/lib/python3/dist-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 849, in visit_Template\r\n self.blockvisit(block.body, block_frame)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 492, in blockvisit\r\n self.visit(node, frame)\r\n File \"/usr/lib/python3/dist-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 1172, in visit_If\r\n self.blockvisit(node.body, if_frame)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 492, in blockvisit\r\n self.visit(node, frame)\r\n File \"/usr/lib/python3/dist-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 1353, in visit_Output\r\n self.visit(argument, frame)\r\n File \"/usr/lib/python3/dist-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 1565, in visit_Filter\r\n self.fail('no filter named %r' % node.name, node.lineno)\r\n File \"/usr/lib/python3/dist-packages/jinja2/compiler.py\", line 427, in fail\r\n raise TemplateAssertionError(msg, lineno, self.name, self.filename)\r\njinja2.exceptions.TemplateAssertionError: no filter named 'tojson'\r\n\r\n2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144\r\n2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/100/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274161964, "node_id": "MDU6SXNzdWUyNzQxNjE5NjQ=", "number": 101, "title": "TemplateAssertionError: no filter named 'tojson'", "user": {"value": 450244, "label": "eaubin"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-11-15T13:47:32Z", "updated_at": "2017-11-15T13:48:55Z", "closed_at": "2017-11-15T13:48:55Z", "author_association": "NONE", "pull_request": null, "body": "I get an exception clicking on the table link: \r\n\r\n```\r\n2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last):\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py\", line 503, in handle_request\r\n response = await response\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py\", line 155, in get\r\n return await self.view_get(request, name, hash, **kwargs)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py\", line 219, in view_get\r\n **context,\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py\", line 84, in render\r\n return html(self.render_string(template, request, **context))\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py\", line 81, in render_string\r\n return self.env.get_template(template).render(**context)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py\", line 812, in get_template\r\n return self._load_template(name, self.make_globals(globals))\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py\", line 786, in _load_template\r\n template = self.loader.load(self, name, globals)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py\", line 125, in load\r\n code = environment.compile(source, name, filename)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py\", line 565, in compile\r\n self.handle_exception(exc_info, source_hint=source_hint)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py\", line 754, in handle_exception\r\n reraise(exc_type, exc_value, tb)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py\", line 37, in reraise\r\n raise value.with_traceback(tb)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html\", line 29, in template\r\n
params = {{ query.params|tojson(4) }}
\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py\", line 515, in _generate\r\n return generate(source, self, name, filename, defer_init=defer_init)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 62, in generate\r\n generator.visit(node)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 849, in visit_Template\r\n self.blockvisit(block.body, block_frame)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 492, in blockvisit\r\n self.visit(node, frame)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 1172, in visit_If\r\n self.blockvisit(node.body, if_frame)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 492, in blockvisit\r\n self.visit(node, frame)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 1353, in visit_Output\r\n self.visit(argument, frame)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py\", line 38, in visit\r\n return f(node, *args, **kwargs)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 1565, in visit_Filter\r\n self.fail('no filter named %r' % node.name, node.lineno)\r\n File \"/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py\", line 427, in fail\r\n raise TemplateAssertionError(msg, lineno, self.name, self.filename)\r\njinja2.exceptions.TemplateAssertionError: no filter named 'tojson'\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/101/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 276091279, "node_id": "MDU6SXNzdWUyNzYwOTEyNzk=", "number": 144, "title": "apsw as alternative sqlite3 binding (for full text search)", "user": {"value": 649467, "label": "mhalle"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2017-11-22T14:40:39Z", "updated_at": "2018-05-28T21:29:42Z", "closed_at": "2018-05-28T21:29:42Z", "author_association": "NONE", "pull_request": null, "body": "Hey there,\r\n\r\nHave you considered providing apsw support as an alternative to stock python sqlite3? I use apsw because it keeps up with sqlite3 and is straightforward to bring in extensions like FTS5. FTS really accelerates the kind of searching often done by web clients.\r\n\r\nI may be able to help (it shouldn't be much code), but there are a couple of stylistic questions that come up when supporting an optional package. Also, apsw is tricky in that it doesn't have a pypi package (author says limitations in providing options to setup.py). ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/144/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 276842536, "node_id": "MDU6SXNzdWUyNzY4NDI1MzY=", "number": 153, "title": "Ability to customize presentation of specific columns in HTML view", "user": {"value": 20264, "label": "ftrain"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 2949431, "label": "Custom templates edition"}, "comments": 14, "created_at": "2017-11-26T17:46:11Z", "updated_at": "2017-12-10T02:08:45Z", "closed_at": "2017-12-07T06:17:33Z", "author_association": "NONE", "pull_request": null, "body": "This ties into https://github.com/simonw/datasette/issues/3 in some ways. It would be great to have some adaptability in the HTML views and to specific some columns as displaying in certain ways.\r\n\r\n- [x] 1. **Auto-parsing URIs into in-browser links.** Why? Lots of public data around cultural commons stuff links to a specific URL. This would be a great utility to turn on at the command line, just parse everything for URLs. Maybe they need to be underlined or represented in a different way than internal URLs.\r\n- [x] 2. **Ability to identify a column as plain/preformatted text.** Why? Was trying to import the Enron emails, the body collapses. Hard to read. These fields also tend to screw up the ability to scan a table view. If you knew it was text the system could set an `overflow` property on the relevant CSS, so you could still scan.\r\n- [x] 3. **Ability to identify a column as HTML.** Why? I want to spider some stuff and drop sections into SQLite, and just keep them as HTML.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/153/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 277589569, "node_id": "MDU6SXNzdWUyNzc1ODk1Njk=", "number": 155, "title": "A primary key column that has foreign key restriction associated won't rendering label column", "user": {"value": 388154, "label": "wsxiaoys"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 2949431, "label": "Custom templates edition"}, "comments": 4, "created_at": "2017-11-29T00:40:02Z", "updated_at": "2017-12-07T05:39:53Z", "closed_at": "2017-12-07T05:39:53Z", "author_association": "NONE", "pull_request": null, "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/155/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 278814220, "node_id": "MDU6SXNzdWUyNzg4MTQyMjA=", "number": 161, "title": "Support WITH query ", "user": {"value": 388154, "label": "wsxiaoys"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2017-12-03T20:00:40Z", "updated_at": "2017-12-08T06:18:12Z", "closed_at": "2017-12-04T04:52:41Z", "author_association": "NONE", "pull_request": null, "body": "Currently datasettle failed with error message: Statement must begin with SELECT\r\n\r\nExample query\r\n```sql\r\nWITH RECURSIVE\r\n cnt(x) AS (\r\n SELECT 1\r\n UNION ALL\r\n SELECT x+1 FROM cnt\r\n LIMIT 1000000\r\n )\r\nSELECT x FROM cnt;\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/161/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 281110295, "node_id": "MDU6SXNzdWUyODExMTAyOTU=", "number": 173, "title": "I18n and L10n support", "user": {"value": 50138, "label": "janimo"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2017-12-11T17:49:58Z", "updated_at": "2021-04-26T12:10:01Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/173/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 282971961, "node_id": "MDU6SXNzdWUyODI5NzE5NjE=", "number": 175, "title": "Add project topic \"automatic-api\"", "user": {"value": 3179832, "label": "dbohdan"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-12-18T18:09:17Z", "updated_at": "2017-12-21T18:33:55Z", "closed_at": "2017-12-21T18:33:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi there! Could you add the ~~tag~~ topic `automatic-api` to your repository? I am [making a list](https://github.com/dbohdan/automatic-api) of all projects that automatically expose APIs to databases. (Your Show HN made me do it. :-) I knew about PostgREST and PostGraphQL, but it took adding Datasette to sell me on the concept.) They will be easier to discover if there is a standard GitHub tag, and `automatic-api` seems as good a candidate as any. Two projects [already use it](https://github.com/topics/automatic-api).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/175/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 285168503, "node_id": "MDU6SXNzdWUyODUxNjg1MDM=", "number": 176, "title": "Add GraphQL endpoint", "user": {"value": 173848, "label": "yozlet"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2017-12-29T23:21:01Z", "updated_at": "2020-04-21T14:16:24Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Would make it much easier to build React & similar frontends. Maybe with https://github.com/graphql-python/sanic-graphql ?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/176/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 289425975, "node_id": "MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw", "number": 181, "title": "add \"format sql\" button to query page, uses sql-formatter", "user": {"value": 1957344, "label": "bsmithgall"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2018-01-17T21:50:04Z", "updated_at": "2019-11-11T03:08:25Z", "closed_at": "2019-11-11T03:08:25Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/181", "body": "Cool project!\r\n\r\nThis fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/181/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 292011379, "node_id": "MDU6SXNzdWUyOTIwMTEzNzk=", "number": 184, "title": "500 from missing table name", "user": {"value": 222245, "label": "carlmjohnson"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2018-01-26T19:46:45Z", "updated_at": "2019-05-21T16:17:29Z", "closed_at": "2018-04-13T18:18:59Z", "author_association": "NONE", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/56623e48da5412b25fb39cc26b9c743b684dd968/datasette/app.py#L517-L519 throws an error if it gets an empty list back. Simplest solution is to write a helper func that just says \r\n\r\n```python\r\nresult = list(await self.execute(name, sql, params)\r\nif result:\r\n return result[0][0]\r\n```\r\n\r\nand use it anywhere `[0][0]` is now.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/184/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 299760684, "node_id": "MDU6SXNzdWUyOTk3NjA2ODQ=", "number": 185, "title": "Metadata should be a nested arbitrary KV store", "user": {"value": 222245, "label": "carlmjohnson"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2018-02-23T16:02:07Z", "updated_at": "2019-05-13T18:33:33Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I started using the metadata feature and was surprised to find that values are not inherited from the root object down to specific databases and tables. This makes metadata much less useful and requires a lot of pointless duplication.\r\n\r\nIdeally, metadata should allow arbitrary key-value pairs, and there should be a way of accessing metadata either in an inherited or non-inherited manner. Something like `metadata.page.key` vs. `metadata.this.key` might work as an interface.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/185/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 306811513, "node_id": "MDU6SXNzdWUzMDY4MTE1MTM=", "number": 186, "title": "proposal new option to disable user agents cache", "user": {"value": 47107, "label": "stefanocudini"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-03-20T10:42:20Z", "updated_at": "2018-03-21T09:07:22Z", "closed_at": "2018-03-21T01:28:31Z", "author_association": "NONE", "pull_request": null, "body": "I think it would be very useful for debugging an option of adding headers to http replies\r\n\r\n```\r\nCache-Control: no-cache \r\n```\r\n\r\nespecially in the html output", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/186/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 309033998, "node_id": "MDU6SXNzdWUzMDkwMzM5OTg=", "number": 187, "title": "Windows installation error", "user": {"value": 11855322, "label": "robmarkcole"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2018-03-27T16:04:37Z", "updated_at": "2019-06-15T21:44:23Z", "closed_at": "2019-06-15T21:44:23Z", "author_association": "NONE", "pull_request": null, "body": "On attempting install on a Win 7 PC with py 3.6.2 (Anaconda dist) I get the error:\r\n\r\n```\r\nCollecting uvloop>=0.5.3 (from Sanic==0.7.0->datasette)\r\n Downloading uvloop-0.9.1.tar.gz (1.8MB)\r\n 100% |\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6\u00a6| 1.8MB 12.8MB/s\r\n Complete output from command python setup.py egg_info:\r\n Traceback (most recent call last):\r\n File \"\", line 1, in \r\n File \"C:\\Users\\RCole\\AppData\\Local\\Temp\\pip-build-juakfqt8\\uvloop\\setup.py\r\n\", line 10, in \r\n raise RuntimeError('uvloop does not support Windows at the moment')\r\n RuntimeError: uvloop does not support Windows at the moment\r\n```\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/187/reactions\", \"total_count\": 4, \"+1\": 4, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 314665147, "node_id": "MDU6SXNzdWUzMTQ2NjUxNDc=", "number": 216, "title": "Bug: Sort by column with NULL in next_page URL", "user": {"value": 222245, "label": "carlmjohnson"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 15, "created_at": "2018-04-16T14:03:18Z", "updated_at": "2018-04-17T01:45:24Z", "closed_at": "2018-04-17T01:45:24Z", "author_association": "NONE", "pull_request": null, "body": "Copy-pasting from https://github.com/simonw/datasette/issues/189#issuecomment-381429213, since that issue is closed:\r\n\r\nI think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The `next_url` gets set by Datasette to:\r\n\r\nhttp://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial\r\n\r\nBut then None is interpreted literally and it tries to find a name with the middle initial \"None\" and ends up skipping ahead to O on page 2.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/216/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 319449852, "node_id": "MDU6SXNzdWUzMTk0NDk4NTI=", "number": 247, "title": "SQLite code decoupled from Datasette", "user": {"value": 11912854, "label": "jsancho-gpl"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-05-02T08:03:28Z", "updated_at": "2018-05-21T15:29:31Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I'm working on the possibility of use Datasette with other file formats that aren't SQLite, like files with [PyTables](https://github.com/PyTables/PyTables) format.\r\n\r\nIn order to accomplish that, I've started [a fork for decoupling the code related with SQLite](https://github.com/jsancho-gpl/datasette/tree/feature/db-type-plugin) and putting it in an external connector to allow future connectors for a lot of file formats.\r\n\r\nIt'd be nice if you could look at it and suggest improvements for a possible PR.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/247/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 322283067, "node_id": "MDU6SXNzdWUzMjIyODMwNjc=", "number": 254, "title": "Escaping named parameters in canned queries", "user": {"value": 247131, "label": "philroche"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2018-05-11T12:43:30Z", "updated_at": "2020-05-10T14:54:14Z", "closed_at": "2020-05-10T14:54:13Z", "author_association": "NONE", "pull_request": null, "body": "Thank you very much for this project.\r\n\r\nI have created some canned queries but some of the filters include a colon eg. \"com.ubuntu.cloud:server:18.04:amd64\". When saved these colons are parsed as named parameters. \r\n\r\nIs there a way to escape colons in a canned query?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/254/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 322741659, "node_id": "MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1", "number": 258, "title": "Add new metadata key persistent_urls which removes the hash from all database urls", "user": {"value": 247131, "label": "philroche"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-05-14T09:39:18Z", "updated_at": "2018-05-21T07:38:15Z", "closed_at": "2018-05-21T07:38:15Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/258", "body": "Add new metadata key \"persistent_urls\" which removes the hash from all database urls when set to \"true\"\r\n\r\nThis PR is just to gauge if this, or something like it, is something you would consider merging?\r\n\r\nI understand the reason why the substring of the hash is included in the url but\r\nthere are some use cases where the urls should persist across deployments. For bookmarks\r\nfor example or for scripts that use the JSON API.\r\n\r\nThis is the initial commit for this feature. Tests and documentation updates to follow.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/258/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 325553991, "node_id": "MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy", "number": 281, "title": "Reduces image size using Alpine + Multistage (re: #278)", "user": {"value": 487897, "label": "iMerica"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-05-23T05:27:05Z", "updated_at": "2018-05-26T02:10:38Z", "closed_at": "2018-05-26T02:10:38Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/281", "body": "Hey Simon! \r\n\r\nI got the image size down from 256MB to 110MB. \r\n\r\nSeems to be working okay, but you might want to test it a bit more.\r\n\r\nExample output of `docker run --rm -it datasette`\r\n```\r\nServe! files=() on port 8001\r\n[2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001\r\n[2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1]\r\n```\r\n\r\nRelated: https://github.com/simonw/datasette/issues/278\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/281/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 330826972, "node_id": "MDU6SXNzdWUzMzA4MjY5NzI=", "number": 308, "title": "Support extra Heroku apps:create options - region, space, team", "user": {"value": 78156, "label": "annapowellsmith"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2018-06-08T23:08:33Z", "updated_at": "2018-09-21T14:09:28Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/308/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 333238932, "node_id": "MDU6SXNzdWUzMzMyMzg5MzI=", "number": 316, "title": "datasette inspect takes a very long time on large dbs", "user": {"value": 132230, "label": "gavinband"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2018-06-18T11:56:27Z", "updated_at": "2019-05-11T18:26:25Z", "closed_at": "2019-05-11T18:26:25Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n\r\nI want to expose data in a very large sqlite database (~600Gb) to the web. I have used datasette with success on smaller test databases with the same schema - it works very well (thanks!). However, using the full db, both `datasette inspect` and `datasette serve` seem to hang or pause for a very long time (tens of minutes) on startup. Is this expected behaviour?\r\n\r\n(I noticed that the output of `datasette inspect` includes row counts for each table. Simply counting the rows in this db will take a long time (tens of millions of rows across each of ~10 tables), so I wondered if this is the source of the problem.)\r\n\r\nAny help on a workaround would be appreciated.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/316/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 334190959, "node_id": "MDU6SXNzdWUzMzQxOTA5NTk=", "number": 321, "title": "Wildcard support in query parameters", "user": {"value": 12617395, "label": "bsilverm"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3439337, "label": "0.23.1"}, "comments": 8, "created_at": "2018-06-20T18:03:56Z", "updated_at": "2018-06-21T17:00:10Z", "closed_at": "2018-06-21T04:55:26Z", "author_association": "NONE", "pull_request": null, "body": "I haven't found a way to get the wildcard (%) inserted automatically in to a query parameter. This would be useful for cases the query parameter is followed by a LIKE clause. Wrapping the parameter name using the wildcard character within the metadata file (ie - ...where xyz like %:querystring%) does not seem to work. Can this be made possible? Or if not, can the template be extended to provide a tip to the user that they need to insert the wildcard characters themselves?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/321/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 334592281, "node_id": "MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx", "number": 322, "title": "Feature/in operator", "user": {"value": 2691848, "label": "4e1e0603"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2018-06-21T17:41:51Z", "updated_at": "2018-06-21T17:45:25Z", "closed_at": "2018-06-21T17:45:25Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/322", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/322/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 339095976, "node_id": "MDU6SXNzdWUzMzkwOTU5NzY=", "number": 334, "title": "extra_options not passed to heroku publisher", "user": {"value": 719357, "label": "kamicut"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2018-07-06T23:26:12Z", "updated_at": "2018-07-24T04:53:21Z", "closed_at": "2018-07-10T01:46:04Z", "author_association": "NONE", "pull_request": null, "body": "I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369). \r\n\r\nAny help appreciated! ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/334/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 340396247, "node_id": "MDU6SXNzdWUzNDAzOTYyNDc=", "number": 339, "title": "Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way", "user": {"value": 12617395, "label": "bsilverm"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2018-07-11T20:38:06Z", "updated_at": "2022-03-21T22:22:40Z", "closed_at": "2022-03-21T22:22:34Z", "author_association": "NONE", "pull_request": null, "body": "Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/339/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 341123355, "node_id": "MDU6SXNzdWUzNDExMjMzNTU=", "number": 342, "title": "Requesting support for query description", "user": {"value": 12617395, "label": "bsilverm"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2018-07-13T18:50:16Z", "updated_at": "2018-07-24T04:53:21Z", "closed_at": "2018-07-16T02:33:54Z", "author_association": "NONE", "pull_request": null, "body": "It would be great if the metadata file allowed you to enter a description for the query. We have a lot of pre-defined queries that can only be so descriptive by their name. It would be nice if an optional description could be included underneath the name within the UI, or on hover where it currently shows the SQL.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/342/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 343728754, "node_id": "MDU6SXNzdWUzNDM3Mjg3NTQ=", "number": 346, "title": "Logo design for DATASETTE", "user": {"value": 35750428, "label": "ggabogarcia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2018-07-23T17:40:17Z", "updated_at": "2018-08-02T02:31:59Z", "closed_at": "2018-08-02T02:31:59Z", "author_association": "NONE", "pull_request": null, "body": "Hello :) , I'm a graphic designer, I'm interested in collaborating with open source projects, besides this helps me expand my portfolio. I would like to design a logo for your project. I will be happy to collaborate with you :). ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/346/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 352768017, "node_id": "MDU6SXNzdWUzNTI3NjgwMTc=", "number": 362, "title": "Add option to include/exclude columns in search filters", "user": {"value": 78156, "label": "annapowellsmith"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-08-22T01:32:08Z", "updated_at": "2020-11-03T19:01:59Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I have a dataset with many columns, of which only some are likely to be of interest for searching.\r\n\r\nIt would be great for usability if the search filters in the UI could be configured to include/exclude columns.\r\n\r\nSee also: https://github.com/simonw/datasette/issues/292", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/362/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 392610803, "node_id": "MDU6SXNzdWUzOTI2MTA4MDM=", "number": 391, "title": "Google Trends example doesn\u2019t work", "user": {"value": 229881, "label": "styfle"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2018-12-19T13:51:38Z", "updated_at": "2019-01-02T19:45:13Z", "closed_at": "2019-01-02T19:45:12Z", "author_association": "NONE", "pull_request": null, "body": "https://google-trends.datasettes.com/\r\n\r\nI see a cloud flare error. ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/391/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 395236066, "node_id": "MDU6SXNzdWUzOTUyMzYwNjY=", "number": 393, "title": "CSV export in \"Advanced export\" pane doesn't respect query", "user": {"value": 1727065, "label": "ltrgoddard"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2019-01-02T12:39:41Z", "updated_at": "2021-06-17T18:14:24Z", "closed_at": "2019-01-03T02:44:10Z", "author_association": "NONE", "pull_request": null, "body": "It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query.\r\n\r\nIt may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/393/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 397129564, "node_id": "MDU6SXNzdWUzOTcxMjk1NjQ=", "number": 397, "title": "Update official datasetteproject/datasette Docker container to SQLite 3.26.0", "user": {"value": 43564, "label": "claes"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-08T22:51:50Z", "updated_at": "2019-01-11T01:25:33Z", "closed_at": "2019-01-11T00:56:18Z", "author_association": "NONE", "pull_request": null, "body": "I try to start datasette on a database that contains the below view\r\n\r\nIt fails in a way that makes me think it does not support the window functions SQL syntax.\r\n\r\n\r\n```\r\ncreate view general_ledger as\r\nselect transactions.account_number, strftime(\"%Y-%m-%d\", verifications.verification_date) as verification_date, verifications.verification_number, verifications.verification_text, \r\ncase when transactions.centi_amount >= 0 and verifications.verification_number > 0 then printf(\"%.2f\", (transactions.centi_amount/100.0))\r\nend as debit,\r\ncase when transactions.centi_amount <= 0 and verifications.verification_number > 0 then printf(\"%.2f\", (transactions.centi_amount/100.0)) \r\nend as credit,\r\nprintf(\"%.2f\", sum(transactions.centi_amount) over (partition by transactions.account_number \r\n order by verifications.verification_number range between unbounded preceding and current row)/100.0)\r\nfrom verifications inner join transactions on transactions.verification_id = verifications.id \r\norder by transactions.account_number, verifications.verification_number;\r\n```\r\n\r\n```\r\ndocker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/ledger.db\r\nServe! files=('/mnt/ledger.db',) on port 8001\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/datasette\", line 11, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 722, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 697, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 1066, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 895, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.6/site-packages/click/core.py\", line 535, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/cli.py\", line 375, in serve\r\n ds.inspect()\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/app.py\", line 308, in inspect\r\n \"views\": inspect_views(conn),\r\n File \"/usr/local/lib/python3.6/site-packages/datasette/inspect.py\", line 30, in inspect_views\r\n return [v[0] for v in conn.execute('select name from sqlite_master where type = \"view\"')]\r\nsqlite3.DatabaseError: malformed database schema (general_ledger) - near \"over\": syntax error\r\n\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/397/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 400229984, "node_id": "MDU6SXNzdWU0MDAyMjk5ODQ=", "number": 401, "title": "How to pass configuration to plugins?", "user": {"value": 1055831, "label": "dazzag24"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-17T11:20:41Z", "updated_at": "2019-01-18T11:48:13Z", "closed_at": "2019-01-18T06:49:07Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nFirstly, thanks for your work on datasette, it is a hugely useful tool!\r\n\r\nI've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead.\r\n\r\nIt uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so:\r\n``` \r\nlet tiles = L.tileLayer.provider('Esri.WorldTopoMap');\r\n```\r\ninstead of the current:\r\n```\r\nlet tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {\r\n maxZoom: 19,\r\n detectRetina: true,\r\n attribution: '© OpenStreetMap contributors'\r\n }),\r\n ```\r\nHowever I've got stuck in trying to work out how to pass the provider string to the plugin.\r\nIn the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used.\r\n\r\nCan you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version?\r\n\r\nMany thanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/401/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 400511206, "node_id": "MDU6SXNzdWU0MDA1MTEyMDY=", "number": 403, "title": "How does persistence work?", "user": {"value": 1794527, "label": "ccorcos"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-01-17T23:41:57Z", "updated_at": "2019-01-19T05:47:55Z", "closed_at": "2019-01-18T06:51:14Z", "author_association": "NONE", "pull_request": null, "body": "I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/403/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 403922644, "node_id": "MDU6SXNzdWU0MDM5MjI2NDQ=", "number": 8, "title": "Problems handling column names containing spaces or - ", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-01-28T17:23:28Z", "updated_at": "2019-04-14T15:29:33Z", "closed_at": "2019-02-23T21:09:03Z", "author_association": "NONE", "pull_request": null, "body": "Irrrespective of whether using column names containing a space or - character is good practice, SQLite does allow it, but `sqlite-utils` throws an error in the following cases:\r\n\r\n```python\r\nfrom sqlite_utils import Database\r\n\r\ndbname = 'test.db'\r\nDB = Database(sqlite3.connect(dbname))\r\n\r\nimport pandas as pd\r\ndf = pd.DataFrame({'col1':range(3), 'col2':range(3)})\r\n\r\n#Convert pandas dataframe to appropriate list/dict format\r\nDB['test1'].insert_all( df.to_dict(orient='records') )\r\n#Works fine\r\n```\r\n\r\nHowever:\r\n\r\n```python\r\ndf = pd.DataFrame({'col 1':range(3), 'col2':range(3)})\r\nDB['test1'].insert_all(df.to_dict(orient='records'))\r\n```\r\n\r\nthrows:\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nOperationalError Traceback (most recent call last)\r\n in ()\r\n 1 import pandas as pd\r\n 2 df = pd.DataFrame({'col 1':range(3), 'col2':range(3)})\r\n----> 3 DB['test1'].insert_all(df.to_dict(orient='records'))\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \"1\": syntax error\r\n```\r\n\r\nand:\r\n\r\n```python\r\ndf = pd.DataFrame({'col-1':range(3), 'col2':range(3)})\r\nDB['test1'].upsert_all(df.to_dict(orient='records'))\r\n```\r\n\r\nresults in:\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nOperationalError Traceback (most recent call last)\r\n in ()\r\n 1 import pandas as pd\r\n 2 df = pd.DataFrame({'col-1':range(3), 'col2':range(3)})\r\n----> 3 DB['test1'].insert_all(df.to_dict(orient='records'))\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \"-\": syntax error\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 407174173, "node_id": "MDU6SXNzdWU0MDcxNzQxNzM=", "number": 408, "title": "Show metadata info (e.g. license, source) on custom SQL query pages", "user": {"value": 78356, "label": "stefanw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2019-02-06T10:43:34Z", "updated_at": "2019-10-14T03:53:22Z", "closed_at": "2019-10-14T03:53:22Z", "author_association": "NONE", "pull_request": null, "body": "Currently metadata info is not displayed on custom SQL pages.\r\n\r\nE.g. compare the footer of [this normal table page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/categories) with the footer [this custom SQL page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7?sql=select+*+from+categories).\r\n\r\nThis is important in order to adhere to attribution license requirements.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/408/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 408376825, "node_id": "MDU6SXNzdWU0MDgzNzY4MjU=", "number": 409, "title": "Zeit API v1 does not work for new users - need to migrate to v2", "user": {"value": 209967, "label": "michaelmcandrew"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-09T00:50:33Z", "updated_at": "2020-04-06T15:44:46Z", "closed_at": "2020-04-06T15:44:46Z", "author_association": "NONE", "pull_request": null, "body": "Hello there,\r\n\r\nThis looks like a great tool. Thanks. \r\n\r\nUnfortunately, I hit the following error:\r\n\r\n```\r\nmichael@hazel ~/src/cc-datasette/data/out datasette publish now cc-datasette.db\r\n> WARN! You are using an old version of the Now Platform. More: https://zeit.co/docs/v1-upgrade\r\n> Deploying /tmp/tmpjtrxwsyf/datasette under michaelmcandrew\r\n> Using project datasette\r\n> Error! You tried to create a Now 1.0 deployment. Please use Now 2.0 instead: https://zeit.co/upgrade\r\n```\r\nI'm guessing you might not hit this because you are not a 'new user' of Zeit (https://github.com/zeit/now-cli/issues/1805#issuecomment-452470953).\r\n\r\nWould it be a lot of work to upgrade to the new Zeit API, do you think?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/409/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 408518024, "node_id": "MDU6SXNzdWU0MDg1MTgwMjQ=", "number": 410, "title": "How to setup a multi database environment?", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-02-10T09:39:24Z", "updated_at": "2019-04-12T04:42:28Z", "closed_at": "2019-04-12T04:42:27Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nfirst of all I need to write that Simon Willison and datasette are really great.\r\n\r\nI have probably a stupid question, but it seems to me that I do not have the reply in the documentation.\r\n\r\nI have installed datasette and run it with `datasette mydb.db`, and I can reach it on `http://127.0.0.1:8001`.\r\n\r\nBut how to work with more than one db? Imagine I have ten sqlite databases, and that I need to explore/query these via datasette, how to run datasette? Is it possibile to create a sort of db index and than run `datasette serve myindex`?\r\n\r\nThank you", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/410/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 410384988, "node_id": "MDU6SXNzdWU0MTAzODQ5ODg=", "number": 411, "title": "How to pass named parameter into spatialite MakePoint() function", "user": {"value": 1055831, "label": "dazzag24"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-14T16:30:22Z", "updated_at": "2023-10-25T13:23:04Z", "closed_at": "2019-05-05T12:25:04Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\ndatasette version: \"0.26.2\"\r\nextensions: \r\n spatialite: \"4.4.0-RC0\"\r\nsqlite version: \"3.22.0\"\r\n\r\nI have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables:\r\n\r\n```\r\nconn.execute('SELECT InitSpatialMetadata(1)')\r\n\r\nconn.execute(\"SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);\")\r\n\r\nconn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||\"longitude\"||' '||\"latitude\"||')',4326);''')\r\n\r\nconn.execute(\"SELECT CreateSpatialIndex('airports', 'point_geom');\")\r\n```\r\n\r\nI'm attempting to create a canned query and have this in my metadata.json file:\r\n```\r\n\"find_airports_nearest_to_point\":{\r\n \"sql\":\"SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \\\"airports\\\" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;\"}\r\n```\r\nwhich doesn't seem to perform the templating of the name parameters correctly and I get no results. \r\n\r\nHave also tired:\r\n```\r\nMakePoint( || :Long || , || :Lat || )\r\n```\r\nwhich returns this error:\r\n```\r\nnear \"||\": syntax error\r\n```\r\n\r\nHowever I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported?\r\n\r\nThanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/411/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 411066700, "node_id": "MDU6SXNzdWU0MTEwNjY3MDA=", "number": 10, "title": "Error in upsert if column named 'order'", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-02-16T12:05:18Z", "updated_at": "2019-02-24T16:55:38Z", "closed_at": "2019-02-24T16:55:37Z", "author_association": "NONE", "pull_request": null, "body": "The following works fine:\r\n```\r\nconnX = sqlite3.connect('DELME.db', timeout=10)\r\n\r\ndfX=pd.DataFrame({'col1':range(3),'col2':range(3)})\r\nDBX = Database(connX)\r\nDBX['test'].upsert_all(dfX.to_dict(orient='records'))\r\n```\r\n\r\nBut if a column is named `order`:\r\n```\r\nconnX = sqlite3.connect('DELME.db', timeout=10)\r\n\r\ndfX=pd.DataFrame({'order':range(3),'col2':range(3)})\r\nDBX = Database(connX)\r\nDBX['test'].upsert_all(dfX.to_dict(orient='records'))\r\n```\r\n\r\nit throws an error:\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nOperationalError Traceback (most recent call last)\r\n in \r\n 3 dfX=pd.DataFrame({'order':range(3),'col2':range(3)})\r\n 4 DBX = Database(connX)\r\n----> 5 DBX['test'].upsert_all(dfX.to_dict(orient='records'))\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order)\r\n 347 foreign_keys=foreign_keys,\r\n 348 upsert=True,\r\n--> 349 column_order=column_order,\r\n 350 )\r\n 351 \r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \"order\": syntax error\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 411257981, "node_id": "MDU6SXNzdWU0MTEyNTc5ODE=", "number": 412, "title": "Linked Data(sette)", "user": {"value": 43340, "label": "sfkeller"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-02-18T00:38:14Z", "updated_at": "2019-03-19T10:09:46Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I've a radical feature idea (possible first as an extension in order to experiment?): \r\n\r\nI'd like to link to a remote table from a remote database, e.g. with a function \"linked_datasette()\". So one could do following query:\r\n```\r\nSELECT foo.id, foo.a, remote_party.b\r\nFROM foo\r\nJOIN linked_datasette(\"https://parlgov.datasettes.com/parlgov-b42a2f2\") AS remote_party \r\n ON foo.id=remote_party.id\r\n```\r\nThis is inspired by SPARQL's SERVICE keyword for remote RDF \"endpoints\".\r\n\r\nThere's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ).\r\n\r\nAnd here's an implementation from me in Postgres FDW to connect another Postgres \"endpoint\": https://pastebin.com/Fz2v64Cz .", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/412/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 418329842, "node_id": "MDU6SXNzdWU0MTgzMjk4NDI=", "number": 415, "title": "Add query parameter to hide SQL textarea", "user": {"value": 36796532, "label": "ad-si"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-03-07T14:11:30Z", "updated_at": "2019-03-15T09:30:57Z", "closed_at": "2019-03-15T05:22:43Z", "author_association": "NONE", "pull_request": null, "body": "It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/415/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 432727685, "node_id": "MDU6SXNzdWU0MzI3Mjc2ODU=", "number": 20, "title": "JSON column values get extraneously quoted ", "user": {"value": 649467, "label": "mhalle"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 4348046, "label": "1.0"}, "comments": 1, "created_at": "2019-04-12T20:15:30Z", "updated_at": "2019-05-25T00:57:19Z", "closed_at": "2019-05-25T00:57:19Z", "author_association": "NONE", "pull_request": null, "body": "If the input to `sqlite-utils insert` includes a column that is a JSON array or object, `sqlite-utils query` will introduce an extra level of quoting on output:\r\n\r\n```\r\n# echo '[{\"key\": [\"one\", \"two\", \"three\"]}]' | sqlite-utils insert t.db t -\r\n\r\n# sqlite-utils t.db 'select * from t'\r\n[{\"key\": \"[\\\"one\\\", \\\"two\\\", \\\"three\\\"]\"}]\r\n\r\n# sqlite3 t.db 'select * from t'\r\n[\"one\", \"two\", \"three\"]\r\n```\r\n\r\nThis might require an imperfect solution, since sqlite3 doesn't have a JSON type. Perhaps fields that start with `[\"` or `{\"` and end with `\"]` or `\"}` could be detected, with a flag to turn off that behavior for weird text fields (or vice versa).", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/20/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 435819321, "node_id": "MDU6SXNzdWU0MzU4MTkzMjE=", "number": 436, "title": "400 Error when trying to register new user via https://publish.datasettes.com/", "user": {"value": 317694, "label": "nniiicc"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-04-22T17:55:00Z", "updated_at": "2021-01-04T20:15:42Z", "closed_at": "2021-01-04T20:15:41Z", "author_association": "NONE", "pull_request": null, "body": "Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. \r\n\r\nReplicated: Chrome and Firefox; 2 different email accounts", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/436/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 448189298, "node_id": "MDU6SXNzdWU0NDgxODkyOTg=", "number": 486, "title": "Ability to add extra routes and related templates", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-05-24T14:04:25Z", "updated_at": "2019-05-24T14:43:28Z", "closed_at": "2019-05-24T14:43:09Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon\r\n\r\nThank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics).\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/486/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 449818897, "node_id": "MDU6SXNzdWU0NDk4MTg4OTc=", "number": 24, "title": "Additional Column Constraints?", "user": {"value": 98555, "label": "IgnoredAmbience"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2019-05-29T13:47:03Z", "updated_at": "2019-06-13T06:47:17Z", "closed_at": "2019-06-13T06:30:26Z", "author_association": "NONE", "pull_request": null, "body": "I'm looking to import data from XML with a pre-defined schema that maps fairly closely to a relational database.\r\nIn particular, it has explicit annotations for when fields are required, optional, or when a default value should be inferred.\r\n\r\nWould there be value in adding the ability to define `NOT NULL` and `DEFAULT` column constraints to sqlite-utils?", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/24/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 450862577, "node_id": "MDU6SXNzdWU0NTA4NjI1Nzc=", "number": 496, "title": "Additional options to gcloud build command in cloudrun - timeout", "user": {"value": 1740337, "label": "costrouc"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-05-31T15:43:55Z", "updated_at": "2019-05-31T23:05:05Z", "closed_at": "2019-05-31T23:05:05Z", "author_association": "NONE", "pull_request": null, "body": "I am trying to deploy a 3.1 GB dataset to cloudrun with datasette. Currrently the docker build times out. Would be nice to have a timeout flag or additional gcloud commands that could be specified. \r\n\r\nHere is the line https://github.com/simonw/datasette/blob/f825e2012109247fa246e2b938f8174069e574f1/datasette/publish/cloudrun.py#L78\r\n\r\nI would be happy to submit a PR to allow for a timeout option. What are your ideas of allowing the user additional build publishing flag options?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/496/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 451513541, "node_id": "MDU6SXNzdWU0NTE1MTM1NDE=", "number": 498, "title": "Full text search of all tables at once?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2019-06-03T14:24:43Z", "updated_at": "2020-05-30T17:26:02Z", "closed_at": "2020-05-30T17:26:02Z", "author_association": "NONE", "pull_request": null, "body": "Does datasette have a built-in way, in a browser, to do a full-text search of all columns, in all databases and tables, that have full-text search enabled? Is there a plugin that does this?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/498/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 451585764, "node_id": "MDU6SXNzdWU0NTE1ODU3NjQ=", "number": 499, "title": "Accessibility for non-techie newsies? ", "user": {"value": 7936571, "label": "chrismp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-03T16:49:37Z", "updated_at": "2019-06-05T21:22:55Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi again, I'm having fun uploading datasets to Heroku via datasette. I'd like to set up datasette so that it's easy for other newsroom workers, who don't use Linux and aren't programmers, to upload datasets. Does datsette provide this out-of-the-box, or as a plugin? ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/499/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 453131917, "node_id": "MDU6SXNzdWU0NTMxMzE5MTc=", "number": 502, "title": "Exporting sqlite database(s)?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-06T16:39:53Z", "updated_at": "2021-04-03T05:16:54Z", "closed_at": "2019-06-11T18:50:42Z", "author_association": "NONE", "pull_request": null, "body": "I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/502/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 453243459, "node_id": "MDU6SXNzdWU0NTMyNDM0NTk=", "number": 503, "title": "Handle SQLite databases with spaces in their names?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": null, "comments": 1, "created_at": "2019-06-06T21:20:59Z", "updated_at": "2019-11-04T23:16:30Z", "closed_at": "2019-11-04T23:16:30Z", "author_association": "NONE", "pull_request": null, "body": "I named my SQLite database \"Government workers\" and published it to Heroku. When I clicked the \"Government workers\" database online it lead to a 404 page: `Database not found: Government%20workers`.\r\n\r\nI believe this is because the database name has a space.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/503/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 457147936, "node_id": "MDU6SXNzdWU0NTcxNDc5MzY=", "number": 512, "title": "\"about\" parameter in metadata does not appear when alone", "user": {"value": 7936571, "label": "chrismp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-17T21:04:20Z", "updated_at": "2019-10-11T15:49:13Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Here's an example of metadata I have for one database on datasette.\r\n\r\n```\r\n\"Records-requests\": {\r\n\t\"tables\": {\r\n\t\t\"Some table\": {\r\n\t\t\t\"about\": \"This table has data.\"\r\n\t\t}\r\n\t}\r\n}\r\n```\r\n\r\nThe text in `about` does not show up when I publish the data. But it shows up after I add a `\"source\"` parameter in the metadata.\r\n\r\nIs this intended?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/512/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 457201907, "node_id": "MDU6SXNzdWU0NTcyMDE5MDc=", "number": 513, "title": "Is it possible to publish to Heroku despite slug size being too large?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-06-18T00:12:02Z", "updated_at": "2019-06-21T22:35:54Z", "closed_at": "2019-06-21T22:35:54Z", "author_association": "NONE", "pull_request": null, "body": "I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command.\r\n\r\n Compiled slug size: 535.5M is too large (max is 500M).\r\n\r\nCan I publish the databases and make datasette work on Heroku despite the large slug size?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/513/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 459397625, "node_id": "MDU6SXNzdWU0NTkzOTc2MjU=", "number": 514, "title": "Documentation with recommendations on running Datasette in production without using Docker", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5971510, "label": "Datasette 0.50"}, "comments": 27, "created_at": "2019-06-21T22:48:12Z", "updated_at": "2020-10-08T23:55:53Z", "closed_at": "2020-10-08T23:33:05Z", "author_association": "NONE", "pull_request": null, "body": "I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette. \r\n\r\nSo instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code.\r\n\r\n`nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &`.\r\n\r\nWhen I go to `http://my-remote-server.com:8000`, the site loads. But I know this is not a good long-term solution to running datasette on this server. \r\n\r\nWhat is the \"correct\" way to have this site run, preferably on server port 80?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/514/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 459882902, "node_id": "MDU6SXNzdWU0NTk4ODI5MDI=", "number": 526, "title": "Stream all results for arbitrary SQL and canned queries", "user": {"value": 50578294, "label": "matej-fr"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 23, "created_at": "2019-06-24T13:09:45Z", "updated_at": "2022-09-28T04:01:25Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I think that there is a difficulty with canned queries.\r\n\r\nWhen I want to stream all results of a canned query TwoDays I get only first 1.000 records.\r\n\r\nExample:\r\n`http://myserver/history_sample/two_days.csv?_stream=on`\r\n\r\nreturns only first 1.000 records.\r\n\r\nIf I do the same with the whole database i.e.\r\n`http://myserver/history_sample/database.csv?_stream=on`\r\n\r\nI get correctly all records.\r\n\r\nAny ideas?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/526/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 459936585, "node_id": "MDU6SXNzdWU0NTk5MzY1ODU=", "number": 527, "title": "Unable to use rank when fts-table generated with csvs-to-sqlite", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-24T14:49:48Z", "updated_at": "2019-06-24T15:21:18Z", "closed_at": "2019-06-24T15:09:10Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon.\r\n\r\nIf i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.:\r\n\r\ndatasette, version 0.28\r\nsqlite-utils, version 1.2.1\r\ncsvs-to-sqlite, version 0.9\r\n\r\nNo column named rank with these commands:\r\n$ csvs-to-sqlite minutes.csv minutes.db -f text_data\r\n$ datasette -i minutes.db\r\nselect rank, * from minutes_fts where minutes_fts match 'dog'\r\n\r\nEverything ok with these commands:\r\n$ csvs-to-sqlite minutes.csv minutes.db\r\n$ sqlite-utils enable-fts minutes.db text_data\r\n$ datasette -i minutes.db\r\nselect rank, * from minutes_fts where minutes_fts match 'dog'\r\n\r\nAm I doing something wrong?\r\n\r\nThank you for a great application!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/527/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 460396952, "node_id": "MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2", "number": 529, "title": "Use keyed rows - fixes #521", "user": {"value": 1383872, "label": "nathancahill"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-06-25T12:33:48Z", "updated_at": "2019-06-25T12:35:07Z", "closed_at": "2019-06-25T12:35:07Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/529", "body": "Supports template syntax like this:\r\n\r\n```\r\n{% for row in display_rows %}\r\n

{{ row[\"First_Name\"] }} {{ row[\"Last_Name\"] }}

\r\n ...\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/529/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 467218270, "node_id": "MDU6SXNzdWU0NjcyMTgyNzA=", "number": 558, "title": "Support unicode in url", "user": {"value": 380586, "label": "0x1997"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-07-12T04:43:24Z", "updated_at": "2019-07-15T01:29:30Z", "closed_at": "2019-07-14T02:49:33Z", "author_association": "NONE", "pull_request": null, "body": "Hi, I defined some custom queries in my `metadata.json`. There are Chinese characters in the names of the queries. So the urls are like `http://127.0.0.1:8001/mydb/\u6d4b\u8bd5\u67e5\u8be2`.\r\nWhen opening such urls, datasette will throw an exception.\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 100, in __call__\r\n return await view(new_scope, receive, send)\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 172, in view\r\n request, **scope[\"url_route\"][\"kwargs\"]\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py\", line 267, in get\r\n request, database, hash, correct_hash_provided, **kwargs\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py\", line 471, in view_get\r\n for key in self.ds.renderers.keys()\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py\", line 471, in \r\n for key in self.ds.renderers.keys()\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/__init__.py\", line 655, in path_with_format\r\n path = request.path\r\n File \"/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 49, in path\r\n self.scope.get(\"raw_path\", self.scope[\"path\"].encode(\"latin-1\"))\r\nUnicodeEncodeError: 'latin-1' codec can't encode characters in position 9-11: ordinal not in range(256)\r\n```\r\nThis used to work when datasette was based on sanic.\r\n\r\nBtw, thanks for the great work!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/558/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 472429048, "node_id": "MDU6SXNzdWU0NzI0MjkwNDg=", "number": 9, "title": "Too many SQL variables", "user": {"value": 166463, "label": "tholo"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-07-24T18:24:17Z", "updated_at": "2019-07-26T10:01:05Z", "closed_at": "2019-07-26T10:01:05Z", "author_association": "NONE", "pull_request": null, "body": "Decided to try importing my data, and ran into this:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/tholo/Source/health/bin/healthkit-to-sqlite\", line 10, in \r\n sys.exit(cli())\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py\", line 50, in cli\r\n convert_xml_to_sqlite(fp, db, progress_callback=bar.update)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py\", line 41, in convert_xml_to_sqlite\r\n write_records(records, db)\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py\", line 80, in write_records\r\n column_order=[\"startDate\", \"endDate\", \"value\", \"unit\"],\r\n File \"/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py\", line 911, in insert_all\r\n result = self.db.conn.execute(sql, values)\r\nsqlite3.OperationalError: too many SQL variables\r\n```\r\n\r\nAdded some debug output in sqlite_utils/db.py, which resulted in:\r\n\r\n```\r\n INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ,\r\n (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\r\n ;\r\n```\r\n\r\nwith the attached data:\r\n\r\n```\r\n['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user \"\"\"\r\nid=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056']\r\n```", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 473307794, "node_id": "MDU6SXNzdWU0NzMzMDc3OTQ=", "number": 565, "title": "Conflict between datasette and uvicorn click versions", "user": {"value": 440503, "label": "jonheslop"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-07-26T11:13:40Z", "updated_at": "2020-10-02T00:09:55Z", "closed_at": "2020-10-02T00:09:55Z", "author_association": "NONE", "pull_request": null, "body": "Hello Datasette is awesome thanks so much!\r\n\r\nI not very familiar with Python but I think there is a problem with datasette docker builds \r\n\r\nI keep getting this error\r\n\r\n```\r\nERROR: uvicorn 0.8.4 has requirement click==7.*, but you'll have click 6.0 which is incompatible.\r\nERROR: datasette 0.29.2 has requirement click~=7.0, but you'll have click 6.0 which is incompatible.\r\n```\r\n\r\nThe full log from the docker build is here - https://gist.github.com/jonheslop/e01cd322e761cfaf34f0cb83f86411b0\r\n\r\nJust in case it\u2019s helpful this is my setup - https://github.com/dotwatcher/dotwatcher-data", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/565/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 476437213, "node_id": "MDU6SXNzdWU0NzY0MzcyMTM=", "number": 566, "title": "Unexpected keyword argument 'hidden'", "user": {"value": 8330931, "label": "dvot197007"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-08-03T10:07:57Z", "updated_at": "2019-08-03T16:13:36Z", "closed_at": "2019-08-03T16:13:36Z", "author_association": "NONE", "pull_request": null, "body": "I couldn't get a test example running. I am running python 3.6.8 and tried both windows and windows subsystem for linux, getting the same error. My test.db was created by converting a five line csv file with csvs-to-sqlite. The csv file is:\r\ncol1, col2, col3\r\n1,2,3\r\n4,5,6\r\n7,8,9\r\n10,11,12\r\nHere is the error message:\r\n(myvenv) davido@DESKTOP-L29G79U:~/dot/datasette-eg$ datasette test.db Traceback (most recent call last): File \"/home/davido/dot/datasette-eg/myvenv/bin/datasette\", line 7, in from datasette.cli import cli File \"/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/datasette/cli.py\", line 2, in import uvicorn File \"/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/__init__.py\", line 2, in from uvicorn.main import Server, main, run File \"/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/main.py\", line 224, in headers: typing.List[str], File \"/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/decorators.py\", line 170, in decorator _param_memo(f, OptionClass(param_decls, **attrs)) File \"/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/core.py\", line 1430, in __init__ Parameter.__init__(self, param_decls, type=type, **attrs) TypeError: __init__() got an unexpected keyword argument 'hidden' \r\n\r\nThanks.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/566/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 476852861, "node_id": "MDU6SXNzdWU0NzY4NTI4NjE=", "number": 568, "title": "Add database_color as a configurable option", "user": {"value": 50906992, "label": "LBHELewis"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-08-05T13:14:45Z", "updated_at": "2023-08-11T05:19:42Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "This would be really useful as it would allow us to tie in with colour schemes.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/568/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 480961330, "node_id": "MDU6SXNzdWU0ODA5NjEzMzA=", "number": 54, "title": "Ability to list views, and to access db[\"view_name\"].rows / rows_where / etc", "user": {"value": 20264, "label": "ftrain"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2019-08-15T02:00:28Z", "updated_at": "2019-08-23T12:41:09Z", "closed_at": "2019-08-23T12:20:15Z", "author_association": "NONE", "pull_request": null, "body": "The docs show me how to create a view via `db.create_view()` but I can't seem to get back to that view post-creation; if I query it as a table it returns `None`, and it doesn't appear in the table listing, even though querying the view works fine from inside the sqlite3 command-line.\r\n\r\nIt'd be great to have the view as a pseudo-table, or if the python/sqlite3 module makes that hard to pull off (I couldn't figure it out), to have that edge-case documented next to the `db.create_view()` docs.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/54/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 491219910, "node_id": "MDU6SXNzdWU0OTEyMTk5MTA=", "number": 61, "title": "importing CSV to SQLite as library", "user": {"value": 17739, "label": "witeshadow"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-09-09T17:12:40Z", "updated_at": "2019-11-04T16:25:01Z", "closed_at": "2019-11-04T16:25:01Z", "author_association": "NONE", "pull_request": null, "body": "CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 494685791, "node_id": "MDU6SXNzdWU0OTQ2ODU3OTE=", "number": 574, "title": "Improve usage description of --host option", "user": {"value": 132978, "label": "terrycojones"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-09-17T15:12:12Z", "updated_at": "2019-11-01T21:58:17Z", "closed_at": "2019-11-01T21:57:54Z", "author_association": "NONE", "pull_request": null, "body": "It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for \"example running datasette on a host\" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/574/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 499954048, "node_id": "MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx", "number": 578, "title": "Added support for multi arch builds", "user": {"value": 887095, "label": "heussd"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-09-29T18:43:03Z", "updated_at": "2019-11-13T19:13:15Z", "closed_at": "2019-11-13T19:13:15Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/578", "body": "Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/578/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 500783373, "node_id": "MDU6SXNzdWU1MDA3ODMzNzM=", "number": 62, "title": "[enhancement] Method to delete a row in python", "user": {"value": 4454869, "label": "Sergeileduc"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2019-10-01T09:45:47Z", "updated_at": "2019-11-04T16:30:34Z", "closed_at": "2019-11-04T16:18:18Z", "author_association": "NONE", "pull_request": null, "body": "Hi !\r\nThanks for the lib !\r\n\r\nObviously, every possible sql queries won't have a dedicated method.\r\n\r\nBut I was thinking : a method to delete a row (I'm terrible with names, maybe `delete_where()` or something, would be useful.\r\n\r\nI have a Database, with primary key.\r\n\r\nFor the moment, I use :\r\n\r\n```Python3\r\ndb.conn.execute(f\"DELETE FROM table WHERE key = {key_id}\")\r\ndb.conn.commit()\r\n```\r\nto delete a row I don't need anymore, giving his primary key.\r\n\r\nWorks like a charm.\r\n\r\nJust an idea :\r\n\r\n```Python3\r\ntable.delete_where_pkey({'key': key_id})\r\n```\r\nor something (I know, I'm terrible at naming methods...).\r\n\r\nPros : well, no need to write SQL query.\r\n\r\nCons : WHERE normally allows to do many more things (operators =, <>, >, <, BETWEEN), not to mention AND, OR, etc...\r\nMethod is maybe to specific, and/or a pain to render more flexible.\r\n\r\nAgain, just a thought. Writing his own sql works too, so...\r\n\r\nThanks again.\r\nSee yah.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/62/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 504238461, "node_id": "MDU6SXNzdWU1MDQyMzg0NjE=", "number": 6, "title": "sqlite3.OperationalError: table users has no column named bio", "user": {"value": 1055831, "label": "dazzag24"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-10-08T19:39:52Z", "updated_at": "2019-10-13T05:31:28Z", "closed_at": "2019-10-13T05:30:19Z", "author_association": "NONE", "pull_request": null, "body": "```\r\n$ github-to-sqlite repos github.db\r\n$ github-to-sqlite starred github.db dazzag24\r\n\r\nTraceback (most recent call last):\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite\", line 10, in \r\n sys.exit(cli())\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py\", line 106, in starred\r\n utils.save_stars(db, user, stars)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py\", line 177, in save_stars\r\n user_id = save_user(db, user)\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py\", line 61, in save_user\r\n return db[\"users\"].upsert(to_save, pk=\"id\").last_pk\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py\", line 1067, in upsert\r\n extracts=extracts,\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py\", line 916, in insert\r\n extracts=extracts,\r\n File \"/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py\", line 1024, in insert_all\r\n result = self.db.conn.execute(sql, values)\r\nsqlite3.OperationalError: table users has no column named bio\r\n\r\n```\r\n\r\n```\r\n$ pipenv graph\r\ngithub-to-sqlite==0.4\r\n - requests [required: Any, installed: 2.22.0]\r\n - certifi [required: >=2017.4.17, installed: 2019.9.11]\r\n - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4]\r\n - idna [required: >=2.5,<2.9, installed: 2.8]\r\n - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6]\r\n - sqlite-utils [required: ~=1.11, installed: 1.11]\r\n - click [required: Any, installed: 7.0]\r\n - click-default-group [required: Any, installed: 1.2.2]\r\n - click [required: Any, installed: 7.0]\r\n - tabulate [required: Any, installed: 0.8.5]\r\n\r\nPython 3.6.8\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 504720731, "node_id": "MDU6SXNzdWU1MDQ3MjA3MzE=", "number": 1, "title": "Add more details on how to request data from google takeout correctly.", "user": {"value": 1055831, "label": "dazzag24"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2019-10-09T15:17:34Z", "updated_at": "2019-10-09T15:17:34Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now:\r\n\r\n- My Activity\r\n- Location History\r\n\r\nIn addition unless you specify that \"My Activity\" is downloaded in JSON format the default is HTML. This then causes the \r\n\r\n`google-takeout-to-sqlite my-activity takeout.db takeout.zip`\r\n\r\ncommand to fail as it only contains html files not json files.\r\n\r\nThanks", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 505512251, "node_id": "MDU6SXNzdWU1MDU1MTIyNTE=", "number": 588, "title": "Queries per DB table in metadata.json", "user": {"value": 12617395, "label": "bsilverm"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-10T21:08:19Z", "updated_at": "2019-10-21T12:58:22Z", "closed_at": "2019-10-21T01:48:42Z", "author_association": "NONE", "pull_request": null, "body": "It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries:\r\n\r\n` \"databases\": {\r\n \"MYDB\": {\r\n \"tables\": {\r\n \"MYFIRSTTABLE\": {\r\n \"source\": \"Test\",\r\n \"source_url\": \"https://www.google.com\",\r\n \"queries\": {\r\n \"Query 1\": {\r\n \"sql\": \"select * from MYFIRSTTABLE\",\r\n \"title\": \"Query 1\",\r\n \"description\": \"This is the first query\"\r\n },\r\n }\r\n },\r\n \"MYSECONDTABLE\": {\r\n \"source\":\"Test2\",\r\n \"source_url\":\"https://www.google.com\",\r\n \"queries\": {\r\n \"Query 2\" : {\r\n \"sql\":\"select * from MYSECONDTABLE;\",\r\n \"title\": \"Query 2\",\r\n \"description\":\"This is the second query\"\r\n }\r\n }\r\n }\r\n }`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/588/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506183241, "node_id": "MDU6SXNzdWU1MDYxODMyNDE=", "number": 593, "title": "make uvicorn optional dependancy (because not ok on windows python yet)", "user": {"value": 4312421, "label": "stonebig"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-12T12:51:07Z", "updated_at": "2019-10-13T06:22:08Z", "closed_at": "2019-10-13T06:22:07Z", "author_association": "NONE", "pull_request": null, "body": "would it be possible to:\r\n- remove uvicorn mandatory dependancy ?\r\n- eventually make a fallback to hypercorn ?\r\n\r\nreason:\r\n- uvloop not yet supported on Windows/Python-3.8 and below, may happen with Python-3.9 only.\r\n- it seems a 6 lines effort (but I'm not expert)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/593/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506297048, "node_id": "MDU6SXNzdWU1MDYyOTcwNDg=", "number": 594, "title": "upgrade to uvicorn-0.9 to be Python-3.8 friendly", "user": {"value": 4312421, "label": "stonebig"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-13T09:23:43Z", "updated_at": "2019-11-12T04:47:04Z", "closed_at": "2019-11-12T04:47:04Z", "author_association": "NONE", "pull_request": null, "body": "uvicorn-0.8 relies on websockets-0.7 which lacks python-3.8 compatiblity", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/594/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 506300941, "node_id": "MDExOlB1bGxSZXF1ZXN0MzI3NTQxMDQ2", "number": 595, "title": "bump uvicorn to 0.9.0 to be Python-3.8 friendly", "user": {"value": 4312421, "label": "stonebig"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2019-10-13T10:00:04Z", "updated_at": "2019-11-12T04:46:48Z", "closed_at": "2019-11-12T04:46:48Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/595", "body": "as uvicorn-0.9 is needed to get websockets-8.0.2, which is needed to have Python-3.8 compatibility", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/595/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 508100844, "node_id": "MDU6SXNzdWU1MDgxMDA4NDQ=", "number": 598, "title": "Character encoding bug with CSV export", "user": {"value": 46313, "label": "JoeGermuska"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-10-16T21:09:30Z", "updated_at": "2021-06-17T18:13:20Z", "closed_at": "2019-10-18T22:52:21Z", "author_association": "NONE", "pull_request": null, "body": "I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error:\r\n\r\n```\r\n'latin-1' codec can't encode character '\\u2019' in position 27: ordinal not in range(256)\r\n```\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/598/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 510076368, "node_id": "MDU6SXNzdWU1MTAwNzYzNjg=", "number": 605, "title": "Support queries at the table level", "user": {"value": 12617395, "label": "bsilverm"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-10-21T15:58:30Z", "updated_at": "2019-10-30T18:55:37Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/605/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 512996469, "node_id": "MDU6SXNzdWU1MTI5OTY0Njk=", "number": 607, "title": "Ways to improve fuzzy search speed on larger data sets?", "user": {"value": 8431341, "label": "zeluspudding"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2019-10-27T17:31:37Z", "updated_at": "2019-11-07T03:38:10Z", "closed_at": "2019-11-07T03:38:10Z", "author_association": "NONE", "pull_request": null, "body": "I have an sqlite table with 16 million rows in it. Having read @simonw article \"[Fast Autocomplete Search for Your Website](https://24ways.org/2018/fast-autocomplete-search-for-your-website/)\" I was curious to try datasette to see what kind of query performance I could get out of it. In truth I don't need to do full text search since all I would like to do is give my users a way to search for the names of investors such as \"Warren Buffet\", or \"Tim Cook\" (who's names are in a single column).\r\n\r\nOn the first search, Datasette takes over 20 seconds to return all records associated with `elon musk`:\r\n\r\n> ![image](https://user-images.githubusercontent.com/8431341/67638889-a86e1100-f8b7-11e9-9f7e-a9d13a42e988.png)\r\n\r\n> ![image](https://user-images.githubusercontent.com/8431341/67638825-ed457800-f8b6-11e9-94d1-b44f1a40ee8c.png)\r\n\r\nIf I rerun the same search, it then takes almost 9 seconds:\r\n> ![image](https://user-images.githubusercontent.com/8431341/67638908-e4a17180-f8b7-11e9-9d00-748c80ef1f21.png)\r\n\r\nThat's far to slow to implement an autocomplete feature. I could reduce the latency by making a special table of only unique investor names, thereby reducing the search space to less than a million rows (then I'd need to implement a way to add only new investor names to the table as I received new data.. about 4,000 rows a day). If I did that, I'm still concerned the new table wouldn't be lean enough to lookup investor names quickly. Plus, even if I can implement the autocomplete feature, I would still finally have to lookup records for that investors which would take between 8 - 20 seconds. \r\n\r\nAre there any tricks for speeding this up?\r\n\r\nHere's my hardware:\r\n> ![image](https://user-images.githubusercontent.com/8431341/67638861-55945980-f8b7-11e9-96a8-ca76c7c68c5d.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/607/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 518506242, "node_id": "MDU6SXNzdWU1MTg1MDYyNDI=", "number": 616, "title": "Datasette FTS detection bug", "user": {"value": 49656826, "label": "null92"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-11-06T14:25:47Z", "updated_at": "2019-11-08T15:31:33Z", "closed_at": "2019-11-08T02:06:56Z", "author_association": "NONE", "pull_request": null, "body": "I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here:\r\nWith search: http://teste-templates.herokuapp.com/amazonia_protege/car\r\nWithout search: http://bases.vortex.media/amazonia_protege/car\r\n![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/616/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 527670799, "node_id": "MDU6SXNzdWU1Mjc2NzA3OTk=", "number": 639, "title": "updating metadata.json without recreating the app", "user": {"value": 172847, "label": "pkoppstein"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2019-11-24T09:19:53Z", "updated_at": "2019-11-30T06:08:50Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I've sucessfully \"uploaded\" an SQLite database (with a metadata.json file) to heroku using:\r\n\r\n $ datasette publish heroku so-sales.db -m metadata.json -n so-sales\r\n\r\nThe question is: how can I modify the (small) metadata.json file without having to upload the (large) SQLite database.\r\n\r\nThe directions on heroku indicate I should run:\r\n\r\n heroku git:clone -a so-sales\r\n\r\nBut this just results in an empty directory with a warning:\r\nwarning: You appear to have cloned an empty repository.\r\n\r\nI've been able to \"clone\" the heroku \"app\" using the command:\r\n\r\n $ heroku slugs:download -a so-sales\r\n\r\nbut this is not a git repository....\r\n\r\nIdeally, it seems to me, there'd be an option of the `datasette` CLI to allow a file\r\nto be updated, or there'd be some way to create a local git \"clone\" of the app\r\nso that the heroku instructions for \"Deploying with git\" would apply.\r\n\r\n(p.s. I ran `datasette publish heroku -m metadata.json -n so-sales`\r\nin the hope that that would not cause the .db file to be wiped, but of course\r\nit was.)\r\n\r\n(p.p.s. Thanks for Datasette!)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/639/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 531502365, "node_id": "MDU6SXNzdWU1MzE1MDIzNjU=", "number": 646, "title": "Make database level information from metadata.json available in the index.html template", "user": {"value": 18017473, "label": "lagolucas"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 3, "created_at": "2019-12-02T19:55:10Z", "updated_at": "2022-03-15T20:50:34Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Did a search on the issues here and didn't find anything related to what I want.\r\n\r\nI want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page.\r\n\r\nI tried some small tweaks on the python and html files, but failed to get that result.\r\n\r\nIs there a way? Thanks!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/646/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 534507142, "node_id": "MDU6SXNzdWU1MzQ1MDcxNDI=", "number": 69, "title": "Feature request: enable extensions loading", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-08T08:06:25Z", "updated_at": "2022-02-05T00:04:25Z", "closed_at": "2020-10-16T18:42:49Z", "author_association": "NONE", "pull_request": null, "body": "Hi, it would be great to add a parameter that enables the load of a sqlite extension you need.\r\n\r\nSomething like \"-ext modspatialite\".\r\n\r\nIn this way your great tool would be even more comfortable and powerful.\r\n\r\n\r\nThank you very much", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 539204432, "node_id": "MDU6SXNzdWU1MzkyMDQ0MzI=", "number": 70, "title": "Implement ON DELETE and ON UPDATE actions for foreign keys", "user": {"value": 26292069, "label": "LucasElArruda"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-12-17T17:19:10Z", "updated_at": "2020-02-27T04:18:53Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 539590148, "node_id": "MDU6SXNzdWU1Mzk1OTAxNDg=", "number": 651, "title": "fts5 syntax error when using punctuation", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-12-18T10:25:35Z", "updated_at": "2021-07-14T19:26:06Z", "closed_at": "2019-12-30T06:42:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon\r\n\r\nI get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' \"enable-fts\"-command.\r\n\r\nThe same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.)\r\n\r\nWhat am I doing wrong? Many thanks!\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/651/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 539985017, "node_id": "MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx", "number": 652, "title": "Quick (and uninformed and perhaps misguided) attempt to add a url for hosting datasette at a particular host/URI", "user": {"value": 132978, "label": "terrycojones"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-12-18T23:37:16Z", "updated_at": "2020-03-24T22:14:50Z", "closed_at": "2020-03-24T22:14:50Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/652", "body": "As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped.\r\n\r\nBUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases.\r\n\r\nMy changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!)\r\n\r\nI'm happy to be more thorough on this if you think it's worth pursuing.\r\n\r\nFixes #394 (he said, optimistically).", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/652/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 541274681, "node_id": "MDU6SXNzdWU1NDEyNzQ2ODE=", "number": 2, "title": "Add linkedin-to-sqlite", "user": {"value": 881925, "label": "mnp"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2019-12-21T03:13:40Z", "updated_at": "2019-12-21T03:13:40Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "There is an API available. https://developer.linkedin.com/docs/rest-api#\r\n\r\nAt the minimum, I would think contact list and messages would be of interest.", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 544571092, "node_id": "MDU6SXNzdWU1NDQ1NzEwOTI=", "number": 15, "title": "Assets table with downloads", "user": {"value": 2029, "label": "garethr"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5225818, "label": "1.0"}, "comments": 4, "created_at": "2020-01-02T13:05:28Z", "updated_at": "2020-03-28T12:17:01Z", "closed_at": "2020-03-23T19:17:32Z", "author_association": "NONE", "pull_request": null, "body": "The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful?\r\nIf so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 545407916, "node_id": "MDU6SXNzdWU1NDU0MDc5MTY=", "number": 73, "title": "upsert_all() throws issue when upserting to empty table", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-01-05T11:58:57Z", "updated_at": "2020-01-31T14:21:09Z", "closed_at": "2020-01-05T17:20:18Z", "author_association": "NONE", "pull_request": null, "body": "If I try to add a list of `dict`s to an empty table using `upsert_all`, I get an error:\r\n\r\n```python\r\nimport sqlite3\r\nfrom sqlite_utils import Database\r\nimport pandas as pd\r\n\r\nconx = sqlite3.connect(':memory')\r\ncx = conx.cursor()\r\ncx.executescript('CREATE TABLE \"test\" (\"Col1\" TEXT);')\r\n\r\nq=\"SELECT * FROM test;\"\r\npd.read_sql(q, conx) #shows empty table\r\n\r\ndb = Database(conx)\r\ndb['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])\r\n\r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\n in \r\n 1 db = Database(conx)\r\n----> 2 db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])\r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts)\r\n 1157 alter=alter,\r\n 1158 extracts=extracts,\r\n-> 1159 upsert=True,\r\n 1160 )\r\n 1161 \r\n\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert)\r\n 1040 sql = \"INSERT OR IGNORE INTO [{table}]({pks}) VALUES({pk_placeholders});\".format(\r\n 1041 table=self.name,\r\n-> 1042 pks=\", \".join([\"[{}]\".format(p) for p in pks]),\r\n 1043 pk_placeholders=\", \".join([\"?\" for p in pks]),\r\n 1044 )\r\n\r\nTypeError: 'NoneType' object is not iterable\r\n\r\n```\r\n\r\nA hacky workaround in use is:\r\n\r\n```python\r\ntry:\r\n db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])\r\nexcept:\r\n db['test'].insert_all([{'Col1':'a'},{'Col1':'b'}])\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/73/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 546051181, "node_id": "MDU6SXNzdWU1NDYwNTExODE=", "number": 16, "title": "Exception running first command: IndexError: list index out of range", "user": {"value": 15092, "label": "jayvdb"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-07T03:01:58Z", "updated_at": "2020-04-14T18:37:21Z", "closed_at": "2020-04-14T18:37:21Z", "author_association": "NONE", "pull_request": null, "body": "Exception running first command without an existing db or auth.\r\n\r\n```py\r\n> mkdir ~/.github/coala\r\n> /usr/bin/github-to-sqlite repos ~/.github/coala coala\r\nTraceback (most recent call last):\r\n File \"/usr/bin/github-to-sqlite\", line 11, in \r\n load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')()\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py\", line 163, in repos\r\n utils.save_repo(db, repo)\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py\", line 120, in save_repo\r\n to_save[\"owner\"] = save_user(db, to_save[\"owner\"])\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py\", line 61, in save_user\r\n return db[\"users\"].upsert(to_save, pk=\"id\", alter=True).last_pk\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1135, in upsert\r\n extracts=extracts,\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1162, in upsert_all\r\n upsert=True,\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1105, in insert_all\r\n row = list(self.rows_where(\"rowid = ?\", [self.last_rowid]))[0]\r\nIndexError: list index out of range\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 548591089, "node_id": "MDU6SXNzdWU1NDg1OTEwODk=", "number": 657, "title": "Allow creation of virtual tables at startup", "user": {"value": 1055831, "label": "dazzag24"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-12T16:10:55Z", "updated_at": "2021-01-15T20:24:35Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n \r\nI've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow.\r\nhttps://cldellow.com/2018/06/22/sqlite-parquet-vtable.html\r\nhttps://github.com/cldellow/sqlite-parquet-vtable\r\n\r\nThis works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial \r\n\r\nCREATE VIRTUAL TABLE .........\r\n\r\ncommand that is required to load the data from the parquet file into the table.\r\n\r\nIt seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time?\r\n\r\nMy thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the \"CREATE VIRTUAL TABLE...\" rather than loading the *.db file in the default case??\r\n\r\nI'm happy to contribute code and testing, I just need some pointers on the best approach.\r\n\r\nThanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/657/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 549287310, "node_id": "MDU6SXNzdWU1NDkyODczMTA=", "number": 76, "title": "order_by mechanism", "user": {"value": 10501166, "label": "metab0t"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-14T02:06:03Z", "updated_at": "2020-04-16T06:23:29Z", "closed_at": "2020-04-16T03:13:06Z", "author_association": "NONE", "pull_request": null, "body": "In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`.\r\nIn a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 550293770, "node_id": "MDU6SXNzdWU1NTAyOTM3NzA=", "number": 658, "title": "How do I use the app.css as style sheet?", "user": {"value": 49656826, "label": "null92"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-01-15T16:27:57Z", "updated_at": "2020-02-07T00:29:50Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Simon,\r\n\r\nI'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't.\r\n\r\nIs this possible?\r\n\r\nThanks!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/658/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 551834842, "node_id": "MDU6SXNzdWU1NTE4MzQ4NDI=", "number": 659, "title": "README information is obscured by feature history", "user": {"value": 55480210, "label": "labstersteve"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-01-18T22:34:51Z", "updated_at": "2020-12-10T23:28:51Z", "closed_at": "2020-12-10T23:28:51Z", "author_association": "NONE", "pull_request": null, "body": "While it's sometimes valuable to know how a project has developed, there is usually little justification for including this information in the README, and certainly not immediately after other key information such as \"what does this package do, and who might want to use it?\"\r\n\r\nMight I recommend that the feature history is migrated to an Appendix in the documentation?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/659/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 555832585, "node_id": "MDU6SXNzdWU1NTU4MzI1ODU=", "number": 661, "title": "--port option to expose a port other than 8001 in \"datasette package\"", "user": {"value": 134771, "label": "dvhthomas"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-01-27T21:05:56Z", "updated_at": "2020-01-30T04:17:52Z", "closed_at": "2020-01-29T22:46:45Z", "author_association": "NONE", "pull_request": null, "body": "I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080.\r\n\r\nhttps://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41\r\n\r\nIs there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/661/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 556814876, "node_id": "MDU6SXNzdWU1NTY4MTQ4NzY=", "number": 662, "title": "Escape_fts5_query-hookimplementation does not work with queries to standard tables", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-01-29T11:56:03Z", "updated_at": "2020-01-30T00:30:20Z", "closed_at": "2020-01-30T00:30:19Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon\r\n\r\nThank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: /plugins/sql_functions.py:\r\n\r\n`from datasette import hookimpl\r\n\r\ndef escape_fts_query(query):\r\n bits = query.split()\r\n return ' '.join('\"{}\"'.format(bit.replace('\"', '')) for bit in bits)\r\n\r\n@hookimpl\r\ndef prepare_connection(conn):\r\n conn.create_function(\"escape_fts_query\", 1, escape_fts_query)`\r\n\r\nIt has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?'\r\n\r\nDoes the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query?\r\n\r\nPS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine.\r\nPPS. The fts5 virtual table is created with 'sqlite3' like so:\r\n\r\n`CREATE VIRTUAL TABLE \"cases_fts\" USING FTS5(\r\n title,\r\n subtitle,\r\n resume,\r\n suggestion,\r\n presentation,\r\n detail = full,\r\n content_rowid = 'id',\r\n content = 'cases',\r\n tokenize='unicode61', 'remove_diacritics 2', 'tokenchars \"-_\"'\r\n);`\r\n\r\nThanks!\r\n\r\n_Originally posted by @clausjuhl in https://github.com/simonw/datasette/issues/651#issuecomment-579675357_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/662/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 559197745, "node_id": "MDU6SXNzdWU1NTkxOTc3NDU=", "number": 82, "title": "Tutorial command no longer works", "user": {"value": 10350886, "label": "petey284"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-02-03T16:36:11Z", "updated_at": "2020-02-27T04:16:43Z", "closed_at": "2020-02-27T04:16:30Z", "author_association": "NONE", "pull_request": null, "body": "Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site.\r\n\r\nThe following command no longer works, and breaks with the previous too many variables error: #50\r\n\r\n``` cmd\r\n> curl \"https://data.nasa.gov/resource/y77d-th95.json\" | \\\r\n sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\nOutput:\r\n``` cmd\r\nTraceback (most recent call last):\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\runpy.py\", line 193, in _run_module_as_main\r\n \"__main__\", mod_spec)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"Continuum\\miniconda3\\envs\\main\\Scripts\\sqlite-utils.exe\\__main__.py\", line 9, in \r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\click\\core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\cli.py\", line 434, in insert\r\n default=default,\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\cli.py\", line 384, in insert_upsert_implementation\r\n docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs\r\n File \"continuum\\miniconda3\\envs\\main\\lib\\site-packages\\sqlite_utils\\db.py\", line 1081, in insert_all\r\n result = self.db.conn.execute(query, params)\r\nsqlite3.OperationalError: too many SQL variables\r\n```\r\n\r\nMy thought is that maybe the dataset grew over the last few years and so didn't run into this issue before.\r\n\r\nNo error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails.\r\n\r\n// This passes\r\n``` cmd\r\ntype meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\n// But this fails\r\n``` cmd\r\ntype meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id\r\n```\r\n\r\nA potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction.\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 562787785, "node_id": "MDU6SXNzdWU1NjI3ODc3ODU=", "number": 667, "title": "Allow injecting configuration data from plugins", "user": {"value": 870184, "label": "xrotwang"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-02-10T19:50:15Z", "updated_at": "2020-02-12T16:18:22Z", "closed_at": "2020-02-12T09:21:22Z", "author_association": "NONE", "pull_request": null, "body": "I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md).\r\n\r\nPart of this customization would be support for the \"special\" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file.\r\n\r\nIt would be nice to be able to programmatically inject config data from plugins as well.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/667/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 564579430, "node_id": "MDU6SXNzdWU1NjQ1Nzk0MzA=", "number": 86, "title": "Problem with square bracket in CSV column name", "user": {"value": 8149512, "label": "foscoj"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-02-13T10:19:57Z", "updated_at": "2020-02-27T04:16:08Z", "closed_at": "2020-02-27T04:16:07Z", "author_association": "NONE", "pull_request": null, "body": "testing some data from european power information (entsoe.eu), the title of the csv contains square brackets.\r\nas I am playing with glitch, sqlite-utils are used for creating the db.\r\n\r\nTraceback (most recent call last):\r\n\r\n File \"/app/.local/bin/sqlite-utils\", line 8, in \r\n\r\n sys.exit(cli())\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n\r\n return self.main(*args, **kwargs)\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n\r\n rv = self.invoke(ctx)\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/click/core.py\", line 1137, in invoke\r\n\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n\r\n return ctx.invoke(self.callback, **ctx.params)\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n\r\n return callback(*args, **kwargs)\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py\", line 434, in insert\r\n\r\n default=default,\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py\", line 384, in insert_upsert_implementation\r\n\r\n docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py\", line 997, in insert_all\r\n\r\n extracts=extracts,\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py\", line 618, in create\r\n\r\n extracts=extracts,\r\n\r\n File \"/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py\", line 310, in create_table\r\n\r\n self.conn.execute(sql)\r\n\r\nsqlite3.OperationalError: unrecognized token: \"]\"\r\n\r\nentsoe_2016.csv\r\n\r\nrenamed to txt for uploading compatibility\r\n\r\n[entsoe_2016.txt](https://github.com/simonw/sqlite-utils/files/4197688/entsoe_2016.txt)\r\n\r\ncode is remixed directly from your https://glitch.com/edit/#!/datasette-csvs repo\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/86/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 567902704, "node_id": "MDU6SXNzdWU1Njc5MDI3MDQ=", "number": 675, "title": "--cp option for datasette publish and datasette package for shipping additional files and directories", "user": {"value": 141844, "label": "aviflax"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2020-02-19T22:55:56Z", "updated_at": "2020-12-28T18:49:21Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I\u2019m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image \u2014 in my case, it\u2019s a sort of a deployment directive. I\u2019ve worked out a way to do this after the image has been created, but it\u2019s convoluted and brittle.\r\n\r\nSo it\u2019d be excellent if there was an additional option for this command, something like, like, `--copy`.\r\n\r\nI\u2019d envision it looking something like:\r\n\r\n```shell\r\n$ datasette package --copy /the/source/path:/the/target/path data.db\r\n```\r\n\r\nI\u2019d be happy to help design, specify, implement, and test this feature, if you\u2019d be interested.\r\n\r\nThanks for the fantastic tools!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/675/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 568091133, "node_id": "MDU6SXNzdWU1NjgwOTExMzM=", "number": 676, "title": "?_searchmode=raw option for running FTS searches without escaping characters", "user": {"value": 58088336, "label": "tunguyenatwork"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-02-20T06:56:57Z", "updated_at": "2020-02-25T05:57:24Z", "closed_at": "2020-02-25T05:56:04Z", "author_association": "NONE", "pull_request": null, "body": "After the version 0.34. I am not able to use the wildchar in the _search option( or the full text search). It will not return any result unless I specify the whole word for text search. \r\n\r\nIf I use 'match :search || \"*\" ' in the sql statement then it will work as expected.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/676/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 569317377, "node_id": "MDU6SXNzdWU1NjkzMTczNzc=", "number": 681, "title": "Cashe-header missing in http-response", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-02-22T10:50:45Z", "updated_at": "2020-02-24T20:53:57Z", "closed_at": "2020-02-24T20:53:56Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers:\r\n\r\naccess-control-allow-origin: *\r\ncontent-type: text/html; charset=utf-8\r\ndate: Sat, 22 Feb 2020 10:32:15 GMT\r\nreferrer-policy: no-referrer\r\nserver: uvicorn\r\ntransfer-encoding: chunked\r\n\r\nCors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker.\r\n\r\nSecond, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers:\r\n\r\n\"Screenshot\r\n\r\nThank you for a great product!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/681/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 573583971, "node_id": "MDU6SXNzdWU1NzM1ODM5NzE=", "number": 689, "title": "\"Templates considered\" comment broken in >=0.35", "user": {"value": 35075, "label": "chrishas35"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-03-01T17:31:21Z", "updated_at": "2020-04-05T19:39:44Z", "closed_at": "2020-04-05T19:39:44Z", "author_association": "NONE", "pull_request": null, "body": "Noticed that the \"Templates Considered\" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34:\r\n\r\n```diff\r\n< \"datasette_version\": \"0.34\",\r\n< \"app_css_hash\": \"ffa51a\",\r\n< \"select_templates\": [\r\n< \"*index.html\"\r\n< ],\r\n< \"zip\": \"\",\r\n< \"body_scripts\": [],\r\n< \"extra_css_urls\": \"\",\r\n< \"extra_js_urls\": \"\",\r\n< \"format_bytes\": \"\",\r\n< \"database_url\": \">\",\r\n< \"database_color\": \">\"\r\n---\r\n> \"datasette_version\": \"0.35\",\r\n> \"database_url\": \">\",\r\n> \"database_color\": \">\"\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/689/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}