{"html_url": "https://github.com/simonw/datasette/issues/1179#issuecomment-755492945", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1179", "id": 755492945, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ5Mjk0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:37:39Z", "updated_at": "2021-01-06T18:37:39Z", "author_association": "OWNER", "body": "I think I'll call this `full_path` for consistency with Django.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780278550, "label": "Make original path available to render hooks"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1179#issuecomment-755489974", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1179", "id": 755489974, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ4OTk3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:35:24Z", "updated_at": "2021-01-06T18:35:24Z", "author_association": "OWNER", "body": "Django calls this ` HttpRequest.get_full_path()` - for the path plus the querystring.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780278550, "label": "Make original path available to render hooks"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1179#issuecomment-755486103", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1179", "id": 755486103, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ4NjEwMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:32:41Z", "updated_at": "2021-01-06T18:34:11Z", "author_association": "OWNER", "body": "This parameter will return the URL path, with querystring arguments, to the HTML version of the page - e.g. `/github/issue_comments` or `/github/issue_comments?_sort_desc=created_at`\r\n\r\nOpen questions:\r\n\r\n- What should it be called? `path` could be misleading since it also includes the querystring.\r\n- Should I provide a `url` or `full_url` version which includes `https://blah.com/...`?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780278550, "label": "Make original path available to render hooks"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-755484384", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 755484384, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ4NDM4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:31:14Z", "updated_at": "2021-01-06T18:31:57Z", "author_association": "OWNER", "body": "In building https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on I discovered the following patterns for importing data into both Pandas and Observable/d3:\r\n```python\r\nimport pandas\r\ndf = pandas.read_json(\r\n \"https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array\"\r\n)\r\n```\r\nAnd:\r\n```javascript\r\nd3 = require(\"d3@5\")\r\nrows = d3.json(\r\n \"https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array\"\r\n)\r\n```\r\nOnce again I find myself torn on the best possible default. A list of JSON objects is instantly compatible with both `pandas.read_json()` and `d3.json()` - but it leaves nowhere to put the extra information like pagination and suchlike!\r\n\r\nEven given this I still think the correct default is an object with `\"rows\"`, `\"total\"` and `\"next_url\"` keys. I should commit to that and implement it - this thought exercise has been running for far too long.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755476820", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755476820, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ3NjgyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:24:47Z", "updated_at": "2021-01-06T18:24:47Z", "author_association": "OWNER", "body": "Issue fixed - https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on displays the correct schemes now.\r\n\r\nI can't think of a reason anyone on Cloud Run would ever NOT want the `force_https_urls` option, but just in case I've made it so if you pass `--extra-options --setting force_https_urls off` to `publish cloudrun` your setting will be respected.\r\n\r\nhttps://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/publish/cloudrun.py#L105-L110", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755468795", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755468795, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTQ2ODc5NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T18:14:35Z", "updated_at": "2021-01-06T18:14:35Z", "author_association": "OWNER", "body": "Deploying that change now to test it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755163886", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755163886, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE2Mzg4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:37:51Z", "updated_at": "2021-01-06T08:37:51Z", "author_association": "OWNER", "body": "Easiest fix would be for `publish cloudrun` to set `force_https_urls`:\r\n\r\n`datasette publish now` used to do this: https://github.com/simonw/datasette/blob/07e208cc6d9e901b87552c1be2854c220b3f9b6d/datasette/publish/now.py#L59-L63", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1179#issuecomment-755161574", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1179", "id": 755161574, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE2MTU3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:32:31Z", "updated_at": "2021-01-06T08:32:31Z", "author_association": "OWNER", "body": "An optional `path` argument to https://docs.datasette.io/en/stable/plugin_hooks.html#register-output-renderer-datasette which shows the path WITHOUT the `.Notebook` extension would be useful here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780278550, "label": "Make original path available to render hooks"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755160187", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755160187, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE2MDE4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:29:35Z", "updated_at": "2021-01-06T08:29:35Z", "author_association": "OWNER", "body": "https://latest-with-plugins.datasette.io/-/asgi-scope\r\n\r\n```\r\n{'asgi': {'spec_version': '2.1', 'version': '3.0'},\r\n 'client': ('169.254.8.129', 54971),\r\n 'headers': [(b'host', b'latest-with-plugins.datasette.io'),\r\n (b'user-agent',\r\n b'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:84.0) Gecko'\r\n b'/20100101 Firefox/84.0'),\r\n (b'accept',\r\n b'text/html,application/xhtml+xml,application/xml;q=0.9,image/'\r\n b'webp,*/*;q=0.8'),\r\n (b'accept-language', b'en-US,en;q=0.5'),\r\n (b'dnt', b'1'),\r\n (b'cookie',\r\n b'_ga_LL6M7BK6D4=GS1.1.1609886546.49.1.1609886923.0; _ga=GA1.1'\r\n b'.894633707.1607575712'),\r\n (b'upgrade-insecure-requests', b'1'),\r\n (b'x-client-data', b'CgSL6ZsV'),\r\n (b'x-cloud-trace-context',\r\n b'e776af843c657d2a3da28a73b726e6fe/14187666787557102189;o=1'),\r\n (b'x-forwarded-for', b'148.64.98.14'),\r\n (b'x-forwarded-proto', b'https'),\r\n (b'forwarded', b'for=\"148.64.98.14\";proto=https'),\r\n (b'accept-encoding', b'gzip, deflate, br'),\r\n (b'content-length', b'0')],\r\n 'http_version': '1.1',\r\n 'method': 'GET',\r\n 'path': '/-/asgi-scope',\r\n 'query_string': b'',\r\n 'raw_path': b'/-/asgi-scope',\r\n 'root_path': '',\r\n 'scheme': 'http',\r\n 'server': ('169.254.8.130', 8080),\r\n 'type': 'http'}\r\n```\r\nNote the `'scheme': 'http'` but also the `(b'x-forwarded-proto', b'https')`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1176#issuecomment-755159583", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1176", "id": 755159583, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1OTU4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:28:20Z", "updated_at": "2021-01-06T08:28:20Z", "author_association": "OWNER", "body": "I used `from datasette.utils import path_with_format` in https://github.com/simonw/datasette-export-notebook/blob/0.1/datasette_export_notebook/__init__.py just now.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779691739, "label": "Policy on documenting \"public\" datasette.utils functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755158310", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755158310, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1ODMxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:25:31Z", "updated_at": "2021-01-06T08:25:31Z", "author_association": "OWNER", "body": "Moving this to the Datasette repo.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755157732", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755157732, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1NzczMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:24:12Z", "updated_at": "2021-01-06T08:24:12Z", "author_association": "OWNER", "body": "https://latest-with-plugins.datasette.io/fixtures/sortable.json has the bug too - the `next_url` is `http://` when it should be `https://`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755157281", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755157281, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1NzI4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:23:14Z", "updated_at": "2021-01-06T08:23:14Z", "author_association": "OWNER", "body": "https://latest-with-plugins.datasette.io/-/settings says `\"force_https_urls\": false`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755157066", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755157066, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1NzA2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:22:47Z", "updated_at": "2021-01-06T08:22:47Z", "author_association": "OWNER", "body": "Weird... https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/app.py#L609-L613\r\n\r\n```python\r\n def absolute_url(self, request, path):\r\n url = urllib.parse.urljoin(request.url, path)\r\n if url.startswith(\"http://\") and self.setting(\"force_https_urls\"):\r\n url = \"https://\" + url[len(\"http://\") :]\r\n return url\r\n```\r\nThat looks like it should work. Needs more digging.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1178#issuecomment-755156606", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1178", "id": 755156606, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTE1NjYwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T08:21:49Z", "updated_at": "2021-01-06T08:21:49Z", "author_association": "OWNER", "body": "https://github.com/simonw/datasette-export-notebook/blob/aec398eab4f34791d240d7bc47b6eec575b357be/datasette_export_notebook/__init__.py#L18-L23\r\n\r\nMaybe this is a bug in `datasette.absolute_url`? Perhaps it doesn't take the scheme into account.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 780267857, "label": "Use force_https_urls on when deploying with Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-755134771", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 755134771, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTEzNDc3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T07:28:01Z", "updated_at": "2021-01-06T07:28:01Z", "author_association": "OWNER", "body": "With this structure it will become possible to stream non-newline-delimited JSON array-of-objects too - the `stream_rows()` method could output `[` first, then each row followed by a comma, then `]` after the very last row.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-755133937", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 755133937, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTEzMzkzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T07:25:48Z", "updated_at": "2021-01-06T07:26:43Z", "author_association": "OWNER", "body": "Idea: instead of returning a dictionary, `register_output_renderer` could return an object. The object could have the following properties:\r\n\r\n- `.extension` - the extension to use\r\n- `.can_render(...)` - says if it can render this\r\n- `.can_stream(...)` - says if streaming is supported\r\n- `async .stream_rows(rows_iterator, send)` - method that loops through all rows and uses `send` to send them to the response in the correct format\r\n\r\nI can then deprecate the existing `dict` return type for 1.0.", "reactions": "{\"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-755128038", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 755128038, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NTEyODAzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-06T07:10:22Z", "updated_at": "2021-01-06T07:10:22Z", "author_association": "OWNER", "body": "Yet another use-case for this: I want to be able to stream newline-delimited JSON in order to better import into Pandas:\r\n\r\n pandas.read_json(\"https://latest.datasette.io/fixtures/compound_three_primary_keys.json?_shape=array&_nl=on\", lines=True)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754958998", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754958998, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1ODk5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T23:16:33Z", "updated_at": "2021-01-05T23:16:33Z", "author_association": "OWNER", "body": "That's really useful, thanks @rcoup ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-754958610", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 754958610, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1ODYxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T23:15:24Z", "updated_at": "2021-01-05T23:15:24Z", "author_association": "OWNER", "body": "https://latest-with-plugins.datasette.io/fixtures/roadside_attraction_characteristics/1.json-preview returns a 500 error at the moment - a KeyError on 'filtered_table_rows_count'.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/576#issuecomment-754957658", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/576", "id": 754957658, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1NzY1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T23:12:50Z", "updated_at": "2021-01-05T23:12:50Z", "author_association": "OWNER", "body": "See https://docs.datasette.io/en/stable/internals.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 497170355, "label": "Documented internals API for use in plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/576#issuecomment-754957563", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/576", "id": 754957563, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1NzU2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T23:12:37Z", "updated_at": "2021-01-05T23:12:37Z", "author_association": "OWNER", "body": "I'm happy with how this has evolved, so I'm closing the issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 497170355, "label": "Documented internals API for use in plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1176#issuecomment-754957378", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1176", "id": 754957378, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1NzM3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T23:12:03Z", "updated_at": "2021-01-05T23:12:03Z", "author_association": "OWNER", "body": "This needs to be done for Datasette 1.0. At the very least I need to ensure it's clear that `datasette.utils` is not part of the public API unless explicitly marked as such.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779691739, "label": "Policy on documenting \"public\" datasette.utils functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1176#issuecomment-754952146", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1176", "id": 754952146, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1MjE0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T22:57:26Z", "updated_at": "2021-01-05T22:57:26Z", "author_association": "OWNER", "body": "Known public APIs might be worth adding type annotations to as well.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779691739, "label": "Policy on documenting \"public\" datasette.utils functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1176#issuecomment-754952040", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1176", "id": 754952040, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1MjA0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T22:57:09Z", "updated_at": "2021-01-05T22:57:09Z", "author_association": "OWNER", "body": "It might be neater to move all of the non-public functions into a separate module - `datasette.utils.internal` perhaps.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779691739, "label": "Policy on documenting \"public\" datasette.utils functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1176#issuecomment-754951786", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1176", "id": 754951786, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDk1MTc4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T22:56:27Z", "updated_at": "2021-01-05T22:56:43Z", "author_association": "OWNER", "body": "Idea: introduce a `@documented` decorator which marks specific functions as part of the public, documented API. The unit tests can then confirm that anything with that decorator is both documented and tested.\r\n```python\r\n@documented\r\ndef escape_css_string(s):\r\n return _css_re.sub(\r\n lambda m: \"\\\\\" + (f\"{ord(m.group()):X}\".zfill(6)),\r\n s.replace(\"\\r\\n\", \"\\n\"),\r\n )\r\n```\r\nOr maybe `@public`?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779691739, "label": "Policy on documenting \"public\" datasette.utils functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1175#issuecomment-754696725", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1175", "id": 754696725, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDY5NjcyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T15:12:30Z", "updated_at": "2021-01-05T15:12:30Z", "author_association": "OWNER", "body": "Some tips here: https://github.com/tiangolo/fastapi/issues/78", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779156520, "label": "Use structlog for logging"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1173#issuecomment-754463845", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1173", "id": 754463845, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDQ2Mzg0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-05T07:41:43Z", "updated_at": "2021-01-05T07:41:43Z", "author_association": "OWNER", "body": "https://github.com/oleksis/pyinstaller-manylinux looks useful, via https://twitter.com/oleksis/status/1346341987876823040", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778682317, "label": "GitHub Actions workflow to build manylinux binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754296761", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754296761, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI5Njc2MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:55:44Z", "updated_at": "2021-01-04T23:55:44Z", "author_association": "OWNER", "body": "Bit uncomfortable that it looks like you need to include your Apple ID username and password in the CI configuration to do this. I'll use GitHub Secrets for this but I don't like it - I'll definitely setup a dedicated code signing account that's not my access-to-everything AppleID for this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754295380", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754295380, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI5NTM4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:54:32Z", "updated_at": "2021-01-04T23:54:32Z", "author_association": "OWNER", "body": "https://github.com/search?l=YAML&q=gon+json&type=Code reveals some examples of people using `gon` in workflows.\r\n\r\nThese look useful:\r\n\r\n* https://github.com/coherence/hub-server/blob/3b7e9c7c5bce9e244b14b854f1f89d66f53a5a39/.github/workflows/release_build.yml\r\n* https://github.com/simoncozens/pilcrow/blob/5abc145e7fb9577086afe47b48fd730cb8195386/.github/workflows/buildapp.yaml", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754287882", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754287882, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI4Nzg4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:40:10Z", "updated_at": "2021-01-04T23:42:32Z", "author_association": "OWNER", "body": "This looks VERY useful: https://github.com/mitchellh/gon - \" Sign, notarize, and package macOS CLI tools and applications written in any language. Available as both a CLI and a Go library.\"\r\n\r\nAnd it installs like this:\r\n\r\n brew install mitchellh/gon/gon", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754286783", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754286783, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI4Njc4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:38:18Z", "updated_at": "2021-01-04T23:38:18Z", "author_association": "OWNER", "body": "Oh wow maybe I need to Notarize it too? https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754286618", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754286618, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI4NjYxOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:37:45Z", "updated_at": "2021-01-04T23:37:45Z", "author_association": "OWNER", "body": "https://github.com/actions/virtual-environments/issues/1820#issuecomment-719549887 looks useful - not sure if those notes are for iOS or macOS though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754285795", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754285795, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI4NTc5NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:35:13Z", "updated_at": "2021-01-04T23:35:13Z", "author_association": "OWNER", "body": "Next step is to automate this all!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1152#issuecomment-754285588", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1152", "id": 754285588, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDI4NTU4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T23:34:30Z", "updated_at": "2021-01-04T23:34:30Z", "author_association": "OWNER", "body": "I think the way to do this is to have a new plugin hook that returns two SQL where clauses: one returning a list of resources that the user should be able to access (the allow-list) and one returning a list of resources they are explicitly forbidden from accessing (the deny-list). Either of these can be blank.\r\n\r\nDatasette can then combine those into a full SQL query and use it to answer the question \"show me a list of resources that the user is allowed to perform action X on\". It can also answer the existing question, \"is user X allowed to perform action Y on resource Z\"?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 770598024, "label": "Efficiently calculate list of databases/tables a user can view"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754233960", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754233960, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIzMzk2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:35:37Z", "updated_at": "2021-01-04T21:35:37Z", "author_association": "OWNER", "body": "I tested it by running a `tmate` session on the GitHub macOS machines, and it worked!\r\n```\r\nMac-1609795972770:tmp runner$ wget 'https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip'\r\n--2021-01-04 21:34:10-- https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip\r\nResolving github.com (github.com)... 140.82.114.4\r\nConnecting to github.com (github.com)|140.82.114.4|:443... connected.\r\nHTTP request sent, awaiting response... 302 Found\r\nLocation: https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream [following]\r\n--2021-01-04 21:34:14-- https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream\r\nResolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)... 52.217.43.164\r\nConnecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)|52.217.43.164|:443... connected.\r\nHTTP request sent, awaiting response... 200 OK\r\nLength: 8297283 (7.9M) [application/octet-stream]\r\nSaving to: \u2018datasette-0.53-macos-binary.zip\u2019\r\n\r\ndatasette-0.53-maco 100%[===================>] 7.91M --.-KB/s in 0.1s \r\n\r\n2021-01-04 21:34:14 (73.4 MB/s) - \u2018datasette-0.53-macos-binary.zip\u2019 saved [8297283/8297283]\r\n\r\nMac-1609795972770:tmp runner$ unzip datasette-0.53-macos-binary.zip \r\nArchive: datasette-0.53-macos-binary.zip\r\n creating: datasette-0.53-macos-binary/\r\n inflating: datasette-0.53-macos-binary/datasette \r\nMac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --help\r\nUsage: datasette [OPTIONS] COMMAND [ARGS]...\r\n\r\n Datasette!\r\n\r\nOptions:\r\n --version Show the version and exit.\r\n --help Show this message and exit.\r\n\r\nCommands:\r\n serve* Serve up specified SQLite database files with a web UI\r\n inspect\r\n install Install Python packages - e.g.\r\n package Package specified SQLite files into a new datasette Docker...\r\n plugins List currently available plugins\r\n publish Publish specified SQLite database files to the internet along...\r\n uninstall Uninstall Python packages (e.g.\r\nMac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --get /-/versions.json\r\n{\"python\": {\"version\": \"3.9.1\", \"full\": \"3.9.1 (default, Dec 10 2020, 10:36:35) \\n[Clang 12.0.0 (clang-1200.0.32.27)]\"}, \"datasette\": {\"version\": \"0.53\"}, \"asgi\": \"3.0\", \"uvicorn\": \"0.13.3\", \"sqlite\": {\"version\": \"3.34.0\", \"fts_versions\": [\"FTS5\", \"FTS4\", \"FTS3\"], \"extensions\": {\"json1\": null}, \"compile_options\": [\"COMPILER=clang-12.0.0\", \"ENABLE_COLUMN_METADATA\", \"ENABLE_FTS3\", \"ENABLE_FTS3_PARENTHESIS\", \"ENABLE_FTS4\", \"ENABLE_FTS5\", \"ENABLE_GEOPOLY\", \"ENABLE_JSON1\", \"ENABLE_PREUPDATE_HOOK\", \"ENABLE_RTREE\", \"ENABLE_SESSION\", \"MAX_VARIABLE_NUMBER=250000\", \"THREADSAFE=1\"]}}\r\nMac-1609795972770:tmp runner$ \r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754229977", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754229977, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIyOTk3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:28:01Z", "updated_at": "2021-01-04T21:28:01Z", "author_association": "OWNER", "body": "As an experiment, I put the macOS one in a zip file and attached it to the latest release:\r\n\r\n```\r\nmkdir datasette-0.53-macos-binary\r\ncp dist/datasette datasette-0.53-macos-binary\r\nzip -r datasette-0.53-macos-binary.zip datasette-0.53-macos-binary\r\n```\r\n\r\nIt's available here: https://github.com/simonw/datasette/releases/tag/0.53 - download URL is https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754227543", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754227543, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIyNzU0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:23:13Z", "updated_at": "2021-01-04T21:23:13Z", "author_association": "OWNER", "body": "```\r\n(pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# dist/datasette --get /-/databases.json\r\n[{\"name\": \":memory:\", \"path\": null, \"size\": 0, \"is_mutable\": true, \"is_memory\": true, \"hash\": null}]\r\n(pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# ls -lah dist/datasette \r\n-rwxr-xr-x 1 root root 8.9M Jan 4 21:05 dist/datasette\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754219002", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754219002, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxOTAwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:06:49Z", "updated_at": "2021-01-04T21:22:27Z", "author_association": "OWNER", "body": "Works on Linux/Ubuntu too, except I had to do `export BASE=` on a separate line. I also did this:\r\n```\r\napt-get install python3 python3-venv\r\npython3 -m venv pyinstaller-venv\r\nsource pyinstaller-venv/bin/activate\r\npip install wheel\r\npip install datasette pyinstaller\r\n\r\nexport DATASETTE_BASE=$(python -c 'import os; print(os.path.dirname(__import__(\"datasette\").__file__))')\r\n\r\npyinstaller -F \\\r\n --add-data \"$DATASETTE_BASE/templates:datasette/templates\" \\\r\n --add-data \"$DATASETTE_BASE/static:datasette/static\" \\\r\n --hidden-import datasette.publish \\\r\n --hidden-import datasette.publish.heroku \\\r\n --hidden-import datasette.publish.cloudrun \\\r\n --hidden-import datasette.facets \\\r\n --hidden-import datasette.sql_functions \\\r\n --hidden-import datasette.actor_auth_cookie \\\r\n --hidden-import datasette.default_permissions \\\r\n --hidden-import datasette.default_magic_parameters \\\r\n --hidden-import datasette.blob_renderer \\\r\n --hidden-import datasette.default_menu_links \\\r\n --hidden-import uvicorn \\\r\n --hidden-import uvicorn.logging \\\r\n --hidden-import uvicorn.loops \\\r\n --hidden-import uvicorn.loops.auto \\\r\n --hidden-import uvicorn.protocols \\\r\n --hidden-import uvicorn.protocols.http \\\r\n --hidden-import uvicorn.protocols.http.auto \\\r\n --hidden-import uvicorn.protocols.websockets \\\r\n --hidden-import uvicorn.protocols.websockets.auto \\\r\n --hidden-import uvicorn.lifespan \\\r\n --hidden-import uvicorn.lifespan.on \\\r\n $(which datasette)\r\n```\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754218545", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754218545, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxODU0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:05:57Z", "updated_at": "2021-01-04T21:05:57Z", "author_association": "OWNER", "body": "That BASE= trick seems to work with `zsh` but not with `bash`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754215392", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754215392, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:59:20Z", "updated_at": "2021-01-04T21:03:14Z", "author_association": "OWNER", "body": "Updated `pyinstaller` recipe - lots of hidden imports needed now:\r\n```\r\npip install wheel\r\npip install datasette pyinstaller\r\n\r\nBASE=$(python -c 'import os; print(os.path.dirname(__import__(\"datasette\").__file__))') \\\r\n pyinstaller -F \\\r\n --add-data \"$BASE/templates:datasette/templates\" \\\r\n --add-data \"$BASE/static:datasette/static\" \\\r\n --hidden-import datasette.publish \\\r\n --hidden-import datasette.publish.heroku \\\r\n --hidden-import datasette.publish.cloudrun \\\r\n --hidden-import datasette.facets \\\r\n --hidden-import datasette.sql_functions \\\r\n --hidden-import datasette.actor_auth_cookie \\\r\n --hidden-import datasette.default_permissions \\\r\n --hidden-import datasette.default_magic_parameters \\\r\n --hidden-import datasette.blob_renderer \\\r\n --hidden-import datasette.default_menu_links \\\r\n --hidden-import uvicorn \\\r\n --hidden-import uvicorn.logging \\\r\n --hidden-import uvicorn.loops \\\r\n --hidden-import uvicorn.loops.auto \\\r\n --hidden-import uvicorn.protocols \\\r\n --hidden-import uvicorn.protocols.http \\\r\n --hidden-import uvicorn.protocols.http.auto \\\r\n --hidden-import uvicorn.protocols.websockets \\\r\n --hidden-import uvicorn.protocols.websockets.auto \\\r\n --hidden-import uvicorn.lifespan \\\r\n --hidden-import uvicorn.lifespan.on \\\r\n $(which datasette)\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754215793", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754215793, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxNTc5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T21:00:14Z", "updated_at": "2021-01-04T21:00:14Z", "author_association": "OWNER", "body": "```\r\n(pyinstaller-datasette) pyinstaller-datasette % file dist/datasette\r\ndist/datasette: Mach-O 64-bit executable x86_64\r\n(pyinstaller-datasette) pyinstaller-datasette % ls -lah dist/datasette\r\n-rwxr-xr-x 1 simon wheel 8.0M Jan 4 12:58 dist/datasette\r\n```\r\nI'm surprised it's only 8MB!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/668#issuecomment-754194996", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/668", "id": 754194996, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5NDk5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:18:39Z", "updated_at": "2021-01-04T20:18:39Z", "author_association": "OWNER", "body": "I fixed this in #1115 - you can run `--load-extension=spatialite` now and it will look for the extension in common places.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 563347679, "label": "Make it easier to load SpatiaLite"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/436#issuecomment-754193501", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/436", "id": 754193501, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5MzUwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:15:41Z", "updated_at": "2021-01-04T20:15:41Z", "author_association": "OWNER", "body": "Sadly `publish.datasettes.com` was broken by changes to Zeit, and I don't think I'll be bringing it back.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 435819321, "label": "400 Error when trying to register new user via https://publish.datasettes.com/"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/371#issuecomment-754192873", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/371", "id": 754192873, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5Mjg3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:14:28Z", "updated_at": "2021-01-04T20:14:28Z", "author_association": "OWNER", "body": "Now that Digital Ocean has App Platform this is less necessary, especially since the documentation covers how to use App Platform here: https://docs.datasette.io/en/stable/deploying.html#deploying-using-buildpacks", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377156339, "label": "datasette publish digitalocean plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/102#issuecomment-754192267", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/102", "id": 754192267, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5MjI2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:13:19Z", "updated_at": "2021-01-04T20:13:19Z", "author_association": "OWNER", "body": "I'm more likely to do Lambda than Elastic Beanstalk, especially now the size limit for Lambdas has been increased as part of their support for Docker.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274264175, "label": "datasette publish elasticbeanstalk"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/221#issuecomment-754191699", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/221", "id": 754191699, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5MTY5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:12:14Z", "updated_at": "2021-01-04T20:12:14Z", "author_association": "OWNER", "body": "I'm going to close this. Plugins can register their own CLI tools (see https://github.com/simonw/click-app) if they need to.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315142414, "label": "Allow plugins to add new cli sub commands "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/221#issuecomment-754190952", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/221", "id": 754190952, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5MDk1Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:10:51Z", "updated_at": "2021-01-04T20:10:51Z", "author_association": "OWNER", "body": "Is this still a good idea? I don't have any pressing need for it at the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315142414, "label": "Allow plugins to add new cli sub commands "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/221#issuecomment-754190814", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/221", "id": 754190814, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE5MDgxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:10:34Z", "updated_at": "2021-01-04T20:10:34Z", "author_association": "OWNER", "body": "For the `csvs-to-sqlite` case I'm going with `datasette insert` instead, see #1160.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315142414, "label": "Allow plugins to add new cli sub commands "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/18#issuecomment-754188383", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/18", "id": 754188383, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4ODM4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:05:48Z", "updated_at": "2021-01-04T20:05:48Z", "author_association": "OWNER", "body": "I'm not using Sanic any more, but this is still very feasible. If I ever do it I'll write a plugin.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267739593, "label": "See if I can get a websockets interface working"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/103#issuecomment-754188099", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/103", "id": 754188099, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4ODA5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:05:14Z", "updated_at": "2021-01-04T20:05:14Z", "author_association": "OWNER", "body": "Wontfix, Cloud Run is already implemented and is a better fit for Datasette.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274265878, "label": "datasette publish appengine"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/913#issuecomment-754187520", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/913", "id": 754187520, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4NzUyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:04:10Z", "updated_at": "2021-01-04T20:04:10Z", "author_association": "OWNER", "body": "That's pretty elegant: each plugin gets its own namespace and can register new settings.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 670209331, "label": "Mechanism for passing additional options to `datasette my.db` that affect plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/913#issuecomment-754187326", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/913", "id": 754187326, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4NzMyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:03:50Z", "updated_at": "2021-01-04T20:03:50Z", "author_association": "OWNER", "body": "I renamed `--config` to `--setting` and changed it to work like this:\r\n\r\n datasette --setting sql_time_limit_ms 1000\r\n\r\nNote the lack of colons.\r\n\r\nThis actually makes colons cleaner to use for plugins - I could support this:\r\n\r\n datasette --setting datasette-insert:unsafe 1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 670209331, "label": "Mechanism for passing additional options to `datasette my.db` that affect plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1111#issuecomment-754184287", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1111", "id": 754184287, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4NDI4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T19:57:53Z", "updated_at": "2021-01-04T19:57:53Z", "author_association": "OWNER", "body": "Relevant new feature in sqlite-utils: the ability to use triggers to maintain fast counts. This optimization could help a lot here. https://github.com/simonw/sqlite-utils/issues/212", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 751195017, "label": "Accessing a database's `.json` is slow for very large SQLite files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1164#issuecomment-754182058", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1164", "id": 754182058, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4MjA1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T19:53:31Z", "updated_at": "2021-01-04T19:53:31Z", "author_association": "OWNER", "body": "This will be helped by the new `package.json` added in #1170.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 776634318, "label": "Mechanism for minifying JavaScript that ships with Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1170#issuecomment-754181646", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1170", "id": 754181646, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T19:52:40Z", "updated_at": "2021-01-04T19:52:40Z", "author_association": "OWNER", "body": "Thank you very much!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778126516, "label": "Install Prettier via package.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753690280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753690280, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY5MDI4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T23:13:30Z", "updated_at": "2021-01-03T23:13:30Z", "author_association": "OWNER", "body": "Oh that's interesting, I hadn't thought about plugins firing events - just responding to events fired by the rest of the application.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671902", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671902, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTkwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:31:04Z", "updated_at": "2021-01-03T20:32:13Z", "author_association": "OWNER", "body": "A `table.has_count_triggers` property.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671235", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671235, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTIzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:25:10Z", "updated_at": "2021-01-03T20:25:10Z", "author_association": "OWNER", "body": "To detect tables, look at the names of the triggers - `{table}{counts_table}_insert` and `{table}{counts_table}_delete`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671009", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671009, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTAwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:22:53Z", "updated_at": "2021-01-03T20:22:53Z", "author_association": "OWNER", "body": "I think this should be accompanied by a `sqlite-utils reset-counts` command.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753670833", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753670833, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MDgzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:20:54Z", "updated_at": "2021-01-03T20:20:54Z", "author_association": "OWNER", "body": "This is a little tricky. We should assume that the existing values in the `_counts` table cannot be trusted at all when this method is called - so we should probably clear that table entirely and then re-populate it.\r\n\r\nBut that means we need to figure out which tables in the database have the counts triggers defined.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753668099", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753668099, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2ODA5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:55:53Z", "updated_at": "2021-01-03T19:55:53Z", "author_association": "OWNER", "body": "So if you instantiate the `Database()` constructor with `use_counts_table=True` any access to the `.count` properties will go through this table - otherwise regular `count(*)` queries will be executed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753665521", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753665521, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2NTUyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:31:33Z", "updated_at": "2021-01-03T19:31:33Z", "author_association": "OWNER", "body": "I'm having second thoughts about this being the default behaviour. It's pretty weird. I feel like HUGE databases that need this are rare, so having it on by default doesn't make sense.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753662490", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753662490, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MjQ5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:05:53Z", "updated_at": "2021-01-03T19:05:53Z", "author_association": "OWNER", "body": "Idea: a `.execute_count()` method that never uses the cache.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661292", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753661292, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MTI5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:56:06Z", "updated_at": "2021-01-03T18:56:23Z", "author_association": "OWNER", "body": "Another option: on creation of the `Database()` object, check to see if the `_counts` table exists and use that as the default for a `use_counts_table` property. Also flip that property to `True` if the user calls `.enable_counts()` at any time.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661158", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753661158, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MTE1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:55:16Z", "updated_at": "2021-01-03T18:55:16Z", "author_association": "OWNER", "body": "Alternative implementation: provided `db.should_trust_counts` is `True`, try running the query:\r\n```sql\r\nselect count from _counts where [table] = ?\r\n```\r\nIf the query fails to return a result OR throws an error because the table doesn't exist, run the `count(*)` query.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660814", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753660814, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MDgxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:53:05Z", "updated_at": "2021-01-03T18:53:05Z", "author_association": "OWNER", "body": "Here's the current `.count` property: https://github.com/simonw/sqlite-utils/blob/036ec6d32313487527c66dea613a3e7118b97459/sqlite_utils/db.py#L597-L609\r\n\r\nIt's implemented on `Queryable` which means it's available on both `Table` and `View` - the optimization doesn't make sense for views.\r\n\r\nI'm a bit cautious about making that property so much more complex. In order to decide if it should try the `_counts` table first it needs to know:\r\n\r\n- Should it be trusting the counts? I'm thinking a `.should_trust_counts` property on `Database` which defaults to `True` would be good - then advanced users can turn that off if they know the counts should not be trusted.\r\n- Does the `_counts` table exist?\r\n- Are the triggers defined?\r\n\r\nThen it can do the query, and if the query fails it can fall back on the `count(*)`. That's quite a lot of extra activity though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660379", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753660379, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MDM3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:50:15Z", "updated_at": "2021-01-03T18:50:15Z", "author_association": "OWNER", "body": "```python\r\n def cached_counts(self, tables=None):\r\n sql = \"select [table], count from {}\".format(self._counts_table_name)\r\n if tables:\r\n sql += \" where [table] in ({})\".format(\", \".join(\"?\" for table in tables))\r\n return {r[0]: r[1] for r in self.execute(sql, tables).fetchall()}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/206#issuecomment-753659260", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/206", "id": 753659260, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1OTI2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:42:01Z", "updated_at": "2021-01-03T18:42:01Z", "author_association": "OWNER", "body": "```\r\n% sqlite-utils insert blah.db blah global_power_plant_database.csv\r\nError: Invalid JSON - use --csv for CSV or --tsv for TSV files\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 761915790, "label": "sqlite-utils should suggest --csv if JSON parsing fails"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753657180", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753657180, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:23:30Z", "updated_at": "2021-01-03T18:23:30Z", "author_association": "OWNER", "body": "Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753653260", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753653260, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T17:54:40Z", "updated_at": "2021-01-03T17:54:40Z", "author_association": "OWNER", "body": "And @benpickles yes I would land that pull request straight away as-is. Thanks!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753653033", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753653033, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T17:52:53Z", "updated_at": "2021-01-03T17:52:53Z", "author_association": "OWNER", "body": "Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the `~/.npm` cache, but evidently that's not the case.\r\n\r\nYou've convinced me that Datasette itself should have a `package.json` - the Dependabot argument is a really good one.\r\n\r\nBut... I'd really love to figure out a general pattern for using `npx` scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or `puppeteer-cli` in without adding a `package.json` to them.\r\n\r\nAny ideas? The best I can think of is for the workflow itself to write out a `package.json` file (using `echo '{ ... }' > package.json`) as part of the run - that way the cache should work (I think) but I don't get a misleading `package.json` file sitting in the repo.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753570710", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753570710, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU3MDcxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:29:56Z", "updated_at": "2021-01-03T05:29:56Z", "author_association": "OWNER", "body": "I thought about using browser events, but they don't quite match the API that I'm looking to provide. In particular, the great thing about Pluggy is that if you have multiple handlers registered for a specific plugin hook each of those handlers can return a value, and Pluggy will combine those values into a list of replies.\r\n\r\nThis is great for things like plugin hooks that add extra menu items - each plugin can return a menu item (maybe as a label/URL/click-callback object) and the calling code can then add all of those items to the menu. See https://docs.datasette.io/en/stable/plugin_hooks.html#table-actions-datasette-actor-database-table for a Python example.\r\n\r\nI'm on the fence about relying on JavaScript modules. I need to think about browser compatibility for them - but I'm already commited to requiring support for `() => {}` arrow functions so maybe I'm committed to module support too already?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1160#issuecomment-753568428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1160", "id": 753568428, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:02:32Z", "updated_at": "2021-01-03T05:02:32Z", "author_association": "OWNER", "body": "Should this command include a `--fts` option for configuring full-text search on one-or-more columns?\r\n\r\nI thought about doing that for `sqlite-utils insert` in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers.\r\n\r\nBut maybe I can set sensible defaults for that with `datasette insert ... -f title -f body`? Worth thinking about a bit more.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 775666296, "label": "\"datasette insert\" command and plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753568264", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/202", "id": 753568264, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2ODI2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:00:24Z", "updated_at": "2021-01-03T05:00:24Z", "author_association": "OWNER", "body": "I'm not going to implement this, because it actually needs several additional options that already exist on `sqlite-utils enable-fts`:\r\n```\r\n --fts4 Use FTS4\r\n --fts5 Use FTS5\r\n --tokenize TEXT Tokenizer to use, e.g. porter\r\n --create-triggers Create triggers to update the FTS tables when the\r\n parent table changes.\r\n```\r\nI'd rather not add all four of those options to `sqlite-utils insert` just to support this shortcut.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 738514367, "label": "sqlite-utils insert -f colname - for configuring full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753567969", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/202", "id": 753567969, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Nzk2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:55:17Z", "updated_at": "2021-01-03T04:55:43Z", "author_association": "OWNER", "body": "The long version of this can be `--fts`, same as in `csvs-to-sqlite`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 738514367, "label": "sqlite-utils insert -f colname - for configuring full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567932, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NzkzMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:54:43Z", "updated_at": "2021-01-03T04:54:43Z", "author_association": "OWNER", "body": "Another option: expand the `ForeignKey` object to have `.columns` and `.other_columns` properties in addition to the existing `.column` and `.other_column` properties. These new plural properties would always return a tuple, which would be a one-item tuple for a non-compound-foreign-key.\r\n\r\nThe question then is what should `.column` and `.other_column` return for compound foreign keys?\r\n\r\nI'd be inclined to say they should return `None` - which would trigger errors in code that encounters a compound foreign key for the first time, but those errors would at least be a strong indicator as to what had gone wrong.\r\n\r\nWe can label `.column` and `.other_column` as deprecated and then remove them in `sqlite-utils 4.0`.\r\n\r\nSince this would still be a breaking change in some minor edge-cases I'm thinking maybe 4.0 needs to happen in order to land this feature. I'm not opposed to doing that, I was just hoping it might be avoidable.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567744", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567744, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Nzc0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:51:44Z", "updated_at": "2021-01-03T04:51:44Z", "author_association": "OWNER", "body": "One way that this could avoid a breaking change would be to have `fk.column` and `fk.other_column` remain as strings for non-compound-foreign-keys, but turn into tuples for a compound foreign key.\r\n\r\nThis is a bit of an ugly API design, and it could still break existing code that encounters a compound foreign key for the first time - but it would leave code working for the more common case of a non-compound-foreign-key.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567508", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567508, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NzUwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:48:17Z", "updated_at": "2021-01-03T04:48:17Z", "author_association": "OWNER", "body": "Sorry for taking so long to review this!\r\n\r\nThis approach looks great to me - being able to optionally pass a tuple anywhere the API currently expects a column is smart, and it's consistent with how the `pk=` parameter works elsewhere.\r\n\r\nThere's just one problem I can see with this: the way it changes the `ForeignKey(...)` interface to always return a tuple for `.column` and `.other_column`, even if that tuple only contains a single item.\r\n\r\nThis represents a breaking change to the existing API - any code that expects `ForeignKey.column` to be a single string (which is any code that has been written against that) will break.\r\n\r\nAs such, I'd have to bump the major version of `sqlite-utils` to `4.0` in order to ship this.\r\n\r\nIdeally I'd like to make this change in a way that doesn't represent an API compatibility break. I need to think a bit harder about how that might be achieved.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753566184", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/217", "id": 753566184, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NjE4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:27:38Z", "updated_at": "2021-01-03T04:27:38Z", "author_association": "OWNER", "body": "Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#quoting-strings-for-use-in-sql", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777543336, "label": "Rename .escape() to .quote()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/216#issuecomment-753566156", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/216", "id": 753566156, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NjE1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:27:14Z", "updated_at": "2021-01-03T04:27:14Z", "author_association": "OWNER", "body": "Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#introspection", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777540352, "label": "database.triggers_dict introspection property"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/218#issuecomment-753563757", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/218", "id": 753563757, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Mzc1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T03:49:51Z", "updated_at": "2021-01-03T03:49:51Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#listing-triggers", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777560474, "label": "\"sqlite-utils triggers\" command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753545757, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NTc1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:58:07Z", "updated_at": "2021-01-02T23:58:07Z", "author_association": "OWNER", "body": "Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers.\r\n\r\nOne way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545381", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753545381, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NTM4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:52:52Z", "updated_at": "2021-01-02T23:52:52Z", "author_association": "OWNER", "body": "Idea: a `db.cached_counts()` method that returns a dictionary of data from the `_counts` table. Call it with a list of tables to get back the counts for just those tables.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753544914", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/217", "id": 753544914, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NDkxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:47:42Z", "updated_at": "2021-01-02T23:47:42Z", "author_association": "OWNER", "body": "https://github.com/simonw/sqlite-utils/blob/9a5c92b63e7917c93cc502478493c51c781b2ecc/sqlite_utils/db.py#L231-L239", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777543336, "label": "Rename .escape() to .quote()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753535488", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/213", "id": 753535488, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUzNTQ4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T22:03:48Z", "updated_at": "2021-01-02T22:03:48Z", "author_association": "OWNER", "body": "I got this error while prototyping this:\r\n\r\n too many levels of trigger recursion\r\n\r\nIt looks like that's because SQLite doesn't like triggers on a table that themselves then update that table - so I'm going to exclude the `_counts` table from this mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777529979, "label": "db.enable_counts() method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753533775", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/213", "id": 753533775, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUzMzc3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T21:47:10Z", "updated_at": "2021-01-02T21:47:10Z", "author_association": "OWNER", "body": "I'm going to skip virtual tables, which I can identify using this property: https://github.com/simonw/sqlite-utils/blob/1cad7fad3e7a5b734088f5cc545b69a055e636da/sqlite_utils/db.py#L720-L726", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777529979, "label": "db.enable_counts() method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753524779", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753524779, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T20:19:26Z", "updated_at": "2021-01-02T20:19:26Z", "author_association": "OWNER", "body": "Idea: version the metadata scheme. If the table is called `_metadata_v1` it gives me a clear path to designing a new scheme in the future.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/212#issuecomment-753422324", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/212", "id": 753422324, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQyMjMyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T03:00:34Z", "updated_at": "2021-01-02T03:00:34Z", "author_association": "OWNER", "body": "Here's a prototype:\r\n```python\r\nwith db.conn:\r\n db.conn.executescript(\"\"\"\r\nCREATE TABLE IF NOT EXISTS [_counts] ([table] TEXT PRIMARY KEY, [count] INTEGER DEFAULT 0);\r\nCREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ai] AFTER INSERT ON [Street_Tree_List] BEGIN\r\n INSERT OR REPLACE INTO _counts\r\n VALUES ('Street_Tree_List', COALESCE(\r\n (SELECT count FROM _counts\r\n WHERE [table]='Street_Tree_List'),\r\n 0) + 1);\r\nEND;\r\nCREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ad] AFTER DELETE ON [Street_Tree_List] BEGIN\r\n INSERT OR REPLACE INTO _counts\r\n VALUES ('Street_Tree_List', COALESCE(\r\n (SELECT count FROM _counts\r\n WHERE [table]='Street_Tree_List'),\r\n 0) - 1);\r\nEND;\r\nINSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', (select count(*) from [Street_Tree_List]));\r\n\"\"\")\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777392020, "label": "Mechanism for maintaining cache of table counts using triggers"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/210#issuecomment-753406744", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/210", "id": 753406744, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwNjc0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T00:02:39Z", "updated_at": "2021-01-02T00:02:39Z", "author_association": "OWNER", "body": "It looks like https://github.com/ofajardo/pyreadr is a good library for this.\r\n\r\nI won't add this to `sqlite-utils` because it's quite a bulky dependency for a relatively small feature.\r\n\r\nNormally I'd write a `rdata-to-sqlite` tool similar to https://pypi.org/project/dbf-to-sqlite/ - but I'm actually working on a new plugin hook for Datasette that might be an even better fit for this. The idea is to allow Datasette plugins to define input formats - such as RData - which would then result in being able to import them on the command-line with `datasette insert my.db file.rdata` or by uploading a file through the Datasette web interface.\r\n\r\nThat work is happening over here: https://github.com/simonw/datasette/issues/1160 - I'll close this issue in favour of a sometime-in-the-future `datasette-import-rdata` plugin.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 767685961, "label": "Support of RData files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/209#issuecomment-753405835", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/209", "id": 753405835, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwNTgzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:52:06Z", "updated_at": "2021-01-01T23:52:06Z", "author_association": "OWNER", "body": "I just hit this one too. Such a weird bug!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 766156875, "label": "Test failure with sqlite 3.34 in test_cli.py::test_optimize"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753402423", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753402423, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:16:05Z", "updated_at": "2021-01-01T23:16:05Z", "author_association": "OWNER", "body": "One catch: solving the \"show me all metadata for everything in this Datasette instance\" problem.\r\n\r\nIdeally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time.\r\n\r\nIdeally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a `_metadata` table in any of the attached databases had an update.\r\n\r\nThis is a much larger problem - but one potential fix would be to use triggers to maintain a \"version number\" for the `_metadata` table - similar to SQLite's own built-in `schema_version` mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated.\r\n\r\nSuch a mechanism would have applications outside of just this `_metadata` system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753401001", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753401001, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:01:45Z", "updated_at": "2021-01-01T23:01:45Z", "author_association": "OWNER", "body": "I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400420", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400420, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:53:58Z", "updated_at": "2021-01-01T22:53:58Z", "author_association": "OWNER", "body": "Precedence idea:\r\n- First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins\r\n- Next priority: `_internal` metadata, which should have been loaded from `metadata.json`\r\n- Last priority: the `_metadata` table from that database itself, i.e. the default \"baked in\" metadata", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400306", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400306, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:52:44Z", "updated_at": "2021-01-01T22:52:44Z", "author_association": "OWNER", "body": "Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400265", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400265, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:52:09Z", "updated_at": "2021-01-01T22:52:09Z", "author_association": "OWNER", "body": "From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399635", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399635, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:45:21Z", "updated_at": "2021-01-01T22:50:21Z", "author_association": "OWNER", "body": "Would also need to figure out the precedence rules:\r\n\r\n- What happens if the database has a `_metadata` table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something.\r\n- Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned?\r\n- If a database has a `license`, does that \"cascade\" down to the tables? What about `source` and `about`?\r\n- What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, `_internal` may have loaded data from `metadata.json` that conflicts with some other remote table metadata definition.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399428, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:43:14Z", "updated_at": "2021-01-01T22:43:22Z", "author_association": "OWNER", "body": "Could this use a compound primary key on `database, table, column`? Does that work with null values?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399366", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399366, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:42:37Z", "updated_at": "2021-01-01T22:42:37Z", "author_association": "OWNER", "body": "So what would the database schema for this look like?\r\n\r\nI'm leaning towards a single table called `_metadata`, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - `_metadata_database` and `_metadata_tables` and `_metadata_columns` perhaps.\r\n\r\nIf it's just a single `_metadata` table, the schema could look like this:\r\n\r\n| database | table | column | metadata |\r\n| --- | --- | --- | --- |\r\n| | mytable | | {\"title\": \"My Table\" } |\r\n| | mytable | mycolumn | {\"description\": \"Column description\" } |\r\n| otherdb | othertable | | {\"description\": \"Table in another DB\" } |\r\n\r\nIf the `database` column is `null` it means \"this is describing a table in the same database file as this `_metadata` table\".\r\n\r\nThe alternative to the `metadata` JSON column would be separate columns for each potential metadata value - `license`, `source`, `about`, `about_url` etc. But that makes it harder for people to create custom metadata fields.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753398542", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753398542, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:37:24Z", "updated_at": "2021-01-01T22:37:24Z", "author_association": "OWNER", "body": "The direction I'm leaning in now is the following:\r\n\r\n- Metadata always lives in SQLite tables\r\n- These tables can be co-located with the database they describe (same DB file)\r\n- ... or they can be in a different DB file and reference the other database that they are describing\r\n- Metadata provided on startup in a `metadata.json` file is loaded into an in-memory metadata table using that same mechanism\r\n\r\nPlugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the `_internal` in-memory database, or they could write to a table in a database on disk.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null}