{"html_url": "https://github.com/simonw/datasette/issues/1576#issuecomment-999874886", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1576", "id": 999874886, "node_id": "IC_kwDOBm6k_c47mOFG", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:55:42Z", "updated_at": "2021-12-22T20:57:28Z", "author_association": "OWNER", "body": "One way to solve this would be to introduce a `set_task_id()` method, which sets an ID which will be returned by `get_task_id()` instead of using `id(current_task(loop=loop))`.\r\n\r\nIt would be really nice if I could solve this using `with` syntax somehow. Something like:\r\n```python\r\nwith trace_child_tasks():\r\n (\r\n suggested_facets,\r\n (facet_results, facets_timed_out),\r\n ) = await asyncio.gather(\r\n execute_suggested_facets(),\r\n execute_facets(),\r\n )\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1087181951, "label": "Traces should include SQL executed by subtasks created with `asyncio.gather`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1576#issuecomment-999874484", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1576", "id": 999874484, "node_id": "IC_kwDOBm6k_c47mN-0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:54:52Z", "updated_at": "2021-12-22T20:54:52Z", "author_association": "OWNER", "body": "Here's the full current relevant code from `tracer.py`: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L8-L64\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1087181951, "label": "Traces should include SQL executed by subtasks created with `asyncio.gather`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999870993", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999870993, "node_id": "IC_kwDOBm6k_c47mNIR", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:47:18Z", "updated_at": "2021-12-22T20:50:24Z", "author_association": "OWNER", "body": "The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25\r\n\r\nThis is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively \"child tasks\" of the current request, and hence should be tracked the same.\r\n\r\nhttps://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999870282", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999870282, "node_id": "IC_kwDOBm6k_c47mM9K", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:45:56Z", "updated_at": "2021-12-22T20:46:08Z", "author_association": "OWNER", "body": "> New short-term goal: get facets and suggested facets to execute in parallel with the main query. Generate a trace graph that proves that is happening using `datasette-pretty-traces`.\r\n\r\nI wrote code to execute those in parallel using `asyncio.gather()` - which seems to work but causes the SQL run inside the parallel `async def` functions not to show up in the trace graph at all.\r\n\r\n```diff\r\ndiff --git a/datasette/views/table.py b/datasette/views/table.py\r\nindex 9808fd2..ec9db64 100644\r\n--- a/datasette/views/table.py\r\n+++ b/datasette/views/table.py\r\n@@ -1,3 +1,4 @@\r\n+import asyncio\r\n import urllib\r\n import itertools\r\n import json\r\n@@ -615,44 +616,37 @@ class TableView(RowTableShared):\r\n if request.args.get(\"_timelimit\"):\r\n extra_args[\"custom_time_limit\"] = int(request.args.get(\"_timelimit\"))\r\n \r\n- # Execute the main query!\r\n- results = await db.execute(sql, params, truncate=True, **extra_args)\r\n-\r\n- # Calculate the total count for this query\r\n- filtered_table_rows_count = None\r\n- if (\r\n- not db.is_mutable\r\n- and self.ds.inspect_data\r\n- and count_sql == f\"select count(*) from {table} \"\r\n- ):\r\n- # We can use a previously cached table row count\r\n- try:\r\n- filtered_table_rows_count = self.ds.inspect_data[database][\"tables\"][\r\n- table\r\n- ][\"count\"]\r\n- except KeyError:\r\n- pass\r\n-\r\n- # Otherwise run a select count(*) ...\r\n- if count_sql and filtered_table_rows_count is None and not nocount:\r\n- try:\r\n- count_rows = list(await db.execute(count_sql, from_sql_params))\r\n- filtered_table_rows_count = count_rows[0][0]\r\n- except QueryInterrupted:\r\n- pass\r\n-\r\n- # Faceting\r\n- if not self.ds.setting(\"allow_facet\") and any(\r\n- arg.startswith(\"_facet\") for arg in request.args\r\n- ):\r\n- raise BadRequest(\"_facet= is not allowed\")\r\n+ async def execute_count():\r\n+ # Calculate the total count for this query\r\n+ filtered_table_rows_count = None\r\n+ if (\r\n+ not db.is_mutable\r\n+ and self.ds.inspect_data\r\n+ and count_sql == f\"select count(*) from {table} \"\r\n+ ):\r\n+ # We can use a previously cached table row count\r\n+ try:\r\n+ filtered_table_rows_count = self.ds.inspect_data[database][\r\n+ \"tables\"\r\n+ ][table][\"count\"]\r\n+ except KeyError:\r\n+ pass\r\n+\r\n+ if count_sql and filtered_table_rows_count is None and not nocount:\r\n+ try:\r\n+ count_rows = list(await db.execute(count_sql, from_sql_params))\r\n+ filtered_table_rows_count = count_rows[0][0]\r\n+ except QueryInterrupted:\r\n+ pass\r\n+\r\n+ return filtered_table_rows_count\r\n+\r\n+ filtered_table_rows_count = await execute_count()\r\n \r\n # pylint: disable=no-member\r\n facet_classes = list(\r\n itertools.chain.from_iterable(pm.hook.register_facet_classes())\r\n )\r\n- facet_results = {}\r\n- facets_timed_out = []\r\n facet_instances = []\r\n for klass in facet_classes:\r\n facet_instances.append(\r\n@@ -668,33 +662,58 @@ class TableView(RowTableShared):\r\n )\r\n )\r\n \r\n- if not nofacet:\r\n- for facet in facet_instances:\r\n- (\r\n- instance_facet_results,\r\n- instance_facets_timed_out,\r\n- ) = await facet.facet_results()\r\n- for facet_info in instance_facet_results:\r\n- base_key = facet_info[\"name\"]\r\n- key = base_key\r\n- i = 1\r\n- while key in facet_results:\r\n- i += 1\r\n- key = f\"{base_key}_{i}\"\r\n- facet_results[key] = facet_info\r\n- facets_timed_out.extend(instance_facets_timed_out)\r\n-\r\n- # Calculate suggested facets\r\n- suggested_facets = []\r\n- if (\r\n- self.ds.setting(\"suggest_facets\")\r\n- and self.ds.setting(\"allow_facet\")\r\n- and not _next\r\n- and not nofacet\r\n- and not nosuggest\r\n- ):\r\n- for facet in facet_instances:\r\n- suggested_facets.extend(await facet.suggest())\r\n+ async def execute_suggested_facets():\r\n+ # Calculate suggested facets\r\n+ suggested_facets = []\r\n+ if (\r\n+ self.ds.setting(\"suggest_facets\")\r\n+ and self.ds.setting(\"allow_facet\")\r\n+ and not _next\r\n+ and not nofacet\r\n+ and not nosuggest\r\n+ ):\r\n+ for facet in facet_instances:\r\n+ suggested_facets.extend(await facet.suggest())\r\n+ return suggested_facets\r\n+\r\n+ async def execute_facets():\r\n+ facet_results = {}\r\n+ facets_timed_out = []\r\n+ if not self.ds.setting(\"allow_facet\") and any(\r\n+ arg.startswith(\"_facet\") for arg in request.args\r\n+ ):\r\n+ raise BadRequest(\"_facet= is not allowed\")\r\n+\r\n+ if not nofacet:\r\n+ for facet in facet_instances:\r\n+ (\r\n+ instance_facet_results,\r\n+ instance_facets_timed_out,\r\n+ ) = await facet.facet_results()\r\n+ for facet_info in instance_facet_results:\r\n+ base_key = facet_info[\"name\"]\r\n+ key = base_key\r\n+ i = 1\r\n+ while key in facet_results:\r\n+ i += 1\r\n+ key = f\"{base_key}_{i}\"\r\n+ facet_results[key] = facet_info\r\n+ facets_timed_out.extend(instance_facets_timed_out)\r\n+\r\n+ return facet_results, facets_timed_out\r\n+\r\n+ # Execute the main query, facets and facet suggestions in parallel:\r\n+ (\r\n+ results,\r\n+ suggested_facets,\r\n+ (facet_results, facets_timed_out),\r\n+ ) = await asyncio.gather(\r\n+ db.execute(sql, params, truncate=True, **extra_args),\r\n+ execute_suggested_facets(),\r\n+ execute_facets(),\r\n+ )\r\n+\r\n+ results = await db.execute(sql, params, truncate=True, **extra_args)\r\n \r\n # Figure out columns and rows for the query\r\n columns = [r[0] for r in results.description]\r\n```\r\nHere's the trace for `http://127.0.0.1:4422/fixtures/compound_three_primary_keys?_trace=1&_facet=pk1&_facet=pk2` with the missing facet and facet suggestion queries:\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999863269", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999863269, "node_id": "IC_kwDOBm6k_c47mLPl", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:35:41Z", "updated_at": "2021-12-22T20:37:13Z", "author_association": "OWNER", "body": "It looks like the count has to be executed before facets can be, because the facet_class constructor needs that total count figure: https://github.com/simonw/datasette/blob/6b1384b2f529134998fb507e63307609a5b7f5c0/datasette/views/table.py#L660-L671\r\n\r\nIt's used in facet suggestion logic here: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/facets.py#L172-L178", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999850191", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999850191, "node_id": "IC_kwDOBm6k_c47mIDP", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:29:38Z", "updated_at": "2021-12-22T20:29:38Z", "author_association": "OWNER", "body": "New short-term goal: get facets and suggested facets to execute in parallel with the main query. Generate a trace graph that proves that is happening using `datasette-pretty-traces`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999837569", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999837569, "node_id": "IC_kwDOBm6k_c47mE-B", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:15:45Z", "updated_at": "2021-12-22T20:15:45Z", "author_association": "OWNER", "body": "Also the whole `special_args` v.s. `request.args` thing is pretty confusing, I think that might be an older code pattern back from when I was using Sanic.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999837220", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999837220, "node_id": "IC_kwDOBm6k_c47mE4k", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:15:04Z", "updated_at": "2021-12-22T20:15:04Z", "author_association": "OWNER", "body": "I think I can move this much higher up in the method, it's a bit confusing having it half way through: https://github.com/simonw/datasette/blob/6b1384b2f529134998fb507e63307609a5b7f5c0/datasette/views/table.py#L414-L436", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-999831967", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 999831967, "node_id": "IC_kwDOBm6k_c47mDmf", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-22T20:04:47Z", "updated_at": "2021-12-22T20:10:11Z", "author_association": "OWNER", "body": "I think I might be able to clean up a lot of the stuff in here using the `render_cell` plugin hook: https://github.com/simonw/datasette/blob/6b1384b2f529134998fb507e63307609a5b7f5c0/datasette/views/table.py#L87-L89\r\n\r\nThe catch with that hook - https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-value-column-table-database-datasette - is that it gets called for every single cell. I don't want the overhead of looking up the foreign key relationships etc once for every value in a specific column.\r\n\r\nBut maybe I could extend the hook to include a shared cache that gets used for all of the cells in a specific table? Something like this:\r\n```python\r\nrender_cell(value, column, table, database, datasette, cache)\r\n```\r\n`cache` is a dictionary - and the same dictionary is passed to every call to that hook while rendering a specific page.\r\n\r\nIt's a bit of a gross hack though, and would it ever be useful for plugins outside of the default plugin in Datasette which does the foreign key stuff?\r\n\r\nIf I can think of one other potential application for this `cache` then I might implement it.\r\n\r\nNo, this optimization doesn't make sense: the most complex cell enrichment logic is the stuff that does a `select * from categories where id in (2, 5, 6)` query, using just the distinct set of IDs that are rendered on the current page. That's not going to fit in the `render_cell` hook no matter how hard I try to warp it into the right shape, because it needs full visibility of all of the results that are being rendered in order to collect those unique ID values.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1181#issuecomment-998999230", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1181", "id": 998999230, "node_id": "IC_kwDOBm6k_c47i4S-", "user": {"value": 9308268, "label": "rayvoelker"}, "created_at": "2021-12-21T18:25:15Z", "updated_at": "2021-12-21T18:25:15Z", "author_association": "NONE", "body": "I wonder if I'm encountering the same bug (or something related). I had previously been using the .csv feature to run queries and then fetch results for the pandas `read_csv()` function, but it seems to have stopped working recently.\r\n\r\nhttps://ilsweb.cincinnatilibrary.org/collection-analysis/collection-analysis/current_collection-3d56dbf.csv?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Alimit%0D%0A++100&_size=max\r\n\r\nDatasette v0.59.4\r\n![image](https://user-images.githubusercontent.com/9308268/146979957-66911877-2cd9-4022-bc76-fd54e4a3a6f7.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 781262510, "label": "Certain database names results in 404: \"Database not found: None\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1554#issuecomment-998354538", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1554", "id": 998354538, "node_id": "IC_kwDOBm6k_c47ga5q", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T23:52:04Z", "updated_at": "2021-12-20T23:52:04Z", "author_association": "OWNER", "body": "Abandoning this since it didn't work how I wanted.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079129258, "label": "TableView refactor"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997519202", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997519202, "node_id": "IC_kwDOBm6k_c47dO9i", "user": {"value": 127565, "label": "wragge"}, "created_at": "2021-12-20T01:36:58Z", "updated_at": "2021-12-20T01:36:58Z", "author_association": "CONTRIBUTOR", "body": "Yep, that works -- thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997514220", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997514220, "node_id": "IC_kwDOBm6k_c47dNvs", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T01:26:25Z", "updated_at": "2021-12-20T01:26:25Z", "author_association": "OWNER", "body": "OK, this should hopefully fix that for you:\r\n\r\n pip install https://github.com/simonw/datasette/archive/f36e010b3b69ada104b79d83c7685caf9359049e.zip", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997513369", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997513369, "node_id": "IC_kwDOBm6k_c47dNiZ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T01:24:43Z", "updated_at": "2021-12-20T01:24:43Z", "author_association": "OWNER", "body": "@wragge thanks, that's a bug! Working on that in #1575.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1575#issuecomment-997513177", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1575", "id": 997513177, "node_id": "IC_kwDOBm6k_c47dNfZ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T01:24:25Z", "updated_at": "2021-12-20T01:24:25Z", "author_association": "OWNER", "body": "Looks like `specname` is new in Pluggy 1.0: https://github.com/pytest-dev/pluggy/blob/main/CHANGELOG.rst#pluggy-100-2021-08-25", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1084257842, "label": "__call__() got an unexpected keyword argument 'specname'"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997511968", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997511968, "node_id": "IC_kwDOBm6k_c47dNMg", "user": {"value": 127565, "label": "wragge"}, "created_at": "2021-12-20T01:21:59Z", "updated_at": "2021-12-20T01:21:59Z", "author_association": "CONTRIBUTOR", "body": "I've installed the alpha version but get an error when starting up Datasette:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/tim/.pyenv/versions/stock-exchange/bin/datasette\", line 5, in \r\n from datasette.cli import cli\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py\", line 15, in \r\n from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py\", line 31, in \r\n from .views.database import DatabaseDownload, DatabaseView\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py\", line 25, in \r\n from datasette.plugins import pm\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py\", line 29, in \r\n mod = importlib.import_module(plugin)\r\n File \"/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py\", line 127, in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n File \"/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py\", line 9, in \r\n @hookimpl(specname=\"filters_from_request\")\r\nTypeError: __call__() got an unexpected keyword argument 'specname'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997507074", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997507074, "node_id": "IC_kwDOCGYnMM47dMAC", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T01:10:06Z", "updated_at": "2021-12-20T01:16:11Z", "author_association": "OWNER", "body": "Work-in-progress improved help:\r\n```\r\nUsage: sqlite-utils insert [OPTIONS] PATH TABLE FILE\r\n\r\n Insert records from FILE into a table, creating the table if it does not\r\n already exist.\r\n\r\n By default the input is expected to be a JSON array of objects. Or:\r\n\r\n - Use --nl for newline-delimited JSON objects\r\n - Use --csv or --tsv for comma-separated or tab-separated input\r\n - Use --lines to write each incoming line to a column called \"line\"\r\n - Use --all to write the entire input to a column called \"all\"\r\n\r\n You can also use --convert to pass a fragment of Python code that will be\r\n used to convert each input.\r\n\r\n Your Python code will be passed a \"row\" variable representing the imported\r\n row, and can return a modified row.\r\n\r\n If you are using --lines your code will be passed a \"line\" variable, and for\r\n --all an \"all\" variable.\r\n\r\nOptions:\r\n --pk TEXT Columns to use as the primary key, e.g. id\r\n --flatten Flatten nested JSON objects, so {\"a\": {\"b\": 1}}\r\n becomes {\"a_b\": 1}\r\n --nl Expect newline-delimited JSON\r\n -c, --csv Expect CSV input\r\n --tsv Expect TSV input\r\n --lines Treat each line as a single value called 'line'\r\n --all Treat input as a single value called 'all'\r\n --convert TEXT Python code to convert each item\r\n --import TEXT Python modules to import\r\n --delimiter TEXT Delimiter to use for CSV files\r\n --quotechar TEXT Quote character to use for CSV/TSV\r\n --sniff Detect delimiter and quote character\r\n --no-headers CSV file has no header row\r\n --batch-size INTEGER Commit every X records\r\n --alter Alter existing table to add any missing columns\r\n --not-null TEXT Columns that should be created as NOT NULL\r\n --default ... Default value that should be set for a column\r\n --encoding TEXT Character encoding for input, defaults to utf-8\r\n -d, --detect-types Detect types for columns in CSV/TSV data\r\n --load-extension TEXT SQLite extensions to load\r\n --silent Do not show progress bar\r\n --ignore Ignore records if pk already exists\r\n --replace Replace records if pk already exists\r\n --truncate Truncate table before inserting records, if table\r\n already exists\r\n -h, --help Show this message and exit.\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997508728", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997508728, "node_id": "IC_kwDOCGYnMM47dMZ4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T01:14:43Z", "updated_at": "2021-12-20T01:14:43Z", "author_association": "OWNER", "body": "(This makes me want `--extract` from #352 even more.)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/163#issuecomment-997502242", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/163", "id": 997502242, "node_id": "IC_kwDOCGYnMM47dK0i", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T00:56:45Z", "updated_at": "2021-12-20T00:56:52Z", "author_association": "OWNER", "body": "> Maybe `sqlite-utils` should absorb all of the functionality from `sqlite-transform` - having two separate tools doesn't necessarily make sense.\r\n\r\nI implemented that in:\r\n- #251", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 706001517, "label": "Idea: conversions= could take Python functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997497262", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997497262, "node_id": "IC_kwDOCGYnMM47dJmu", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T00:40:15Z", "updated_at": "2021-12-20T00:40:15Z", "author_association": "OWNER", "body": "`--flatten` could do with a better description too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997496931", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997496931, "node_id": "IC_kwDOCGYnMM47dJhj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T00:39:14Z", "updated_at": "2021-12-20T00:39:52Z", "author_association": "OWNER", "body": "```\r\n% sqlite-utils insert --help\r\nUsage: sqlite-utils insert [OPTIONS] PATH TABLE JSON_FILE\r\n\r\n Insert records from JSON file into a table, creating the table if it does\r\n not already exist.\r\n\r\n Input should be a JSON array of objects, unless --nl or --csv is used.\r\n\r\nOptions:\r\n --pk TEXT Columns to use as the primary key, e.g. id\r\n --nl Expect newline-delimited JSON\r\n --flatten Flatten nested JSON objects\r\n -c, --csv Expect CSV\r\n --tsv Expect TSV\r\n --convert TEXT Python code to convert each item\r\n --import TEXT Python modules to import\r\n --delimiter TEXT Delimiter to use for CSV files\r\n --quotechar TEXT Quote character to use for CSV/TSV\r\n --sniff Detect delimiter and quote character\r\n --no-headers CSV file has no header row\r\n --batch-size INTEGER Commit every X records\r\n --alter Alter existing table to add any missing columns\r\n --not-null TEXT Columns that should be created as NOT NULL\r\n --default ... Default value that should be set for a column\r\n --encoding TEXT Character encoding for input, defaults to utf-8\r\n -d, --detect-types Detect types for columns in CSV/TSV data\r\n --load-extension TEXT SQLite extensions to load\r\n --silent Do not show progress bar\r\n --ignore Ignore records if pk already exists\r\n --replace Replace records if pk already exists\r\n --truncate Truncate table before inserting records, if table\r\n already exists\r\n -h, --help Show this message and exit.\r\n```\r\nI can add a bunch of extra help at the top there to explain all of this stuff. That \"Input should be a JSON array of objects\" bit could be expanded to several paragraphs.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997492872", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997492872, "node_id": "IC_kwDOCGYnMM47dIiI", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-20T00:23:31Z", "updated_at": "2021-12-20T00:23:31Z", "author_association": "OWNER", "body": "I think this should work on JSON, or CSV, or individual lines, or the entire content at once.\r\n\r\nSo I'll require `--lines --convert ...` to import individual lines, or `--all --convert` to run the conversion against the entire input at once.\r\n\r\nWhat would `--lines` or `--all` do without `--convert`? Maybe insert records as `{\"line\": \"line of text\"}` or `{\"all\": \"whole input}`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997486156", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997486156, "node_id": "IC_kwDOCGYnMM47dG5M", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T23:51:02Z", "updated_at": "2021-12-19T23:51:02Z", "author_association": "OWNER", "body": "This is going to need a `--import` multi option too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997485361", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/356", "id": 997485361, "node_id": "IC_kwDOCGYnMM47dGsx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T23:45:30Z", "updated_at": "2021-12-19T23:45:30Z", "author_association": "OWNER", "body": "Really interesting example input for this: https://blog.timac.org/2021/1219-state-of-swift-and-swiftui-ios15/iOS13.txt - see https://blog.timac.org/2021/1219-state-of-swift-and-swiftui-ios15/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077431957, "label": "`sqlite-utils insert --convert` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997474022", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997474022, "node_id": "IC_kwDOBm6k_c47dD7m", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:36:49Z", "updated_at": "2021-12-19T22:37:29Z", "author_association": "OWNER", "body": "No way with a tagged template literal to pass an extra database name argument, so instead I need a method that returns a callable that can be used for the tagged template literal for a specific database - or the default database.\r\n\r\nThis could work (bit weird looking though):\r\n```javascript\r\nvar rows = await datasette.query(\"fixtures\")`select * from foo`;\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997473856", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997473856, "node_id": "IC_kwDOBm6k_c47dD5A", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:35:20Z", "updated_at": "2021-12-19T22:35:20Z", "author_association": "OWNER", "body": "Quick prototype of that tagged template `query` function:\r\n\r\n```javascript\r\nfunction query(pieces, ...parameters) {\r\n var qs = new URLSearchParams();\r\n var sql = pieces[0];\r\n parameters.forEach((param, i) => {\r\n sql += `:p${i}${pieces[i + 1]}`;\r\n qs.append(`p${i}`, param);\r\n });\r\n qs.append(\"sql\", sql);\r\n return qs.toString();\r\n}\r\n\r\nvar id = 4;\r\nconsole.log(query`select * from ids where id > ${id}`);\r\n```\r\nOutputs:\r\n```\r\np0=4&sql=select+*+from+ids+where+id+%3E+%3Ap0\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997472639", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997472639, "node_id": "IC_kwDOBm6k_c47dDl_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:25:50Z", "updated_at": "2021-12-19T22:25:50Z", "author_association": "OWNER", "body": "Or...\r\n```javascript\r\nrows = await datasette.query`select * from searchable where id > ${id}`;\r\n```\r\nAnd it knows how to turn that into a parameterized call using tagged template literals.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997472509", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997472509, "node_id": "IC_kwDOBm6k_c47dDj9", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:24:50Z", "updated_at": "2021-12-19T22:24:50Z", "author_association": "OWNER", "body": "... huh, it could even expose a JavaScript function that can be called to execute a SQL query.\r\n\r\n```javascript\r\ndatasette.query(\"select * from blah\").then(...)\r\n```\r\nMaybe it takes an optional second argument that specifies the database - defaulting to the one for the current page.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997472370", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997472370, "node_id": "IC_kwDOBm6k_c47dDhy", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:23:36Z", "updated_at": "2021-12-19T22:23:36Z", "author_association": "OWNER", "body": "This should also expose the JSON API endpoints used to execute SQL against this database.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-997472214", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 997472214, "node_id": "IC_kwDOBm6k_c47dDfW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:22:08Z", "updated_at": "2021-12-19T22:22:08Z", "author_association": "OWNER", "body": "I sketched out a chained SQL builder pattern that might be useful for further tidying up this code - though with the new plugin hook I'm less excited about it than I was:\r\n\r\n```python\r\nclass TableQuery:\r\n def __init__(self, table, columns, pks, is_view=False, prev=None):\r\n self.table = table\r\n self.columns = columns\r\n self.pks = pks\r\n self.is_view = is_view\r\n self.prev = prev\r\n \r\n # These can be changed for different instances in the chain:\r\n self._where_clauses = None\r\n self._order_by = None\r\n self._page_size = None\r\n self._offset = None\r\n self._select_columns = None\r\n\r\n self.select_all_columns = '*'\r\n self.select_specified_columns = '*'\r\n\r\n @property\r\n def where_clauses(self):\r\n wheres = []\r\n current = self\r\n while current:\r\n if current._where_clauses is not None:\r\n wheres.extend(current._where_clauses)\r\n current = current.prev\r\n return list(reversed(wheres))\r\n\r\n def where(self, where):\r\n new_cls = TableQuery(self.table, self.columns, self.pks, self.is_view, self)\r\n new_cls._where_clauses = [where]\r\n return new_cls\r\n \r\n @classmethod\r\n async def introspect(cls, db, table):\r\n return cls(\r\n table,\r\n columns = await db.table_columns(table),\r\n pks = await db.primary_keys(table),\r\n is_view = bool(await db.get_view_definition(table))\r\n )\r\n \r\n @property\r\n def sql_from(self):\r\n return f\"from {self.table}{self.sql_where}\"\r\n\r\n @property\r\n def sql_where(self):\r\n if not self.where_clauses:\r\n return \"\"\r\n else:\r\n return f\" where {' and '.join(self.where_clauses)}\"\r\n\r\n @property\r\n def sql_no_order_no_limit(self):\r\n return f\"select {self.select_all_columns} from {self.table}{self.sql_where}\"\r\n\r\n @property\r\n def sql(self):\r\n return f\"select {self.select_specified_columns} from {self.table} {self.sql_where}{self._order_by} limit {self._page_size}{self._offset}\"\r\n\r\n @property\r\n def sql_count(self):\r\n return f\"select count(*) {self.sql_from}\"\r\n\r\n\r\n def __repr__(self):\r\n return f\"\"\r\n```\r\nUsage:\r\n```python\r\nfrom datasette.app import Datasette\r\nds = Datasette(memory=True, files=[\"/Users/simon/Dropbox/Development/datasette/fixtures.db\"])\r\ndb = ds.get_database(\"fixtures\")\r\nquery = await TableQuery.introspect(db, \"facetable\")\r\nprint(query.where(\"foo = bar\").where(\"baz = 1\").sql_count)\r\n# 'select count(*) from facetable where foo = bar and baz = 1'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997471672", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997471672, "node_id": "IC_kwDOBm6k_c47dDW4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:18:26Z", "updated_at": "2021-12-19T22:18:26Z", "author_association": "OWNER", "body": "I released this [in an alpha](https://github.com/simonw/datasette/releases/tag/0.60a1), so you can try out this fix using:\r\n\r\n pip install datasette==0.60a1", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1566#issuecomment-997470633", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1566", "id": 997470633, "node_id": "IC_kwDOBm6k_c47dDGp", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T22:12:00Z", "updated_at": "2021-12-19T22:12:00Z", "author_association": "OWNER", "body": "Released another alpha, 0.60a1: https://github.com/simonw/datasette/releases/tag/0.60a1", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083669410, "label": "Release Datasette 0.60"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1545#issuecomment-997462604", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1545", "id": 997462604, "node_id": "IC_kwDOBm6k_c47dBJM", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T21:17:08Z", "updated_at": "2021-12-19T21:17:08Z", "author_association": "OWNER", "body": "Here's the relevant code: https://github.com/simonw/datasette/blob/4094741c2881c2ada3f3f878b532fdaec7914953/datasette/app.py#L1204-L1219\r\n\r\nIt's using `route_path.split(\"/\")` which should be OK because that's the incoming `request.path` path - which I would expect to use `/` even on Windows. Then it uses `os.path.join` which should do the right thing.\r\n\r\nI need to get myself a proper Windows development environment setup to investigate this one.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1075893249, "label": "Custom pages don't work on windows"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1573#issuecomment-997462117", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1573", "id": 997462117, "node_id": "IC_kwDOBm6k_c47dBBl", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T21:13:13Z", "updated_at": "2021-12-19T21:13:13Z", "author_association": "OWNER", "body": "This might also be the impetus I need to bring the https://datasette.io/plugins/datasette-pretty-traces plugin into Datasette core itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1084185188, "label": "Make trace() a documented internal API"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1547#issuecomment-997460731", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1547", "id": 997460731, "node_id": "IC_kwDOBm6k_c47dAr7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T21:02:15Z", "updated_at": "2021-12-19T21:02:15Z", "author_association": "OWNER", "body": "Yes, this is a bug. It looks like the problem is with the `if write:` branch in this code here: https://github.com/simonw/datasette/blob/5fac26aa221a111d7633f2dd92014641f7c0ade9/datasette/views/database.py#L252-L327\r\n\r\nIs missing this bit of code:\r\n\r\nhttps://github.com/simonw/datasette/blob/5fac26aa221a111d7633f2dd92014641f7c0ade9/datasette/views/database.py#L343-L347", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076388044, "label": "Writable canned queries fail to load custom templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1570#issuecomment-997460061", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1570", "id": 997460061, "node_id": "IC_kwDOBm6k_c47dAhd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:56:54Z", "updated_at": "2021-12-19T20:56:54Z", "author_association": "OWNER", "body": "Documentation: https://docs.datasette.io/en/latest/internals.html#await-db-execute-write-sql-params-none-block-false", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083921371, "label": "Separate db.execute_write() into three methods"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997459958", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997459958, "node_id": "IC_kwDOBm6k_c47dAf2", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:55:59Z", "updated_at": "2021-12-19T20:55:59Z", "author_association": "OWNER", "body": "Closing this issue because I've optimized this a whole bunch, and it's definitely good enough for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997325189", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997325189, "node_id": "IC_kwDOBm6k_c47cfmF", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:55:01Z", "updated_at": "2021-12-19T20:54:51Z", "author_association": "OWNER", "body": "It's a bit annoying that the queries no longer show up in the trace at all now, thanks to running in `.execute_fn()`. I wonder if there's something smart I can do about that - maybe have `trace()` record that function with a traceback even though it doesn't have the executed SQL string?\r\n\r\n5fac26aa221a111d7633f2dd92014641f7c0ade9 has the same problem.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997459637", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997459637, "node_id": "IC_kwDOBm6k_c47dAa1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:53:46Z", "updated_at": "2021-12-19T20:53:46Z", "author_association": "OWNER", "body": "Using #1571 showed me that the `DELETE FROM columns/foreign_keys/indexes WHERE database_name = ? and table_name = ?` queries were running way more times than I expected. I came up with a new optimization that just does `DELETE FROM columns/foreign_keys/indexes WHERE database_name = ?` instead.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1566#issuecomment-997457790", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1566", "id": 997457790, "node_id": "IC_kwDOBm6k_c47c_9-", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:40:50Z", "updated_at": "2021-12-19T20:40:57Z", "author_association": "OWNER", "body": "Also release new version of `datasette-pretty-traces` with this feature:\r\n- https://github.com/simonw/datasette-pretty-traces/issues/7", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083669410, "label": "Release Datasette 0.60"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997342494", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997342494, "node_id": "IC_kwDOBm6k_c47cj0e", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T07:22:04Z", "updated_at": "2021-12-19T07:22:04Z", "author_association": "OWNER", "body": "Another option would be to provide an abstraction that makes it easier to run a group of SQL queries in the same thread at the same time, and have them traced correctly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997324666", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997324666, "node_id": "IC_kwDOBm6k_c47cfd6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:47:51Z", "updated_at": "2021-12-19T03:48:09Z", "author_association": "OWNER", "body": "Here's a hacked together prototype of running all of that stuff inside a single function passed to `.execute_fn()`:\r\n\r\n```diff\r\ndiff --git a/datasette/utils/internal_db.py b/datasette/utils/internal_db.py\r\nindex 95055d8..58f9982 100644\r\n--- a/datasette/utils/internal_db.py\r\n+++ b/datasette/utils/internal_db.py\r\n@@ -1,4 +1,5 @@\r\n import textwrap\r\n+from datasette.utils import table_column_details\r\n \r\n \r\n async def init_internal_db(db):\r\n@@ -70,49 +71,70 @@ async def populate_schema_tables(internal_db, db):\r\n \"DELETE FROM tables WHERE database_name = ?\", [database_name], block=True\r\n )\r\n tables = (await db.execute(\"select * from sqlite_master WHERE type = 'table'\")).rows\r\n- tables_to_insert = []\r\n- columns_to_delete = []\r\n- columns_to_insert = []\r\n- foreign_keys_to_delete = []\r\n- foreign_keys_to_insert = []\r\n- indexes_to_delete = []\r\n- indexes_to_insert = []\r\n \r\n- for table in tables:\r\n- table_name = table[\"name\"]\r\n- tables_to_insert.append(\r\n- (database_name, table_name, table[\"rootpage\"], table[\"sql\"])\r\n- )\r\n- columns_to_delete.append((database_name, table_name))\r\n- columns = await db.table_column_details(table_name)\r\n- columns_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **column._asdict(),\r\n- }\r\n- for column in columns\r\n- )\r\n- foreign_keys_to_delete.append((database_name, table_name))\r\n- foreign_keys = (\r\n- await db.execute(f\"PRAGMA foreign_key_list([{table_name}])\")\r\n- ).rows\r\n- foreign_keys_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **dict(foreign_key),\r\n- }\r\n- for foreign_key in foreign_keys\r\n- )\r\n- indexes_to_delete.append((database_name, table_name))\r\n- indexes = (await db.execute(f\"PRAGMA index_list([{table_name}])\")).rows\r\n- indexes_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **dict(index),\r\n- }\r\n- for index in indexes\r\n+ def collect_info(conn):\r\n+ tables_to_insert = []\r\n+ columns_to_delete = []\r\n+ columns_to_insert = []\r\n+ foreign_keys_to_delete = []\r\n+ foreign_keys_to_insert = []\r\n+ indexes_to_delete = []\r\n+ indexes_to_insert = []\r\n+\r\n+ for table in tables:\r\n+ table_name = table[\"name\"]\r\n+ tables_to_insert.append(\r\n+ (database_name, table_name, table[\"rootpage\"], table[\"sql\"])\r\n+ )\r\n+ columns_to_delete.append((database_name, table_name))\r\n+ columns = table_column_details(conn, table_name)\r\n+ columns_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **column._asdict(),\r\n+ }\r\n+ for column in columns\r\n+ )\r\n+ foreign_keys_to_delete.append((database_name, table_name))\r\n+ foreign_keys = conn.execute(\r\n+ f\"PRAGMA foreign_key_list([{table_name}])\"\r\n+ ).fetchall()\r\n+ foreign_keys_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **dict(foreign_key),\r\n+ }\r\n+ for foreign_key in foreign_keys\r\n+ )\r\n+ indexes_to_delete.append((database_name, table_name))\r\n+ indexes = conn.execute(f\"PRAGMA index_list([{table_name}])\").fetchall()\r\n+ indexes_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **dict(index),\r\n+ }\r\n+ for index in indexes\r\n+ )\r\n+ return (\r\n+ tables_to_insert,\r\n+ columns_to_delete,\r\n+ columns_to_insert,\r\n+ foreign_keys_to_delete,\r\n+ foreign_keys_to_insert,\r\n+ indexes_to_delete,\r\n+ indexes_to_insert,\r\n )\r\n \r\n+ (\r\n+ tables_to_insert,\r\n+ columns_to_delete,\r\n+ columns_to_insert,\r\n+ foreign_keys_to_delete,\r\n+ foreign_keys_to_insert,\r\n+ indexes_to_delete,\r\n+ indexes_to_insert,\r\n+ ) = await db.execute_fn(collect_info)\r\n+\r\n await internal_db.execute_write_many(\r\n \"\"\"\r\n INSERT INTO tables (database_name, table_name, rootpage, sql)\r\n```\r\nFirst impressions: it looks like this helps **a lot** - as far as I can tell this is now taking around 21ms to get to the point at which all of those internal databases have been populated, where previously it took more than 180ms.\r\n\r\n![CleanShot 2021-12-18 at 19 47 22@2x](https://user-images.githubusercontent.com/9599/146663192-bba098d5-e7bd-4e2e-b525-2270867888a0.png)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997324156", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997324156, "node_id": "IC_kwDOBm6k_c47cfV8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:40:05Z", "updated_at": "2021-12-19T03:40:05Z", "author_association": "OWNER", "body": "Using the prototype of this:\r\n- https://github.com/simonw/datasette-pretty-traces/issues/5\r\n\r\nI'm seeing about 180ms spent running all of these queries on startup!\r\n\r\n![CleanShot 2021-12-18 at 19 38 37@2x](https://user-images.githubusercontent.com/9599/146663045-46bda669-90de-474f-8870-345182725dc1.png)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321767", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321767, "node_id": "IC_kwDOBm6k_c47cewn", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:10:58Z", "updated_at": "2021-12-19T03:10:58Z", "author_association": "OWNER", "body": "I wonder how much overhead there is switching between the `async` event loop main code and the thread that runs the SQL queries.\r\n\r\nWould there be a performance boost if I gathered all of the column/index information in a single function run on the thread using `db.execute_fn()` I wonder? It would eliminate a bunch of switching between threads.\r\n\r\nWould be great to understand how much of an impact that would have.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321653", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321653, "node_id": "IC_kwDOBm6k_c47ceu1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:09:43Z", "updated_at": "2021-12-19T03:09:43Z", "author_association": "OWNER", "body": "On that same documentation page I just spotted this:\r\n\r\n> This feature is experimental and is subject to change. Further documentation will become available if and when the table-valued functions for PRAGMAs feature becomes officially supported. \r\n\r\nThis makes me nervous to rely on pragma function optimizations in Datasette itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321477", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321477, "node_id": "IC_kwDOBm6k_c47cesF", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:07:33Z", "updated_at": "2021-12-19T03:07:33Z", "author_association": "OWNER", "body": "If I want to continue supporting SQLite prior to 3.16.0 (2017-01-02) I'll need this optimization to only kick in with versions that support table-valued PRAGMA functions, while keeping the old `PRAGMA foreign_key_list(table)` stuff working for those older versions.\r\n\r\nThat's feasible, but it's a bit more work - and I need to make sure I have robust testing in place for SQLite 3.15.0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321327", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321327, "node_id": "IC_kwDOBm6k_c47cepv", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:05:39Z", "updated_at": "2021-12-19T03:05:44Z", "author_association": "OWNER", "body": "This caught me out once before in:\r\n- https://github.com/simonw/datasette/issues/1276\r\n\r\nTurns out Glitch was running SQLite 3.11.0 from 2016-02-15.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321217", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321217, "node_id": "IC_kwDOBm6k_c47ceoB", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:04:16Z", "updated_at": "2021-12-19T03:04:16Z", "author_association": "OWNER", "body": "One thing to watch out for though, from https://sqlite.org/pragma.html#pragfunc\r\n\r\n> The table-valued functions for PRAGMA feature was added in SQLite version 3.16.0 (2017-01-02). Prior versions of SQLite cannot use this feature. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321115", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321115, "node_id": "IC_kwDOBm6k_c47cemb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:03:12Z", "updated_at": "2021-12-19T03:03:12Z", "author_association": "OWNER", "body": "Table columns is a bit harder, because `table_xinfo` is only in SQLite 3.26.0 or higher: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/utils/__init__.py#L565-L581\r\n\r\nSo if that function is available: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++table_xinfo.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_table_xinfo%28sqlite_master.name%29+AS+table_xinfo%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n table_xinfo.*\r\nFROM\r\n sqlite_master,\r\n pragma_table_xinfo(sqlite_master.name) AS table_xinfo\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```\r\nAnd otherwise, using `table_info`: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++table_info.*%2C%0D%0A++0+as+hidden%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_table_info%28sqlite_master.name%29+AS+table_info%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n table_info.*,\r\n 0 as hidden\r\nFROM\r\n sqlite_master,\r\n pragma_table_info(sqlite_master.name) AS table_info\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997320824", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997320824, "node_id": "IC_kwDOBm6k_c47ceh4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T02:59:57Z", "updated_at": "2021-12-19T03:00:44Z", "author_association": "OWNER", "body": "To list all indexes: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++index_list.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_index_list%28sqlite_master.name%29+AS+index_list%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n index_list.*\r\nFROM\r\n sqlite_master,\r\n pragma_index_list(sqlite_master.name) AS index_list\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```\r\n\r\nForeign keys: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++foreign_key_list.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_foreign_key_list%28sqlite_master.name%29+AS+foreign_key_list%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n foreign_key_list.*\r\nFROM\r\n sqlite_master,\r\n pragma_foreign_key_list(sqlite_master.name) AS foreign_key_list\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1566#issuecomment-997272328", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1566", "id": 997272328, "node_id": "IC_kwDOBm6k_c47cSsI", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T19:18:01Z", "updated_at": "2021-12-18T19:18:01Z", "author_association": "OWNER", "body": "Added some useful new documented internal methods in:\r\n- #1570", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083669410, "label": "Release Datasette 0.60"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997272223", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997272223, "node_id": "IC_kwDOBm6k_c47cSqf", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T19:17:13Z", "updated_at": "2021-12-18T19:17:13Z", "author_association": "OWNER", "body": "That's a good optimization. Still need to deal with the huge flurry of `PRAGMA` queries though before I can consider this done.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1570#issuecomment-997267583", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1570", "id": 997267583, "node_id": "IC_kwDOBm6k_c47cRh_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:46:05Z", "updated_at": "2021-12-18T18:46:12Z", "author_association": "OWNER", "body": "This will replace the work done in #1569.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083921371, "label": "Separate db.execute_write() into three methods"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997267416", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997267416, "node_id": "IC_kwDOBm6k_c47cRfY", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:44:53Z", "updated_at": "2021-12-18T18:45:28Z", "author_association": "OWNER", "body": "Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods:\r\n\r\n- `db.execute_write(sql, params=None, block=False)`\r\n- `db.execute_writescript(sql, block=False)`\r\n- `db.execute_writemany(sql, params_seq, block=False)`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1569#issuecomment-997266687", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1569", "id": 997266687, "node_id": "IC_kwDOBm6k_c47cRT_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:41:40Z", "updated_at": "2021-12-18T18:41:40Z", "author_association": "OWNER", "body": "Updated documentation: https://docs.datasette.io/en/latest/internals.html#await-db-execute-write-sql-params-none-executescript-false-block-false", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083895395, "label": "db.execute_write(..., executescript=True) parameter"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997266100", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997266100, "node_id": "IC_kwDOBm6k_c47cRK0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:40:02Z", "updated_at": "2021-12-18T18:40:02Z", "author_association": "OWNER", "body": "The implementation of `cursor.executemany()` looks very efficient - it turns into a call to this C function with `multiple` set to `1`: https://github.com/python/cpython/blob/e002bbc6cce637171fb2b1391ffeca8643a13843/Modules/_sqlite/cursor.c#L468-L469", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997262475", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997262475, "node_id": "IC_kwDOBm6k_c47cQSL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:34:18Z", "updated_at": "2021-12-18T18:34:18Z", "author_association": "OWNER", "body": "\"image\"\r\n\r\nUsing `executescript=True` that call now takes 1.89ms to create all of those tables.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1569#issuecomment-997249563", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1569", "id": 997249563, "node_id": "IC_kwDOBm6k_c47cNIb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:21:23Z", "updated_at": "2021-12-18T18:21:23Z", "author_association": "OWNER", "body": "Goal here is to gain the ability to use `conn.executescript()` and still have it show up in the tracer.\r\n\r\nhttps://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083895395, "label": "db.execute_write(..., executescript=True) parameter"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997248364", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997248364, "node_id": "IC_kwDOBm6k_c47cM1s", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:20:10Z", "updated_at": "2021-12-18T18:20:10Z", "author_association": "OWNER", "body": "Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this:\r\n```diff\r\ndiff --git a/datasette/database.py b/datasette/database.py\r\nindex 468e936..1a424f5 100644\r\n--- a/datasette/database.py\r\n+++ b/datasette/database.py\r\n@@ -94,10 +94,14 @@ class Database:\r\n f\"file:{self.path}{qs}\", uri=True, check_same_thread=False\r\n )\r\n \r\n- async def execute_write(self, sql, params=None, block=False):\r\n+ async def execute_write(self, sql, params=None, executescript=False, block=False):\r\n+ assert not executescript and params, \"Cannot use params with executescript=True\"\r\n def _inner(conn):\r\n with conn:\r\n- return conn.execute(sql, params or [])\r\n+ if executescript:\r\n+ return conn.executescript(sql)\r\n+ else:\r\n+ return conn.execute(sql, params or [])\r\n \r\n with trace(\"sql\", database=self.name, sql=sql.strip(), params=params):\r\n results = await self.execute_write_fn(_inner, block=block)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997245301", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997245301, "node_id": "IC_kwDOBm6k_c47cMF1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:17:04Z", "updated_at": "2021-12-18T18:17:04Z", "author_association": "OWNER", "body": "One downside of `conn.executescript()` is that it won't be picked up by the tracing mechanism - in fact nothing that uses `await db.execute_write_fn(fn, block=True)` or `await db.execute_fn(fn, block=True)` gets picked up by tracing.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997241969", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997241969, "node_id": "IC_kwDOBm6k_c47cLRx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:13:04Z", "updated_at": "2021-12-18T18:13:04Z", "author_association": "OWNER", "body": "Also: running all of those `CREATE TABLE IF NOT EXISTS` in a single call to `conn.executescript()` rather than as separate queries may speed things up too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997241645", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997241645, "node_id": "IC_kwDOBm6k_c47cLMt", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:12:26Z", "updated_at": "2021-12-18T18:12:26Z", "author_association": "OWNER", "body": "A simpler optimization would be just to turn all of those column and index reads into a single efficient UNION query against each database, then figure out the most efficient pattern to send them all as writes in one go as opposed to calling `.execute_write()` in a loop.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1566#issuecomment-997235388", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1566", "id": 997235388, "node_id": "IC_kwDOBm6k_c47cJq8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T17:32:07Z", "updated_at": "2021-12-18T17:32:07Z", "author_association": "OWNER", "body": "I can release a new version of `datasette-leaflet-freedraw` as soon as this is out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083669410, "label": "Release Datasette 0.60"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997235086", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997235086, "node_id": "IC_kwDOBm6k_c47cJmO", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T17:30:13Z", "updated_at": "2021-12-18T17:30:13Z", "author_association": "OWNER", "body": "Now that trace sees write queries (#1568) it's clear that there is a whole lot more DB activity then I had realized:\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997234858", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997234858, "node_id": "IC_kwDOBm6k_c47cJiq", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T17:28:44Z", "updated_at": "2021-12-18T17:28:44Z", "author_association": "OWNER", "body": "Maybe it would be worth exploring attaching each DB in turn to the _internal connection in order to perform these queries faster.\r\n\r\nI'm a bit worried about leaks though: the internal database isn't meant to be visible, even temporarily attaching another DB to it could cause SQL queries against that DB to be able to access the internal data.\r\n\r\nSo maybe instead the _internal connection gets to connect to the other DBs? There's a maximum of ten there I think, which is good for most but not all cases. But the cases with the most connected databases will see the worst performance!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1568#issuecomment-997153253", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1568", "id": 997153253, "node_id": "IC_kwDOBm6k_c47b1nl", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T06:20:23Z", "updated_at": "2021-12-18T06:20:23Z", "author_association": "OWNER", "body": "Now running at https://latest-with-plugins.datasette.io/github/commits?_trace=1", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083726550, "label": "Trace should show queries on the write connection too"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1568#issuecomment-997128950", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1568", "id": 997128950, "node_id": "IC_kwDOBm6k_c47bvr2", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:38:01Z", "updated_at": "2021-12-18T02:38:01Z", "author_association": "OWNER", "body": "Prototype:\r\n\r\n```diff\r\ndiff --git a/datasette/database.py b/datasette/database.py\r\nindex 0a0c104..468e936 100644\r\n--- a/datasette/database.py\r\n+++ b/datasette/database.py\r\n@@ -99,7 +99,9 @@ class Database:\r\n with conn:\r\n return conn.execute(sql, params or [])\r\n \r\n- return await self.execute_write_fn(_inner, block=block)\r\n+ with trace(\"sql\", database=self.name, sql=sql.strip(), params=params):\r\n+ results = await self.execute_write_fn(_inner, block=block)\r\n+ return results\r\n \r\n async def execute_write_fn(self, fn, block=False):\r\n task_id = uuid.uuid5(uuid.NAMESPACE_DNS, \"datasette.io\")\r\n```\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083726550, "label": "Trace should show queries on the write connection too"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1561#issuecomment-997128712", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1561", "id": 997128712, "node_id": "IC_kwDOBm6k_c47bvoI", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2021-12-18T02:35:48Z", "updated_at": "2021-12-18T02:35:48Z", "author_association": "CONTRIBUTOR", "body": "interesting! i love this feature. this + full caching with cloudflare is really super!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082765654, "label": "add hash id to \"_memory\" url if hashed url mode is turned on and crossdb is also turned on"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128508", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128508, "node_id": "IC_kwDOBm6k_c47bvk8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:33:57Z", "updated_at": "2021-12-18T02:33:57Z", "author_association": "OWNER", "body": "Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128368", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128368, "node_id": "IC_kwDOBm6k_c47bviw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:32:43Z", "updated_at": "2021-12-18T02:32:43Z", "author_association": "OWNER", "body": "I wonder why the `INSERT INTO` queries don't show up in that `?trace=1` view?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128251", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128251, "node_id": "IC_kwDOBm6k_c47bvg7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:31:51Z", "updated_at": "2021-12-18T02:31:51Z", "author_association": "OWNER", "body": "I was thinking it might even be possible to convert this into a `insert into tables select from ...` query:\r\n\r\nhttps://github.com/simonw/datasette/blob/c00f29affcafce8314366852ba1a0f5a7dd25690/datasette/utils/internal_db.py#L102-L112\r\n\r\n But the `SELECT` runs against a separate database from the `INSERT INTO`, so I would have to setup a cross-database connection for this which feels a little too complicated.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128080", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128080, "node_id": "IC_kwDOBm6k_c47bveQ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:30:19Z", "updated_at": "2021-12-18T02:30:19Z", "author_association": "OWNER", "body": "I think all of these queries happen in one place - in the `populate_schema_tables()` function - so optimizing them might be localized to just that area of the code, which would be nice:\r\n\r\nhttps://github.com/simonw/datasette/blob/c00f29affcafce8314366852ba1a0f5a7dd25690/datasette/utils/internal_db.py#L97-L183", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1561#issuecomment-997127784", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1561", "id": 997127784, "node_id": "IC_kwDOBm6k_c47bvZo", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:27:56Z", "updated_at": "2021-12-18T02:27:56Z", "author_association": "OWNER", "body": "Oh that's an interesting solution, combining the hashes of all of the individual databases.\r\n\r\nI'm actually not a big fan of `hashed_url` mode - I implemented it right at the start of the project because it felt like a clever hack, and then ended up making it not-the-default a few years ago:\r\n\r\n- #418\r\n- #419\r\n- #421\r\n\r\nI've since not found myself wanting to use it at all for any of my projects - which makes me nervous, because it means there's a pretty complex feature that I'm not using at all, so it's only really protected by the existing unit tests for it.\r\n\r\nWhat I'd really like to do is figure out how to have hashed URL mode work entirely as a plugin - then I could extract it from Datasette core entirely (which would simplify a bunch of stuff) but people who find the optimization useful would be able to access it.\r\n\r\nI'm not sure that the existing plugin hooks are robust enough to do that yet though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082765654, "label": "add hash id to \"_memory\" url if hashed url mode is turned on and crossdb is also turned on"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1563#issuecomment-997127084", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1563", "id": 997127084, "node_id": "IC_kwDOBm6k_c47bvOs", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:22:30Z", "updated_at": "2021-12-18T02:22:30Z", "author_association": "OWNER", "body": "Docs here: https://docs.datasette.io/en/latest/internals.html#datasette-class", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083573206, "label": "Datasette(... files=) should not be a required argument"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1563#issuecomment-997125191", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1563", "id": 997125191, "node_id": "IC_kwDOBm6k_c47buxH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:10:20Z", "updated_at": "2021-12-18T02:10:20Z", "author_association": "OWNER", "body": "I should document the usage of this constructor in https://docs.datasette.io/en/stable/internals.html#datasette-class", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083573206, "label": "Datasette(... files=) should not be a required argument"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1546#issuecomment-997124280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1546", "id": 997124280, "node_id": "IC_kwDOBm6k_c47bui4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:05:16Z", "updated_at": "2021-12-18T02:05:16Z", "author_association": "OWNER", "body": "Sure - there are actually several levels to this.\r\n\r\nThe code that creates connections to the database is this: https://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/database.py#L72-L95\r\n\r\nFor files on disk, it does this:\r\n```python\r\n# For read-only connections\r\nconn = sqlite3.connect( \"file:my.db?mode=ro\", uri=True, check_same_thread=False)\r\n# For connections that should be treated as immutable:\r\nconn = sqlite3.connect( \"file:my.db?immutable=1\", uri=True, check_same_thread=False)\r\n```\r\nFor in-memory databases it runs this after the connection has been created:\r\n```python\r\nconn.execute(\"PRAGMA query_only=1\")\r\n```\r\nSQLite `PRAGMA` queries are treated as dangerous: someone could run `PRAGMA query_only=0` to turn that previous option off for example.\r\n\r\nSo this function runs against any incoming SQL to verify that it looks like a `SELECT ...` and doesn't have anything like that in it.\r\n\r\nhttps://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/utils/__init__.py#L195-L204\r\n\r\nYou can see the tests for that here: https://github.com/simonw/datasette/blob/b1fed48a95516ae84c0f020582303ab50ab817e2/tests/test_utils.py#L136-L170", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076057610, "label": "validating the sql"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1564#issuecomment-997122938", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1564", "id": 997122938, "node_id": "IC_kwDOBm6k_c47buN6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T01:55:25Z", "updated_at": "2021-12-18T01:55:46Z", "author_association": "OWNER", "body": "Made this change while working on this issue:\r\n- #1567\r\n\r\nI'm going to write a test for this that uses that `sleep()` SQL function from c35b84a2aabe2f14aeacf6cda4110ae1e94d6059.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083581011, "label": "_prepare_connection not called on write connections"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997121215", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997121215, "node_id": "IC_kwDOBm6k_c47bty_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T01:45:44Z", "updated_at": "2021-12-18T01:45:44Z", "author_association": "OWNER", "body": "I want to get this into Datasette 0.60 - #1566 - it's a small change that can unlock a lot of potential.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/621#issuecomment-997120723", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/621", "id": 997120723, "node_id": "IC_kwDOBm6k_c47btrT", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T01:42:33Z", "updated_at": "2021-12-18T01:42:33Z", "author_association": "OWNER", "body": "I refactored this code out into the `filters.py` module in aa7f0037a46eb76ae6fe9bf2a1f616c58738ecdf", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 520681725, "label": "Syntax for ?_through= that works as a form field"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/617#issuecomment-552253893", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/617", "id": 552253893, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjI1Mzg5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-11T00:46:42Z", "updated_at": "2021-12-18T01:41:47Z", "author_association": "OWNER", "body": "As noted in https://github.com/simonw/datasette/issues/621#issuecomment-552253208 a common pattern in this method is blocks of code that append new items to the `where_clauses`, `params` and `extra_human_descriptions` arrays. This is a useful refactoring opportunity.\r\n\r\nCode that fits this pattern:\r\n\r\n* The code that builds based on the filters: `where_clauses, params = filters.build_where_clauses(table)` and `human_description_en = filters.human_description_en(extra=extra_human_descriptions)`\r\n* Code that handles `?_where=`: `where_clauses.extend(request.args[\"_where\"])` - though note that this also appends to a `extra_wheres_for_ui` array which nothing else uses\r\n* The `_through=` code, see #621 for details\r\n* The code that deals with `?_search=` FTS\r\n\r\nThe keyset pagination code modifies `where_clauses` and `params` too, but I don't think it's quite going to work with the same abstraction that would cover the above examples.\r\n\r\n[UPDATE December 2021 - this comment became the basis for a new `filters_from_request` plugin hook, see also #473]", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 519613116, "label": "Refactor TableView.data() method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-981153060", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 981153060, "node_id": "IC_kwDOBm6k_c46ezUk", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-28T21:13:09Z", "updated_at": "2021-12-17T23:37:08Z", "author_association": "OWNER", "body": "Two new requirements inspired by work on the `datasette-table` (and `datasette-notebook`) projects:\r\n\r\n- #1533\r\n- #1534", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/358#issuecomment-996482595", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/358", "id": 996482595, "node_id": "IC_kwDOCGYnMM47ZR4j", "user": {"value": 11597658, "label": "luxint"}, "created_at": "2021-12-17T06:57:51Z", "updated_at": "2021-12-17T23:24:16Z", "author_association": "NONE", "body": "> This goes beyond the `transform()` method - the curious methods that create new SQL tables could benefit from the ability to add `CHECK` constraints too.\r\n> \r\n> I haven't used these myself, do you have any `CREATE TABLE` examples that use them that you can share?\r\n\r\nI'm using them myself for the first time as well, this is a tutorial of how to use (and change) them in sqlite: https://www.sqlitetutorial.net/sqlite-check-constraint/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082651698, "label": "Support for CHECK constraints"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-997082845", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 997082845, "node_id": "IC_kwDOBm6k_c47bkbd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T23:10:09Z", "updated_at": "2021-12-17T23:10:17Z", "author_association": "OWNER", "body": "These changes so far are now in the 0.60a0 alpha: https://github.com/simonw/datasette/releases/tag/0.60a0", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1559#issuecomment-997082676", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1559", "id": 997082676, "node_id": "IC_kwDOBm6k_c47bkY0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T23:09:41Z", "updated_at": "2021-12-17T23:09:41Z", "author_association": "OWNER", "body": "This is now available to try out in Datasette 0.60a0: https://github.com/simonw/datasette/releases/tag/0.60a0", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082743068, "label": "filters_from_request plugin hook, now used in TableView"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1562#issuecomment-997082189", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1562", "id": 997082189, "node_id": "IC_kwDOBm6k_c47bkRN", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T23:08:14Z", "updated_at": "2021-12-17T23:08:14Z", "author_association": "OWNER", "body": "Oh that makes sense: In Python 3.6 this happens:\r\n```\r\nCollecting janus<1.1,>=0.6.2\r\n Using cached janus-0.7.0-py3-none-any.whl (6.9 kB)\r\n```\r\nWhile in Python 3.7 or higher this happens:\r\n```\r\nCollecting janus<1.1,>=0.6.2\r\n Downloading janus-1.0.0-py3-none-any.whl (6.9 kB)\r\n```\r\nSo this is safe to apply because `pip` is smart enough to pick the version of Janus that works for that Python version.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083246400, "label": "Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1562#issuecomment-997081673", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1562", "id": 997081673, "node_id": "IC_kwDOBm6k_c47bkJJ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T23:06:38Z", "updated_at": "2021-12-17T23:06:38Z", "author_association": "OWNER", "body": "From this diff between `0.7.0` and `1.0`: https://github.com/aio-libs/janus/compare/v0.7.0...v1.0.0\r\n\r\nIt looks like the only change relevant to compatibility is `loop = asyncio.get_running_loop()` directly instead of falling back to `asyncio.get_event_loop()` if `get_running_loop` isn't available.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083246400, "label": "Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1562#issuecomment-997080352", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1562", "id": 997080352, "node_id": "IC_kwDOBm6k_c47bj0g", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T23:03:08Z", "updated_at": "2021-12-17T23:03:08Z", "author_association": "OWNER", "body": "They say they've dropped 3.6 support, but Datasette's tests against 3.6 are still passing.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083246400, "label": "Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1566#issuecomment-997078812", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1566", "id": 997078812, "node_id": "IC_kwDOBm6k_c47bjcc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T22:58:55Z", "updated_at": "2021-12-17T22:58:55Z", "author_association": "OWNER", "body": "The release notes for the 0.60a0 alpha will be useful here: https://github.com/simonw/datasette/releases/tag/0.60a0", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083669410, "label": "Release Datasette 0.60"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997077410", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997077410, "node_id": "IC_kwDOBm6k_c47bjGi", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T22:54:45Z", "updated_at": "2021-12-17T22:54:45Z", "author_association": "OWNER", "body": "The table page should expose the query both with and without the `limit` clause. The above gave me back:\r\n\r\n```sql\r\nselect id, ACCESS_TYP, UNIT_ID, UNIT_NAME, SUID_NMA, AGNCY_ID, AGNCY_NAME, AGNCY_LEV,\r\n AGNCY_TYP, AGNCY_WEB, LAYER, MNG_AG_ID, MNG_AGENCY, MNG_AG_LEV, MNG_AG_TYP,\r\n PARK_URL, COUNTY, ACRES, LABEL_NAME, YR_EST, DES_TP, GAP_STS, geometry\r\nfrom CPAD_2020a_Units where \"AGNCY_LEV\" = :p0 order by id limit 101\r\n```\r\nBut I actually wanted to run a `fetch()` against a version of that without the `order by id limit 101` bit (I wanted to figure out the `Extent()` of the `geometry` column) - so I need something like `datasette.table_sql_no_order_no_limit`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1565#issuecomment-997069128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1565", "id": 997069128, "node_id": "IC_kwDOBm6k_c47bhFI", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T22:31:18Z", "updated_at": "2021-12-17T22:31:18Z", "author_association": "OWNER", "body": "This should aim to be as consistent as possible with the various arguments to hooks on https://docs.datasette.io/en/stable/plugin_hooks.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083657868, "label": "Documented JavaScript variables on different templates made available for plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1559#issuecomment-996961196", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1559", "id": 996961196, "node_id": "IC_kwDOBm6k_c47bGus", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T19:00:53Z", "updated_at": "2021-12-17T19:00:53Z", "author_association": "OWNER", "body": "I'm going to merge this to `main` now. I can continue the refactoring there, but having it in `main` means I can put out an alpha release with the new hook which will unblock me from running tests against it in this repo: https://github.com/simonw/datasette-leaflet-freedraw/pull/8", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082743068, "label": "filters_from_request plugin hook, now used in TableView"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1559#issuecomment-996959325", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1559", "id": 996959325, "node_id": "IC_kwDOBm6k_c47bGRd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T18:59:54Z", "updated_at": "2021-12-17T18:59:54Z", "author_association": "OWNER", "body": "I've convinced myself that this plugin hook design is good through this `datasette-leaflet-freedraw` prototype: https://github.com/simonw/datasette-leaflet-freedraw/blob/e8a16a0fe90656b8d655c02881d23a2b9833281d/datasette_leaflet_freedraw/__init__.py", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082743068, "label": "filters_from_request plugin hook, now used in TableView"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/473#issuecomment-996958442", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/473", "id": 996958442, "node_id": "IC_kwDOBm6k_c47bGDq", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T18:59:27Z", "updated_at": "2021-12-17T18:59:27Z", "author_association": "OWNER", "body": "I'm happy with how the prototype that used this plugin in `datasette-leaflet-freedraw` turned out: https://github.com/simonw/datasette-leaflet-freedraw/blob/e8a16a0fe90656b8d655c02881d23a2b9833281d/datasette_leaflet_freedraw/__init__.py", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 445850934, "label": "Plugin hook: filters_from_request"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/473#issuecomment-996345233", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/473", "id": 996345233, "node_id": "IC_kwDOBm6k_c47YwWR", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T01:20:31Z", "updated_at": "2021-12-17T18:13:01Z", "author_association": "OWNER", "body": "I could use this hook to add table filtering on a map to the existing `datasette-leaflet-freedraw` plugin.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 445850934, "label": "Plugin hook: filters_from_request"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1559#issuecomment-996289541", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1559", "id": 996289541, "node_id": "IC_kwDOBm6k_c47YiwF", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-12-17T00:07:42Z", "updated_at": "2021-12-17T17:28:54Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1559](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (71af58d) into [main](https://codecov.io/gh/simonw/datasette/commit/0663d5525cc41e9260ac7d1f6386d3a6eb5ad2a9?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (0663d55) will **increase** coverage by `0.09%`.\n> The diff coverage is `97.97%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1559/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n\n```diff\n@@ Coverage Diff @@\n## main #1559 +/- ##\n==========================================\n+ Coverage 91.96% 92.05% +0.09% \n==========================================\n Files 34 34 \n Lines 4442 4493 +51 \n==========================================\n+ Hits 4085 4136 +51 \n Misses 357 357 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/plugins.py](https://codecov.io/gh/simonw/datasette/pull/1559/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3BsdWdpbnMucHk=) | `82.35% <\u00f8> (\u00f8)` | |\n| [datasette/filters.py](https://codecov.io/gh/simonw/datasette/pull/1559/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2ZpbHRlcnMucHk=) | `95.69% <97.67%> (+1.33%)` | :arrow_up: |\n| [datasette/hookspecs.py](https://codecov.io/gh/simonw/datasette/pull/1559/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2hvb2tzcGVjcy5weQ==) | `100.00% <100.00%> (\u00f8)` | |\n| [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1559/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `96.21% <100.00%> (+0.13%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [0663d55...71af58d](https://codecov.io/gh/simonw/datasette/pull/1559?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082743068, "label": "filters_from_request plugin hook, now used in TableView"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1559#issuecomment-996895423", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1559", "id": 996895423, "node_id": "IC_kwDOBm6k_c47a2q_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T17:28:44Z", "updated_at": "2021-12-17T17:28:44Z", "author_association": "OWNER", "body": "Before I land this I'm going to build one prototype plugin against it to confirm that the new hook is useful in its current shape.\r\n\r\nI'll add support for filtering a table by drawing on a map to https://datasette.io/plugins/datasette-leaflet-freedraw", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1082743068, "label": "filters_from_request plugin hook, now used in TableView"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1562#issuecomment-996716158", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1562", "id": 996716158, "node_id": "IC_kwDOBm6k_c47aK5-", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-12-17T13:18:49Z", "updated_at": "2021-12-17T13:18:49Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1562](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (2f008e8) into [main](https://codecov.io/gh/simonw/datasette/commit/0663d5525cc41e9260ac7d1f6386d3a6eb5ad2a9?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (0663d55) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1562/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n\n```diff\n@@ Coverage Diff @@\n## main #1562 +/- ##\n=======================================\n Coverage 91.96% 91.96% \n=======================================\n Files 34 34 \n Lines 4442 4442 \n=======================================\n Hits 4085 4085 \n Misses 357 357 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [0663d55...2f008e8](https://codecov.io/gh/simonw/datasette/pull/1562?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1083246400, "label": "Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1204#issuecomment-996488925", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1204", "id": 996488925, "node_id": "IC_kwDOBm6k_c47ZTbd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T07:10:48Z", "updated_at": "2021-12-17T07:10:48Z", "author_association": "OWNER", "body": "I think this is missing the `_macro.html` template file but I have that in my Dropbox.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793002853, "label": "WIP: Plugin includes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/473#issuecomment-996484551", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/473", "id": 996484551, "node_id": "IC_kwDOBm6k_c47ZSXH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-17T07:02:21Z", "updated_at": "2021-12-17T07:04:23Z", "author_association": "OWNER", "body": "The one slightly weird thing about this hook is how it adds `extra_context` without an obvious way for plugins to add extra HTML to the templates based on that context.\r\n\r\nMaybe I need the proposed mechanism from\r\n- #1191\r\n\r\nWhich has an in-progress PR:\r\n- #1204", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 445850934, "label": "Plugin hook: filters_from_request"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1191#issuecomment-761104933", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1191", "id": 761104933, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MTEwNDkzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-15T18:21:26Z", "updated_at": "2021-12-17T07:03:02Z", "author_association": "OWNER", "body": "Also related: #857 (comprehensive documentation of variables available to templates) - since then the plugin hook could be fed the full template context and use that to do its thing.\r\n\r\nOr maybe the plugin hooks gets to return the name of a template that should be `{% include %}` into the page at that point? But the plugin may want to add extra context that is available to that template include.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 787098345, "label": "Ability for plugins to collaborate when adding extra HTML to blocks in default templates"}, "performed_via_github_app": null}