html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app
https://github.com/simonw/datasette/issues/1723#issuecomment-1110330554,https://api.github.com/repos/simonw/datasette/issues/1723,1110330554,IC_kwDOBm6k_c5CLky6,9599,2022-04-26T23:06:20Z,2022-04-26T23:06:20Z,OWNER,Deployed here: https://latest-with-plugins.datasette.io/github/commits?_facet=repo&_trace=1&_facet=committer,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1216508080,
https://github.com/simonw/datasette/issues/1723#issuecomment-1110305790,https://api.github.com/repos/simonw/datasette/issues/1723,1110305790,IC_kwDOBm6k_c5CLev-,9599,2022-04-26T22:19:04Z,2022-04-26T22:19:04Z,OWNER,"I realized that seeing the total time in queries wasn't enough to understand this, because if the queries were executed in serial or parallel it should still sum up to the same amount of SQL time (roughly).
Instead I need to know how long the page took to render. But that's hard to display on the page since you can't measure it until rendering has finished!
So I built an ASGI plugin to handle that measurement: https://github.com/simonw/datasette-total-page-time
And with that plugin installed, `http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel2&_facet=other_fuel1&_parallel=1` (the parallel version) takes 377ms:
While `http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel2&_facet=other_fuel1` (the serial version) takes 762ms:
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1216508080,
https://github.com/simonw/datasette/issues/1723#issuecomment-1110279869,https://api.github.com/repos/simonw/datasette/issues/1723,1110279869,IC_kwDOBm6k_c5CLYa9,9599,2022-04-26T21:45:39Z,2022-04-26T21:45:39Z,OWNER,"Getting some nice traces out of this:
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1216508080,
https://github.com/simonw/datasette/issues/1723#issuecomment-1110278577,https://api.github.com/repos/simonw/datasette/issues/1723,1110278577,IC_kwDOBm6k_c5CLYGx,9599,2022-04-26T21:44:04Z,2022-04-26T21:44:04Z,OWNER,"And some simple benchmarks with `ab` - using the `?_parallel=1` hack to try it with and without a parallel `asyncio.gather()`:
```
~ % ab -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2'
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2
Document Length: 314187 bytes
Concurrency Level: 1
Time taken for tests: 68.279 seconds
Complete requests: 100
Failed requests: 13
(Connect: 0, Receive: 0, Length: 13, Exceptions: 0)
Total transferred: 31454937 bytes
HTML transferred: 31418437 bytes
Requests per second: 1.46 [#/sec] (mean)
Time per request: 682.787 [ms] (mean)
Time per request: 682.787 [ms] (mean, across all concurrent requests)
Transfer rate: 449.89 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 621 683 68.0 658 993
Waiting: 620 682 68.0 657 992
Total: 621 683 68.0 658 993
Percentage of the requests served within a certain time (ms)
50% 658
66% 678
75% 687
80% 711
90% 763
95% 879
98% 926
99% 993
100% 993 (longest request)
----
In parallel:
~ % ab -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1
Document Length: 315703 bytes
Concurrency Level: 1
Time taken for tests: 34.763 seconds
Complete requests: 100
Failed requests: 11
(Connect: 0, Receive: 0, Length: 11, Exceptions: 0)
Total transferred: 31607988 bytes
HTML transferred: 31570288 bytes
Requests per second: 2.88 [#/sec] (mean)
Time per request: 347.632 [ms] (mean)
Time per request: 347.632 [ms] (mean, across all concurrent requests)
Transfer rate: 887.93 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 311 347 28.0 338 450
Waiting: 311 347 28.0 338 450
Total: 312 348 28.0 338 451
Percentage of the requests served within a certain time (ms)
50% 338
66% 348
75% 361
80% 367
90% 396
95% 408
98% 436
99% 451
100% 451 (longest request)
----
With concurrency 10, not parallel:
~ % ab -c 10 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel='
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=
Document Length: 314346 bytes
Concurrency Level: 10
Time taken for tests: 38.408 seconds
Complete requests: 100
Failed requests: 93
(Connect: 0, Receive: 0, Length: 93, Exceptions: 0)
Total transferred: 31471333 bytes
HTML transferred: 31433733 bytes
Requests per second: 2.60 [#/sec] (mean)
Time per request: 3840.829 [ms] (mean)
Time per request: 384.083 [ms] (mean, across all concurrent requests)
Transfer rate: 800.18 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.1 0 1
Processing: 685 3719 354.0 3774 4096
Waiting: 684 3707 353.7 3750 4095
Total: 685 3719 354.0 3774 4096
Percentage of the requests served within a certain time (ms)
50% 3774
66% 3832
75% 3855
80% 3878
90% 3944
95% 4006
98% 4057
99% 4096
100% 4096 (longest request)
----
Concurrency 10 parallel:
~ % ab -c 10 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1
Document Length: 315703 bytes
Concurrency Level: 10
Time taken for tests: 36.762 seconds
Complete requests: 100
Failed requests: 89
(Connect: 0, Receive: 0, Length: 89, Exceptions: 0)
Total transferred: 31606516 bytes
HTML transferred: 31568816 bytes
Requests per second: 2.72 [#/sec] (mean)
Time per request: 3676.182 [ms] (mean)
Time per request: 367.618 [ms] (mean, across all concurrent requests)
Transfer rate: 839.61 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.1 0 0
Processing: 381 3602 419.6 3609 4458
Waiting: 381 3586 418.7 3607 4457
Total: 381 3603 419.6 3609 4458
Percentage of the requests served within a certain time (ms)
50% 3609
66% 3741
75% 3791
80% 3821
90% 3972
95% 4074
98% 4386
99% 4458
100% 4458 (longest request)
Trying -c 3 instead. Non parallel:
~ % ab -c 3 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel='
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=
Document Length: 314346 bytes
Concurrency Level: 3
Time taken for tests: 39.365 seconds
Complete requests: 100
Failed requests: 83
(Connect: 0, Receive: 0, Length: 83, Exceptions: 0)
Total transferred: 31470808 bytes
HTML transferred: 31433208 bytes
Requests per second: 2.54 [#/sec] (mean)
Time per request: 1180.955 [ms] (mean)
Time per request: 393.652 [ms] (mean, across all concurrent requests)
Transfer rate: 780.72 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 731 1153 126.2 1189 1359
Waiting: 730 1151 125.9 1188 1358
Total: 731 1153 126.2 1189 1359
Percentage of the requests served within a certain time (ms)
50% 1189
66% 1221
75% 1234
80% 1247
90% 1296
95% 1309
98% 1343
99% 1359
100% 1359 (longest request)
----
Parallel:
~ % ab -c 3 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'
This is ApacheBench, Version 2.3 <$Revision: 1879490 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8001
Document Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1
Document Length: 315703 bytes
Concurrency Level: 3
Time taken for tests: 34.530 seconds
Complete requests: 100
Failed requests: 18
(Connect: 0, Receive: 0, Length: 18, Exceptions: 0)
Total transferred: 31606179 bytes
HTML transferred: 31568479 bytes
Requests per second: 2.90 [#/sec] (mean)
Time per request: 1035.902 [ms] (mean)
Time per request: 345.301 [ms] (mean, across all concurrent requests)
Transfer rate: 893.87 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 412 1020 104.4 1018 1280
Waiting: 411 1018 104.1 1014 1275
Total: 412 1021 104.4 1018 1280
Percentage of the requests served within a certain time (ms)
50% 1018
66% 1041
75% 1061
80% 1079
90% 1136
95% 1176
98% 1251
99% 1280
100% 1280 (longest request)
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1216508080,
https://github.com/simonw/datasette/issues/1723#issuecomment-1110278182,https://api.github.com/repos/simonw/datasette/issues/1723,1110278182,IC_kwDOBm6k_c5CLYAm,9599,2022-04-26T21:43:34Z,2022-04-26T21:43:34Z,OWNER,"Here's the diff I'm using:
```diff
diff --git a/datasette/views/table.py b/datasette/views/table.py
index d66adb8..f15ef1e 100644
--- a/datasette/views/table.py
+++ b/datasette/views/table.py
@@ -1,3 +1,4 @@
+import asyncio
import itertools
import json
@@ -5,6 +6,7 @@ import markupsafe
from datasette.plugins import pm
from datasette.database import QueryInterrupted
+from datasette import tracer
from datasette.utils import (
await_me_maybe,
CustomRow,
@@ -150,6 +152,16 @@ class TableView(DataView):
default_labels=False,
_next=None,
_size=None,
+ ):
+ with tracer.trace_child_tasks():
+ return await self._data_traced(request, default_labels, _next, _size)
+
+ async def _data_traced(
+ self,
+ request,
+ default_labels=False,
+ _next=None,
+ _size=None,
):
database_route = tilde_decode(request.url_vars[""database""])
table_name = tilde_decode(request.url_vars[""table""])
@@ -159,6 +171,20 @@ class TableView(DataView):
raise NotFound(""Database not found: {}"".format(database_route))
database_name = db.name
+ # For performance profiling purposes, ?_parallel=1 turns on asyncio.gather
+ async def _gather_parallel(*args):
+ return await asyncio.gather(*args)
+
+ async def _gather_sequential(*args):
+ results = []
+ for fn in args:
+ results.append(await fn)
+ return results
+
+ gather = (
+ _gather_parallel if request.args.get(""_parallel"") else _gather_sequential
+ )
+
# If this is a canned query, not a table, then dispatch to QueryView instead
canned_query = await self.ds.get_canned_query(
database_name, table_name, request.actor
@@ -174,8 +200,12 @@ class TableView(DataView):
write=bool(canned_query.get(""write"")),
)
- is_view = bool(await db.get_view_definition(table_name))
- table_exists = bool(await db.table_exists(table_name))
+ is_view, table_exists = map(
+ bool,
+ await gather(
+ db.get_view_definition(table_name), db.table_exists(table_name)
+ ),
+ )
# If table or view not found, return 404
if not is_view and not table_exists:
@@ -497,33 +527,44 @@ class TableView(DataView):
)
)
- if not nofacet:
- for facet in facet_instances:
- (
+ async def execute_facets():
+ if not nofacet:
+ # Run them in parallel
+ facet_awaitables = [facet.facet_results() for facet in facet_instances]
+ facet_awaitable_results = await gather(*facet_awaitables)
+ for (
instance_facet_results,
instance_facets_timed_out,
- ) = await facet.facet_results()
- for facet_info in instance_facet_results:
- base_key = facet_info[""name""]
- key = base_key
- i = 1
- while key in facet_results:
- i += 1
- key = f""{base_key}_{i}""
- facet_results[key] = facet_info
- facets_timed_out.extend(instance_facets_timed_out)
-
- # Calculate suggested facets
+ ) in facet_awaitable_results:
+ for facet_info in instance_facet_results:
+ base_key = facet_info[""name""]
+ key = base_key
+ i = 1
+ while key in facet_results:
+ i += 1
+ key = f""{base_key}_{i}""
+ facet_results[key] = facet_info
+ facets_timed_out.extend(instance_facets_timed_out)
+
suggested_facets = []
- if (
- self.ds.setting(""suggest_facets"")
- and self.ds.setting(""allow_facet"")
- and not _next
- and not nofacet
- and not nosuggest
- ):
- for facet in facet_instances:
- suggested_facets.extend(await facet.suggest())
+
+ async def execute_suggested_facets():
+ # Calculate suggested facets
+ if (
+ self.ds.setting(""suggest_facets"")
+ and self.ds.setting(""allow_facet"")
+ and not _next
+ and not nofacet
+ and not nosuggest
+ ):
+ # Run them in parallel
+ facet_suggest_awaitables = [
+ facet.suggest() for facet in facet_instances
+ ]
+ for suggest_result in await gather(*facet_suggest_awaitables):
+ suggested_facets.extend(suggest_result)
+
+ await gather(execute_facets(), execute_suggested_facets())
# Figure out columns and rows for the query
columns = [r[0] for r in results.description]
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1216508080,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110265087,https://api.github.com/repos/simonw/datasette/issues/1715,1110265087,IC_kwDOBm6k_c5CLUz_,9599,2022-04-26T21:26:17Z,2022-04-26T21:26:17Z,OWNER,"Running facets and facet suggestions in parallel using `asyncio.gather()` turns out to be a lot less hassle than I had thought - maybe I don't need `asyncinject` for this at all?
```diff
if not nofacet:
- for facet in facet_instances:
- (
- instance_facet_results,
- instance_facets_timed_out,
- ) = await facet.facet_results()
+ # Run them in parallel
+ facet_awaitables = [facet.facet_results() for facet in facet_instances]
+ facet_awaitable_results = await asyncio.gather(*facet_awaitables)
+ for (
+ instance_facet_results,
+ instance_facets_timed_out,
+ ) in facet_awaitable_results:
for facet_info in instance_facet_results:
base_key = facet_info[""name""]
key = base_key
@@ -522,8 +540,10 @@ class TableView(DataView):
and not nofacet
and not nosuggest
):
- for facet in facet_instances:
- suggested_facets.extend(await facet.suggest())
+ # Run them in parallel
+ facet_suggest_awaitables = [facet.suggest() for facet in facet_instances]
+ for suggest_result in await asyncio.gather(*facet_suggest_awaitables):
+ suggested_facets.extend(suggest_result)
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110246593,https://api.github.com/repos/simonw/datasette/issues/1715,1110246593,IC_kwDOBm6k_c5CLQTB,9599,2022-04-26T21:03:56Z,2022-04-26T21:03:56Z,OWNER,"Well this is fun... I applied this change:
```diff
diff --git a/datasette/views/table.py b/datasette/views/table.py
index d66adb8..85f9e44 100644
--- a/datasette/views/table.py
+++ b/datasette/views/table.py
@@ -1,3 +1,4 @@
+import asyncio
import itertools
import json
@@ -5,6 +6,7 @@ import markupsafe
from datasette.plugins import pm
from datasette.database import QueryInterrupted
+from datasette import tracer
from datasette.utils import (
await_me_maybe,
CustomRow,
@@ -174,8 +176,11 @@ class TableView(DataView):
write=bool(canned_query.get(""write"")),
)
- is_view = bool(await db.get_view_definition(table_name))
- table_exists = bool(await db.table_exists(table_name))
+ with tracer.trace_child_tasks():
+ is_view, table_exists = map(bool, await asyncio.gather(
+ db.get_view_definition(table_name),
+ db.table_exists(table_name)
+ ))
# If table or view not found, return 404
if not is_view and not table_exists:
```
And now using https://datasette.io/plugins/datasette-pretty-traces I get this:
![CleanShot 2022-04-26 at 14 03 33@2x](https://user-images.githubusercontent.com/9599/165392009-84c4399d-3e94-46d4-ba7b-a64a116cac5c.png)
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110219185,https://api.github.com/repos/simonw/datasette/issues/1715,1110219185,IC_kwDOBm6k_c5CLJmx,9599,2022-04-26T20:28:40Z,2022-04-26T20:56:48Z,OWNER,"The refactor I did in #1719 pretty much clashes with all of the changes in https://github.com/simonw/datasette/commit/5053f1ea83194ecb0a5693ad5dada5b25bf0f7e6 so I'll probably need to start my `api-extras` branch again from scratch.
Using a new `tableview-asyncinject` branch.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110239536,https://api.github.com/repos/simonw/datasette/issues/1715,1110239536,IC_kwDOBm6k_c5CLOkw,9599,2022-04-26T20:54:53Z,2022-04-26T20:54:53Z,OWNER,`pytest tests/test_table_*` runs the tests quickly.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110238896,https://api.github.com/repos/simonw/datasette/issues/1715,1110238896,IC_kwDOBm6k_c5CLOaw,9599,2022-04-26T20:53:59Z,2022-04-26T20:53:59Z,OWNER,I'm going to rename `database` to `database_name` and `table` to `table_name` to avoid confusion with the `Database` object as opposed to the string name for the database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1715#issuecomment-1110229319,https://api.github.com/repos/simonw/datasette/issues/1715,1110229319,IC_kwDOBm6k_c5CLMFH,9599,2022-04-26T20:41:32Z,2022-04-26T20:44:38Z,OWNER,"This time I'm not going to bother with the `filter_args` thing - I'm going to just try to use `asyncinject` to execute some big high level things in parallel - facets, suggested facets, counts, the query - and then combine it with the `extras` mechanism I'm trying to introduce too.
Most importantly: I want that `extra_template()` function that adds more template context for the HTML to be executed as part of an `asyncinject` flow!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1212823665,
https://github.com/simonw/datasette/issues/1720#issuecomment-1110212021,https://api.github.com/repos/simonw/datasette/issues/1720,1110212021,IC_kwDOBm6k_c5CLH21,9599,2022-04-26T20:20:27Z,2022-04-26T20:20:27Z,OWNER,Closing this because I have a good enough idea of the design for now - the details of the parameters can be figured out when I implement this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109309683,https://api.github.com/repos/simonw/datasette/issues/1720,1109309683,IC_kwDOBm6k_c5CHrjz,9599,2022-04-26T04:12:39Z,2022-04-26T04:12:39Z,OWNER,"I think the rough shape of the three plugin hooks is right. The detailed decisions that are needed concern what the parameters should be, which I think will mainly happen as part of:
- #1715","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109306070,https://api.github.com/repos/simonw/datasette/issues/1720,1109306070,IC_kwDOBm6k_c5CHqrW,9599,2022-04-26T04:05:20Z,2022-04-26T04:05:20Z,OWNER,"The proposed plugin for annotations - allowing users to attach comments to database tables, columns and rows - would be a great application for all three of those `?_extra=` plugin hooks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109305184,https://api.github.com/repos/simonw/datasette/issues/1720,1109305184,IC_kwDOBm6k_c5CHqdg,9599,2022-04-26T04:03:35Z,2022-04-26T04:03:35Z,OWNER,I bet there's all kinds of interesting potential extras that could be calculated by loading the results of the query into a Pandas DataFrame.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109200774,https://api.github.com/repos/simonw/datasette/issues/1720,1109200774,IC_kwDOBm6k_c5CHQ-G,9599,2022-04-26T01:25:43Z,2022-04-26T01:26:15Z,OWNER,"Had a thought: if a custom HTML template is going to make use of stuff generated using these extras, it will need a way to tell Datasette to execute those extras even in the absence of the `?_extra=...` URL parameters.
Is that necessary? Or should those kinds of plugins use the existing `extra_template_vars` hook instead?
Or maybe the `extra_template_vars` hook gets redesigned so it can depend on other `extras` in some way?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109200335,https://api.github.com/repos/simonw/datasette/issues/1720,1109200335,IC_kwDOBm6k_c5CHQ3P,9599,2022-04-26T01:24:47Z,2022-04-26T01:24:47Z,OWNER,"Sketching out a `?_extra=statistics` table plugin:
```python
from datasette import hookimpl
@hookimpl
def register_table_extras(datasette):
return [statistics]
async def statistics(datasette, query, columns, sql):
# ... need to figure out which columns are integer/floats
# then build and execute a SQL query that calculates sum/avg/etc for each column
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/sqlite-utils/issues/428#issuecomment-1109190401,https://api.github.com/repos/simonw/sqlite-utils/issues/428,1109190401,IC_kwDOCGYnMM5CHOcB,9599,2022-04-26T01:05:29Z,2022-04-26T01:05:29Z,OWNER,Django makes extensive use of savepoints for nested transactions: https://docs.djangoproject.com/en/4.0/topics/db/transactions/#savepoints,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215216249,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109174715,https://api.github.com/repos/simonw/datasette/issues/1720,1109174715,IC_kwDOBm6k_c5CHKm7,9599,2022-04-26T00:40:13Z,2022-04-26T00:43:33Z,OWNER,"Some of the things I'd like to use `?_extra=` for, that may or not make sense as plugins:
- Performance breakdown information, maybe including explain output for a query/table
- Information about the tables that were consulted in a query - imagine pulling in additional table metadata
- Statistical aggregates against the full set of results. This may well be a Datasette core feature at some point in the future, but being able to provide it early as a plugin would be really cool.
- For tables, what are the other tables they can join against?
- Suggested facets
- Facet results themselves
- New custom facets I haven't thought of - though the `register_facet_classes` hook covers that already
- Table schema
- Table metadata
- Analytics - how many times has this table been queried? Would be a plugin thing
- For geospatial data, how about a GeoJSON polygon that represents the bounding box for all returned results? Effectively this is an extra aggregation.
Looking at https://github-to-sqlite.dogsheep.net/github/commits.json?_labels=on&_shape=objects for inspiration.
I think there's a separate potential mechanism in the future that lets you add custom columns to a table. This would affect `.csv` and the HTML presentation too, which makes it a different concept from the `?_extra=` hook that affects the JSON export (and the context that is fed to the HTML templates).","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109171871,https://api.github.com/repos/simonw/datasette/issues/1720,1109171871,IC_kwDOBm6k_c5CHJ6f,9599,2022-04-26T00:34:48Z,2022-04-26T00:34:48Z,OWNER,"Let's try sketching out a `register_table_extras` plugin for something new.
The first idea I came up with suggests adding new fields to the individual row records that come back - my mental model for extras so far has been that they add new keys to the root object.
So if a table result looked like this:
```json
{
""rows"": [
{""id"": 1, ""name"": ""Cleo""},
{""id"": 2, ""name"": ""Suna""}
],
""next_url"": null
}
```
I was initially thinking that `?_extra=facets` would add a `""facets"": {...}` key to that root object.
Here's a plugin idea I came up with that would probably justify adding to the individual row objects instead:
- `?_extra=check404s` - does an async `HEAD` request against every column value that looks like a URL and checks if it returns a 404
This could also work by adding a `""check404s"": {""url-here"": 200}` key to the root object though.
I think I need some better plugin concepts before committing to this new hook. There's overlap between this and how I want the enrichments mechanism ([see here](https://simonwillison.net/2021/Jan/17/weeknotes-still-pretty-distracted/)) to work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109165411,https://api.github.com/repos/simonw/datasette/issues/1720,1109165411,IC_kwDOBm6k_c5CHIVj,9599,2022-04-26T00:22:42Z,2022-04-26T00:22:42Z,OWNER,Passing `pk_values` to the plugin hook feels odd. I think I'd pass a `row` object instead and let the code look up the primary key values on that row (by introspecting the primary keys for the table).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109164803,https://api.github.com/repos/simonw/datasette/issues/1720,1109164803,IC_kwDOBm6k_c5CHIMD,9599,2022-04-26T00:21:40Z,2022-04-26T00:21:40Z,OWNER,"What would the existing https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables feature look like if it was re-imagined as a `register_row_extras()` plugin?
Rough sketch, copying most of the code from https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/row.py#L98
```python
from datasette import hookimpl
@hookimpl
def register_row_extras(datasette):
return [foreign_key_tables]
async def foreign_key_tables(datasette, database, table, pk_values):
if len(pk_values) != 1:
return []
db = datasette.get_database(database)
all_foreign_keys = await db.get_all_foreign_keys()
foreign_keys = all_foreign_keys[table][""incoming""]
if len(foreign_keys) == 0:
return []
sql = ""select "" + "", "".join(
[
""(select count(*) from {table} where {column}=:id)"".format(
table=escape_sqlite(fk[""other_table""]),
column=escape_sqlite(fk[""other_column""]),
)
for fk in foreign_keys
]
)
try:
rows = list(await db.execute(sql, {""id"": pk_values[0]}))
except QueryInterrupted:
# Almost certainly hit the timeout
return []
foreign_table_counts = dict(
zip(
[(fk[""other_table""], fk[""other_column""]) for fk in foreign_keys],
list(rows[0]),
)
)
foreign_key_tables = []
for fk in foreign_keys:
count = (
foreign_table_counts.get((fk[""other_table""], fk[""other_column""])) or 0
)
key = fk[""other_column""]
if key.startswith(""_""):
key += ""__exact""
link = ""{}?{}={}"".format(
self.ds.urls.table(database, fk[""other_table""]),
key,
"","".join(pk_values),
)
foreign_key_tables.append({**fk, **{""count"": count, ""link"": link}})
return foreign_key_tables
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109162123,https://api.github.com/repos/simonw/datasette/issues/1720,1109162123,IC_kwDOBm6k_c5CHHiL,9599,2022-04-26T00:16:42Z,2022-04-26T00:16:51Z,OWNER,"Actually I'm going to imitate the existing `register_*` hooks:
- `def register_output_renderer(datasette)`
- `def register_facet_classes()`
- `def register_routes(datasette)`
- `def register_commands(cli)`
- `def register_magic_parameters(datasette)`
So I'm going to call the new hooks:
- `register_table_extras(datasette)`
- `register_row_extras(datasette)`
- `register_query_extras(datasette)`
They'll return a list of `async def` functions. The names of those functions will become the names of the extras.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109160226,https://api.github.com/repos/simonw/datasette/issues/1720,1109160226,IC_kwDOBm6k_c5CHHEi,9599,2022-04-26T00:14:11Z,2022-04-26T00:14:11Z,OWNER,"There are four existing plugin hooks that include the word ""extra"" but use it to mean something else - to mean additional CSS/JS/variables to be injected into the page:
- `def extra_css_urls(...)`
- `def extra_js_urls(...)`
- `def extra_body_script(...)`
- `def extra_template_vars(...)`
I think `extra_*` and `*_extras` are different enough that they won't be confused with each other.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109159307,https://api.github.com/repos/simonw/datasette/issues/1720,1109159307,IC_kwDOBm6k_c5CHG2L,9599,2022-04-26T00:12:28Z,2022-04-26T00:12:28Z,OWNER,"I'm going to keep table and row separate. So I think I need to add three new plugin hooks:
- `table_extras()`
- `row_extras()`
- `query_extras()`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109158903,https://api.github.com/repos/simonw/datasette/issues/1720,1109158903,IC_kwDOBm6k_c5CHGv3,9599,2022-04-26T00:11:42Z,2022-04-26T00:11:42Z,OWNER,"Places this plugin hook (or hooks?) should be able to affect:
- JSON for a table/view
- JSON for a row
- JSON for a canned query
- JSON for a custom arbitrary query
I'm going to combine those last two, which means there are three places. But maybe I can combine the table one and the row one as well?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,