{"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997459958", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997459958, "node_id": "IC_kwDOBm6k_c47dAf2", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:55:59Z", "updated_at": "2021-12-19T20:55:59Z", "author_association": "OWNER", "body": "Closing this issue because I've optimized this a whole bunch, and it's definitely good enough for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997325189", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997325189, "node_id": "IC_kwDOBm6k_c47cfmF", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:55:01Z", "updated_at": "2021-12-19T20:54:51Z", "author_association": "OWNER", "body": "It's a bit annoying that the queries no longer show up in the trace at all now, thanks to running in `.execute_fn()`. I wonder if there's something smart I can do about that - maybe have `trace()` record that function with a traceback even though it doesn't have the executed SQL string?\r\n\r\n5fac26aa221a111d7633f2dd92014641f7c0ade9 has the same problem.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997459637", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997459637, "node_id": "IC_kwDOBm6k_c47dAa1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T20:53:46Z", "updated_at": "2021-12-19T20:53:46Z", "author_association": "OWNER", "body": "Using #1571 showed me that the `DELETE FROM columns/foreign_keys/indexes WHERE database_name = ? and table_name = ?` queries were running way more times than I expected. I came up with a new optimization that just does `DELETE FROM columns/foreign_keys/indexes WHERE database_name = ?` instead.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997342494", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997342494, "node_id": "IC_kwDOBm6k_c47cj0e", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T07:22:04Z", "updated_at": "2021-12-19T07:22:04Z", "author_association": "OWNER", "body": "Another option would be to provide an abstraction that makes it easier to run a group of SQL queries in the same thread at the same time, and have them traced correctly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997324666", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997324666, "node_id": "IC_kwDOBm6k_c47cfd6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:47:51Z", "updated_at": "2021-12-19T03:48:09Z", "author_association": "OWNER", "body": "Here's a hacked together prototype of running all of that stuff inside a single function passed to `.execute_fn()`:\r\n\r\n```diff\r\ndiff --git a/datasette/utils/internal_db.py b/datasette/utils/internal_db.py\r\nindex 95055d8..58f9982 100644\r\n--- a/datasette/utils/internal_db.py\r\n+++ b/datasette/utils/internal_db.py\r\n@@ -1,4 +1,5 @@\r\n import textwrap\r\n+from datasette.utils import table_column_details\r\n \r\n \r\n async def init_internal_db(db):\r\n@@ -70,49 +71,70 @@ async def populate_schema_tables(internal_db, db):\r\n \"DELETE FROM tables WHERE database_name = ?\", [database_name], block=True\r\n )\r\n tables = (await db.execute(\"select * from sqlite_master WHERE type = 'table'\")).rows\r\n- tables_to_insert = []\r\n- columns_to_delete = []\r\n- columns_to_insert = []\r\n- foreign_keys_to_delete = []\r\n- foreign_keys_to_insert = []\r\n- indexes_to_delete = []\r\n- indexes_to_insert = []\r\n \r\n- for table in tables:\r\n- table_name = table[\"name\"]\r\n- tables_to_insert.append(\r\n- (database_name, table_name, table[\"rootpage\"], table[\"sql\"])\r\n- )\r\n- columns_to_delete.append((database_name, table_name))\r\n- columns = await db.table_column_details(table_name)\r\n- columns_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **column._asdict(),\r\n- }\r\n- for column in columns\r\n- )\r\n- foreign_keys_to_delete.append((database_name, table_name))\r\n- foreign_keys = (\r\n- await db.execute(f\"PRAGMA foreign_key_list([{table_name}])\")\r\n- ).rows\r\n- foreign_keys_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **dict(foreign_key),\r\n- }\r\n- for foreign_key in foreign_keys\r\n- )\r\n- indexes_to_delete.append((database_name, table_name))\r\n- indexes = (await db.execute(f\"PRAGMA index_list([{table_name}])\")).rows\r\n- indexes_to_insert.extend(\r\n- {\r\n- **{\"database_name\": database_name, \"table_name\": table_name},\r\n- **dict(index),\r\n- }\r\n- for index in indexes\r\n+ def collect_info(conn):\r\n+ tables_to_insert = []\r\n+ columns_to_delete = []\r\n+ columns_to_insert = []\r\n+ foreign_keys_to_delete = []\r\n+ foreign_keys_to_insert = []\r\n+ indexes_to_delete = []\r\n+ indexes_to_insert = []\r\n+\r\n+ for table in tables:\r\n+ table_name = table[\"name\"]\r\n+ tables_to_insert.append(\r\n+ (database_name, table_name, table[\"rootpage\"], table[\"sql\"])\r\n+ )\r\n+ columns_to_delete.append((database_name, table_name))\r\n+ columns = table_column_details(conn, table_name)\r\n+ columns_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **column._asdict(),\r\n+ }\r\n+ for column in columns\r\n+ )\r\n+ foreign_keys_to_delete.append((database_name, table_name))\r\n+ foreign_keys = conn.execute(\r\n+ f\"PRAGMA foreign_key_list([{table_name}])\"\r\n+ ).fetchall()\r\n+ foreign_keys_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **dict(foreign_key),\r\n+ }\r\n+ for foreign_key in foreign_keys\r\n+ )\r\n+ indexes_to_delete.append((database_name, table_name))\r\n+ indexes = conn.execute(f\"PRAGMA index_list([{table_name}])\").fetchall()\r\n+ indexes_to_insert.extend(\r\n+ {\r\n+ **{\"database_name\": database_name, \"table_name\": table_name},\r\n+ **dict(index),\r\n+ }\r\n+ for index in indexes\r\n+ )\r\n+ return (\r\n+ tables_to_insert,\r\n+ columns_to_delete,\r\n+ columns_to_insert,\r\n+ foreign_keys_to_delete,\r\n+ foreign_keys_to_insert,\r\n+ indexes_to_delete,\r\n+ indexes_to_insert,\r\n )\r\n \r\n+ (\r\n+ tables_to_insert,\r\n+ columns_to_delete,\r\n+ columns_to_insert,\r\n+ foreign_keys_to_delete,\r\n+ foreign_keys_to_insert,\r\n+ indexes_to_delete,\r\n+ indexes_to_insert,\r\n+ ) = await db.execute_fn(collect_info)\r\n+\r\n await internal_db.execute_write_many(\r\n \"\"\"\r\n INSERT INTO tables (database_name, table_name, rootpage, sql)\r\n```\r\nFirst impressions: it looks like this helps **a lot** - as far as I can tell this is now taking around 21ms to get to the point at which all of those internal databases have been populated, where previously it took more than 180ms.\r\n\r\n![CleanShot 2021-12-18 at 19 47 22@2x](https://user-images.githubusercontent.com/9599/146663192-bba098d5-e7bd-4e2e-b525-2270867888a0.png)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997324156", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997324156, "node_id": "IC_kwDOBm6k_c47cfV8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:40:05Z", "updated_at": "2021-12-19T03:40:05Z", "author_association": "OWNER", "body": "Using the prototype of this:\r\n- https://github.com/simonw/datasette-pretty-traces/issues/5\r\n\r\nI'm seeing about 180ms spent running all of these queries on startup!\r\n\r\n![CleanShot 2021-12-18 at 19 38 37@2x](https://user-images.githubusercontent.com/9599/146663045-46bda669-90de-474f-8870-345182725dc1.png)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321767", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321767, "node_id": "IC_kwDOBm6k_c47cewn", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:10:58Z", "updated_at": "2021-12-19T03:10:58Z", "author_association": "OWNER", "body": "I wonder how much overhead there is switching between the `async` event loop main code and the thread that runs the SQL queries.\r\n\r\nWould there be a performance boost if I gathered all of the column/index information in a single function run on the thread using `db.execute_fn()` I wonder? It would eliminate a bunch of switching between threads.\r\n\r\nWould be great to understand how much of an impact that would have.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321653", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321653, "node_id": "IC_kwDOBm6k_c47ceu1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:09:43Z", "updated_at": "2021-12-19T03:09:43Z", "author_association": "OWNER", "body": "On that same documentation page I just spotted this:\r\n\r\n> This feature is experimental and is subject to change. Further documentation will become available if and when the table-valued functions for PRAGMAs feature becomes officially supported. \r\n\r\nThis makes me nervous to rely on pragma function optimizations in Datasette itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321477", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321477, "node_id": "IC_kwDOBm6k_c47cesF", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:07:33Z", "updated_at": "2021-12-19T03:07:33Z", "author_association": "OWNER", "body": "If I want to continue supporting SQLite prior to 3.16.0 (2017-01-02) I'll need this optimization to only kick in with versions that support table-valued PRAGMA functions, while keeping the old `PRAGMA foreign_key_list(table)` stuff working for those older versions.\r\n\r\nThat's feasible, but it's a bit more work - and I need to make sure I have robust testing in place for SQLite 3.15.0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321327", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321327, "node_id": "IC_kwDOBm6k_c47cepv", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:05:39Z", "updated_at": "2021-12-19T03:05:44Z", "author_association": "OWNER", "body": "This caught me out once before in:\r\n- https://github.com/simonw/datasette/issues/1276\r\n\r\nTurns out Glitch was running SQLite 3.11.0 from 2016-02-15.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321217", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321217, "node_id": "IC_kwDOBm6k_c47ceoB", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:04:16Z", "updated_at": "2021-12-19T03:04:16Z", "author_association": "OWNER", "body": "One thing to watch out for though, from https://sqlite.org/pragma.html#pragfunc\r\n\r\n> The table-valued functions for PRAGMA feature was added in SQLite version 3.16.0 (2017-01-02). Prior versions of SQLite cannot use this feature. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997321115", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997321115, "node_id": "IC_kwDOBm6k_c47cemb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T03:03:12Z", "updated_at": "2021-12-19T03:03:12Z", "author_association": "OWNER", "body": "Table columns is a bit harder, because `table_xinfo` is only in SQLite 3.26.0 or higher: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/utils/__init__.py#L565-L581\r\n\r\nSo if that function is available: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++table_xinfo.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_table_xinfo%28sqlite_master.name%29+AS+table_xinfo%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n table_xinfo.*\r\nFROM\r\n sqlite_master,\r\n pragma_table_xinfo(sqlite_master.name) AS table_xinfo\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```\r\nAnd otherwise, using `table_info`: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++table_info.*%2C%0D%0A++0+as+hidden%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_table_info%28sqlite_master.name%29+AS+table_info%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n table_info.*,\r\n 0 as hidden\r\nFROM\r\n sqlite_master,\r\n pragma_table_info(sqlite_master.name) AS table_info\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997320824", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997320824, "node_id": "IC_kwDOBm6k_c47ceh4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-19T02:59:57Z", "updated_at": "2021-12-19T03:00:44Z", "author_association": "OWNER", "body": "To list all indexes: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++index_list.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_index_list%28sqlite_master.name%29+AS+index_list%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n index_list.*\r\nFROM\r\n sqlite_master,\r\n pragma_index_list(sqlite_master.name) AS index_list\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```\r\n\r\nForeign keys: https://latest.datasette.io/fixtures?sql=SELECT%0D%0A++sqlite_master.name%2C%0D%0A++foreign_key_list.*%0D%0AFROM%0D%0A++sqlite_master%2C%0D%0A++pragma_foreign_key_list%28sqlite_master.name%29+AS+foreign_key_list%0D%0AWHERE%0D%0A++sqlite_master.type+%3D+%27table%27\r\n\r\n```sql\r\nSELECT\r\n sqlite_master.name,\r\n foreign_key_list.*\r\nFROM\r\n sqlite_master,\r\n pragma_foreign_key_list(sqlite_master.name) AS foreign_key_list\r\nWHERE\r\n sqlite_master.type = 'table'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997272223", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997272223, "node_id": "IC_kwDOBm6k_c47cSqf", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T19:17:13Z", "updated_at": "2021-12-18T19:17:13Z", "author_association": "OWNER", "body": "That's a good optimization. Still need to deal with the huge flurry of `PRAGMA` queries though before I can consider this done.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997267416", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997267416, "node_id": "IC_kwDOBm6k_c47cRfY", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:44:53Z", "updated_at": "2021-12-18T18:45:28Z", "author_association": "OWNER", "body": "Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods:\r\n\r\n- `db.execute_write(sql, params=None, block=False)`\r\n- `db.execute_writescript(sql, block=False)`\r\n- `db.execute_writemany(sql, params_seq, block=False)`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997266100", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997266100, "node_id": "IC_kwDOBm6k_c47cRK0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:40:02Z", "updated_at": "2021-12-18T18:40:02Z", "author_association": "OWNER", "body": "The implementation of `cursor.executemany()` looks very efficient - it turns into a call to this C function with `multiple` set to `1`: https://github.com/python/cpython/blob/e002bbc6cce637171fb2b1391ffeca8643a13843/Modules/_sqlite/cursor.c#L468-L469", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997262475", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997262475, "node_id": "IC_kwDOBm6k_c47cQSL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:34:18Z", "updated_at": "2021-12-18T18:34:18Z", "author_association": "OWNER", "body": "\"image\"\r\n\r\nUsing `executescript=True` that call now takes 1.89ms to create all of those tables.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997248364", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997248364, "node_id": "IC_kwDOBm6k_c47cM1s", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:20:10Z", "updated_at": "2021-12-18T18:20:10Z", "author_association": "OWNER", "body": "Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this:\r\n```diff\r\ndiff --git a/datasette/database.py b/datasette/database.py\r\nindex 468e936..1a424f5 100644\r\n--- a/datasette/database.py\r\n+++ b/datasette/database.py\r\n@@ -94,10 +94,14 @@ class Database:\r\n f\"file:{self.path}{qs}\", uri=True, check_same_thread=False\r\n )\r\n \r\n- async def execute_write(self, sql, params=None, block=False):\r\n+ async def execute_write(self, sql, params=None, executescript=False, block=False):\r\n+ assert not executescript and params, \"Cannot use params with executescript=True\"\r\n def _inner(conn):\r\n with conn:\r\n- return conn.execute(sql, params or [])\r\n+ if executescript:\r\n+ return conn.executescript(sql)\r\n+ else:\r\n+ return conn.execute(sql, params or [])\r\n \r\n with trace(\"sql\", database=self.name, sql=sql.strip(), params=params):\r\n results = await self.execute_write_fn(_inner, block=block)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997245301", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997245301, "node_id": "IC_kwDOBm6k_c47cMF1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:17:04Z", "updated_at": "2021-12-18T18:17:04Z", "author_association": "OWNER", "body": "One downside of `conn.executescript()` is that it won't be picked up by the tracing mechanism - in fact nothing that uses `await db.execute_write_fn(fn, block=True)` or `await db.execute_fn(fn, block=True)` gets picked up by tracing.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997241969", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997241969, "node_id": "IC_kwDOBm6k_c47cLRx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:13:04Z", "updated_at": "2021-12-18T18:13:04Z", "author_association": "OWNER", "body": "Also: running all of those `CREATE TABLE IF NOT EXISTS` in a single call to `conn.executescript()` rather than as separate queries may speed things up too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997241645", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997241645, "node_id": "IC_kwDOBm6k_c47cLMt", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T18:12:26Z", "updated_at": "2021-12-18T18:12:26Z", "author_association": "OWNER", "body": "A simpler optimization would be just to turn all of those column and index reads into a single efficient UNION query against each database, then figure out the most efficient pattern to send them all as writes in one go as opposed to calling `.execute_write()` in a loop.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997235086", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997235086, "node_id": "IC_kwDOBm6k_c47cJmO", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T17:30:13Z", "updated_at": "2021-12-18T17:30:13Z", "author_association": "OWNER", "body": "Now that trace sees write queries (#1568) it's clear that there is a whole lot more DB activity then I had realized:\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997234858", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997234858, "node_id": "IC_kwDOBm6k_c47cJiq", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T17:28:44Z", "updated_at": "2021-12-18T17:28:44Z", "author_association": "OWNER", "body": "Maybe it would be worth exploring attaching each DB in turn to the _internal connection in order to perform these queries faster.\r\n\r\nI'm a bit worried about leaks though: the internal database isn't meant to be visible, even temporarily attaching another DB to it could cause SQL queries against that DB to be able to access the internal data.\r\n\r\nSo maybe instead the _internal connection gets to connect to the other DBs? There's a maximum of ten there I think, which is good for most but not all cases. But the cases with the most connected databases will see the worst performance!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128508", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128508, "node_id": "IC_kwDOBm6k_c47bvk8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:33:57Z", "updated_at": "2021-12-18T02:33:57Z", "author_association": "OWNER", "body": "Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128368", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128368, "node_id": "IC_kwDOBm6k_c47bviw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:32:43Z", "updated_at": "2021-12-18T02:32:43Z", "author_association": "OWNER", "body": "I wonder why the `INSERT INTO` queries don't show up in that `?trace=1` view?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128251", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128251, "node_id": "IC_kwDOBm6k_c47bvg7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:31:51Z", "updated_at": "2021-12-18T02:31:51Z", "author_association": "OWNER", "body": "I was thinking it might even be possible to convert this into a `insert into tables select from ...` query:\r\n\r\nhttps://github.com/simonw/datasette/blob/c00f29affcafce8314366852ba1a0f5a7dd25690/datasette/utils/internal_db.py#L102-L112\r\n\r\n But the `SELECT` runs against a separate database from the `INSERT INTO`, so I would have to setup a cross-database connection for this which feels a little too complicated.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1555#issuecomment-997128080", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1555", "id": 997128080, "node_id": "IC_kwDOBm6k_c47bveQ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:30:19Z", "updated_at": "2021-12-18T02:30:19Z", "author_association": "OWNER", "body": "I think all of these queries happen in one place - in the `populate_schema_tables()` function - so optimizing them might be localized to just that area of the code, which would be nice:\r\n\r\nhttps://github.com/simonw/datasette/blob/c00f29affcafce8314366852ba1a0f5a7dd25690/datasette/utils/internal_db.py#L97-L183", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079149656, "label": "Optimize all those calls to index_list and foreign_key_list"}, "performed_via_github_app": null}