{"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-899915829", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 899915829, "node_id": "IC_kwDOBm6k_c41o6A1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-17T01:02:35Z", "updated_at": "2021-08-17T01:02:35Z", "author_association": "OWNER", "body": "New approach: this time I'm building a simplified executor for the bytecode operations themselves.\r\n```python\r\ndef execute_operations(operations, max_iterations = 100, trace=None):\r\n trace = trace or (lambda *args: None)\r\n registers: Dict[int, Any] = {}\r\n cursors: Dict[int, Tuple[str, Dict]] = {}\r\n instruction_pointer = 0\r\n iterations = 0\r\n result_row = None\r\n while True:\r\n iterations += 1\r\n if iterations > max_iterations:\r\n break\r\n operation = operations[instruction_pointer]\r\n trace(instruction_pointer, dict(operation))\r\n opcode = operation[\"opcode\"]\r\n if opcode == \"Init\":\r\n if operation[\"p2\"] != 0:\r\n instruction_pointer = operation[\"p2\"]\r\n continue\r\n else:\r\n instruction_pointer += 1\r\n continue\r\n elif opcode == \"Goto\":\r\n instruction_pointer = operation[\"p2\"]\r\n continue\r\n elif opcode == \"Halt\":\r\n break\r\n elif opcode == \"OpenRead\":\r\n cursors[operation[\"p1\"]] = (\"database_table\", {\r\n \"rootpage\": operation[\"p2\"],\r\n \"connection\": operation[\"p3\"],\r\n })\r\n elif opcode == \"OpenEphemeral\":\r\n cursors[operation[\"p1\"]] = (\"ephemeral\", {\r\n \"num_columns\": operation[\"p2\"],\r\n \"index_keys\": [],\r\n })\r\n elif opcode == \"MakeRecord\":\r\n registers[operation[\"p3\"]] = (\"MakeRecord\", {\r\n \"registers\": list(range(operation[\"p1\"] + operation[\"p2\"]))\r\n })\r\n elif opcode == \"IdxInsert\":\r\n record = registers[operation[\"p2\"]]\r\n cursors[operation[\"p1\"]][1][\"index_keys\"].append(record)\r\n elif opcode == \"Rowid\":\r\n registers[operation[\"p2\"]] = (\"rowid\", {\r\n \"table\": operation[\"p1\"]\r\n })\r\n elif opcode == \"Sequence\":\r\n registers[operation[\"p2\"]] = (\"sequence\", {\r\n \"next_from_cursor\": operation[\"p1\"]\r\n })\r\n elif opcode == \"Column\":\r\n registers[operation[\"p3\"]] = (\"column\", {\r\n \"cursor\": operation[\"p1\"],\r\n \"column_offset\": operation[\"p2\"]\r\n })\r\n elif opcode == \"ResultRow\":\r\n p1 = operation[\"p1\"]\r\n p2 = operation[\"p2\"]\r\n trace(\"ResultRow: \", list(range(p1, p1 + p2)), registers)\r\n result_row = [registers.get(i) for i in range(p1, p1 + p2)]\r\n elif opcode == \"Integer\":\r\n registers[operation[\"p2\"]] = (\"Integer\", operation[\"p1\"])\r\n elif opcode == \"String8\":\r\n registers[operation[\"p2\"]] = (\"String\", operation[\"p4\"])\r\n instruction_pointer += 1\r\n return {\"registers\": registers, \"cursors\": cursors, \"result_row\": result_row}\r\n```\r\nResults are promising!\r\n```\r\nexecute_operations(db.execute(\"explain select 'hello', 55, rowid, * from searchable\").fetchall())\r\n\r\n{'registers': {1: ('String', 'hello'),\r\n 2: ('Integer', 55),\r\n 3: ('rowid', {'table': 0}),\r\n 4: ('rowid', {'table': 0}),\r\n 5: ('column', {'cursor': 0, 'column_offset': 1}),\r\n 6: ('column', {'cursor': 0, 'column_offset': 2}),\r\n 7: ('column', {'cursor': 0, 'column_offset': 3})},\r\n 'cursors': {0: ('database_table', {'rootpage': 32, 'connection': 0})},\r\n 'result_row': [('String', 'hello'),\r\n ('Integer', 55),\r\n ('rowid', {'table': 0}),\r\n ('rowid', {'table': 0}),\r\n ('column', {'cursor': 0, 'column_offset': 1}),\r\n ('column', {'cursor': 0, 'column_offset': 2}),\r\n ('column', {'cursor': 0, 'column_offset': 3})]}\r\n```\r\nHere's what happens with a union across three tables:\r\n```\r\nexecute_operations(db.execute(f\"\"\"\r\nexplain select data as content from binary_data\r\nunion\r\nselect pk as content from complex_foreign_keys\r\nunion\r\nselect name as content from facet_cities\r\n\"\"\"}).fetchall())\r\n\r\n{'registers': {1: ('column', {'cursor': 4, 'column_offset': 0}),\r\n 2: ('MakeRecord', {'registers': [0, 1, 2, 3]}),\r\n 3: ('column', {'cursor': 0, 'column_offset': 1}),\r\n 4: ('column', {'cursor': 3, 'column_offset': 0})},\r\n 'cursors': {3: ('ephemeral',\r\n {'num_columns': 1,\r\n 'index_keys': [('MakeRecord', {'registers': [0, 1]}),\r\n ('MakeRecord', {'registers': [0, 1]}),\r\n ('MakeRecord', {'registers': [0, 1, 2, 3]})]}),\r\n 2: ('database_table', {'rootpage': 44, 'connection': 0}),\r\n 4: ('database_table', {'rootpage': 24, 'connection': 0}),\r\n 0: ('database_table', {'rootpage': 42, 'connection': 0})},\r\n 'result_row': [('column', {'cursor': 3, 'column_offset': 0})]}\r\n```\r\nNote how the result_row refers to cursor 3, which is an ephemeral table which had three different sets of `MakeRecord` index keys assigned to it - indicating that the output column is NOT from the same underlying table source.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1423#issuecomment-899749881", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1423", "id": 899749881, "node_id": "IC_kwDOBm6k_c41oRf5", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-16T19:07:02Z", "updated_at": "2021-08-16T19:07:02Z", "author_association": "OWNER", "body": "Demo: https://latest.datasette.io/fixtures/compound_three_primary_keys?_facet=content&_facet_size=max&_facet=pk1&_facet=pk2\r\n\r\n\"fixtures__compound_three_primary_keys__1_001_rows\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 962391325, "label": "Show count of facet values if ?_facet_size=max"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1423#issuecomment-899744109", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1423", "id": 899744109, "node_id": "IC_kwDOBm6k_c41oQFt", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-16T18:58:29Z", "updated_at": "2021-08-16T18:58:29Z", "author_association": "OWNER", "body": "I didn't bother with the tooltip, just the visible display if `?_facet_size=max`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 962391325, "label": "Show count of facet values if ?_facet_size=max"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898961535", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898961535, "node_id": "IC_kwDOBm6k_c41lRB_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T21:37:24Z", "updated_at": "2021-08-14T21:37:24Z", "author_association": "OWNER", "body": "Did some more research into building SQLite custom versions via `pysqlite3` - here's what I figured out for macOS (which should hopefully work for Linux too): https://til.simonwillison.net/sqlite/build-specific-sqlite-pysqlite-macos", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898936068", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898936068, "node_id": "IC_kwDOBm6k_c41lK0E", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T17:44:54Z", "updated_at": "2021-08-14T17:44:54Z", "author_association": "OWNER", "body": "Another interesting query to consider: https://latest.datasette.io/fixtures?sql=explain+select+*+from++pragma_table_info%28+%27123_starts_with_digits%27%29\r\n\r\nThat one shows `VColumn` instead of `Column`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898933865", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898933865, "node_id": "IC_kwDOBm6k_c41lKRp", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T17:27:16Z", "updated_at": "2021-08-14T17:28:29Z", "author_association": "OWNER", "body": "Maybe I split this out into a separate Python library that gets tested against *every* SQLite release I can possibly try it against, and then bakes out the supported release versions into the library code itself?\r\n\r\nDatasette could depend on that library. The library could be released independently of Datasette any time a new SQLite version comes out.\r\n\r\nI could even run a separate git scraper repo that checks for new SQLite releases and submits PRs against the library when a new release comes out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898913629", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898913629, "node_id": "IC_kwDOBm6k_c41lFVd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T16:14:12Z", "updated_at": "2021-08-14T16:14:12Z", "author_association": "OWNER", "body": "I would feel a lot more comfortable about all of this if I had a robust mechanism for running the Datasette test suite against multiple versions of SQLite itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898913554", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898913554, "node_id": "IC_kwDOBm6k_c41lFUS", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T16:13:40Z", "updated_at": "2021-08-14T16:13:40Z", "author_association": "OWNER", "body": "I think I need to care about the following:\r\n\r\n- `ResultRow` and `Column` for the final result\r\n- `OpenRead` for opening tables\r\n- `OpenEphemeral` then `MakeRecord` and `IdxInsert` for writing records into ephemeral tables\r\n\r\n`Column` may reference either a table (from `OpenRead`) or an ephemeral table (from `OpenEphemeral`).\r\n\r\nThat *might* be enough.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/316#issuecomment-898824020", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/316", "id": 898824020, "node_id": "IC_kwDOCGYnMM41kvdU", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T05:12:23Z", "updated_at": "2021-08-14T05:12:23Z", "author_association": "OWNER", "body": "No visible backticks on https://sqlite-utils.datasette.io/en/latest/reference.html any more.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 970320615, "label": "Fix visible backticks on reference page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898788262", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898788262, "node_id": "IC_kwDOBm6k_c41kmum", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-14T01:22:26Z", "updated_at": "2021-08-14T01:51:08Z", "author_association": "OWNER", "body": "Tried a more complicated query:\r\n```sql\r\nexplain select pk, text1, text2, [name with . and spaces] from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search)) order by text1 desc limit 101\r\n```\r\nHere's the explain:\r\n```\r\nsqlite> explain select pk, text1, text2, [name with . and spaces] from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search)) order by text1 desc limit 101\r\n ...> ;\r\naddr opcode p1 p2 p3 p4 p5 comment \r\n---- ------------- ---- ---- ---- ------------- -- -------------\r\n0 Init 0 41 0 00 Start at 41 \r\n1 OpenEphemeral 2 6 0 k(1,-B) 00 nColumn=6 \r\n2 Integer 101 1 0 00 r[1]=101; LIMIT counter\r\n3 OpenRead 0 32 0 4 00 root=32 iDb=0; searchable\r\n4 Integer 16 3 0 00 r[3]=16; return address\r\n5 Once 0 16 0 00 \r\n6 OpenEphemeral 3 1 0 k(1,) 00 nColumn=1; Result of SELECT 1\r\n7 VOpen 1 0 0 vtab:7FCBCA72BE80 00 \r\n8 Function0 1 7 6 unknown(-1) 01 r[6]=func(r[7])\r\n9 Integer 5 4 0 00 r[4]=5 \r\n10 Integer 1 5 0 00 r[5]=1 \r\n11 VFilter 1 16 4 00 iplan=r[4] zplan=''\r\n12 Rowid 1 8 0 00 r[8]=rowid \r\n13 MakeRecord 8 1 9 C 00 r[9]=mkrec(r[8])\r\n14 IdxInsert 3 9 8 1 00 key=r[9] \r\n15 VNext 1 12 0 00 \r\n16 Return 3 0 0 00 \r\n17 Rewind 3 33 0 00 \r\n18 Column 3 0 2 00 r[2]= \r\n19 IsNull 2 32 0 00 if r[2]==NULL goto 32\r\n20 SeekRowid 0 32 2 00 intkey=r[2] \r\n21 Column 0 1 10 00 r[10]=searchable.text1\r\n22 Sequence 2 11 0 00 r[11]=cursor[2].ctr++\r\n23 IfNotZero 1 27 0 00 if r[1]!=0 then r[1]--, goto 27\r\n24 Last 2 0 0 00 \r\n25 IdxLE 2 32 10 1 00 key=r[10] \r\n26 Delete 2 0 0 00 \r\n27 Rowid 0 12 0 00 r[12]=rowid \r\n28 Column 0 2 13 00 r[13]=searchable.text2\r\n29 Column 0 3 14 00 r[14]=searchable.name with . and spaces\r\n30 MakeRecord 10 5 16 00 r[16]=mkrec(r[10..14])\r\n31 IdxInsert 2 16 10 5 00 key=r[16] \r\n32 Next 3 18 0 00 \r\n33 Sort 2 40 0 00 \r\n34 Column 2 4 15 00 r[15]=[name with . and spaces]\r\n35 Column 2 3 14 00 r[14]=text2 \r\n36 Column 2 0 13 00 r[13]=text1 \r\n37 Column 2 2 12 00 r[12]=pk \r\n38 ResultRow 12 4 0 00 output=r[12..15]\r\n39 Next 2 34 0 00 \r\n40 Halt 0 0 0 00 \r\n41 Transaction 0 0 35 0 01 usesStmtJournal=0\r\n42 Variable 1 7 0 :search 00 r[7]=parameter(1,:search)\r\n43 Goto 0 1 0 00 \r\n```\r\nHere the `ResultRow` is for registers `12..15` - but those all refer to `Column` records in `2` - where `2` is the first `OpenEphemeral` declared right at the start. I'm having enormous trouble figuring out how that ephemeral table gets populated by the other operations in a way that would let me derive which columns end up in the `ResultRow`.\r\n\r\nFrustratingly SQLite seems to be able to figure that out just fine, see the column of comments on the right hand side - but I only get those in the `sqlite3` CLI shell, they're not available to me with SQLite when called as a library from Python.\r\n\r\nMaybe the key to that is this section:\r\n```\r\n27 Rowid 0 12 0 00 r[12]=rowid \r\n28 Column 0 2 13 00 r[13]=searchable.text2\r\n29 Column 0 3 14 00 r[14]=searchable.name with . and spaces\r\n30 MakeRecord 10 5 16 00 r[16]=mkrec(r[10..14])\r\n31 IdxInsert 2 16 10 5 00 key=r[16] \r\n```\r\nMakeRecord:\r\n\r\n> Convert P2 registers beginning with P1 into the record format use as a data record in a database table or as a key in an index. The Column opcode can decode the record later.\r\n> \r\n> P4 may be a string that is P2 characters long. The N-th character of the string indicates the column affinity that should be used for the N-th field of the index key.\r\n> \r\n> The mapping from character to affinity is given by the SQLITE_AFF_ macros defined in sqliteInt.h.\r\n> \r\n> If P4 is NULL then all index fields have the affinity BLOB.\r\n> \r\n> The meaning of P5 depends on whether or not the SQLITE_ENABLE_NULL_TRIM compile-time option is enabled:\r\n> \r\n> * If SQLITE_ENABLE_NULL_TRIM is enabled, then the P5 is the index of the right-most table that can be null-trimmed.\r\n> \r\n> * If SQLITE_ENABLE_NULL_TRIM is omitted, then P5 has the value OPFLAG_NOCHNG_MAGIC if the MakeRecord opcode is allowed to accept no-change records with serial_type 10. This value is only used inside an assert() and does not affect the end result.\r\n\r\nIdxInsert:\r\n> Register P2 holds an SQL index key made using the MakeRecord instructions. This opcode writes that key into the index P1. Data for the entry is nil.\r\n> \r\n> If P4 is not zero, then it is the number of values in the unpacked key of reg(P2). In that case, P3 is the index of the first register for the unpacked key. The availability of the unpacked key can sometimes be an optimization.\r\n> \r\n> If P5 has the OPFLAG_APPEND bit set, that is a hint to the b-tree layer that this insert is likely to be an append.\r\n> \r\n> If P5 has the OPFLAG_NCHANGE bit set, then the change counter is incremented by this instruction. If the OPFLAG_NCHANGE bit is clear, then the change counter is unchanged.\r\n> \r\n> If the OPFLAG_USESEEKRESULT flag of P5 is set, the implementation might run faster by avoiding an unnecessary seek on cursor P1. However, the OPFLAG_USESEEKRESULT flag must only be set if there have been no prior seeks on the cursor or if the most recent seek used a key equivalent to P2.\r\n>\r\n> This instruction only works for indices. The equivalent instruction for tables is Insert.\r\n\r\nIdxLE:\r\n> The P4 register values beginning with P3 form an unpacked index key that omits the PRIMARY KEY or ROWID. Compare this key value against the index that P1 is currently pointing to, ignoring the PRIMARY KEY or ROWID on the P1 index.\r\n>\r\n> If the P1 index entry is less than or equal to the key value then jump to P2. Otherwise fall through to the next instruction.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898760808", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898760808, "node_id": "IC_kwDOBm6k_c41kgBo", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T23:03:01Z", "updated_at": "2021-08-13T23:03:01Z", "author_association": "OWNER", "body": "Another idea: strip out any `order by` clause to try and keep this simpler. I doubt that's going to cope with complex nested queries though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898760020", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898760020, "node_id": "IC_kwDOBm6k_c41kf1U", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T23:00:28Z", "updated_at": "2021-08-13T23:01:27Z", "author_association": "OWNER", "body": "New theory: this is all about `SorterOpen` and `SorterInsert`. Consider the following with extra annotations at the end of the lines after the `--`:\r\n```\r\naddr opcode p1 p2 p3 p4 p5 comment \r\n---- ------------- ---- ---- ---- ------------- -- -------------\r\n0 Init 0 25 0 00 Start at 25 \r\n1 SorterOpen 2 5 0 k(1,B) 00 -- New SORTER in r2 with 5 slots\r\n2 OpenRead 0 43 0 7 00 root=43 iDb=0; facetable\r\n3 OpenRead 1 42 0 2 00 root=42 iDb=0; facet_cities\r\n4 Rewind 0 16 0 00 \r\n5 Column 0 6 3 00 r[3]=facetable.neighborhood\r\n6 Function0 1 2 1 like(2) 02 r[1]=func(r[2..3])\r\n7 IfNot 1 15 1 00 \r\n8 Column 0 5 4 00 r[4]=facetable.city_id\r\n9 SeekRowid 1 15 4 00 intkey=r[4] \r\n10 Column 1 1 6 00 r[6]=facet_cities.name\r\n11 Column 0 4 7 00 r[7]=facetable.state\r\n12 Column 0 6 5 00 r[5]=facetable.neighborhood\r\n13 MakeRecord 5 3 9 00 r[9]=mkrec(r[5..7])\r\n14 SorterInsert 2 9 5 3 00 key=r[9]-- WRITES record from r9 (line above) into sorter in r2\r\n15 Next 0 5 0 01 \r\n16 OpenPseudo 3 10 5 00 5 columns in r[10]\r\n17 SorterSort 2 24 0 00 -- runs the sort, not relevant to my goal\r\n18 SorterData 2 10 3 00 r[10]=data -- \"Write into register P2 (r10) the current sorter data for sorter cursor P1 (sorter 2)\"\r\n19 Column 3 2 8 00 r[8]=state \r\n20 Column 3 1 7 00 r[7]=facet_cities.name\r\n21 Column 3 0 6 00 r[6]=neighborhood\r\n22 ResultRow 6 3 0 00 output=r[6..8]\r\n23 SorterNext 2 18 0 00 \r\n24 Halt 0 0 0 00 \r\n25 Transaction 0 0 35 0 01 usesStmtJournal=0\r\n26 String8 0 2 0 %bob% 00 r[2]='%bob%' \r\n27 Goto 0 1 0 00 \r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898576097", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898576097, "node_id": "IC_kwDOBm6k_c41jy7h", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T16:19:57Z", "updated_at": "2021-08-13T16:19:57Z", "author_association": "OWNER", "body": "I think I need to look out for `OpenPseudo` and, when that occurs, take a look at the most recent `SorterInsert` and use that to find the `MakeRecord` and then use the `MakeRecord` to figure out the columns that went into it.\r\n\r\nAfter all of that I'll be able to resolve that \"table 3\" reference.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898572065", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898572065, "node_id": "IC_kwDOBm6k_c41jx8h", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T16:13:16Z", "updated_at": "2021-08-13T16:13:16Z", "author_association": "OWNER", "body": "Aha! That `MakeRecord` line says `r[5..7]` - and r5 = neighborhood, r6 = facet_cities.name, r7 = facetable.state\r\n\r\nSo if the `MakeRecord` defines what goes into that pseudo-table column 2 of that pseudo-table would be `state` - which is what we want.\r\n\r\nThis is really convoluted. I'm no longer confident I can get this to work in a sensible way, especially since I've not started exploring what complex nested tables with CTEs and sub-selects do yet.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898569319", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898569319, "node_id": "IC_kwDOBm6k_c41jxRn", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T16:09:01Z", "updated_at": "2021-08-13T16:10:48Z", "author_association": "OWNER", "body": "Need to figure out what column 2 of that pseudo-table is.\r\n\r\nI think the answer is here:\r\n\r\n```\r\n4 Rewind 0 16 0 00 \r\n5 Column 0 6 3 00 r[3]=facetable.neighborhood\r\n6 Function0 1 2 1 like(2) 02 r[1]=func(r[2..3])\r\n7 IfNot 1 15 1 00 \r\n8 Column 0 5 4 00 r[4]=facetable.city_id\r\n9 SeekRowid 1 15 4 00 intkey=r[4] \r\n10 Column 1 1 6 00 r[6]=facet_cities.name\r\n11 Column 0 4 7 00 r[7]=facetable.state\r\n12 Column 0 6 5 00 r[5]=facetable.neighborhood\r\n13 MakeRecord 5 3 9 00 r[9]=mkrec(r[5..7])\r\n14 SorterInsert 2 9 5 3 00 key=r[9] \r\n15 Next 0 5 0 01 \r\n16 OpenPseudo 3 10 5 00 5 columns in r[10]\r\n```\r\nI think the `OpenPseduo` line puts five columns in `r[10]` - and those five columns are the five from the previous block - maybe the five leading up to the `MakeRecord` call on line 13.\r\n\r\nIn which case column 2 would be `facet_cities.name` - assuming we start counting from 0.\r\n\r\nBut the debug code said \"r[8]=state\".", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898567974", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898567974, "node_id": "IC_kwDOBm6k_c41jw8m", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T16:07:00Z", "updated_at": "2021-08-13T16:07:00Z", "author_association": "OWNER", "body": "So this line:\r\n```\r\n19 Column 3 2 8 00 r[8]=state\r\n```\r\nMeans \"Take column 2 of table 3 (the pseudo-table) and store it in register 8\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898564705", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898564705, "node_id": "IC_kwDOBm6k_c41jwJh", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T16:02:12Z", "updated_at": "2021-08-13T16:04:06Z", "author_association": "OWNER", "body": "More debug output:\r\n```\r\ntable_rootpage_by_register={0: 43, 1: 42}\r\nnames_and_types_by_rootpage={42: ('facet_cities', 'table'), 43: ('facetable', 'table')}\r\ntable_id=0 cid=6 column_register=3\r\ntable_id=0 cid=5 column_register=4\r\ntable_id=1 cid=1 column_register=6\r\ntable_id=0 cid=4 column_register=7\r\ntable_id=0 cid=6 column_register=5\r\ntable_id=3 cid=2 column_register=8\r\ntable_id=3 cid=2 column_register=8\r\n KeyError\r\n 3\r\n table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\r\n names_and_types_by_rootpage={42: ('facet_cities', 'table'), 43: ('facetable', 'table')} table_rootpage_by_register={0: 43, 1: 42} table_id=3\r\n columns_by_column_register[column_register] = (table, cid)\r\n column_register=8 = (table='facetable', cid=2)\r\ntable_id=3 cid=1 column_register=7\r\n KeyError\r\n 3\r\n table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\r\n names_and_types_by_rootpage={42: ('facet_cities', 'table'), 43: ('facetable', 'table')} table_rootpage_by_register={0: 43, 1: 42} table_id=3\r\n columns_by_column_register[column_register] = (table, cid)\r\n column_register=7 = (table='facetable', cid=1)\r\ntable_id=3 cid=0 column_register=6\r\n KeyError\r\n 3\r\n table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\r\n names_and_types_by_rootpage={42: ('facet_cities', 'table'), 43: ('facetable', 'table')} table_rootpage_by_register={0: 43, 1: 42} table_id=3\r\n columns_by_column_register[column_register] = (table, cid)\r\n column_register=6 = (table='facetable', cid=0)\r\nresult_registers=[6, 7, 8]\r\ncolumns_by_column_register={3: ('facetable', 6), 4: ('facetable', 5), 6: ('facet_cities', 1), 7: ('facetable', 4), 5: ('facetable', 6)}\r\nall_column_names={('facet_cities', 0): 'id', ('facet_cities', 1): 'name', ('facetable', 0): 'pk', ('facetable', 1): 'created', ('facetable', 2): 'planet_int', ('facetable', 3): 'on_earth', ('facetable', 4): 'state', ('facetable', 5): 'city_id', ('facetable', 6): 'neighborhood', ('facetable', 7): 'tags', ('facetable', 8): 'complex_array', ('facetable', 9): 'distinct_some_null'}\r\n```\r\nThose `KeyError` are happening here because of a lookup in `table_rootpage_by_register` for `table_id=3` - but `table_rootpage_by_register` only has keys 0 and 1.\r\n\r\nIt looks like that `3` actually corresponds to the `OpenPseudo` table from here:\r\n\r\n```\r\n16 OpenPseudo 3 10 5 00 5 columns in r[10]\r\n17 SorterSort 2 24 0 00 \r\n18 SorterData 2 10 3 00 r[10]=data \r\n19 Column 3 2 8 00 r[8]=state \r\n20 Column 3 1 7 00 r[7]=facet_cities.name\r\n21 Column 3 0 6 00 r[6]=neighborhood\r\n22 ResultRow 6 3 0 00 output=r[6..8]\r\n```\r\n\r\nPython code:\r\n\r\n```python\r\ndef columns_for_query(conn, sql, params=None):\r\n \"\"\"\r\n Given a SQLite connection ``conn`` and a SQL query ``sql``, returns a list of\r\n ``(table_name, column_name)`` pairs corresponding to the columns that would be\r\n returned by that SQL query.\r\n\r\n Each pair indicates the source table and column for the returned column, or\r\n ``(None, None)`` if no table and column could be derived (e.g. for \"select 1\")\r\n \"\"\"\r\n if sql.lower().strip().startswith(\"explain\"):\r\n return []\r\n opcodes = conn.execute(\"explain \" + sql, params).fetchall()\r\n table_rootpage_by_register = {\r\n r[\"p1\"]: r[\"p2\"] for r in opcodes if r[\"opcode\"] == \"OpenRead\"\r\n }\r\n print(f\"{table_rootpage_by_register=}\")\r\n names_and_types_by_rootpage = dict(\r\n [(r[0], (r[1], r[2])) for r in conn.execute(\r\n \"select rootpage, name, type from sqlite_master where rootpage in ({})\".format(\r\n \", \".join(map(str, table_rootpage_by_register.values()))\r\n )\r\n )]\r\n )\r\n print(f\"{names_and_types_by_rootpage=}\")\r\n columns_by_column_register = {}\r\n for opcode_row in opcodes:\r\n if opcode_row[\"opcode\"] in (\"Rowid\", \"Column\"):\r\n addr, opcode, table_id, cid, column_register, p4, p5, comment = opcode_row\r\n print(f\"{table_id=} {cid=} {column_register=}\")\r\n try:\r\n table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\r\n columns_by_column_register[column_register] = (table, cid)\r\n except KeyError as e:\r\n print(\" KeyError\")\r\n print(\" \", e)\r\n print(\" table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\")\r\n print(f\" {names_and_types_by_rootpage=} {table_rootpage_by_register=} {table_id=}\")\r\n print(\" columns_by_column_register[column_register] = (table, cid)\")\r\n print(f\" {column_register=} = ({table=}, {cid=})\")\r\n pass\r\n result_row = [dict(r) for r in opcodes if r[\"opcode\"] == \"ResultRow\"][0]\r\n result_registers = list(range(result_row[\"p1\"], result_row[\"p1\"] + result_row[\"p2\"]))\r\n print(f\"{result_registers=}\")\r\n print(f\"{columns_by_column_register=}\")\r\n all_column_names = {}\r\n for (table, _) in names_and_types_by_rootpage.values():\r\n table_xinfo = conn.execute(\"pragma table_xinfo({})\".format(table)).fetchall()\r\n for column_info in table_xinfo:\r\n all_column_names[(table, column_info[\"cid\"])] = column_info[\"name\"]\r\n print(f\"{all_column_names=}\")\r\n final_output = []\r\n for register in result_registers:\r\n try:\r\n table, cid = columns_by_column_register[register]\r\n final_output.append((table, all_column_names[table, cid]))\r\n except KeyError:\r\n final_output.append((None, None))\r\n return final_output\r\n\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898554859", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898554859, "node_id": "IC_kwDOBm6k_c41jtvr", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:46:18Z", "updated_at": "2021-08-13T15:46:18Z", "author_association": "OWNER", "body": "So it looks like the bug is in the code that populates `columns_by_column_register`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898554427", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898554427, "node_id": "IC_kwDOBm6k_c41jto7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:45:32Z", "updated_at": "2021-08-13T15:45:32Z", "author_association": "OWNER", "body": "Some useful debug output:\r\n```\r\ntable_rootpage_by_register={0: 43, 1: 42}\r\nnames_and_types_by_rootpage={42: ('facet_cities', 'table'), 43: ('facetable', 'table')}\r\nresult_registers=[6, 7, 8]\r\ncolumns_by_column_register={3: ('facetable', 6), 4: ('facetable', 5), 6: ('facet_cities', 1), 7: ('facetable', 4), 5: ('facetable', 6)}\r\nall_column_names={('facet_cities', 0): 'id', ('facet_cities', 1): 'name', ('facetable', 0): 'pk', ('facetable', 1): 'created', ('facetable', 2): 'planet_int', ('facetable', 3): 'on_earth', ('facetable', 4): 'state', ('facetable', 5): 'city_id', ('facetable', 6): 'neighborhood', ('facetable', 7): 'tags', ('facetable', 8): 'complex_array', ('facetable', 9): 'distinct_some_null'}\r\n```\r\nThe `result_registers` should each correspond to the correct entry in `columns_by_column_register` but they do not.\r\n\r\nPython code:\r\n```python\r\ndef columns_for_query(conn, sql, params=None):\r\n \"\"\"\r\n Given a SQLite connection ``conn`` and a SQL query ``sql``, returns a list of\r\n ``(table_name, column_name)`` pairs corresponding to the columns that would be\r\n returned by that SQL query.\r\n\r\n Each pair indicates the source table and column for the returned column, or\r\n ``(None, None)`` if no table and column could be derived (e.g. for \"select 1\")\r\n \"\"\"\r\n if sql.lower().strip().startswith(\"explain\"):\r\n return []\r\n opcodes = conn.execute(\"explain \" + sql, params).fetchall()\r\n table_rootpage_by_register = {\r\n r[\"p1\"]: r[\"p2\"] for r in opcodes if r[\"opcode\"] == \"OpenRead\"\r\n }\r\n print(f\"{table_rootpage_by_register=}\")\r\n names_and_types_by_rootpage = dict(\r\n [(r[0], (r[1], r[2])) for r in conn.execute(\r\n \"select rootpage, name, type from sqlite_master where rootpage in ({})\".format(\r\n \", \".join(map(str, table_rootpage_by_register.values()))\r\n )\r\n )]\r\n )\r\n print(f\"{names_and_types_by_rootpage=}\")\r\n columns_by_column_register = {}\r\n for opcode in opcodes:\r\n if opcode[\"opcode\"] in (\"Rowid\", \"Column\"):\r\n addr, opcode, table_id, cid, column_register, p4, p5, comment = opcode\r\n try:\r\n table = names_and_types_by_rootpage[table_rootpage_by_register[table_id]][0]\r\n columns_by_column_register[column_register] = (table, cid)\r\n except KeyError:\r\n pass\r\n result_row = [dict(r) for r in opcodes if r[\"opcode\"] == \"ResultRow\"][0]\r\n result_registers = list(range(result_row[\"p1\"], result_row[\"p1\"] + result_row[\"p2\"]))\r\n print(f\"{result_registers=}\")\r\n print(f\"{columns_by_column_register=}\")\r\n all_column_names = {}\r\n for (table, _) in names_and_types_by_rootpage.values():\r\n table_xinfo = conn.execute(\"pragma table_xinfo({})\".format(table)).fetchall()\r\n for column_info in table_xinfo:\r\n all_column_names[(table, column_info[\"cid\"])] = column_info[\"name\"]\r\n print(f\"{all_column_names=}\")\r\n final_output = []\r\n for register in result_registers:\r\n try:\r\n table, cid = columns_by_column_register[register]\r\n final_output.append((table, all_column_names[table, cid]))\r\n except KeyError:\r\n final_output.append((None, None))\r\n return final_output\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898545815", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898545815, "node_id": "IC_kwDOBm6k_c41jriX", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:31:53Z", "updated_at": "2021-08-13T15:31:53Z", "author_association": "OWNER", "body": "My hunch here is that registers or columns are being reused in a way that makes my code break - my code is pretty dumb, there are places in it where maybe the first mention of a register wins instead of the last one?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898541972", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898541972, "node_id": "IC_kwDOBm6k_c41jqmU", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:26:06Z", "updated_at": "2021-08-13T15:29:06Z", "author_association": "OWNER", "body": "ResultRow:\r\n> The registers P1 through P1+P2-1 contain a single row of results. This opcode causes the sqlite3_step() call to terminate with an SQLITE_ROW return code and it sets up the sqlite3_stmt structure to provide access to the r(P1)..r(P1+P2-1) values as the result row.\r\n\r\nColumn:\r\n> Interpret the data that cursor P1 points to as a structure built using the MakeRecord instruction. (See the MakeRecord opcode for additional information about the format of the data.) Extract the P2-th column from this record. If there are less that (P2+1) values in the record, extract a NULL.\r\n>\r\n> The value extracted is stored in register P3.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898541543", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898541543, "node_id": "IC_kwDOBm6k_c41jqfn", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:25:26Z", "updated_at": "2021-08-13T15:25:26Z", "author_association": "OWNER", "body": "But the debug output here seems to be saying what we want it to say:\r\n```\r\n17 SorterSort 2 24 0 00 \r\n18 SorterData 2 10 3 00 r[10]=data \r\n19 Column 3 2 8 00 r[8]=state \r\n20 Column 3 1 7 00 r[7]=facet_cities.name\r\n21 Column 3 0 6 00 r[6]=neighborhood\r\n22 ResultRow 6 3 0 00 output=r[6..8]\r\n```\r\nWe want to get back `neighborhood`, `facet_cities.name`, `state`.\r\n\r\nWhy then are we seeing `[('facet_cities', 'name'), ('facetable', 'state'), (None, None)]`?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898540260", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898540260, "node_id": "IC_kwDOBm6k_c41jqLk", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:23:28Z", "updated_at": "2021-08-13T15:23:28Z", "author_association": "OWNER", "body": "SorterInsert:\r\n> Register P2 holds an SQL index key made using the MakeRecord instructions. This opcode writes that key into the sorter P1. Data for the entry is nil.\r\n\r\nSorterData:\r\n> Write into register P2 the current sorter data for sorter cursor P1. Then clear the column header cache on cursor P3.\r\n>\r\n> This opcode is normally use to move a record out of the sorter and into a register that is the source for a pseudo-table cursor created using OpenPseudo. That pseudo-table cursor is the one that is identified by parameter P3. Clearing the P3 column cache as part of this opcode saves us from having to issue a separate NullRow instruction to clear that cache.\r\n\r\nOpenPseudo:\r\n> Open a new cursor that points to a fake table that contains a single row of data. The content of that one row is the content of memory register P2. In other words, cursor P1 becomes an alias for the MEM_Blob content contained in register P2.\r\n>\r\n> A pseudo-table created by this opcode is used to hold a single row output from the sorter so that the row can be decomposed into individual columns using the Column opcode. The Column opcode is the only cursor opcode that works with a pseudo-table.\r\n>\r\n> P3 is the number of fields in the records that will be stored by the pseudo-table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898536181", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898536181, "node_id": "IC_kwDOBm6k_c41jpL1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:17:20Z", "updated_at": "2021-08-13T15:20:33Z", "author_association": "OWNER", "body": "Documentation for `MakeRecord`: https://www.sqlite.org/opcode.html#MakeRecord\r\n\r\nRunning `explain` inside `sqlite3` provides extra comments and indentation which make it easier to understand:\r\n```\r\nsqlite> explain select neighborhood, facet_cities.name, state\r\n ...> from facetable\r\n ...> join facet_cities\r\n ...> on facetable.city_id = facet_cities.id\r\n ...> where neighborhood like '%bob%';\r\naddr opcode p1 p2 p3 p4 p5 comment \r\n---- ------------- ---- ---- ---- ------------- -- -------------\r\n0 Init 0 15 0 00 Start at 15 \r\n1 OpenRead 0 43 0 7 00 root=43 iDb=0; facetable\r\n2 OpenRead 1 42 0 2 00 root=42 iDb=0; facet_cities\r\n3 Rewind 0 14 0 00 \r\n4 Column 0 6 3 00 r[3]=facetable.neighborhood\r\n5 Function0 1 2 1 like(2) 02 r[1]=func(r[2..3])\r\n6 IfNot 1 13 1 00 \r\n7 Column 0 5 4 00 r[4]=facetable.city_id\r\n8 SeekRowid 1 13 4 00 intkey=r[4] \r\n9 Column 0 6 5 00 r[5]=facetable.neighborhood\r\n10 Column 1 1 6 00 r[6]=facet_cities.name\r\n11 Column 0 4 7 00 r[7]=facetable.state\r\n12 ResultRow 5 3 0 00 output=r[5..7]\r\n13 Next 0 4 0 01 \r\n14 Halt 0 0 0 00 \r\n15 Transaction 0 0 35 0 01 usesStmtJournal=0\r\n16 String8 0 2 0 %bob% 00 r[2]='%bob%' \r\n17 Goto 0 1 0 00 \r\n```\r\nCompared with:\r\n```\r\nsqlite> explain select neighborhood, facet_cities.name, state\r\n ...> from facetable\r\n ...> join facet_cities\r\n ...> on facetable.city_id = facet_cities.id\r\n ...> where neighborhood like '%bob%' order by neighborhood\r\n ...> ;\r\naddr opcode p1 p2 p3 p4 p5 comment \r\n---- ------------- ---- ---- ---- ------------- -- -------------\r\n0 Init 0 25 0 00 Start at 25 \r\n1 SorterOpen 2 5 0 k(1,B) 00 \r\n2 OpenRead 0 43 0 7 00 root=43 iDb=0; facetable\r\n3 OpenRead 1 42 0 2 00 root=42 iDb=0; facet_cities\r\n4 Rewind 0 16 0 00 \r\n5 Column 0 6 3 00 r[3]=facetable.neighborhood\r\n6 Function0 1 2 1 like(2) 02 r[1]=func(r[2..3])\r\n7 IfNot 1 15 1 00 \r\n8 Column 0 5 4 00 r[4]=facetable.city_id\r\n9 SeekRowid 1 15 4 00 intkey=r[4] \r\n10 Column 1 1 6 00 r[6]=facet_cities.name\r\n11 Column 0 4 7 00 r[7]=facetable.state\r\n12 Column 0 6 5 00 r[5]=facetable.neighborhood\r\n13 MakeRecord 5 3 9 00 r[9]=mkrec(r[5..7])\r\n14 SorterInsert 2 9 5 3 00 key=r[9] \r\n15 Next 0 5 0 01 \r\n16 OpenPseudo 3 10 5 00 5 columns in r[10]\r\n17 SorterSort 2 24 0 00 \r\n18 SorterData 2 10 3 00 r[10]=data \r\n19 Column 3 2 8 00 r[8]=state \r\n20 Column 3 1 7 00 r[7]=facet_cities.name\r\n21 Column 3 0 6 00 r[6]=neighborhood\r\n22 ResultRow 6 3 0 00 output=r[6..8]\r\n23 SorterNext 2 18 0 00 \r\n24 Halt 0 0 0 00 \r\n25 Transaction 0 0 35 0 01 usesStmtJournal=0\r\n26 String8 0 2 0 %bob% 00 r[2]='%bob%' \r\n27 Goto 0 1 0 00 \r\n```\r\nSo actually it looks like the `SorterSort` may be key to understanding this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898527525", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898527525, "node_id": "IC_kwDOBm6k_c41jnEl", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:08:03Z", "updated_at": "2021-08-13T15:08:03Z", "author_association": "OWNER", "body": "Am I going to need to look at the `ResultRow` and its columns but then wind back to that earlier `MakeRecord` and its columns?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898524057", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898524057, "node_id": "IC_kwDOBm6k_c41jmOZ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:06:37Z", "updated_at": "2021-08-13T15:06:37Z", "author_association": "OWNER", "body": "Comparing the `explain` for the two versions of that query - one with the order by and one without:\r\n\r\n\"fixtures__explain_select_neighborhood__facet_cities_name__state_from_facetable_join_facet_cities_on_facetable_city_id___facet_cities_id_where_neighborhood_like_________text________order_by_neighborhood_and_fixtures__explain_select_neighborh\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898519924", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898519924, "node_id": "IC_kwDOBm6k_c41jlN0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:03:36Z", "updated_at": "2021-08-13T15:03:36Z", "author_association": "OWNER", "body": "Weird edge-case: adding an `order by` changes the order of the columns with respect to the information I am deriving about them.\r\n\r\nWithout order by this gets it right:\r\n\r\n\"fixtures__select_neighborhood__facet_cities_name__state_from_facetable_join_facet_cities_on_facetable_city_id___facet_cities_id_where_neighborhood_like_________text________\"\r\n\r\nWith order by:\r\n\r\n\"fixtures__select_neighborhood__facet_cities_name__state_from_facetable_join_facet_cities_on_facetable_city_id___facet_cities_id_where_neighborhood_like_________text________order_by_neighborhood\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898517872", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898517872, "node_id": "IC_kwDOBm6k_c41jktw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T15:00:50Z", "updated_at": "2021-08-13T15:00:50Z", "author_association": "OWNER", "body": "The primary key column (or `rowid`) often resolves to an `index` record in the `sqlite_master` table, e.g. the second row in this: \r\n\r\ntype | name | tbl_name | rootpage | sql\r\n-- | -- | -- | -- | --\r\ntable | simple_primary_key | simple_primary_key | 2 | CREATE TABLE simple_primary_key ( id varchar(30) primary key, content text )\r\nindex | sqlite_autoindex_simple_primary_key_1 | simple_primary_key | 3 | \u00a0\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898506647", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898506647, "node_id": "IC_kwDOBm6k_c41jh-X", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T14:43:19Z", "updated_at": "2021-08-13T14:43:19Z", "author_association": "OWNER", "body": "Work will continue in PR #1434.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1433#issuecomment-898450402", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1433", "id": 898450402, "node_id": "IC_kwDOBm6k_c41jUPi", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-08-13T13:15:55Z", "updated_at": "2021-08-13T13:15:55Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1433](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (ddba6cc) into [main](https://codecov.io/gh/simonw/datasette/commit/2883098770fc66e50183b2b231edbde20848d4d6?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (2883098) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1433/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n\n```diff\n@@ Coverage Diff @@\n## main #1433 +/- ##\n=======================================\n Coverage 91.82% 91.82% \n=======================================\n Files 34 34 \n Lines 4418 4418 \n=======================================\n Hits 4057 4057 \n Misses 361 361 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [2883098...ddba6cc](https://codecov.io/gh/simonw/datasette/pull/1433?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 970386262, "label": "Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1429#issuecomment-898185944", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1429", "id": 898185944, "node_id": "IC_kwDOBm6k_c41iTrY", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T04:37:41Z", "updated_at": "2021-08-13T04:37:41Z", "author_association": "OWNER", "body": "If a count is available and the count is less than 1,000 it could say \"Show all\" instead.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969548935, "label": "UI for setting `?_size=max` on table page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1432#issuecomment-898084675", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1432", "id": 898084675, "node_id": "IC_kwDOBm6k_c41h69D", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T01:11:30Z", "updated_at": "2021-08-13T01:11:30Z", "author_association": "OWNER", "body": "It's only `datasette-publish-vercel` that will break the actual functionality - the others will have broken tests.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969855774, "label": "Rename Datasette.__init__(config=) parameter to settings="}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1432#issuecomment-898079507", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1432", "id": 898079507, "node_id": "IC_kwDOBm6k_c41h5sT", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T01:08:42Z", "updated_at": "2021-08-13T01:09:41Z", "author_association": "OWNER", "body": "This is going to break some plugins: https://ripgrep.datasette.io/-/ripgrep?pattern=config%3D&literal=on&glob=%21datasette%2F**\r\n\r\n> ### datasette-cluster-map/tests/test_cluster_map.py\r\n> \r\n> @pytest.mark.asyncio\r\n> \r\n> async def test_respects_base_url():\r\n> ds = Datasette([], memory=True, config={\"base_url\": \"/foo/\"})\r\n> response = await ds.client.get(\"/:memory:?sql=select+1+as+latitude,+2+as+longitude\")\r\n> assert (\r\n> \r\n> ### datasette-export-notebook/tests/test_export_notebook.py\r\n> \r\n> @pytest.mark.asyncio\r\n> \r\n> async def test_notebook_no_csv(db_path):\r\n> datasette = Datasette([db_path], config={\"allow_csv_stream\": False})\r\n> response = await datasette.client.get(\"/db/big.Notebook\")\r\n> assert \".csv\" not in response.text\r\n> \r\n> ### datasette-publish-vercel/tests/test_publish_vercel.py\r\n> metadata=metadata,\r\n> cors=True,\r\n> config={\"default_page_size\": 10, \"sql_time_limit_ms\": 2000}\r\n> ).app()\r\n> \"\"\"\r\n> \r\n> ### datasette-publish-vercel/datasette_publish_vercel/__init__.py\r\n> metadata=metadata{extras},\r\n> cors=True,\r\n> config={settings}\r\n> \r\n> ).app()\r\n> \r\n> \"\"\".strip()\r\n> \r\n> ### datasette-search-all/tests/test_search_all.py\r\n> \r\n> async def test_base_url(db_path, path):\r\n> sqlite_utils.Database(db_path)[\"creatures\"].enable_fts([\"name\", \"description\"])\r\n> datasette = Datasette([db_path], config={\"base_url\": \"/foo/\"})\r\n> response = await datasette.client.get(path)\r\n> assert response.status_code == 200\r\n\r\nI should fix those as soon as this goes out in a release. I won't close this issue until then.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969855774, "label": "Rename Datasette.__init__(config=) parameter to settings="}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1432#issuecomment-898074849", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1432", "id": 898074849, "node_id": "IC_kwDOBm6k_c41h4jh", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T01:03:40Z", "updated_at": "2021-08-13T01:03:40Z", "author_association": "OWNER", "body": "Also this method: https://github.com/simonw/datasette/blob/77f46297a88ac7e49dad2139410b01ee56d5f99c/datasette/app.py#L422-L424\r\n\r\nAnd the places that use it:\r\n\r\nhttps://github.com/simonw/datasette/blob/fc4846850fffd54561bc125332dfe97bb41ff42e/datasette/views/base.py#L617\r\n\r\nhttps://github.com/simonw/datasette/blob/fc4846850fffd54561bc125332dfe97bb41ff42e/datasette/views/database.py#L459\r\n\r\nWhich is used in this template: https://github.com/simonw/datasette/blob/77f46297a88ac7e49dad2139410b01ee56d5f99c/datasette/templates/table.html#L204\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969855774, "label": "Rename Datasette.__init__(config=) parameter to settings="}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1431#issuecomment-898072940", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1431", "id": 898072940, "node_id": "IC_kwDOBm6k_c41h4Fs", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:58:40Z", "updated_at": "2021-08-13T00:58:40Z", "author_association": "OWNER", "body": "While I'm doing this I should rename this internal variable to avoid confusion in the future:\r\n\r\nhttps://github.com/simonw/datasette/blob/e837095ef35ae155b4c78cc9a8b7133a48c94f03/datasette/app.py#L203", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969840302, "label": "`--help-config` should be called `--help-settings`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-813134386", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 813134386, "node_id": "MDEyOklzc3VlQ29tbWVudDgxMzEzNDM4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-04-05T01:20:28Z", "updated_at": "2021-08-13T00:42:30Z", "author_association": "OWNER", "body": "... that output might also provide a better way to extract variables than the current mechanism using a regular expression, by looking for the `Variable` opcodes.\r\n\r\n[UPDATE: it did indeed do that, see #1421]", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898066466", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898066466, "node_id": "IC_kwDOBm6k_c41h2gi", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:40:24Z", "updated_at": "2021-08-13T00:40:24Z", "author_association": "OWNER", "body": "It figures out renamed columns too:\r\n\r\n\"fixtures__select_created__state_as_the_state_from_facetable\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898065948", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898065948, "node_id": "IC_kwDOBm6k_c41h2Yc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:38:58Z", "updated_at": "2021-08-13T00:38:58Z", "author_association": "OWNER", "body": "Trying to run `explain select * from facetable` fails with an error in my prototype, because it tries to execute `explain explain select * from facetable` - so I need to spot that error and ignore it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898065011", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898065011, "node_id": "IC_kwDOBm6k_c41h2Jz", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:36:30Z", "updated_at": "2021-08-13T00:36:30Z", "author_association": "OWNER", "body": "> https://latest.datasette.io/fixtures?sql=explain+select+*+from+paginated_view will be an interesting test query - because `paginated_view` is defined like this:\r\n> \r\n> ```sql\r\n> CREATE VIEW paginated_view AS\r\n> SELECT\r\n> content,\r\n> '- ' || content || ' -' AS content_extra\r\n> FROM no_primary_key;\r\n> ```\r\n> \r\n> So this will help test that the mechanism isn't confused by output columns that are created through a concatenation expression.\r\n\r\nHere's what it does for that:\r\n\r\n\"fixtures__select___from_paginated_view\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898063815", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898063815, "node_id": "IC_kwDOBm6k_c41h13H", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:33:17Z", "updated_at": "2021-08-13T00:33:17Z", "author_association": "OWNER", "body": "Improved version of that function:\r\n```python\r\ndef columns_for_query(conn, sql):\r\n \"\"\"\r\n Given a SQLite connection ``conn`` and a SQL query ``sql``,\r\n returns a list of ``(table_name, column_name)`` pairs, one\r\n per returned column. ``(None, None)`` if no table and column\r\n could be derived.\r\n \"\"\"\r\n rows = conn.execute('explain ' + sql).fetchall()\r\n table_rootpage_by_register = {r['p1']: r['p2'] for r in rows if r['opcode'] == 'OpenRead'}\r\n names_by_rootpage = dict(\r\n conn.execute(\r\n 'select rootpage, name from sqlite_master where rootpage in ({})'.format(\r\n ', '.join(map(str, table_rootpage_by_register.values()))\r\n )\r\n )\r\n )\r\n columns_by_column_register = {}\r\n for row in rows:\r\n if row['opcode'] in ('Rowid', 'Column'):\r\n addr, opcode, table_id, cid, column_register, p4, p5, comment = row\r\n table = names_by_rootpage[table_rootpage_by_register[table_id]]\r\n columns_by_column_register[column_register] = (table, cid)\r\n result_row = [dict(r) for r in rows if r['opcode'] == 'ResultRow'][0]\r\n registers = list(range(result_row[\"p1\"], result_row[\"p1\"] + result_row[\"p2\"]))\r\n all_column_names = {}\r\n for table in names_by_rootpage.values():\r\n table_xinfo = conn.execute('pragma table_xinfo({})'.format(table)).fetchall()\r\n for row in table_xinfo:\r\n all_column_names[(table, row[\"cid\"])] = row[\"name\"]\r\n final_output = []\r\n for r in registers:\r\n try:\r\n table, cid = columns_by_column_register[r]\r\n final_output.append((table, all_column_names[table, cid]))\r\n except KeyError:\r\n final_output.append((None, None))\r\n return final_output\r\n```\r\nIt works!\r\n\r\n\"Banners_and_Alerts_and_fixtures__select_attraction_id__roadside_attractions_name__characteristic_id__attraction_characteristic_name_as_characteristic_from_roadside_attraction_characteristics_join_roadside_attractions_on_roadside_attractions\"\r\n\r\n```diff\r\ndiff --git a/datasette/templates/query.html b/datasette/templates/query.html\r\nindex 75f7f1b..9fe1d4f 100644\r\n--- a/datasette/templates/query.html\r\n+++ b/datasette/templates/query.html\r\n@@ -67,6 +67,8 @@\r\n

\r\n \r\n \r\n+extra_column_info: {{ extra_column_info }}\r\n+\r\n {% if display_rows %}\r\n

This data as {% for name, url in renderers.items() %}{{ name }}{{ \", \" if not loop.last }}{% endfor %}, CSV

\r\n
\r\ndiff --git a/datasette/views/database.py b/datasette/views/database.py\r\nindex 7c36034..02f8039 100644\r\n--- a/datasette/views/database.py\r\n+++ b/datasette/views/database.py\r\n@@ -10,6 +10,7 @@ import markupsafe\r\n from datasette.utils import (\r\n await_me_maybe,\r\n check_visibility,\r\n+ columns_for_query,\r\n derive_named_parameters,\r\n to_css_class,\r\n validate_sql_select,\r\n@@ -248,6 +249,8 @@ class QueryView(DataView):\r\n \r\n query_error = None\r\n \r\n+ extra_column_info = None\r\n+\r\n # Execute query - as write or as read\r\n if write:\r\n if request.method == \"POST\":\r\n@@ -334,6 +337,10 @@ class QueryView(DataView):\r\n database, sql, params_for_query, truncate=True, **extra_args\r\n )\r\n columns = [r[0] for r in results.description]\r\n+\r\n+ # Try to figure out extra column information\r\n+ db = self.ds.get_database(database)\r\n+ extra_column_info = await db.execute_fn(lambda conn: columns_for_query(conn, sql))\r\n except sqlite3.DatabaseError as e:\r\n query_error = e\r\n results = None\r\n@@ -462,6 +469,7 @@ class QueryView(DataView):\r\n \"show_hide_text\": show_hide_text,\r\n \"show_hide_hidden\": markupsafe.Markup(show_hide_hidden),\r\n \"hide_sql\": hide_sql,\r\n+ \"extra_column_info\": extra_column_info,\r\n }\r\n \r\n return (\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1293#issuecomment-898056013", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1293", "id": 898056013, "node_id": "IC_kwDOBm6k_c41hz9N", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:12:09Z", "updated_at": "2021-08-13T00:12:09Z", "author_association": "OWNER", "body": "Having added column metadata in #1430 (ref #942) I could also include a definition list at the top of the query results page exposing the column descriptions for any columns, using the same EXPLAIN mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849978964, "label": "Show column metadata plus links for foreign keys on arbitrary query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898051645", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898051645, "node_id": "IC_kwDOBm6k_c41hy49", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-13T00:02:25Z", "updated_at": "2021-08-13T00:02:25Z", "author_association": "OWNER", "body": "And on mobile:\r\n\r\n![5FAF8D73-7199-4BB7-A5B8-9E46DCB4A985](https://user-images.githubusercontent.com/9599/129284817-dc13cbf4-144e-4f4c-8fb7-470602e2eea0.jpeg)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898050457", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898050457, "node_id": "IC_kwDOBm6k_c41hymZ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T23:59:53Z", "updated_at": "2021-08-12T23:59:53Z", "author_association": "OWNER", "body": "Documentation: https://docs.datasette.io/en/latest/metadata.html#column-descriptions\r\n\r\nLive demo: https://latest.datasette.io/fixtures/roadside_attractions\r\n\r\n\"fixtures__roadside_attractions__4_rows\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1430#issuecomment-898043575", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1430", "id": 898043575, "node_id": "IC_kwDOBm6k_c41hw63", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-08-12T23:39:36Z", "updated_at": "2021-08-12T23:49:51Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1430](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (9419947) into [main](https://codecov.io/gh/simonw/datasette/commit/b1fed48a95516ae84c0f020582303ab50ab817e2?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (b1fed48) will **increase** coverage by `0.00%`.\n> The diff coverage is `100.00%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1430/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n\n```diff\n@@ Coverage Diff @@\n## main #1430 +/- ##\n=======================================\n Coverage 91.71% 91.71% \n=======================================\n Files 34 34 \n Lines 4417 4418 +1 \n=======================================\n+ Hits 4051 4052 +1 \n Misses 366 366 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1430/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `96.00% <100.00%> (+<0.01%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [b1fed48...9419947](https://codecov.io/gh/simonw/datasette/pull/1430?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969758038, "label": "Column metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898037650", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898037650, "node_id": "IC_kwDOBm6k_c41hveS", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T23:23:54Z", "updated_at": "2021-08-12T23:23:54Z", "author_association": "OWNER", "body": "I like this enough that I'm going to ship it as an alpha and try it out on a couple of live projects.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898037456", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898037456, "node_id": "IC_kwDOBm6k_c41hvbQ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T23:23:34Z", "updated_at": "2021-08-12T23:23:34Z", "author_association": "OWNER", "body": "Prototype with a `
`:\r\n\r\n\"fixtures__sortable__201_rows\"\r\n\r\n```diff\r\ndiff --git a/datasette/static/app.css b/datasette/static/app.css\r\nindex c6be1e9..bf068fd 100644\r\n--- a/datasette/static/app.css\r\n+++ b/datasette/static/app.css\r\n@@ -836,6 +841,16 @@ svg.dropdown-menu-icon {\r\n background-repeat: no-repeat;\r\n }\r\n \r\n+dl.column-descriptions dt {\r\n+ font-weight: bold;\r\n+}\r\n+dl.column-descriptions dd {\r\n+ padding-left: 1.5em;\r\n+ white-space: pre-wrap;\r\n+ line-height: 1.1em;\r\n+ color: #666;\r\n+}\r\n+\r\n .anim-scale-in {\r\n animation-name: scale-in;\r\n animation-duration: 0.15s;\r\ndiff --git a/datasette/templates/table.html b/datasette/templates/table.html\r\nindex 211352b..466e8a4 100644\r\n--- a/datasette/templates/table.html\r\n+++ b/datasette/templates/table.html\r\n@@ -51,6 +51,14 @@\r\n \r\n {% block description_source_license %}{% include \"_description_source_license.html\" %}{% endblock %}\r\n \r\n+{% if metadata.columns %}\r\n+
\r\n+ {% for column_name, column_description in metadata.columns.items() %}\r\n+
{{ column_name }}
{{ column_description }}
\r\n+ {% endfor %}\r\n+
\r\n+{% endif %}\r\n+\r\n {% if filtered_table_rows_count or human_description_en %}\r\n

{% if filtered_table_rows_count or filtered_table_rows_count == 0 %}{{ \"{:,}\".format(filtered_table_rows_count) }} row{% if filtered_table_rows_count == 1 %}{% else %}s{% endif %}{% endif %}\r\n {% if human_description_en %}{{ human_description_en }}{% endif %}\r\n ```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898032118", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898032118, "node_id": "IC_kwDOBm6k_c41huH2", "user": {"value": 596279, "label": "zaneselvans"}, "created_at": "2021-08-12T23:12:00Z", "updated_at": "2021-08-12T23:12:00Z", "author_association": "NONE", "body": "This looks awesome. We'll definitely make extensive use of this feature!\n\nOn Thu, Aug 12, 2021 at 5:52 PM Simon Willison ***@***.***>\nwrote:\n\n> I like this. Need to solve for mobile though where the cog menu isn't\n> visible - I think I'll do that with a definition list at the top of the\n> page.\n>\n> \u2014\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> ,\n> or unsubscribe\n> \n> .\n> Triage notifications on the go with GitHub Mobile for iOS\n> \n> or Android\n> \n> .\n>\n\n\n-- \nZane A. Selvans, PhD\nChief Data Wrangler\nCatalyst Cooperative\nhttps://catalyst.coop\n***@***.***\nSignal/WhatsApp/SMS: +1 720 443 1363\nTwitter: @ZaneSelvans \nPGP : 0x64F7B56F3A127B04\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898022235", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898022235, "node_id": "IC_kwDOBm6k_c41hrtb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T22:52:23Z", "updated_at": "2021-08-12T22:52:23Z", "author_association": "OWNER", "body": "I like this. Need to solve for mobile though where the cog menu isn't visible - I think I'll do that with a definition list at the top of the page.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-898021895", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 898021895, "node_id": "IC_kwDOBm6k_c41hroH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T22:51:36Z", "updated_at": "2021-08-12T22:51:36Z", "author_association": "OWNER", "body": "Prototype:\r\n\r\n\"fixtures__sortable__201_rows\"\r\n\r\n```diff\r\ndiff --git a/datasette/static/app.css b/datasette/static/app.css\r\nindex c6be1e9..5ca64cb 100644\r\n--- a/datasette/static/app.css\r\n+++ b/datasette/static/app.css\r\n@@ -784,9 +784,14 @@ svg.dropdown-menu-icon {\r\n font-size: 0.7em;\r\n color: #666;\r\n margin: 0;\r\n- padding: 0;\r\n padding: 4px 8px 4px 8px;\r\n }\r\n+.dropdown-menu .dropdown-column-description {\r\n+ margin: 0;\r\n+ color: #666;\r\n+ padding: 4px 8px 4px 8px;\r\n+ max-width: 20em;\r\n+}\r\n .dropdown-menu li {\r\n border-bottom: 1px solid #ccc;\r\n }\r\ndiff --git a/datasette/static/table.js b/datasette/static/table.js\r\nindex 991346d..a903112 100644\r\n--- a/datasette/static/table.js\r\n+++ b/datasette/static/table.js\r\n@@ -9,6 +9,7 @@ var DROPDOWN_HTML = `
\r\n
  • Show not-blank rows
  • \r\n \r\n

    \r\n+

    \r\n
    `;\r\n \r\n var DROPDOWN_ICON_SVG = `\r\n@@ -166,6 +167,14 @@ var DROPDOWN_ICON_SVG = `\r\n
    \r\n {% for column in display_columns %}\r\n- \r\n+ \r\n {% if not column.sortable %}\r\n {{ column.name }}\r\n {% else %}\r\ndiff --git a/datasette/views/table.py b/datasette/views/table.py\r\nindex 456d806..486a613 100644\r\n--- a/datasette/views/table.py\r\n+++ b/datasette/views/table.py\r\n@@ -125,6 +125,7 @@ class RowTableShared(DataView):\r\n \"\"\"Returns columns, rows for specified table - including fancy foreign key treatment\"\"\"\r\n db = self.ds.databases[database]\r\n table_metadata = self.ds.table_metadata(database, table)\r\n+ column_descriptions = table_metadata.get(\"columns\") or {}\r\n column_details = {col.name: col for col in await db.table_column_details(table)}\r\n sortable_columns = await self.sortable_columns_for_table(database, table, True)\r\n pks = await db.primary_keys(table)\r\n@@ -147,6 +148,7 @@ class RowTableShared(DataView):\r\n \"is_pk\": r[0] in pks_for_display,\r\n \"type\": type_,\r\n \"notnull\": notnull,\r\n+ \"description\": column_descriptions.get(r[0]),\r\n }\r\n )\r\n\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-897996296", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 897996296, "node_id": "IC_kwDOBm6k_c41hlYI", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T22:01:36Z", "updated_at": "2021-08-12T22:01:36Z", "author_association": "OWNER", "body": "I'm going with `\"columns\": {\"name-of-column\": \"description-of-column\"}`.\r\n\r\nIf I decide to make `\"col\"` and `\"nocol\"` available in metadata I'll use those as the keys in the metadata, for consistency with the existing query string parameters.\r\n\r\nI'm OK with having both `\"columns\": ...` and `\"col\": ...` keys in the metadata, even though they could be a tiny bit confusing without the documentation.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1429#issuecomment-897960049", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1429", "id": 897960049, "node_id": "IC_kwDOBm6k_c41hchx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-12T20:53:04Z", "updated_at": "2021-08-12T20:53:04Z", "author_association": "OWNER", "body": "Maybe something like this:\r\n\r\n> [Next page](#) - 100 per page ([show 1,000 per page](#))", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 969548935, "label": "UI for setting `?_size=max` on table page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/186#issuecomment-897600677", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/186", "id": 897600677, "node_id": "IC_kwDOCGYnMM41gEyl", "user": {"value": 9308268, "label": "rayvoelker"}, "created_at": "2021-08-12T12:32:14Z", "updated_at": "2021-08-12T12:32:14Z", "author_association": "NONE", "body": "Actually, I forgot to include the `bib_pub_year` in the extract ... \r\n\r\nBut also, I tried again with empty string values instead of `NULL` values and it seems to place the foreign key properly / correctly... \r\n\r\n```python3\r\nsql = \"\"\"\\\r\nINSERT INTO \"circulation_info\" (\"item_id\", \"bib_title\", \"bib_creator\", \"bib_format\", \"bib_pub_year\", \"checkout_date\")\r\nVALUES\r\n(1, \"title one\", \"creator one\", \"Book\", 2018, \"2021-08-12 00:01\"),\r\n(2, \"title two\", \"creator one\", \"Book\", 2019, \"2021-08-12 00:02\"),\r\n(3, \"title three\", \"\", \"DVD\", 2020, \"2021-08-12 00:03\"),\r\n(4, \"title four\", \"\", \"DVD\", \"\", \"2021-08-12 00:04\"),\r\n(5, \"title five\", \"\", \"DVD\", \"\", \"2021-08-12 00:05\")\r\n\"\"\"\r\n\r\nwith sqlite3.connect('test_bib_2.db') as con:\r\n con.execute(sql)\r\n```\r\n\r\n```python3\r\ndb[\"circulation_info\"].extract(\r\n [\r\n \"bib_title\",\r\n \"bib_creator\",\r\n \"bib_format\",\r\n \"bib_pub_year\"\r\n ],\r\n table=\"bib_info\", \r\n fk_column=\"bib_info_id\"\r\n)\r\n```\r\n\r\n```\r\n{'id': 1, 'item_id': 1, 'bib_info_id': 1, 'bib_pub_year': 2018, 'checkout_date': '2021-08-12 00:01'}\r\n{'id': 2, 'item_id': 2, 'bib_info_id': 2, 'bib_pub_year': 2019, 'checkout_date': '2021-08-12 00:02'}\r\n{'id': 3, 'item_id': 3, 'bib_info_id': 3, 'bib_pub_year': 2020, 'checkout_date': '2021-08-12 00:03'}\r\n{'id': 4, 'item_id': 4, 'bib_info_id': 4, 'bib_pub_year': '', 'checkout_date': '2021-08-12 00:04'}\r\n{'id': 5, 'item_id': 5, 'bib_info_id': 5, 'bib_pub_year': '', 'checkout_date': '2021-08-12 00:05'}\r\n\r\n---\r\n\r\n{'id': 1, 'bib_title': 'title one', 'bib_creator': 'creator one', 'bib_format': 'Book'}\r\n{'id': 2, 'bib_title': 'title two', 'bib_creator': 'creator one', 'bib_format': 'Book'}\r\n{'id': 3, 'bib_title': 'title three', 'bib_creator': '', 'bib_format': 'DVD'}\r\n{'id': 4, 'bib_title': 'title four', 'bib_creator': '', 'bib_format': 'DVD'}\r\n{'id': 5, 'bib_title': 'title five', 'bib_creator': '', 'bib_format': 'DVD'}\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 722816436, "label": ".extract() shouldn't extract null values"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/186#issuecomment-897588624", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/186", "id": 897588624, "node_id": "IC_kwDOCGYnMM41gB2Q", "user": {"value": 9308268, "label": "rayvoelker"}, "created_at": "2021-08-12T12:13:25Z", "updated_at": "2021-08-12T12:13:25Z", "author_association": "NONE", "body": "I think I ran into an issue that's perhaps related with `extract()`\r\n\r\nI have a case where I want to create a lookup table for all the related title data where there are possibly multiple null values in the related columns ....\r\n```python3\r\nsql = \"\"\"\\\r\nINSERT INTO \"circulation_info\" (\"item_id\", \"bib_title\", \"bib_creator\", \"bib_format\", \"bib_pub_year\", \"checkout_date\")\r\nVALUES\r\n(1, \"title one\", \"creator one\", \"Book\", 2018, \"2021-08-12 00:01\"),\r\n(2, \"title two\", \"creator one\", \"Book\", 2019, \"2021-08-12 00:02\"),\r\n(3, \"title three\", NULL, \"DVD\", 2020, \"2021-08-12 00:03\"),\r\n(4, \"title four\", NULL, \"DVD\", NULL, \"2021-08-12 00:04\"),\r\n(5, \"title five\", NULL, \"DVD\", NULL, \"2021-08-12 00:05\")\r\n\"\"\"\r\n\r\nwith sqlite3.connect('test_bib.db') as con:\r\n con.execute(sql)\r\n```\r\n\r\nwhen I run the `extract()` method ... \r\n\r\n```python3\r\ndb[\"circulation_info\"].extract(\r\n [\r\n \"bib_title\",\r\n \"bib_creator\",\r\n \"bib_format\" \r\n ],\r\n table=\"bib_info\", \r\n fk_column=\"bib_info_id\"\r\n)\r\n\r\ndb = sqlite_utils.Database(\"test_bib.db\")\r\n\r\nfor row in db[\"circulation_info\"].rows:\r\n print(row)\r\n\r\nprint(\"\\n---\\n\")\r\n\r\nfor row in db[\"bib_info\"].rows:\r\n print(row)\r\n```\r\n\r\nresults in this .. \r\n```\r\n{'id': 1, 'item_id': 1, 'bib_info_id': 1, 'bib_pub_year': 2018, 'checkout_date': '2021-08-12 00:01'}\r\n{'id': 2, 'item_id': 2, 'bib_info_id': 2, 'bib_pub_year': 2019, 'checkout_date': '2021-08-12 00:02'}\r\n{'id': 3, 'item_id': 3, 'bib_info_id': None, 'bib_pub_year': 2020, 'checkout_date': '2021-08-12 00:03'}\r\n{'id': 4, 'item_id': 4, 'bib_info_id': None, 'bib_pub_year': None, 'checkout_date': '2021-08-12 00:04'}\r\n{'id': 5, 'item_id': 5, 'bib_info_id': None, 'bib_pub_year': None, 'checkout_date': '2021-08-12 00:05'}\r\n\r\n---\r\n\r\n{'id': 1, 'bib_title': 'title one', 'bib_creator': 'creator one', 'bib_format': 'Book'}\r\n{'id': 2, 'bib_title': 'title two', 'bib_creator': 'creator one', 'bib_format': 'Book'}\r\n{'id': 3, 'bib_title': 'title three', 'bib_creator': None, 'bib_format': 'DVD'}\r\n{'id': 4, 'bib_title': 'title four', 'bib_creator': None, 'bib_format': 'DVD'}\r\n{'id': 5, 'bib_title': 'title five', 'bib_creator': None, 'bib_format': 'DVD'}\r\n```\r\n\r\nSeems like it's correctly generating the row data for those lookups, but it's not correctly updating the foreign key back to the primary table? Looks like it just results in a `NULL` value in that original table.\r\n\r\nAny ideas on why? Thanks again!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 722816436, "label": ".extract() shouldn't extract null values"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/311#issuecomment-896381184", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/311", "id": 896381184, "node_id": "IC_kwDOCGYnMM41bbEA", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T23:33:33Z", "updated_at": "2021-08-10T23:33:33Z", "author_association": "OWNER", "body": "Now live at https://sqlite-utils.datasette.io/en/latest/reference.html\r\n\r\nTIL from what I learned today here: https://til.simonwillison.net/sphinx/sphinx-autodoc", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965102534, "label": "Add reference documentation generated from docstrings"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-896378525", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8", "id": 896378525, "node_id": "IC_kwDODFE5qs41baad", "user": {"value": 28565, "label": "maxhawkins"}, "created_at": "2021-08-10T23:28:45Z", "updated_at": "2021-08-10T23:28:45Z", "author_association": "NONE", "body": "I added parsing of text/html emails using BeautifulSoup.\r\n\r\nAround half of the emails in my archive don't include a text/plain payload so adding html parsing makes a good chunk of them searchable.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 954546309, "label": "Add Gmail takeout mbox import (v2)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896162082", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896162082, "node_id": "IC_kwDOCGYnMM41alki", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-08-10T17:10:39Z", "updated_at": "2021-08-10T23:07:35Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#312](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (43bc064) into [main](https://codecov.io/gh/simonw/sqlite-utils/commit/ee469e3122d6f5973ec2584c1580d930daca2e7c?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (ee469e3) will **decrease** coverage by `0.02%`.\n> The diff coverage is `96.84%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/sqlite-utils/pull/312/graphs/tree.svg?width=650&height=150&src=pr&token=O0X3703L9P&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n\n```diff\n@@ Coverage Diff @@\n## main #312 +/- ##\n==========================================\n- Coverage 96.30% 96.28% -0.03% \n==========================================\n Files 5 5 \n Lines 2168 2179 +11 \n==========================================\n+ Hits 2088 2098 +10 \n- Misses 80 81 +1 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage \u0394 | |\n|---|---|---|\n| [sqlite\\_utils/db.py](https://codecov.io/gh/simonw/sqlite-utils/pull/312/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-c3FsaXRlX3V0aWxzL2RiLnB5) | `97.91% <96.84%> (-0.08%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [ee469e3...43bc064](https://codecov.io/gh/simonw/sqlite-utils/pull/312?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/314#issuecomment-896369551", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/314", "id": 896369551, "node_id": "IC_kwDOCGYnMM41bYOP", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T23:06:41Z", "updated_at": "2021-08-10T23:06:41Z", "author_association": "OWNER", "body": "I took a big bite out of this when I annotated the ``.insert()`` method - but there are a bunch of other places that still need doing: https://github.com/simonw/sqlite-utils/blob/43bc06481783c3cfcee70c0cb541a686e8894adb/sqlite_utils/db.py#L2382-L2397\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965210966, "label": "Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/314#issuecomment-896344833", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/314", "id": 896344833, "node_id": "IC_kwDOCGYnMM41bSMB", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T22:07:34Z", "updated_at": "2021-08-10T22:07:34Z", "author_association": "OWNER", "body": "Also the `.insert()` family of methods - they look pretty ugly in Sphinx right now:\r\n\r\n
    \"API_Reference_\u2014_sqlite-utils_3_15-16-gf51b712_documentation\"\r\n\r\nI should probably define reusable types for things like `pk=`, which have complex type signatures (a string or a list/tuple of strings) and show up in multiple places.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965210966, "label": "Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/315#issuecomment-896339144", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/315", "id": 896339144, "node_id": "IC_kwDOCGYnMM41bQzI", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T21:55:41Z", "updated_at": "2021-08-10T21:55:41Z", "author_association": "OWNER", "body": "Or should we raise an error if you attempt to call `.delete_where()` on a table that doesn't exist?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965440017, "label": "`.delete_where()` returns `[]` when it should return self"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896284722", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896284722, "node_id": "IC_kwDOCGYnMM41bDgy", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T20:08:03Z", "updated_at": "2021-08-10T20:08:21Z", "author_association": "OWNER", "body": "Spotted a rogue backtick:\r\n\r\n![A0147E27-7506-49B0-BEFB-20D99BBFEBAD](https://user-images.githubusercontent.com/9599/128927930-b3333dee-a385-409b-a945-f108e6ea40df.jpeg)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896200682", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896200682, "node_id": "IC_kwDOCGYnMM41au_q", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T18:03:40Z", "updated_at": "2021-08-10T18:03:40Z", "author_association": "OWNER", "body": "Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896186025", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896186025, "node_id": "IC_kwDOCGYnMM41arap", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:42:51Z", "updated_at": "2021-08-10T17:42:51Z", "author_association": "OWNER", "body": "That worked! https://sqlite-utils.datasette.io/en/autodoc/reference.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896182934", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896182934, "node_id": "IC_kwDOCGYnMM41aqqW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:38:44Z", "updated_at": "2021-08-10T17:38:44Z", "author_association": "OWNER", "body": "From https://docs.readthedocs.io/en/stable/config-file/v2.html#packages it looks like I can tell Read The Docs to run `pip install -e .` using a `.readthedocs.yaml` configuration:\r\n\r\n```yaml\r\nversion: 2\r\n\r\nsphinx:\r\n configuration: docs/conf.py\r\n\r\npython:\r\n version: \"3.9\"\r\n install:\r\n - method: pip\r\n path: .\r\n extra_requirements:\r\n - docs\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896180956", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896180956, "node_id": "IC_kwDOCGYnMM41aqLc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:35:51Z", "updated_at": "2021-08-10T17:35:51Z", "author_association": "OWNER", "body": "Reading the rest of https://sphinx-rtd-tutorial.readthedocs.io/en/latest/sphinx-config.html#autodoc-configuration it suggests using a `requirements.txt` file to install dependencies - but I use `setup.py` for that so I need to figure out a different pattern here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896175438", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896175438, "node_id": "IC_kwDOCGYnMM41ao1O", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:28:19Z", "updated_at": "2021-08-10T17:28:19Z", "author_association": "OWNER", "body": "https://sphinx-rtd-tutorial.readthedocs.io/en/latest/sphinx-config.html#autodoc-configuration says do something like this at the top of `conf.py`:\r\n\r\n```python\r\nimport os\r\nimport sys\r\nsys.path.insert(0, os.path.abspath('../../simpleble/'))\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896174456", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896174456, "node_id": "IC_kwDOCGYnMM41aol4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:27:01Z", "updated_at": "2021-08-10T17:27:01Z", "author_association": "OWNER", "body": "Docs are now building at https://sqlite-utils.datasette.io/en/autodoc/reference.html\r\n\r\nBut there's a problem! The page is semi-blank:\r\n\r\n\"Reference_\u2014_sqlite-utils_3_15-6-gc11ff89_documentation\"\r\n\r\nI need to teach Read The Docs how to ensure `sqlite_utils` is available for introspection.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896156971", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896156971, "node_id": "IC_kwDOCGYnMM41akUr", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:04:22Z", "updated_at": "2021-08-10T17:05:59Z", "author_association": "OWNER", "body": "I'm going to get Read The Docs to build the docs for this branch too - on https://readthedocs.org/projects/sqlite-utils/versions/ I am clicking this button:\r\n\r\n\"Versions___Read_the_Docs\"\r\n\r\nI then set it to \"active\" (so pushes to the branch will build it) and \"hidden\" (so it wouldn't show up in search or in the navigation menu). https://docs.readthedocs.io/en/stable/versions.html#version-states\r\n\r\n\"autodoc_-_sqlite-utils___Read_the_Docs\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/312#issuecomment-896154028", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/312", "id": 896154028, "node_id": "IC_kwDOCGYnMM41ajms", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T17:01:06Z", "updated_at": "2021-08-10T17:01:06Z", "author_association": "OWNER", "body": "On Python 3.6:\r\n\r\n```\r\nsqlite_utils/db.py:366: in Database\r\n def tables(self) -> List[Table]:\r\nE NameError: name 'Table' is not defined\r\n```\r\nPython 3.7 can fix this with `from __future__ import annotations` but since we still support 3.6 I'll have to use a string instead.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965143346, "label": "Add reference page to documentation using Sphinx autodoc"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/311#issuecomment-896152812", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/311", "id": 896152812, "node_id": "IC_kwDOCGYnMM41ajTs", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T16:59:34Z", "updated_at": "2021-08-10T16:59:34Z", "author_association": "OWNER", "body": "Work will continue in PR #312.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965102534, "label": "Add reference documentation generated from docstrings"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/311#issuecomment-896149590", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/311", "id": 896149590, "node_id": "IC_kwDOCGYnMM41aihW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T16:55:36Z", "updated_at": "2021-08-10T16:55:36Z", "author_association": "OWNER", "body": "I'm going to use this as an excuse to add a bunch more type signatures too, refs #266.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965102534, "label": "Add reference documentation generated from docstrings"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/311#issuecomment-896131902", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/311", "id": 896131902, "node_id": "IC_kwDOCGYnMM41aeM-", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-10T16:31:51Z", "updated_at": "2021-08-10T16:31:51Z", "author_association": "OWNER", "body": "`make livehtml` wasn't picking up changes I made to the docstrings `.py` files.\r\n\r\nFix was to change it to this:\r\n```\r\nsphinx-autobuild -a -b html \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(0) --watch ../sqlite_utils\r\n```\r\nSee https://github.com/executablebooks/sphinx-autobuild#relevant-sphinx-bugs - though that suggested `-a` but didn't suggest `--watch`, which is a tip I got from https://github.com/executablebooks/sphinx-autobuild#working-on-a-sphinx-html-theme\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 965102534, "label": "Add reference documentation generated from docstrings"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895622908", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895622908, "node_id": "IC_kwDOCGYnMM41Yh78", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T23:40:29Z", "updated_at": "2021-08-09T23:40:29Z", "author_association": "OWNER", "body": "TIL about how the stack inspection works: https://til.simonwillison.net/python/find-local-variables-in-exception-traceback", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895581038", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895581038, "node_id": "IC_kwDOCGYnMM41YXtu", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T22:03:54Z", "updated_at": "2021-08-09T23:39:53Z", "author_association": "OWNER", "body": "Steps to reproduce:\r\n\r\n echo '{\"v\": 34223049823094832094802398430298048240}' | sqlite-utils insert /tmp/blah.db row -", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895592507", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895592507, "node_id": "IC_kwDOCGYnMM41Yag7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T22:26:28Z", "updated_at": "2021-08-09T22:33:48Z", "author_association": "OWNER", "body": "Demo:\r\n```\r\n$ echo '{\"v\": 34223049823094832094802398430298048240}' | sqlite-utils insert /tmp/blah.db row - \r\nError: Python int too large to convert to SQLite INTEGER\r\n\r\nsql = INSERT INTO [row] ([v]) VALUES (?);\r\nparameters = [34223049823094832094802398430298048240]\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895587441", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895587441, "node_id": "IC_kwDOCGYnMM41YZRx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T22:15:45Z", "updated_at": "2021-08-09T22:15:45Z", "author_association": "OWNER", "body": "```\r\nOverflowError: Python int too large to convert to SQLite INTEGER\r\n>>> import sys\r\n>>> def find_variables(tb, vars):\r\n to_find = list(vars)\r\n found = {}\r\n for var in to_find:\r\n if var in tb.tb_frame.f_locals:\r\n vars.remove(var)\r\n found[var] = tb.tb_frame.f_locals[var]\r\n if vars and tb.tb_next:\r\n found.update(find_variables(tb.tb_next, vars))\r\n return found\r\n... \r\n>>> find_variables(sys.last_traceback, [\"sql\", \"params\"])\r\n{'params': [34223049823094832094802398430298048240], 'sql': 'INSERT INTO [row] ([v]) VALUES (?);'}\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895587282", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895587282, "node_id": "IC_kwDOCGYnMM41YZPS", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T22:15:25Z", "updated_at": "2021-08-09T22:15:25Z", "author_association": "OWNER", "body": "I'm going to use a bit of a dirty trick for this one: I'm going to recursively inspect the stack on an error and try to find the `sql` and `params` variables.\r\n\r\nThat way I can handle this all at the CLI layer without changing the exceptions that are being raised by the Python library.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895577012", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/309", "id": 895577012, "node_id": "IC_kwDOCGYnMM41YWu0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T21:55:52Z", "updated_at": "2021-08-09T21:59:03Z", "author_association": "OWNER", "body": "Yeah this error message could certainly be more helpful.\r\n\r\nI thought `OverflowError` might be one of the SQLite exceptions: https://docs.python.org/3/library/sqlite3.html#exceptions - but it turns out it's actually reusing the Python built-in `OverflowError` class:\r\n```python\r\nimport sqlite3\r\ndb = sqlite3.connect(\":memory:\")\r\ncaught = []\r\ntry:\r\n db.execute(\"create table foo (number integer)\")\r\n db.execute(\"insert into foo (number) values (?)\", [34223049823094832094802398430298048240])\r\nexcept Exception as e:\r\n print(e)\r\n caught.append(e)\r\nisinstance(caught[0], OverflowError)\r\n```\r\nHere's where that happens in the Python `sqlite3` module code: https://github.com/python/cpython/blob/058fb35b57ca8c5063d16ec818e668b3babfea65/Modules/_sqlite/util.c#L123-L124", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963897111, "label": "sqlite-utils insert errors should show SQL and parameters, if possible"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/310#issuecomment-895572309", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/310", "id": 895572309, "node_id": "IC_kwDOCGYnMM41YVlV", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T21:46:15Z", "updated_at": "2021-08-09T21:46:15Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#flattening-nested-json-objects", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964400482, "label": "`sqlite-utils insert --flatten` option to flatten nested JSON"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/310#issuecomment-895571420", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/310", "id": 895571420, "node_id": "IC_kwDOCGYnMM41YVXc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T21:44:38Z", "updated_at": "2021-08-09T21:44:38Z", "author_association": "OWNER", "body": "When I ship this I should update the TILs at https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs and https://til.simonwillison.net/jq/flatten-nested-json-objects-jq to reference it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964400482, "label": "`sqlite-utils insert --flatten` option to flatten nested JSON"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1426#issuecomment-895522818", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1426", "id": 895522818, "node_id": "IC_kwDOBm6k_c41YJgC", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T20:34:10Z", "updated_at": "2021-08-09T20:34:10Z", "author_association": "OWNER", "body": "At the very least Datasette should serve a blank `/robots.txt` by default - I'm seeing a ton of 404s for it in the logs.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964322136, "label": "Manage /robots.txt in Datasette core, block robots by default"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1426#issuecomment-895510773", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1426", "id": 895510773, "node_id": "IC_kwDOBm6k_c41YGj1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T20:14:50Z", "updated_at": "2021-08-09T20:19:22Z", "author_association": "OWNER", "body": "https://twitter.com/mal/status/1424825895139876870\r\n\r\n> True pinging google should be part of the build process on a static site :)\r\n\r\nThat's another aspect of this: if you DO want your site crawled, teaching the `datasette publish` command how to ping Google when a deploy has gone out could be a nice improvement.\r\n\r\nAnnoyingly it looks like you need to configure an auth token of some sort in order to use their API though, which is likely too much hassle to be worth building into Datasette itself: https://developers.google.com/search/apis/indexing-api/v3/using-api\r\n\r\n```\r\ncurl -X POST https://indexing.googleapis.com/v3/urlNotifications:publish -d '{\r\n \"url\": \"https://careers.google.com/jobs/google/technical-writer\",\r\n \"type\": \"URL_UPDATED\"\r\n}' -H \"Content-Type: application/json\"\r\n\r\n{\r\n \"error\": {\r\n \"code\": 401,\r\n \"message\": \"Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.\",\r\n \"status\": \"UNAUTHENTICATED\"\r\n }\r\n}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964322136, "label": "Manage /robots.txt in Datasette core, block robots by default"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1426#issuecomment-895509536", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1426", "id": 895509536, "node_id": "IC_kwDOBm6k_c41YGQg", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T20:12:57Z", "updated_at": "2021-08-09T20:12:57Z", "author_association": "OWNER", "body": "I could try out the `X-Robots` HTTP header too: https://developers.google.com/search/docs/advanced/robots/robots_meta_tag#xrobotstag", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964322136, "label": "Manage /robots.txt in Datasette core, block robots by default"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1426#issuecomment-895500565", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1426", "id": 895500565, "node_id": "IC_kwDOBm6k_c41YEEV", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T20:00:04Z", "updated_at": "2021-08-09T20:00:04Z", "author_association": "OWNER", "body": "A few options for how this would work:\r\n\r\n- `datasette ... --robots allow`\r\n- `datasette ... --setting robots allow`\r\n\r\nOptions could be:\r\n\r\n- `allow` - allow all crawling\r\n- `deny` - deny all crawling\r\n- `limited` - allow access to the homepage and the index pages for each database and each table, but disallow crawling any further than that\r\n\r\nThe \"limited\" mode is particularly interesting. Could even make it the default, but I think that may be a bit too confusing. Idea would be to get the key pages indexed but use `nofollow` to discourage crawlers from indexing individual row pages or deep pages like `https://datasette.io/content/repos?_facet=owner&_facet=language&_facet_array=topics&topics__arraycontains=sqlite#facet-owner`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 964322136, "label": "Manage /robots.txt in Datasette core, block robots by default"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-895003796", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 895003796, "node_id": "IC_kwDOBm6k_c41WKyU", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2021-08-09T07:14:35Z", "updated_at": "2021-08-09T07:14:35Z", "author_association": "CONTRIBUTOR", "body": "I believe this also provides a workaround for the problem I face in https://github.com/simonw/datasette/issues/1300. \r\n\r\nNow I should be able to get table PKs and generate a row URL. I'll test this out and report my findings.\r\n\r\n\r\n```py\r\nfrom datasette.utils import path_from_row_pks\r\n\r\npks = await db.primary_keys(table)\r\nurl = self.ds.urls.row_blob(\r\n database,\r\n table,\r\n path_from_row_pks(row, pks, not pks),\r\n column,\r\n)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894930013", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894930013, "node_id": "IC_kwDOBm6k_c41V4xd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:38:06Z", "updated_at": "2021-08-09T03:38:06Z", "author_association": "OWNER", "body": "Amusing edge-case: if you run this against a `explain ...` query it falls back to using regular expressions, because `explain explain select ...` is invalid SQL. https://latest.datasette.io/fixtures?sql=explain+select+*+from+facetable%0D%0Awhere+state+%3D+%3Astate%0D%0Aand+on_earth+%3D+%3Aon_earth%0D%0Aand+neighborhood+not+like+%2700%3A04%27&state=&on_earth=", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894929769", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894929769, "node_id": "IC_kwDOBm6k_c41V4tp", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:36:49Z", "updated_at": "2021-08-09T03:36:49Z", "author_association": "OWNER", "body": "SQLite carries a warning about using `EXPLAIN` like this: https://www.sqlite.org/lang_explain.html\r\n\r\n> The output from EXPLAIN and EXPLAIN QUERY PLAN is intended for interactive analysis and troubleshooting only. The details of the output format are subject to change from one release of SQLite to the next. Applications should not use EXPLAIN or EXPLAIN QUERY PLAN since their exact behavior is variable and only partially documented.\r\n\r\nI think that's OK here, because of the regular expression fallback. If the format changes in the future in a way that breaks the query the error should be caught and the regex-captured parameters should be returned instead.\r\n\r\nHmmm... actually that's not entirely true:\r\n\r\nhttps://github.com/simonw/datasette/blob/b1fed48a95516ae84c0f020582303ab50ab817e2/datasette/utils/__init__.py#L1084-L1091\r\n\r\nIf the format changes such that the same columns are returned but the `[row[\"p4\"].lstrip(\":\") for row in results if row[\"opcode\"] == \"Variable\"]` list comprehension returns an empty array it will break Datasette!\r\n\r\nI'm going to take that risk for the moment, but I'll actively watch out for problems in the future. If this does turn out to be bad I can always go back to the pure regular expression mechanism.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894929080", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894929080, "node_id": "IC_kwDOBm6k_c41V4i4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:33:02Z", "updated_at": "2021-08-09T03:33:02Z", "author_association": "OWNER", "body": "Fixed! Fantastic, this one has been bothering me for *years*.\r\n\r\nhttps://latest.datasette.io/fixtures?sql=select+*+from+facetable%0D%0Awhere+state+%3D+%3Astate%0D%0Aand+on_earth+%3D+%3Aon_earth%0D%0Aand+neighborhood+not+like+%2700%3A04%27\r\n\r\n\"fixtures__select___from_facetable_where_state____state_and_on_earth____on_earth_and_neighborhood_not_like__00_04__and_pyinfra_pip_py_at_current_\u00b7_Fizzadar_pyinfra\"\r\n\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894927185", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894927185, "node_id": "IC_kwDOBm6k_c41V4FR", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:25:01Z", "updated_at": "2021-08-09T03:25:01Z", "author_association": "OWNER", "body": "One catch with this approach: if the SQL query is invalid, the parameters will not be extracted and shown as form fields.\r\n\r\nMaybe that's completely fine? Why display a form if it's going to break when the user actually runs the query?\r\n\r\nBut it does bother me. I worry that someone who is still iterating on and editing their query before actually starting to use it might find the behaviour confusing.\r\n\r\nSo maybe if the query raises an exception it could fall back on the regular expression results?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894925914", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894925914, "node_id": "IC_kwDOBm6k_c41V3xa", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:20:42Z", "updated_at": "2021-08-09T03:20:42Z", "author_association": "OWNER", "body": "I think this works!\r\n\r\n```python\r\n_re_named_parameter = re.compile(\":([a-zA-Z0-9_]+)\")\r\n\r\nasync def derive_named_parameters(db, sql):\r\n explain = 'explain {}'.format(sql.strip().rstrip(\";\"))\r\n possible_params = _re_named_parameter.findall(sql)\r\n try:\r\n results = await db.execute(explain, {p: None for p in possible_params})\r\n return [row[\"p4\"].lstrip(\":\") for row in results if row[\"opcode\"] == \"Variable\"]\r\n except sqlite3.DatabaseError:\r\n return []\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894925437", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894925437, "node_id": "IC_kwDOBm6k_c41V3p9", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:19:00Z", "updated_at": "2021-08-09T03:19:00Z", "author_association": "OWNER", "body": "This may not work:\r\n\r\n> `ERROR: sql = 'explain select 1 + :one + :two', params = None: You did not supply a value for binding 1.`\r\n\r\nThe `explain` queries themselves want me to pass them parameters.\r\n\r\nI could try using the regex to pull out candidates and passing `None` for each of those, including incorrect ones like `:31`.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894922703", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894922703, "node_id": "IC_kwDOBm6k_c41V2_P", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:09:29Z", "updated_at": "2021-08-09T03:09:29Z", "author_association": "OWNER", "body": "Relevant code: https://github.com/simonw/datasette/blob/ad90a72afa21b737b162e2bbdddc301a97d575cd/datasette/views/database.py#L225-L231", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894922145", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894922145, "node_id": "IC_kwDOBm6k_c41V22h", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:07:38Z", "updated_at": "2021-08-09T03:07:38Z", "author_association": "OWNER", "body": "I hoped this would work:\r\n```sql\r\nwith foo as (\r\n explain select * from facetable\r\n where state = :state\r\n and on_earth = :on_earth\r\n and neighborhood not like '00:04'\r\n)\r\nselect p4 from foo where opcode = 'Variable'\r\n```\r\n But sadly [it returns an error](https://latest.datasette.io/fixtures?sql=with+foo+as+%28%0D%0A++explain+select+*+from+facetable%0D%0A++where+state+%3D+%3Astate%0D%0A++and+on_earth+%3D+%3Aon_earth%0D%0A++and+neighborhood+not+like+%2700%3A04%27%0D%0A%29%0D%0Aselect+p4+from+foo+where+opcode+%3D+%27Variable%27&state=&on_earth=&04=):\r\n\r\n> near \"explain\": syntax error", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1421#issuecomment-894921512", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1421", "id": 894921512, "node_id": "IC_kwDOBm6k_c41V2so", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T03:05:26Z", "updated_at": "2021-08-09T03:05:26Z", "author_association": "OWNER", "body": "I may have a way to work around this, using `explain`. Consider this query:\r\n\r\n```sql\r\nselect * from facetable\r\nwhere state = :state\r\nand on_earth = :on_earth\r\nand neighborhood not like '00:04'\r\n```\r\nDatasette currently gets confused and shows three form fields: https://latest.datasette.io/fixtures?sql=select+*+from+facetable%0D%0Awhere+state+%3D+%3Astate%0D%0Aand+on_earth+%3D+%3Aon_earth%0D%0Aand+neighborhood+not+like+%2700%3A04%27&state=&on_earth=&04=\r\n\r\n\"fixtures__select___from_facetable_where_state____state_and_on_earth____on_earth_and_neighborhood_not_like__00_04__and_pyinfra_pip_py_at_current_\u00b7_Fizzadar_pyinfra\"\r\n\r\nBut... if I run `explain` [against that](https://latest.datasette.io/fixtures?sql=explain+select+*+from+facetable%0D%0Awhere+state+%3D+%3Astate%0D%0Aand+on_earth+%3D+%3Aon_earth%0D%0Aand+neighborhood+not+like+%2700%3A04%27&state=&on_earth=&04=) I get this (truncated):\r\n\r\naddr | opcode | p1 | p2 | p3 | p4 | p5 | comment\r\n-- | -- | -- | -- | -- | -- | -- | --\r\n20 | ResultRow | 6 | 10 | 0 | \u00a0 | 0 | \u00a0\r\n21 | Next | 0 | 3 | 0 | \u00a0 | 1 | \u00a0\r\n22 | Halt | 0 | 0 | 0 | \u00a0 | 0 | \u00a0\r\n23 | Transaction | 0 | 0 | 35 | 0 | 1 | \u00a0\r\n24 | Variable | 1 | 2 | 0 | :state | 0 | \u00a0\r\n25 | Variable | 2 | 3 | 0 | :on_earth | 0 | \u00a0\r\n26 | String8 | 0 | 4 | 0 | 00:04 | 0 | \u00a0\r\n27 | Goto | 0 | 1 | 0 | \u00a0 | 0 | \u00a0\r\n\r\nCould it be as simple as pulling out those `Variable` rows to figure out the names of the variables in the query?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 959999095, "label": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894900267", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894900267, "node_id": "IC_kwDOBm6k_c41Vxgr", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T01:31:22Z", "updated_at": "2021-08-09T01:31:22Z", "author_association": "OWNER", "body": "I used this to build a new plugin: https://github.com/simonw/datasette-query-links\r\n\r\nDemo here: https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++%27select+*+from+[facetable]%27+as+query%0D%0Aunion%0D%0Aselect%0D%0A++%27select+sqlite_version()%27%0D%0Aunion%0D%0Aselect%0D%0A++%27select+this+is+invalid+SQL+so+will+not+be+linked%27", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894893319", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894893319, "node_id": "IC_kwDOBm6k_c41Vv0H", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T01:08:56Z", "updated_at": "2021-08-09T01:09:12Z", "author_association": "OWNER", "body": "Demo: https://latest.datasette.io/fixtures/simple_primary_key shows `RENDER_CELL_ASYNC_RESULT` where the CSV version shows `RENDER_CELL_ASYNC`: https://latest.datasette.io/fixtures/simple_primary_key.csv - because of this test plugin code: https://github.com/simonw/datasette/blob/a390bdf9cef01d8723d025fc3348e81345ff4856/tests/plugins/my_plugin.py#L98-L122", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894884874", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894884874, "node_id": "IC_kwDOBm6k_c41VtwK", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T00:38:20Z", "updated_at": "2021-08-09T00:38:20Z", "author_association": "OWNER", "body": "I'm trying the version where I remove `firstresult=True`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894883664", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894883664, "node_id": "IC_kwDOBm6k_c41VtdQ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T00:33:56Z", "updated_at": "2021-08-09T00:33:56Z", "author_association": "OWNER", "body": "I could extract that code out and write my own function which implements the equivalent of calling `pm.hook.render_cell(...)` but runs `await_me_maybe()` before checking if `res is not None`.\r\n\r\nThat's pretty nasty.\r\n\r\nCould I instead call the plugin hook normally, but then have additional logic which says \"if I await it and it returns `None` then try calling the hook again but skip this one\" - not sure if there's a way to do that either.\r\n\r\nI could remove the `firstresult=True` from the hookspec - which would cause it to call and return ALL hooks - but then in my own code use only the first one. This is slightly less efficient (since it calls all the hooks and then discards all-but-one value) but it's the least unpleasant in terms of the code I would have to write - plus I don't think it's going to be THAT common for someone to have multiple expensive `render_cell()` hooks installed at once (they are usually pretty cheap).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894882642", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894882642, "node_id": "IC_kwDOBm6k_c41VtNS", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T00:29:57Z", "updated_at": "2021-08-09T00:29:57Z", "author_association": "OWNER", "body": "Here's the code in `pluggy` that implements this: https://github.com/pytest-dev/pluggy/blob/0a064fe275060dbdb1fe6e10c888e72bc400fb33/src/pluggy/callers.py#L31-L43\r\n\r\n```python\r\n if hook_impl.hookwrapper:\r\n try:\r\n gen = hook_impl.function(*args)\r\n next(gen) # first yield\r\n teardowns.append(gen)\r\n except StopIteration:\r\n _raise_wrapfail(gen, \"did not yield\")\r\n else:\r\n res = hook_impl.function(*args)\r\n if res is not None:\r\n results.append(res)\r\n if firstresult: # halt further impl calls\r\n break\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894882123", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894882123, "node_id": "IC_kwDOBm6k_c41VtFL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T00:27:43Z", "updated_at": "2021-08-09T00:27:43Z", "author_association": "OWNER", "body": "Good news: `render_cell()` is the only hook to use `firstresult=True`:\r\n\r\nhttps://github.com/simonw/datasette/blob/f3c9edb376a13c09b5ecf97c7390f4e49efaadf2/datasette/hookspecs.py#L62-L64\r\n\r\nhttps://pluggy.readthedocs.io/en/latest/#first-result-only", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1425#issuecomment-894881448", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1425", "id": 894881448, "node_id": "IC_kwDOBm6k_c41Vs6o", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-09T00:24:25Z", "updated_at": "2021-08-09T00:24:39Z", "author_association": "OWNER", "body": "My hunch is that the \"skip this `render_cell()` result if it returns `None`\" logic isn't working correctly, ever since I added the `await_me_maybe` line.\r\n\r\nCould that be because Pluggy handles the \"do the next if `None` is returned\" logic itself, but I'm no-longer returning `None`, I'm returning an awaitable which when awaited returns `None`.\r\n\r\nThis would suggest that all of the `await_me_maybe()` plugin hooks have the same bug. That's definitely possible - it may well be that no-one has yet stumbled across a bug caused by a plugin returning an awaitable and hence not being skipped, because plugin hooks that return awaitable are rare enough that no-one has tried two plugins which both use that trick.\r\n\r\nStill don't see why it would pass on my laptop but fail in CI though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 963528457, "label": "render_cell() hook should support returning an awaitable"}, "performed_via_github_app": null}