{"html_url": "https://github.com/simonw/datasette/issues/370#issuecomment-1261930179", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/370", "id": 1261930179, "node_id": "IC_kwDOBm6k_c5LN4bD", "user": {"value": 72577720, "label": "MichaelTiemannOSC"}, "created_at": "2022-09-29T08:17:46Z", "updated_at": "2022-09-29T08:17:46Z", "author_association": "CONTRIBUTOR", "body": "Just watched this video which demonstrates the integration of *any* webapp into JupyterLab: https://youtu.be/FH1dKKmvFtc\r\n\r\nMaybe this is the answer?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377155320, "label": "Integration with JupyterLab"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1396#issuecomment-946467547", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1396", "id": 946467547, "node_id": "IC_kwDOBm6k_c44afLb", "user": {"value": 72577720, "label": "MichaelTiemannOSC"}, "created_at": "2021-10-19T08:10:26Z", "updated_at": "2021-10-19T08:10:26Z", "author_association": "CONTRIBUTOR", "body": "Now that 0.59 has excellent annotated release notes, you can re-confirm this is fixed by updating the published Docker image and checking that these fixes still work ;-)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 944903881, "label": "\"invalid reference format\" publishing Docker image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2202#issuecomment-1801876943", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2202", "id": 1801876943, "node_id": "IC_kwDOBm6k_c5rZnXP", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-11-08T13:19:00Z", "updated_at": "2023-11-08T13:19:00Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2206.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1959278971, "label": "Bump the python-packages group with 1 update"}, "performed_via_github_app": "{\"id\": 29110, \"slug\": \"dependabot\", \"node_id\": \"MDM6QXBwMjkxMTA=\", \"owner\": {\"login\": \"github\", \"id\": 9919, \"node_id\": \"MDEyOk9yZ2FuaXphdGlvbjk5MTk=\", \"avatar_url\": \"https://avatars.githubusercontent.com/u/9919?v=4\", \"gravatar_id\": \"\", \"url\": \"https://api.github.com/users/github\", \"html_url\": \"https://github.com/github\", \"followers_url\": \"https://api.github.com/users/github/followers\", \"following_url\": \"https://api.github.com/users/github/following{/other_user}\", \"gists_url\": \"https://api.github.com/users/github/gists{/gist_id}\", \"starred_url\": \"https://api.github.com/users/github/starred{/owner}{/repo}\", \"subscriptions_url\": \"https://api.github.com/users/github/subscriptions\", \"organizations_url\": \"https://api.github.com/users/github/orgs\", \"repos_url\": \"https://api.github.com/users/github/repos\", \"events_url\": \"https://api.github.com/users/github/events{/privacy}\", \"received_events_url\": \"https://api.github.com/users/github/received_events\", \"type\": \"Organization\", \"site_admin\": false}, \"name\": \"Dependabot\", \"description\": \"\", \"external_url\": \"https://dependabot-api.githubapp.com\", \"html_url\": \"https://github.com/apps/dependabot\", \"created_at\": \"2019-04-16T22:34:25Z\", \"updated_at\": \"2023-10-12T13:35:09Z\", \"permissions\": {\"checks\": \"write\", \"contents\": \"write\", \"issues\": \"write\", \"members\": \"read\", \"metadata\": \"read\", \"pull_requests\": \"write\", \"statuses\": \"read\", \"vulnerability_alerts\": \"read\", \"workflows\": \"write\"}, \"events\": [\"check_suite\", \"issues\", \"issue_comment\", \"label\", \"pull_request\", \"pull_request_review\", \"pull_request_review_comment\", \"repository\"]}"} {"html_url": "https://github.com/simonw/datasette/pull/2200#issuecomment-1777228352", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2200", "id": 1777228352, "node_id": "IC_kwDOBm6k_c5p7lpA", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-10-24T13:40:25Z", "updated_at": "2023-10-24T13:40:25Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2202.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1949756141, "label": "Bump the python-packages group with 1 update"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2182#issuecomment-1719451803", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2182", "id": 1719451803, "node_id": "IC_kwDOBm6k_c5mfMCb", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-09-14T13:27:26Z", "updated_at": "2023-09-14T13:27:26Z", "author_association": "CONTRIBUTOR", "body": "Looks like these dependencies are updatable in another way, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1890593563, "label": "Bump the python-packages group with 2 updates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2148#issuecomment-1696591957", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2148", "id": 1696591957, "node_id": "IC_kwDOBm6k_c5lH_BV", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-29T00:15:29Z", "updated_at": "2023-08-29T00:15:29Z", "author_association": "CONTRIBUTOR", "body": "This pull request was built based on a group rule. Closing it will not ignore any of these versions in future pull requests.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1859415334, "label": "Bump sphinx, furo, blacken-docs dependencies"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2152#issuecomment-1695736691", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2152", "id": 1695736691, "node_id": "IC_kwDOBm6k_c5lEuNz", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-28T13:49:35Z", "updated_at": "2023-08-28T13:49:35Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2160.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1865174661, "label": "Bump the python-packages group with 3 updates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2148#issuecomment-1689198413", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2148", "id": 1689198413, "node_id": "IC_kwDOBm6k_c5krx9N", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-23T02:57:55Z", "updated_at": "2023-08-23T02:57:55Z", "author_association": "CONTRIBUTOR", "body": "Looks like this PR has been edited by someone other than Dependabot. That means Dependabot can't rebase it - sorry!\n\nIf you're happy for Dependabot to recreate it from scratch, overwriting any edits, you can request `@dependabot recreate`.\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1859415334, "label": "Bump sphinx, furo, blacken-docs dependencies"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2144#issuecomment-1686366557", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2144", "id": 1686366557, "node_id": "IC_kwDOBm6k_c5kg-ld", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-21T13:48:15Z", "updated_at": "2023-08-21T13:48:15Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2148.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1856760386, "label": "Bump the python-packages group with 3 updates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2142#issuecomment-1683950031", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2142", "id": 1683950031, "node_id": "IC_kwDOBm6k_c5kXwnP", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-18T13:49:24Z", "updated_at": "2023-08-18T13:49:24Z", "author_association": "CONTRIBUTOR", "body": "Looks like these dependencies are updatable in another way, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1854970601, "label": "Bump the python-packages group with 2 updates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2141#issuecomment-1682256251", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2141", "id": 1682256251, "node_id": "IC_kwDOBm6k_c5kRTF7", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-17T13:07:43Z", "updated_at": "2023-08-17T13:07:43Z", "author_association": "CONTRIBUTOR", "body": "Looks like blacken-docs is updatable in another way, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1853289039, "label": "Bump the python-packages group with 1 update"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2125#issuecomment-1668187546", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2125", "id": 1668187546, "node_id": "IC_kwDOBm6k_c5jboWa", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-07T16:20:26Z", "updated_at": "2023-08-07T16:20:26Z", "author_association": "CONTRIBUTOR", "body": "Looks like sphinx is up-to-date now, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1833193570, "label": "Bump sphinx from 6.1.3 to 7.1.2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2121#issuecomment-1668186872", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2121", "id": 1668186872, "node_id": "IC_kwDOBm6k_c5jboL4", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-07T16:20:19Z", "updated_at": "2023-08-07T16:20:19Z", "author_association": "CONTRIBUTOR", "body": "Looks like furo is up-to-date now, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1824399610, "label": "Bump furo from 2023.3.27 to 2023.7.26"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2098#issuecomment-1668186815", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2098", "id": 1668186815, "node_id": "IC_kwDOBm6k_c5jboK_", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-07T16:20:18Z", "updated_at": "2023-08-07T16:20:18Z", "author_association": "CONTRIBUTOR", "body": "Looks like blacken-docs is up-to-date now, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1796830110, "label": "Bump blacken-docs from 1.14.0 to 1.15.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2124#issuecomment-1662215579", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2124", "id": 1662215579, "node_id": "IC_kwDOBm6k_c5jE2Wb", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-08-02T13:28:43Z", "updated_at": "2023-08-02T13:28:43Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2125.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1826424151, "label": "Bump sphinx from 6.1.3 to 7.1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2107#issuecomment-1655678215", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2107", "id": 1655678215, "node_id": "IC_kwDOBm6k_c5ir6UH", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-07-28T13:23:16Z", "updated_at": "2023-07-28T13:23:16Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2124.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1820346348, "label": "Bump sphinx from 6.1.3 to 7.1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2077#issuecomment-1653652665", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2077", "id": 1653652665, "node_id": "IC_kwDOBm6k_c5ikLy5", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-07-27T13:40:52Z", "updated_at": "2023-07-27T13:40:52Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2121.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1719759468, "label": "Bump furo from 2023.3.27 to 2023.5.20"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2075#issuecomment-1649849249", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2075", "id": 1649849249, "node_id": "IC_kwDOBm6k_c5iVrOh", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-07-25T13:28:35Z", "updated_at": "2023-07-25T13:28:35Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2107.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1710164693, "label": "Bump sphinx from 6.1.3 to 7.0.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2068#issuecomment-1547911570", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2068", "id": 1547911570, "node_id": "IC_kwDOBm6k_c5cQ0GS", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-05-15T13:59:35Z", "updated_at": "2023-05-15T13:59:35Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2075.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1690842199, "label": "Bump sphinx from 6.1.3 to 7.0.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2064#issuecomment-1529737426", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2064", "id": 1529737426, "node_id": "IC_kwDOBm6k_c5bLfDS", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-05-01T13:58:50Z", "updated_at": "2023-05-01T13:58:50Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2068.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1683229834, "label": "Bump sphinx from 6.1.3 to 6.2.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2063#issuecomment-1521837780", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2063", "id": 1521837780, "node_id": "IC_kwDOBm6k_c5atWbU", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-04-25T13:57:52Z", "updated_at": "2023-04-25T13:57:52Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2064.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1681339696, "label": "Bump sphinx from 6.1.3 to 6.2.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2014#issuecomment-1487999503", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2014", "id": 1487999503, "node_id": "IC_kwDOBm6k_c5YsRIP", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-03-29T06:09:11Z", "updated_at": "2023-03-29T06:09:11Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2047.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1566081801, "label": "Bump black from 22.12.0 to 23.1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2043#issuecomment-1486944644", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2043", "id": 1486944644, "node_id": "IC_kwDOBm6k_c5YoPmE", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-03-28T13:58:20Z", "updated_at": "2023-03-28T13:58:20Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #2046.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1639446870, "label": "Bump furo from 2022.12.7 to 2023.3.23"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1982#issuecomment-1376620851", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1982", "id": 1376620851, "node_id": "IC_kwDOBm6k_c5SDZEz", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-01-10T02:03:18Z", "updated_at": "2023-01-10T02:03:18Z", "author_association": "CONTRIBUTOR", "body": "Looks like sphinx is up-to-date now, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1525560504, "label": "Bump sphinx from 5.3.0 to 6.1.2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1977#issuecomment-1375596856", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1977", "id": 1375596856, "node_id": "IC_kwDOBm6k_c5R_fE4", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-01-09T13:06:14Z", "updated_at": "2023-01-09T13:06:14Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1982.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1522552817, "label": "Bump sphinx from 5.3.0 to 6.1.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1976#issuecomment-1373592231", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1976", "id": 1373592231, "node_id": "IC_kwDOBm6k_c5R31qn", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-01-06T13:02:15Z", "updated_at": "2023-01-06T13:02:15Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1977.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1520712722, "label": "Bump sphinx from 5.3.0 to 6.1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1974#issuecomment-1372188571", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1974", "id": 1372188571, "node_id": "IC_kwDOBm6k_c5Rye-b", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2023-01-05T13:02:40Z", "updated_at": "2023-01-05T13:02:40Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1976.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1516376583, "label": "Bump sphinx from 5.3.0 to 6.0.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1685#issuecomment-1237381620", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1685", "id": 1237381620, "node_id": "IC_kwDOBm6k_c5JwPH0", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2022-09-05T18:36:47Z", "updated_at": "2022-09-05T18:36:47Z", "author_association": "CONTRIBUTOR", "body": "Looks like jinja2 is no longer updatable, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180778860, "label": "Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1799#issuecomment-1237381569", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1799", "id": 1237381569, "node_id": "IC_kwDOBm6k_c5JwPHB", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2022-09-05T18:36:42Z", "updated_at": "2022-09-05T18:36:42Z", "author_association": "CONTRIBUTOR", "body": "Looks like aiofiles is no longer updatable, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1362242558, "label": "Update aiofiles requirement from <0.9,>=0.4 to >=0.4,<22.2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1693#issuecomment-1168704157", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1693", "id": 1168704157, "node_id": "IC_kwDOBm6k_c5FqQKd", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2022-06-28T13:11:36Z", "updated_at": "2022-06-28T13:11:36Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1763.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1184850337, "label": "Bump black from 22.1.0 to 22.3.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1753#issuecomment-1163091750", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1753", "id": 1163091750, "node_id": "IC_kwDOBm6k_c5FU18m", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2022-06-22T13:22:34Z", "updated_at": "2022-06-22T13:22:34Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1760.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1261826957, "label": "Bump furo from 2022.4.7 to 2022.6.4.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1593#issuecomment-1031455498", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1593", "id": 1031455498, "node_id": "IC_kwDOBm6k_c49esMK", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2022-02-07T13:13:22Z", "updated_at": "2022-02-07T13:13:22Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1631.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1101705012, "label": "Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1514#issuecomment-972852184", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1514", "id": 972852184, "node_id": "IC_kwDOBm6k_c45_IvY", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-11-18T13:11:15Z", "updated_at": "2021-11-18T13:11:15Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1516.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1056117435, "label": "Bump black from 21.9b0 to 21.11b0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1500#issuecomment-971568829", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1500", "id": 971568829, "node_id": "IC_kwDOBm6k_c456Pa9", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-11-17T13:13:58Z", "updated_at": "2021-11-17T13:13:58Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1514.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1041158024, "label": "Bump black from 21.9b0 to 21.10b0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1489#issuecomment-943594738", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1489", "id": 943594738, "node_id": "IC_kwDOBm6k_c44Phzy", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-10-14T18:04:13Z", "updated_at": "2021-10-14T18:04:13Z", "author_association": "CONTRIBUTOR", "body": "OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting `@dependabot ignore this major version` or `@dependabot ignore this minor version`. You can also ignore all major, minor, or patch releases for a dependency by adding an [`ignore` condition](https://docs.github.com/en/code-security/supply-chain-security/configuration-options-for-dependency-updates#ignore) with the desired `update_types` to your config file.\n\nIf you change your mind, just re-open this PR and I'll resolve any conflicts on it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1026379132, "label": "Update pyyaml requirement from ~=5.3 to >=5.3,<7.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1489#issuecomment-943594735", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1489", "id": 943594735, "node_id": "IC_kwDOBm6k_c44Phzv", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-10-14T18:04:12Z", "updated_at": "2021-10-14T18:04:12Z", "author_association": "CONTRIBUTOR", "body": "Looks like this PR is closed. If you re-open it I'll rebase it as long as no-one else has edited it (you can use `@dependabot reopen` if the branch has been deleted).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1026379132, "label": "Update pyyaml requirement from ~=5.3 to >=5.3,<7.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1453#issuecomment-919135732", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1453", "id": 919135732, "node_id": "IC_kwDOBm6k_c42yOX0", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-09-14T13:10:38Z", "updated_at": "2021-09-14T13:10:38Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1471.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 982780906, "label": "Bump black from 21.7b0 to 21.8b0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1318#issuecomment-838449572", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1318", "id": 838449572, "node_id": "MDEyOklzc3VlQ29tbWVudDgzODQ0OTU3Mg==", "user": {"value": 49699333, "label": "dependabot[bot]"}, "created_at": "2021-05-11T13:12:30Z", "updated_at": "2021-05-11T13:12:30Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1321.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 876431852, "label": "Bump black from 21.4b2 to 21.5b0"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/3#issuecomment-934372104", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3", "id": 934372104, "node_id": "IC_kwDOD079W843sWMI", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2021-10-05T12:38:24Z", "updated_at": "2021-10-05T12:38:24Z", "author_association": "CONTRIBUTOR", "body": "As dogsheep-photos already uses [osxphotos](https://github.com/RhetTbull/osxphotos) to load photos you can access the EXIF data via osxphotos. Apple Photos imports a small subset of EXIF data at the time the photo is imported and osxphotos provides this via the [exif_info](https://github.com/RhetTbull/osxphotos#exifinfo) property. If you want the full EXIF data, osxphotos also provides a wrapper around [exiftool](https://github.com/RhetTbull/osxphotos#exiftool).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 602533481, "label": "Import EXIF data into SQLite - lens used, ISO, aperture etc"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778246347", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778246347, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODI0NjM0Nw==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2021-02-12T15:00:43Z", "updated_at": "2021-02-12T15:00:43Z", "author_association": "CONTRIBUTOR", "body": "Yes, Big Sur Photos database doesn't have `ZGENERICASSET` table. PR #31 will fix this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-748562330", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31", "id": 748562330, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODU2MjMzMA==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-12-20T04:45:08Z", "updated_at": "2020-12-20T04:45:08Z", "author_association": "CONTRIBUTOR", "body": "Fixes the issue mentioned here: https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 771511344, "label": "Update for Big Sur"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748562288", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15", "id": 748562288, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODU2MjI4OA==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-12-20T04:44:22Z", "updated_at": "2020-12-20T04:44:22Z", "author_association": "CONTRIBUTOR", "body": "@nickvazz @simonw I opened a [PR](https://github.com/dogsheep/dogsheep-photos/pull/31) that replaces the SQL for `ZCOMPUTEDASSETATTRIBUTES` to use osxphotos which now exposes all this data and has been updated for Big Sur. I did regression tests to confirm the extracted data is identical, with one exception which should not affect operation: the old code pulled data from `ZCOMPUTEDASSETATTRIBUTES` for missing photos while the main loop ignores missing photos and does not add them to `apple_photos`. The new code does not add rows to the `apple_photos_scores` table for missing photos.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612151767, "label": "Expose scores from ZCOMPUTEDASSETATTRIBUTES"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436779", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15", "id": 748436779, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODQzNjc3OQ==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-12-19T07:49:00Z", "updated_at": "2020-12-19T07:49:00Z", "author_association": "CONTRIBUTOR", "body": "@nickvazz ZGENERICASSET changed to ZASSET in Big Sur. Here's a list of other changes to the schema in Big Sur: https://github.com/RhetTbull/osxphotos/wiki/Changes-in-Photos-6---Big-Sur", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612151767, "label": "Expose scores from ZCOMPUTEDASSETATTRIBUTES"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-628405453", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22", "id": 628405453, "node_id": "MDEyOklzc3VlQ29tbWVudDYyODQwNTQ1Mw==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-14T05:59:53Z", "updated_at": "2020-05-14T05:59:53Z", "author_association": "CONTRIBUTOR", "body": "I've added support for the above exif data to [v0.28.17](https://github.com/RhetTbull/osxphotos/releases/tag/v0.28.17) of osxphotos. `PhotoInfo.exif_info` will return an `ExifInfo` [dataclass](https://docs.python.org/3/library/dataclasses.html) object with the following properties:\r\n\r\n```python\r\n flash_fired: bool\r\n iso: int\r\n metering_mode: int\r\n sample_rate: int\r\n track_format: int\r\n white_balance: int\r\n aperture: float\r\n bit_rate: float\r\n duration: float\r\n exposure_bias: float\r\n focal_length: float\r\n fps: float\r\n latitude: float\r\n longitude: float\r\n shutter_speed: float\r\n camera_make: str\r\n camera_model: str\r\n codec: str\r\n lens_model: str\r\n```\r\n\r\nIt's not all the EXIF data available in most files but is the data Photos deems important to save. Of course, you can get all the exif_data\r\n\r\nNote: this only works in Photos 5. As best as I can tell, EXIF data is not stored in the database for earlier versions. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615626118, "label": "Try out ExifReader"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-627007458", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22", "id": 627007458, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNzAwNzQ1OA==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-11T22:51:52Z", "updated_at": "2020-05-11T22:52:26Z", "author_association": "CONTRIBUTOR", "body": "I'm not familiar with `ExifReader`. I wrote my own wrapper around `exiftool` because I wanted a simple way to write EXIF data when exporting photos (e.g. writing out to PersonInImage and keywords to IPTC:Keywords) and the existing python packages like [pyexiftool](https://github.com/smarnach/pyexiftool) didn't do quite what I wanted. If all you're after is the camera and shot info, that's available in `ZEXTENDEDATTRIBUTES` table. I've got an open issue [#11](https://github.com/RhetTbull/osxphotos/issues/11) to add this to osxphotos but it hasn't bubbled to the top of my backlog yet. \r\n\r\nosxphotos will give you the location info: `PhotoInfo.location` returns a tuple of (lat, lon) though this info is in ZEXTENDEDATTRIBUTES too (though it might not be correct as I believe Photos creates this table at import and the user might have changed the location of a photo, e.g. if camera didn't have GPS).\r\n\r\n```sql\r\nCREATE TABLE ZEXTENDEDATTRIBUTES (\r\n Z_PK INTEGER PRIMARY KEY, Z_ENT INTEGER, \r\n Z_OPT INTEGER, ZFLASHFIRED INTEGER, \r\n ZISO INTEGER, ZMETERINGMODE INTEGER, \r\n ZSAMPLERATE INTEGER, ZTRACKFORMAT INTEGER, \r\n ZWHITEBALANCE INTEGER, ZASSET INTEGER, \r\n ZAPERTURE FLOAT, ZBITRATE FLOAT, ZDURATION FLOAT, \r\n ZEXPOSUREBIAS FLOAT, ZFOCALLENGTH FLOAT, \r\n ZFPS FLOAT, ZLATITUDE FLOAT, ZLONGITUDE FLOAT, \r\n ZSHUTTERSPEED FLOAT, ZCAMERAMAKE VARCHAR, \r\n ZCAMERAMODEL VARCHAR, ZCODEC VARCHAR, \r\n ZLENSMODEL VARCHAR\r\n);\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615626118, "label": "Try out ExifReader"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-626667235", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22", "id": 626667235, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjY2NzIzNQ==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-11T12:20:34Z", "updated_at": "2020-05-11T12:20:34Z", "author_association": "CONTRIBUTOR", "body": "@simonw FYI, osxphotos includes a built in ExifTool class that uses [exiftool](https://exiftool.org/) to read and write exif data. It's not exposed yet in the docs because I really only use it right now in the osphotos command line interface to write tags when exporting. In v0.28.16 (just pushed) I added an ExifTool.as_dict() method which will give you a dict with all the exif tags in a file. For example:\r\n\r\n```python\r\nimport osxphotos\r\nphotos = osxphotos.PhotosDB().photos()\r\nexiftool = osxphotos.exiftool.ExifTool(photos[0].path)\r\nexifdata = exiftool.as_dict()\r\ntags = exifdata[\"IPTC:Keywords\"]\r\n```\r\n\r\nNot as elegant perhaps as a python only implementation because ExifTool has to make subprocess calls to an external tool but exiftool is by far the best tool available for reading and writing EXIF data and it does support HEIC.\r\n\r\nAs for implementation, ExifTool uses a singleton pattern so the first time you instantiate it, it spawns an IPC to exiftool but then keeps it open and uses the same process for any subsequent calls (even on different files). ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615626118, "label": "Try out ExifReader"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626396379", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626396379, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5NjM3OQ==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-10T22:01:48Z", "updated_at": "2020-05-10T22:01:48Z", "author_association": "CONTRIBUTOR", "body": "Frustrates me when package authors create a \"drop in\" replacement with the same import name...this kind of thing has bitten me more than once! Would've been nicer I think for bpylist2 to do \"import bpylist2 as bpylist\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395641", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626395641, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5NTY0MQ==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-10T21:55:54Z", "updated_at": "2020-05-10T21:55:54Z", "author_association": "CONTRIBUTOR", "body": "Did removing old bpylist solve the original problem or do you still have a photo that throws circular reference?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395507", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626395507, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5NTUwNw==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-10T21:54:45Z", "updated_at": "2020-05-10T21:54:45Z", "author_association": "CONTRIBUTOR", "body": "@simonw does Photos show valid reverse geolocation info? Are you sure you're using [bpylist2](https://github.com/xa4a/bpylist2) and not bpylist? They're both unfortunately imported as \"bpylist\" so if you somehow got the wrong (original bpylist) version installed, it could be the issue. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626390317", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626390317, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5MDMxNw==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-10T21:11:24Z", "updated_at": "2020-05-10T21:50:58Z", "author_association": "CONTRIBUTOR", "body": "Ugh....Yeah, I think easiest is to catch the exception and return no place as you suggest. This particular bit of code involves un-archiving a serialized NSKeyedArchiver which uses an object table and it is certainly possible to create a circular reference that way. Because this is happening in the decode, the circular reference must be in the original data. Does Photos show valid reverse geolocation info for the photo in question? If so, Photos may be doing something beyond a simple decode of the binary plist. For now, I'll push a patch to catch the exception.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/17#issuecomment-624284539", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17", "id": 624284539, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNDI4NDUzOQ==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-05T20:20:05Z", "updated_at": "2020-05-05T20:20:05Z", "author_association": "CONTRIBUTOR", "body": "FYI, I've got an [issue](https://github.com/RhetTbull/osxphotos/issues/25) to make osxphotos cross-platform but it's low on my priority list. About 90% of the functionality could be done cross-platform but right now the MacOS specific stuff is embedded throughout and would take some work. Though I try to minimize it, there's sprinklings of ObjC & Applescript throughout osxphotos.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612860531, "label": "Only install osxphotos if running on macOS"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623845014", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16", "id": 623845014, "node_id": "MDEyOklzc3VlQ29tbWVudDYyMzg0NTAxNA==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2020-05-05T03:55:14Z", "updated_at": "2020-05-05T03:56:24Z", "author_association": "CONTRIBUTOR", "body": "I'm traveling w/o access to my Mac so can't help with any code right now. I suspected ZSCENEIDENTIFIER was a foreign key into one of these psi.sqlite tables. But looks like you're on to something connecting groups to assets. As for the UUID, I think there's two ints because each is 64-bits but UUIDs are 128-bits. Thus they need to be combined to get the 128 bit UUID. You might be able to use Apple's [NSUUID](https://developer.apple.com/documentation/foundation/nsuuid?language=objc), for example, by wrapping with pyObjC. Here's one [example](https://github.com/ronaldoussoren/pyobjc/blob/881c82a7ba90f193934b52b44143360c80dce5e5/pyobjc-framework-Cocoa/PyObjCTest/test_nsuuid.py) of using this in PyObjC's test suite. Interesting it's stored this way instead of a UUIDString as in Photos.sqlite. Perhaps it for faster indexing.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612287234, "label": "Import machine-learning detected labels (dog, llama etc) from Apple Photos"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/103#issuecomment-622599528", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/103", "id": 622599528, "node_id": "MDEyOklzc3VlQ29tbWVudDYyMjU5OTUyOA==", "user": {"value": 32605365, "label": "b0b5h4rp13"}, "created_at": "2020-05-01T22:49:12Z", "updated_at": "2020-05-02T11:15:44Z", "author_association": "CONTRIBUTOR", "body": "With SQLITE_MAX_VARS = 999, or even 899, This hits the problem with the batch rows causing a overflow (works fine if SQLITE_MAX_VARS = 799).\r\n\r\np.s. I have tried a few list of dicts to sqlite modules and this was the easiest to use/understand\r\n\r\n------------- file begins ------------------\r\nimport sqlite_utils as su\r\n\r\n\r\ndata = [\r\n{'tickerId': 913324382, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'CONSTELLATION B', 'symbol': 'STZ B', 'disSymbol': 'STZ-B', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '163.13', 'change': '6.46', 'changeRatio': '0.0412', 'marketValue': '31180699895.63', 'volume': '417', 'turnoverRate': '0.0000'},\r\n{'tickerId': 913323791, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Molina Health', 'symbol': 'MOH', 'disSymbol': 'MOH', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '173.25', 'change': '9.28', 'changeRatio': '0.0566', 'pPrice': '173.25', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '10520341695.50', 'volume': '1281557', 'turnoverRate': '0.0202'},\r\n{'tickerId': 913257501, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Seattle Genetics', 'symbol': 'SGEN', 'disSymbol': 'SGEN', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '145.64', 'change': '8.41', 'changeRatio': '0.0613', 'pPrice': '146.45', 'pChange': '0.8100', 'pChRatio': '0.0056', 'marketValue': '25117961347.60', 'volume': '2791411', 'turnoverRate': '0.0162'},\r\n{'tickerId': 925381971, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Bandwidth', 'symbol': 'BAND', 'disSymbol': 'BAND', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '89.22', 'change': '7.66', 'changeRatio': '0.0939', 'pPrice': '89.00', 'pChange': '-0.2200', 'pChRatio': '-0.0025', 'marketValue': '2100025474.98', 'volume': '1508629', 'turnoverRate': '0.0641'},\r\n{'tickerId': 913323935, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Magellan Health', 'symbol': 'MGLN', 'disSymbol': 'MGLN', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '68.00', 'change': '7.27', 'changeRatio': '0.1197', 'pPrice': '68.00', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '1697894040.00', 'volume': '448919', 'turnoverRate': '0.0180'},\r\n{'tickerId': 913254854, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'On Assignment', 'symbol': 'ASGN', 'disSymbol': 'ASGN', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '53.04', 'change': '6.59', 'changeRatio': '0.1419', 'pPrice': '53.04', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '2811120000.00', 'volume': '1339771', 'turnoverRate': '0.0253'},\r\n{'tickerId': 913255732, 'exchangeId': 95, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Arcturus', 'symbol': 'ARCT', 'disSymbol': 'ARCT', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NMS', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '40.86', 'change': '6.36', 'changeRatio': '0.1843', 'pPrice': '42.60', 'pChange': '1.740', 'pChRatio': '0.0426', 'marketValue': '812021444.46', 'volume': '1577508', 'turnoverRate': '0.0794'},\r\n{'tickerId': 913256616, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'DexCom', 'symbol': 'DXCM', 'disSymbol': 'DXCM', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '341.52', 'change': '6.32', 'changeRatio': '0.0189', 'pPrice': '340.00', 'pChange': '-1.5200', 'pChRatio': '-0.0045', 'marketValue': '31522296000.00', 'volume': '1008849', 'turnoverRate': '0.0109'},\r\n{'tickerId': 913255108, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Clorox', 'symbol': 'CLX', 'disSymbol': 'CLX', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '192.71', 'change': '6.27', 'changeRatio': '0.0336', 'pPrice': '192.95', 'pChange': '0.2400', 'pChRatio': '0.0012', 'marketValue': '24185773318.28', 'volume': '4996414', 'turnoverRate': '0.0398'},\r\n{'tickerId': 925314627, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'FRANCO NEVADA', 'symbol': 'FNV', 'disSymbol': 'FNV', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '137.85', 'change': '5.64', 'changeRatio': '0.0427', 'pPrice': '138.50', 'pChange': '0.6500', 'pChRatio': '0.0047', 'marketValue': '26110405326.30', 'volume': '1047688', 'turnoverRate': '0.0055'},\r\n{'tickerId': 913254955, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Aon Plc', 'symbol': 'AON', 'disSymbol': 'AON', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '178.21', 'change': '5.54', 'changeRatio': '0.0321', 'pPrice': '178.21', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '41181209117.22', 'volume': '2026234', 'turnoverRate': '0.0088'},\r\n{'tickerId': 913324105, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Willis Towers', 'symbol': 'WLTW', 'disSymbol': 'WLTW', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '183.34', 'change': '5.05', 'changeRatio': '0.0283', 'pPrice': '183.34', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '23597461124.96', 'volume': '968943', 'turnoverRate': '0.0075'},\r\n{'tickerId': 913254759, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'TELADOC HEALTH', 'symbol': 'TDOC', 'disSymbol': 'TDOC', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '169.43', 'change': '4.84', 'changeRatio': '0.0294', 'pPrice': '168.88', 'pChange': '-0.5500', 'pChRatio': '-0.0032', 'marketValue': '12614616858.38', 'volume': '2628946', 'turnoverRate': '0.0353'},\r\n{'tickerId': 913255222, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Emergent Bio', 'symbol': 'EBS', 'disSymbol': 'EBS', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '78.70', 'change': '4.75', 'changeRatio': '0.0642', 'pPrice': '78.40', 'pChange': '-0.3000', 'pChRatio': '-0.0038', 'marketValue': '4113368277.10', 'volume': '783804', 'turnoverRate': '0.0150'},\r\n{'tickerId': 913323443, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Pool', 'symbol': 'POOL', 'disSymbol': 'POOL', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '216.02', 'change': '4.36', 'changeRatio': '0.0206', 'pPrice': '216.02', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '8696077573.82', 'volume': '310837', 'turnoverRate': '0.0077'},\r\n{'tickerId': 913257075, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Masimo', 'symbol': 'MASI', 'disSymbol': 'MASI', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '218.00', 'change': '4.09', 'changeRatio': '0.0191', 'pPrice': '217.00', 'pChange': '-1.0000', 'pChRatio': '-0.0046', 'marketValue': '11797070000.00', 'volume': '542131', 'turnoverRate': '0.0100'},\r\n{'tickerId': 913253761, 'exchangeId': 10, 'type': 2, 'secType': [62], 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Pope Resources', 'symbol': 'POPE', 'disSymbol': 'POPE', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NAS', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '101.05', 'change': '3.95', 'changeRatio': '0.0407', 'pPrice': '99.90', 'pChange': '2.800', 'pChRatio': '0.0288', 'marketValue': '447370075.75', 'volume': '33138', 'turnoverRate': '0.0075'},\r\n{'tickerId': 913323560, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Seneca Foods', 'symbol': 'SENEB', 'disSymbol': 'SENEB', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '40.04', 'change': '3.84', 'changeRatio': '0.1061', 'marketValue': '347950039.71', 'volume': '501'},\r\n{'tickerId': 913324274, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Resmed', 'symbol': 'RMD', 'disSymbol': 'RMD', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '159.07', 'change': '3.75', 'changeRatio': '0.0241', 'pPrice': '159.07', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '23004217759.29', 'volume': '1267075', 'turnoverRate': '0.0088'},\r\n{'tickerId': 913323736, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Vertex Pharms', 'symbol': 'VRTX', 'disSymbol': 'VRTX', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '254.90', 'change': '3.70', 'changeRatio': '0.0147', 'pPrice': '255.00', 'pChange': '0.1000', 'pChRatio': '0.0004', 'marketValue': '66062980780.10', 'volume': '1939843', 'turnoverRate': '0.0075'},\r\n{'tickerId': 913323767, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'MCCORMICK VTG', 'symbol': 'MKC V', 'disSymbol': 'MKC-V', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '159.99', 'change': '3.42', 'changeRatio': '0.0218', 'marketValue': '21262671000.00', 'volume': '432', 'turnoverRate': '0.0000'},\r\n{'tickerId': 950118595, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'ZOOM VIDEO', 'symbol': 'ZM', 'disSymbol': 'ZM', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '138.56', 'change': '3.39', 'changeRatio': '0.0251', 'pPrice': '138.99', 'pChange': '0.4300', 'pChRatio': '0.0031', 'marketValue': '38620532420.16', 'volume': '13786017', 'turnoverRate': '0.0495'},\r\n{'tickerId': 916040738, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'WHEATON PRECIOUS', 'symbol': 'WPM', 'disSymbol': 'WPM', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '41.10', 'change': '3.34', 'changeRatio': '0.0885', 'pPrice': '41.09', 'pChange': '-0.0100', 'pChRatio': '-0.0002', 'marketValue': '18404536146.30', 'volume': '5019137', 'turnoverRate': '0.0112'},\r\n{'tickerId': 913257174, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Royal Gold', 'symbol': 'RGLD', 'disSymbol': 'RGLD', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '125.86', 'change': '3.33', 'changeRatio': '0.0272', 'pPrice': '125.86', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '8253015011.08', 'volume': '853473', 'turnoverRate': '0.0130'},\r\n{'tickerId': 913254394, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Fortune Brand', 'symbol': 'FBHS', 'disSymbol': 'FBHS', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '51.50', 'change': '3.30', 'changeRatio': '0.0685', 'pPrice': '51.50', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '7194870278.50', 'volume': '3004021', 'turnoverRate': '0.0214'},\r\n{'tickerId': 913323312, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Liberty Global', 'symbol': 'LBTYK', 'disSymbol': 'LBTYK', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '21.49', 'change': '3.18', 'changeRatio': '0.1737', 'pPrice': '21.48', 'pChange': '-0.0100', 'pChRatio': '-0.0005', 'marketValue': '13594662302.41', 'volume': '19980228', 'turnoverRate': '0.0315'},\r\n{'tickerId': 913323882, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Preformed Line', 'symbol': 'PLPC', 'disSymbol': 'PLPC', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '52.82', 'change': '3.14', 'changeRatio': '0.0632', 'pPrice': '52.10', 'pChange': '-0.7200', 'pChRatio': '-0.0136', 'marketValue': '264979981.20', 'volume': '9305', 'turnoverRate': '0.0018'},\r\n{'tickerId': 913323248, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Discovery', 'symbol': 'DISCB', 'disSymbol': 'DISCB', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'status': 'A', 'close': '57.95', 'change': '23.63', 'changeRatio': '0.6884', 'pPrice': '54.26', 'pChange': '-3.6900', 'pChRatio': '-0.0637', 'marketValue': '29362894177.95', 'volume': '218305', 'turnoverRate': '0.0004'},\r\n{'tickerId': 913323930, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'MercadoLibre', 'symbol': 'MELI', 'disSymbol': 'MELI', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '605.52', 'change': '22.01', 'changeRatio': '0.0377', 'pPrice': '603.69', 'pChange': '-1.8300', 'pChRatio': '-0.0030', 'marketValue': '30226598045.28', 'volume': '699008', 'turnoverRate': '0.0140'},\r\n{'tickerId': 913257170, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Liberty Global', 'symbol': 'LBTYA', 'disSymbol': 'LBTYA', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '22.28', 'change': '2.86', 'changeRatio': '0.1473', 'pPrice': '22.29', 'pChange': '0.0100', 'pChRatio': '0.0004', 'marketValue': '14094419548.52', 'volume': '10534672', 'turnoverRate': '0.0167'},\r\n{'tickerId': 913303991, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Liberty Brodband', 'symbol': 'LBRDK', 'disSymbol': 'LBRDK', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '125.44', 'change': '2.76', 'changeRatio': '0.0225', 'pPrice': '125.44', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '22817900904.96', 'volume': '926177', 'turnoverRate': '0.0042'},\r\n{'tickerId': 913257082, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Helen of Troy', 'symbol': 'HELE', 'disSymbol': 'HELE', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '167.04', 'change': '2.76', 'changeRatio': '0.0168', 'pPrice': '167.04', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '4216707982.08', 'volume': '341465', 'turnoverRate': '0.0135'},\r\n{'tickerId': 913256458, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Forrester', 'symbol': 'FORR', 'disSymbol': 'FORR', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '33.88', 'change': '2.58', 'changeRatio': '0.0824', 'marketValue': '635419400.00', 'volume': '85115', 'turnoverRate': '0.0045'},\r\n{'tickerId': 950158952, 'exchangeId': 95, 'type': 2, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'LYRA THERAPEUTICS, INC.', 'symbol': 'LYRA', 'disSymbol': 'LYRA', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NMS', 'listStatus': 1, 'template': 'ipo', 'status': 'A', 'close': '18.56', 'change': '2.56', 'changeRatio': '0.1600', 'pPrice': '18.96', 'pChange': '0.4000', 'pChRatio': '0.0216', 'marketValue': '229705575.68', 'volume': '1738472', 'turnoverRate': '0.1405'},\r\n{'tickerId': 913257570, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Bio-Techne', 'symbol': 'TECH', 'disSymbol': 'TECH', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '227.54', 'change': '2.54', 'changeRatio': '0.0113', 'pPrice': '227.54', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '8726538309.18', 'volume': '497006', 'turnoverRate': '0.0130'},\r\n{'tickerId': 913323246, 'exchangeId': 96, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Bel Fuse', 'symbol': 'BELFB', 'disSymbol': 'BELFB', 'disExchangeCode': 'NASDAQ', 'exchangeCode': 'NSQ', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '9.99', 'change': '2.53', 'changeRatio': '0.3391', 'pPrice': '9.75', 'pChange': '-0.2400', 'pChRatio': '-0.0240', 'marketValue': '122562454.86', 'volume': '177634', 'turnoverRate': '0.0145'},\r\n{'tickerId': 916040647, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Agnico Eagle', 'symbol': 'AEM', 'disSymbol': 'AEM', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '61.20', 'change': '2.52', 'changeRatio': '0.0429', 'pPrice': '61.10', 'pChange': '-0.1000', 'pChRatio': '-0.0016', 'marketValue': '14739911553.60', 'volume': '2820765', 'turnoverRate': '0.0117'},\r\n{'tickerId': 913303768, 'exchangeId': 12, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'CHASE CORP', 'symbol': 'CCF', 'disSymbol': 'CCF', 'disExchangeCode': 'AMEX', 'exchangeCode': 'ASE', 'listStatus': 1, 'template': 'stock', 'status': 'D', 'close': '96.71', 'change': '2.45', 'changeRatio': '0.0260', 'marketValue': '916799598.60', 'volume': '29229', 'turnoverRate': '0.0031'},\r\n{'tickerId': 913324557, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'Allergan', 'symbol': 'AGN', 'disSymbol': 'AGN', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'A', 'close': '189.74', 'change': '2.40', 'changeRatio': '0.0128', 'pPrice': '189.76', 'pChange': '0.0200', 'pChRatio': '0.0001', 'marketValue': '62424842326.10', 'volume': '5787032', 'turnoverRate': '0.0176'},\r\n{'tickerId': 913324566, 'exchangeId': 11, 'type': 2, 'secType': 61, 'regionId': 6, 'regionCode': 'US', 'currencyId': 247, 'name': 'West Pharm Svc', 'symbol': 'WST', 'disSymbol': 'WST', 'disExchangeCode': 'NYSE', 'exchangeCode': 'NYSE', 'listStatus': 1, 'template': 'stock', 'derivativeSupport': 1, 'status': 'D', 'close': '191.64', 'change': '2.38', 'changeRatio': '0.0126', 'pPrice': '191.64', 'pChange': '0.0000', 'pChRatio': '0.0000', 'marketValue': '14078267117.08', 'volume': '352460', 'turnoverRate': '0.0042'}\r\n]\r\n\r\ndb = su.Database(f\"overnight hold.db\" )\r\ndb['active'].insert_all(data)\r\n\r\n--------------- file ends ----------------------", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 610517472, "label": "sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/456#issuecomment-661524006", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/456", "id": 661524006, "node_id": "MDEyOklzc3VlQ29tbWVudDY2MTUyNDAwNg==", "user": {"value": 32467826, "label": "abeyerpath"}, "created_at": "2020-07-21T01:15:07Z", "updated_at": "2020-07-21T01:15:07Z", "author_association": "CONTRIBUTOR", "body": "Bumping this, as the previous fix is passing the wrong type, and not actually addressing the issue...\r\n\r\nThe `exclude` argument needs an iterable of packages instead of a single string (but since `str` is iterable, it's currently excluding packages `t`, `e`, and `s`.)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 442327592, "label": "Installing installs the tests package"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/pull/10#issuecomment-707326192", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/10", "id": 707326192, "node_id": "MDEyOklzc3VlQ29tbWVudDcwNzMyNjE5Mg==", "user": {"value": 29426418, "label": "mattiaborsoi"}, "created_at": "2020-10-12T20:20:02Z", "updated_at": "2020-10-12T20:20:02Z", "author_association": "CONTRIBUTOR", "body": "This closes issue #8 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 719637258, "label": "Update utils.py to fix sqlite3.OperationalError"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1313#issuecomment-829352402", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1313", "id": 829352402, "node_id": "MDEyOklzc3VlQ29tbWVudDgyOTM1MjQwMg==", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "created_at": "2021-04-29T15:47:23Z", "updated_at": "2021-04-29T15:47:23Z", "author_association": "CONTRIBUTOR", "body": "This pull request will no longer be automatically closed when a new version is found as this pull request was created by Dependabot Preview and this repo is using a `version: 2` config file. You can close this pull request and let Dependabot re-create it the next time it checks for updates.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 871046111, "label": "Bump black from 20.8b1 to 21.4b2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1311#issuecomment-829260725", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1311", "id": 829260725, "node_id": "MDEyOklzc3VlQ29tbWVudDgyOTI2MDcyNQ==", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "created_at": "2021-04-29T13:58:08Z", "updated_at": "2021-04-29T13:58:08Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1313.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 870227815, "label": "Bump black from 20.8b1 to 21.4b1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1309#issuecomment-828679943", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1309", "id": 828679943, "node_id": "MDEyOklzc3VlQ29tbWVudDgyODY3OTk0Mw==", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "created_at": "2021-04-28T18:26:03Z", "updated_at": "2021-04-28T18:26:03Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #1311.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 869237023, "label": "Bump black from 20.8b1 to 21.4b0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/952#issuecomment-686061028", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/952", "id": 686061028, "node_id": "MDEyOklzc3VlQ29tbWVudDY4NjA2MTAyOA==", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "created_at": "2020-09-02T22:26:14Z", "updated_at": "2020-09-02T22:26:14Z", "author_association": "CONTRIBUTOR", "body": "Looks like black is up-to-date now, so this is no longer needed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 687245650, "label": "Update black requirement from ~=19.10b0 to >=19.10,<21.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/730#issuecomment-623463200", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/730", "id": 623463200, "node_id": "MDEyOklzc3VlQ29tbWVudDYyMzQ2MzIwMA==", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "created_at": "2020-05-04T13:27:22Z", "updated_at": "2020-05-04T13:27:22Z", "author_association": "CONTRIBUTOR", "body": "Superseded by #753.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 604001627, "label": "Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-770150526", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51", "id": 770150526, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDE1MDUyNg==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-30T03:44:19Z", "updated_at": "2021-01-30T03:47:24Z", "author_association": "CONTRIBUTOR", "body": "I don't have much experience with github's rate limiting. In my day job we use the [tenacity library](https://github.com/jd/tenacity) to handle http errors we get.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 703246031, "label": "github-to-sqlite should handle rate limits better"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770112248", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770112248, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDExMjI0OA==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-30T00:01:03Z", "updated_at": "2021-01-30T01:14:42Z", "author_association": "CONTRIBUTOR", "body": "Yes that would be cool! I wouldn't mind helping. Is this the meat of it? https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/utils.py#L512\r\n\r\nIt looks like the cli option is added with this decorator : https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/cli.py#L14\r\n\r\nI looked a bit at utils.py in the GitHub repository. I was surprised at the amount of manual mapping of the API response you had to do to get this to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770069864", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770069864, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDA2OTg2NA==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-29T21:52:05Z", "updated_at": "2021-02-12T18:29:43Z", "author_association": "CONTRIBUTOR", "body": "For the purposes below I am assuming the organization I would get all the repositories and their related commits from is called `gh-organization`. The github's owner id of gh-orgnization is `123456789`.\r\n\r\n```bash\r\ngithub-to-sqlite repos github.db gh-organization\r\n```\r\n\r\nI'm on a windows computer running git bash to be able to use the `|` command. This works for me\r\n```bash\r\nsqlite3 github.db \"SELECT full_name FROM repos WHERE owner = '123456789';\" | tr '\\n\\r' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }\r\n```\r\n\r\nOn a pure linux system I think this would work because the new line character is normally `\\n`\r\n```bash\r\nsqlite3 github.db \"SELECT full_name FROM repos WHERE owner = '123456789';\" | tr '\\n' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }`\r\n```\r\n\r\nAs expected I ran into rate limit issues #51 \r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/604#issuecomment-1843975536", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/604", "id": 1843975536, "node_id": "IC_kwDOCGYnMM5t6NVw", "user": {"value": 16437338, "label": "tkhattra"}, "created_at": "2023-12-07T01:17:05Z", "updated_at": "2023-12-07T01:17:05Z", "author_association": "CONTRIBUTOR", "body": "Apologies - I pushed a fix that addresses the mypy failures.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 2001006157, "label": "Add more STRICT table support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/344#issuecomment-1815825863", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/344", "id": 1815825863, "node_id": "IC_kwDOCGYnMM5sO03H", "user": {"value": 16437338, "label": "tkhattra"}, "created_at": "2023-11-17T06:44:49Z", "updated_at": "2023-11-17T06:44:49Z", "author_association": "CONTRIBUTOR", "body": "hello Simon,\r\n\r\nI've added more STRICT table support per https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982014776 in changeset https://github.com/simonw/sqlite-utils/commit/e4b9b582cdb4e48430865f8739f341bc8017c1e4.\r\nIt also fixes table.transform() to preserve STRICT mode.\r\nPlease pull if you deem appropriate. Thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1066474200, "label": "Support STRICT tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2199#issuecomment-1760401731", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2199", "id": 1760401731, "node_id": "IC_kwDOBm6k_c5o7ZlD", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-10-12T21:41:42Z", "updated_at": "2023-10-12T21:41:42Z", "author_association": "CONTRIBUTOR", "body": "I dig it - I was thinking an Observable notebook where you paste your `metadata.json`/`metadata.yaml` and it would generate the new metadata + datasette.yaml files, but an extensible `datasette upgrade` plugin would be nice for future plugins.\r\n\r\nOne thing to think about: If someone has comments in their original `metadata.yaml`, could we preserve them in the new files? tbh maybe not too important bc if people cared that much they could just copy + paste, and it might be too distracting \r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1940346034, "label": "Detailed upgrade instructions for metadata.yaml -> datasette.yaml"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2190#issuecomment-1729961503", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2190", "id": 1729961503, "node_id": "IC_kwDOBm6k_c5nHR4f", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-09-21T16:56:57Z", "updated_at": "2023-09-21T16:56:57Z", "author_association": "CONTRIBUTOR", "body": "TODO: add similar checks for permissions/allow/canned queries", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1901483874, "label": "Raise an exception if a \"plugins\" block exists in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2188#issuecomment-1722848454", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2188", "id": 1722848454, "node_id": "IC_kwDOBm6k_c5msJTG", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-09-18T06:58:53Z", "updated_at": "2023-09-18T06:58:53Z", "author_association": "CONTRIBUTOR", "body": "Thinking about this more, here a list of things I imagine a \"compile-to-sql\" plugin would want to do:\r\n\r\n1. Attach itself to the SQL code editor (switch from SQL -> PRQL/Logica, additional syntax highlighting)\r\n2. Add \"Query using PRQL\" buttons in various parts of Datasette's UI, like `/dbname` page\r\n3. Use `$LANGUAGE=` instead of `sql=` in the JSON API and the SQL results pages\r\n4. Have their own dedicated code editor page\r\n\r\n\r\n1) and 2) would be difficult to do with current plugin hooks, unless we add the concept of \"slots\" and get the JS plugin support in. 3) could maybe be done with the [`asgi_wrapper(datasette)`](https://docs.datasette.io/en/stable/plugin_hooks.html#asgi-wrapper-datasette) hook? And 4) ca n be done easily with the `register_routes()` hooks. \r\n\r\nSo it really only sounds like extending the SQL editor will be the hard part. In #2094 I want to add JavaScript plugin hooks for extending the SQL editor, which may work here. \r\n\r\nIf I get the time/motivation, I might try out a `datasette-prql` extension, just because I like playing with it. It'd be really cool if I can get the `asgi_wrapper()` hook to work right there...", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1900026059, "label": "Plugin Hooks for \"compile to SQL\" languages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1191#issuecomment-1722845490", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1191", "id": 1722845490, "node_id": "IC_kwDOBm6k_c5msIky", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-09-18T06:55:52Z", "updated_at": "2023-09-18T06:55:52Z", "author_association": "CONTRIBUTOR", "body": "One note here: this feature could be called \"slots\", similar to [Layout Slots](https://vitepress.dev/guide/extending-default-theme#layout-slots) in Vitepress.\r\n\r\nIn Vitepress, you can add custom components/widget/gadgets into determined named \"slots\", like so:\r\n\r\n```\r\ndoc-top\r\ndoc-bottom\r\ndoc-footer-before\r\ndoc-before\r\ndoc-after\r\n...\r\n```\r\n\r\nWould be great to do in both Python and Javascript, with the upcoming JavaScript API #2052. In `datasette-write-ui`, all we do is add a few \"Insert row\" and \"edit this row\" buttons and that required completely capturing the `table.html` template, which isn't great for other plugins. But having \"slots\" like `table-footer-before` or `table-row-id` or something would be great to work with.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 787098345, "label": "Ability for plugins to collaborate when adding extra HTML to blocks in default templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2183#issuecomment-1716801971", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2183", "id": 1716801971, "node_id": "IC_kwDOBm6k_c5mVFGz", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-09-13T01:34:01Z", "updated_at": "2023-09-13T01:34:01Z", "author_association": "CONTRIBUTOR", "body": "@simonw docs are finished, this is ready for review!\r\n\r\nOne thing: I added \"Configuration\" as a top-level item in the documentation site, at the very bottom. Not sure if this is the best, maybe it can be named \"datasette.yaml Configuration\" or something similar?\r\n\r\nMostly because \"Configuration\" by itself can mean many things, but adding \"datasette.yaml\" would make it pretty clear it's about that specific file, and is easier to scan. I'd also be fine with using \"datasette.yaml\" instead of \"datasette.json\", since writing in YAML is much more forgiving (and advanced users will know JSON is also supported)\r\n\r\nAlso, maybe this is a chance to consolidate the docs a bit? I think \"Settings\", \"Configuration\", \"Metadata\", and \"Authentication and permissions\" should possibly be under the same section. Maybe even consolidate the different Plugin pages that exist?\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1891212159, "label": "`datasette.yaml` plugin support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2157#issuecomment-1700291967", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2157", "id": 1700291967, "node_id": "IC_kwDOBm6k_c5lWGV_", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-31T02:45:56Z", "updated_at": "2023-08-31T02:45:56Z", "author_association": "CONTRIBUTOR", "body": "@simonw what do you think about adding a `DATASETTE_INTERNAL_DB_PATH` env variable, where when defined, is the default location of the internal DB? This means when the `--internal` flag is NOT provided, Datasette would check to see if `DATASETTE_INTERNAL_DB_PATH` exists, and if so, uses that as the internal database (and would fallback to an ephemeral memory database)\r\n\r\nMy rationale: some plugins may require, or strongly encourage, a persistent internal database (`datasette-comments`, `datasette-bookmarks`, `datasette-link-shortener`, etc.). However, for users that have a global installation of Datasette (say from `brew install` or a global `pip install`), it would be annoying having to specify `--internal` every time. So instead, they can just add `export DATASETTE_INTERNAL_DB_PATH=\"/path/to/internal.db\"` to their bashrc/zshrc/whereever to not have to worry about `--internal`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1865869205, "label": "Proposal: Make the `_internal` database persistent, customizable, and hidden"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2093#issuecomment-1688532012", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2093", "id": 1688532012, "node_id": "IC_kwDOBm6k_c5kpPQs", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-22T16:21:40Z", "updated_at": "2023-08-22T16:21:40Z", "author_association": "CONTRIBUTOR", "body": "OK Here's the gameplan for this, which is closely tied to #2143 :\r\n\r\n- We will add a new `datasette.json`/`datasette.yaml` configuration file to datasette, which combines settings/plugin config/permissions/canned queries into a new file format\r\n- Metadata will NOT be a part of this file\r\n- TOML support is not planned, but maybe we can create a separate issue for support TOML with JSON/YAML\r\n- The `settings.json` file will be deprecated, and the `--config` arg will be brought back.\r\n- Command line arguments can still be used to overwrite values (ex `--setting` will overwrite settings in `datasette.yaml`\r\n\r\nThe format of `datasette.json` will follow what Simon listed here: https://github.com/simonw/datasette/issues/2143#issuecomment-1684484426\r\n\r\nHere's the current implementation plan:\r\n\r\n1. Add a new `--config` flag and port over `\"settings\"` into a new datasette.json config file, remove settings.json\r\n2. Add top-level plugin config support to `datasette.json`\r\n3. Figure out database/table structure of config `datasette.json`\r\n4. Port over database/table level plugin config support `datasette.json`\r\n5. Port over permissions/auth settings to `datasette.json`\r\n6. Deprecate non-metadata values in `metadata.json`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1781530343, "label": "Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2145#issuecomment-1686745094", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2145", "id": 1686745094, "node_id": "IC_kwDOBm6k_c5kibAG", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-21T17:30:01Z", "updated_at": "2023-08-21T17:30:01Z", "author_association": "CONTRIBUTOR", "body": "Another point: The new Datasette write API should refuse to insert a row with a NULL primary key. That will likely decrease the likelihood someone find themselves with NULLs in their primary keys, at least with Datasette users. Especially buggy code that uses the write API, like our `datasette-write-ui` bug that led to this issue.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1857234285, "label": "If a row has a primary key of `null` various things break"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2143#issuecomment-1684496274", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2143", "id": 1684496274, "node_id": "IC_kwDOBm6k_c5kZ1-S", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-18T22:30:45Z", "updated_at": "2023-08-18T22:30:45Z", "author_association": "CONTRIBUTOR", "body": "> That said, I do really like a bias towards settings that can be changed at runtime\r\n\r\nDoes this include things like `--settings` values or plugin config? I can totally see being able to update metadata without restarting, but not sure if that would work well with `--setting`, plugin config, or auth/permissions stuff. \r\n\r\nWell it could work with `--setting` and auth/permissions, with a lot of core changes. But changing plugin config on the fly could be challenging, for plugin authors. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1855885427, "label": "De-tangling Metadata before Datasette 1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2143#issuecomment-1684205563", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2143", "id": 1684205563, "node_id": "IC_kwDOBm6k_c5kYu_7", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-18T17:12:54Z", "updated_at": "2023-08-18T17:12:54Z", "author_association": "CONTRIBUTOR", "body": "Another option would be, instead of flat `datasette.json`/`datasette.yaml` files, we could instead use a Python file, like `datasette_config.py`. That way one could dynamically generate config (ex dev vs prod, auto-discover credentials, etc.). Kinda like Django settings.\r\n\r\nThough I imagine Python imports might make this complex to do, and json/yaml is already supported and pretty easy to write\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1855885427, "label": "De-tangling Metadata before Datasette 1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2143#issuecomment-1684202932", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2143", "id": 1684202932, "node_id": "IC_kwDOBm6k_c5kYuW0", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-08-18T17:10:21Z", "updated_at": "2023-08-18T17:10:21Z", "author_association": "CONTRIBUTOR", "body": "I agree with all your points!\r\n\r\nI think the best solution would be having a `datasette.json` config file, where you \"configure\" your datasette instances, with settings, permissions/auth, plugin configuration, and table settings (sortable column, label columns, etc.). Which #2093 would do.\r\n\r\nThen optionally, you have a `metadata.json`, or use `datasette_metadata`, or some other plugin to define metadata (ex the future [sqlite-docs](https://github.com/asg017/sqlite-docs) plugin).\r\n\r\nEverything in `datasette.json` could also be overwritten by CLI flags, like `--setting key value`, `--plugin xxxx key value`.\r\n\r\nWe could even completely remove `settings.json` in favor or just `datasette.json`. Mostly because I think the less files the better, especially if they have generic names like `settings.json` or `config.json`. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1855885427, "label": "De-tangling Metadata before Datasette 1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2104#issuecomment-1641082395", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2104", "id": 1641082395, "node_id": "IC_kwDOBm6k_c5h0O4b", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-07-18T22:41:37Z", "updated_at": "2023-07-18T22:41:37Z", "author_association": "CONTRIBUTOR", "body": "For filtering virtual table's \"shadow tables\" (ex the FTS5 _content and most the spatialite tables), you can use `pragma_table_list` (first appeared in SQLite 3.37 (2021-11-27), which has a `type` column that calls out `type=\"shadow\"` tables https://www.sqlite.org/pragma.html#pragma_table_list", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 1}", "issue": {"value": 1808215339, "label": "Tables starting with an underscore should be treated as hidden"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1638910473", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/567", "id": 1638910473, "node_id": "IC_kwDOCGYnMM5hr8oJ", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-07-17T21:27:41Z", "updated_at": "2023-07-17T21:27:41Z", "author_association": "CONTRIBUTOR", "body": "Another use-case: I want to make a `sqlite-utils` plugin that'll help me insert data into Datasette. \r\n\r\n```bash\r\nsqlite-utils insert-datasette \\\r\n --token $DATASETTE_API_KEY \\\r\n https://latest.datasette.io/fixtures/my-table \\\r\n 'select ...'\r\n```\r\n\r\n\r\nThis could also be a datasette plugin (ex `datasette upload-data ...`, but you can also think of `sqlite-utils` plugins that upload to S3, a postgres table, other DBMS's, etc.)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1801394744, "label": "Plugin system"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2087#issuecomment-1616853644", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2087", "id": 1616853644, "node_id": "IC_kwDOBm6k_c5gXzqM", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-07-02T22:00:48Z", "updated_at": "2023-07-02T22:00:48Z", "author_association": "CONTRIBUTOR", "body": "I just saw in the docs that Dasette auto-detects `settings.json`:\r\n\r\n> settings.json - settings that would normally be passed using --setting - here they should be stored as a JSON object of key/value pairs\r\n> [*Source*](https://docs.datasette.io/en/stable/settings.html#:~:text=settings.json%20%2D%20settings%20that%20would%20normally%20be%20passed%20using%20%2D%2Dsetting%20%2D%20here%20they%20should%20be%20stored%20as%20a%20JSON%20object%20of%20key/value%20pairs)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1765870617, "label": "`--settings settings.json` option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2093#issuecomment-1616286848", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2093", "id": 1616286848, "node_id": "IC_kwDOBm6k_c5gVpSA", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-07-02T02:17:46Z", "updated_at": "2023-07-02T02:17:46Z", "author_association": "CONTRIBUTOR", "body": "Storing metadata in the database won't be required. I imagine there'll be many different ways to store metadata, including any possible `datasette_metadata` or sqlite-docs, or the older metadata.json way. \r\n\r\nThe next question will be how precedence should work - i'd imagine metadata.json > plugins > datasette_metadata > sqlite-docs", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1781530343, "label": "Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1616095810", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1616095810, "node_id": "IC_kwDOBm6k_c5gU6pC", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-07-01T20:31:31Z", "updated_at": "2023-07-01T20:31:31Z", "author_association": "CONTRIBUTOR", "body": "> Just curious, is there a query that can be used to compile this programmatically, or did you identify these through memory?\r\n\r\nI just did a github search for `user:simonw \"def extra_js_urls(\"` ! Though I'm sure other plugins made by people other than Simon also exist out there https://github.com/search?q=user%3Asimonw+%22def+extra_js_urls%28%22&type=code", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2093#issuecomment-1613896210", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2093", "id": 1613896210, "node_id": "IC_kwDOBm6k_c5gMhoS", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-06-29T22:53:33Z", "updated_at": "2023-06-29T22:53:33Z", "author_association": "CONTRIBUTOR", "body": "Maybe we can have a separate issue for revamping `metadata.json`? A `datasette_metadata` table or the `sqlite-docs` extension seem like two reasonable additions that we can work through. Storing metadata inside a SQLite database makes sense, but I don't think storing `datasette.*` style config (ex ports, settings, etc.) inside a SQLite DB makes sense, since it's very environment-dependent", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1781530343, "label": "Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2093#issuecomment-1613895188", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2093", "id": 1613895188, "node_id": "IC_kwDOBm6k_c5gMhYU", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-06-29T22:51:53Z", "updated_at": "2023-06-29T22:51:53Z", "author_association": "CONTRIBUTOR", "body": "I agree with not liking `metadata.json` stuff in a `datasette.*` config file. Editing description of a table/column in a file like `datasette.*` seems odd to me. \r\n\r\nThough since plugin configuration currently lives in `metadata.json`, I think it should be removed from there and placed in `datasette.*`, at least for top-level config like `datasette-auth-github`'s config. Keeping `metadata.json` strictly for documentation/licensing/column units makes sense to me, but anything plugin related should be in some config file, like `datasette.*`.\r\n\r\nAnd ya, supporting both `datasette.*` and CLI flags makes a lot of sense to me. Any `--setting` flag should override anything in `datasette.*` for easier debugging, with possibly a warning message so people don't get confused. Same with `--port` and a port defined in `datasette.*`", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1781530343, "label": "Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1613778296", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1613778296, "node_id": "IC_kwDOBm6k_c5gME14", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-06-29T20:36:09Z", "updated_at": "2023-06-29T20:36:09Z", "author_association": "CONTRIBUTOR", "body": "Ok @hydrosquall a couple things before this PR should be good to go:\r\n\r\n- Can we move `datasette/static/table-example-plugins.js` into `demos/plugins/static`?\r\n- For `datasetteManager.VERSION`, can we fill that in or just comment it out for now? Not sure how difficult it'll be to inject it server-side. I imagine we could also have a small build process with esbuild/rollup that just injects a version string into `manager.js` directly, so we don't have to worry about server-rendering (but that can be a future PR)\r\n\r\nIn terms of how to integrate this into Datasette, a few options I can see working:\r\n\r\n- Push this as-is and figure it out before the next release\r\n- Hide this feature behind a settings flag (`--setting unstable-js-plugins on`) and use that setting to hide/show `` in `base.html`\r\n\r\nI'll let @simonw decide which one to work with. I kindof like the idea of having an \"unstable\" opt-in process to enable JS plugins, to give us time to try it out with a wide variety of plugins until we feel its ready.\r\n\r\nI'm also curious to see how \"plugins for a plugin' would work, like #1542. For example, if the leaflet plugin showed default markers, but also included its own hook for other plugins to add more markers/styling. I'm imagine that the individual plugin would re-create their own plugin system compared to this, since handling \"plugins of plugins\" at the top with Datasette seems really convoluted. \r\n\r\nAlso for posterity, here's a list of Simon's Datasette plugins that use \"extra_js_urls()\", which probably means they can be ported/re-written to use this new plugin system:\r\n\r\n- [`datasette-vega`](https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L25)\r\n- [`datasette-cluster-map`](https://github.com/simonw/datasette-cluster-map/blob/795d25ad9ff6cba0307191f44fecc8f8070bef5c/datasette_cluster_map/__init__.py#L14)\r\n- [`datasette-leaflet-geojson`](https://github.com/simonw/datasette-leaflet-geojson/blob/64713aa497750400b9ac2c12e8bb6ffab8eb77f3/datasette_leaflet_geojson/__init__.py#L47)\r\n- [`datasette-pretty-traces`](https://github.com/simonw/datasette-pretty-traces/blob/5219d65eca3d7d7a73bb9d3120df42fe046a1315/datasette_pretty_traces/__init__.py#L5)\r\n- [`datasette-youtube-embed`](https://github.com/simonw/datasette-youtube-embed/blob/4b4a0d7e58ebe15f47e9baf68beb9908c1d899da/datasette_youtube_embed/__init__.py#L55)\r\n- [`datasette-leaflet-freedraw`](https://github.com/simonw/datasette-leaflet-freedraw/blob/8f28c2c2080ec9d29f18386cc6a2573a1c8fbde7/datasette_leaflet_freedraw/__init__.py#L66)\r\n- [`datasette-hovercards`](https://github.com/simonw/datasette-hovercards/blob/9439ba46b7140fb03223faff0d21aeba5615a287/datasette_hovercards/__init__.py#L5)\r\n- [`datasette-mp3-audio`](https://github.com/simonw/datasette-mp3-audio/blob/4402168792f452a46ab7b488e40ec49cd4b12185/datasette_mp3_audio/__init__.py#L6)\r\n- [`datasette-geojson-map`](https://github.com/simonw/datasette-geojson-map/blob/32af5f1fd1a07278bbf8071fbb20a61e0f613246/datasette_geojson_map/__init__.py#L30)", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 1}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1606352600", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1606352600, "node_id": "IC_kwDOBm6k_c5fvv7Y", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2023-06-26T00:17:04Z", "updated_at": "2023-06-26T00:17:04Z", "author_association": "CONTRIBUTOR", "body": ":wave: would love to see this get merged soon! I want to make a javascript plugin on top of the code-mirror editor to make a few things nicer (function auto-complete, table/column descriptions, etc.), and this would help out a bunch", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1884#issuecomment-1321460293", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1884", "id": 1321460293, "node_id": "IC_kwDOBm6k_c5Ow-JF", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2022-11-21T04:40:55Z", "updated_at": "2022-11-21T04:40:55Z", "author_association": "CONTRIBUTOR", "body": "Counting any virtual tables can be pretty tricky. On one hand, counting a [CSV virtual table](https://www.sqlite.org/csv.html) would return the number of rows in the CSV, which is helpful (but can be I/O intensive). Counting a [FTS5 virtual table](https://www.sqlite.org/fts5.html) would return the number of entries in the FTS index, which is kindof helpful, but can be misleading in some cases.\r\n\r\nOn the other hand, arbitrarily running `COUNT(*)` on some virtual tables can be incredibly expensive. SQLite offers new shortcuts/pushdowns on `COUNT(*)` queries for virtual tables, and instead calls the underlying vtab implementation and iterates through all rows in the table without discretion. For example, a virtual table that's backed by a Postgres table would call `select * from pg_table`, which would use up a lot of network and CPU calls. Or a virtual table backed by a [google sheet](https://github.com/0x6b/libgsqlite) would make network/API requests to get all the rows from the sheet just to make a count.\r\n\r\nThe [`pragma_table_list`](https://www.sqlite.org/pragma.html#pragma_table_list) pragma tells you when a table is a regular table or virtual (in the `type` column), but was only added in version 3.37.0 (2021-11-27). \r\n\r\n\r\nPersonally, I wouldnt try to `COUNT(*)` virtual tables - it depends on how the virtual table is implemented, it requires that the connection has the proper extensions loaded, and it may accientally cause perf issues for new-age extensions. A few extensions that I'm writing have virtual tables that wouldn't benefit much from `COUNT(*)`, and the fact that SQLite iterates through all rows in a table to count just makes things worse. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1439009231, "label": "Exclude virtual tables from datasette inspect"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1851#issuecomment-1292519956", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1851", "id": 1292519956, "node_id": "IC_kwDOBm6k_c5NCkoU", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2022-10-26T19:20:33Z", "updated_at": "2022-10-26T19:20:33Z", "author_association": "CONTRIBUTOR", "body": "> This could use a new plugin hook, too. I don't want to complicate your life too much, but for things like GIS, I'd want a way to turn regular JSON into SpatiaLite geometries or combine X/Y coordinates into point geometries and such. Happy to help however I can.\r\n\r\n @eyeseast Maybe you could do this with triggers? Like you can insert JSON-friendly data into a \"raw\" table, and create a trigger that transforms that inserted data into the proper table\r\n\r\nHere's an example:\r\n\r\n```sql\r\n-- meant to be updated from a Datasette insert\r\ncreate table points_raw(longitude int, latitude int);\r\n\r\n-- the target table with proper spatliate geometries\r\ncreate table points(point geometry);\r\n\r\nCREATE TRIGGER insert_points_raw INSERT ON points_raw \r\n BEGIN\r\n insert into points(point) values (makepoint(new.longitude, new.latitude))\r\n END;\r\n```\r\n\r\nYou could then POST a new row to `points_raw` like this:\r\n```\r\nPOST /db/points_raw\r\nAuthorization: Bearer xxx\r\nContent-Type: application/json\r\n{\r\n \"row\": {\r\n \"longitude\": 27.64356,\r\n \"latitude\": -47.29384\r\n }\r\n}\r\n```\r\n\r\nThen SQLite with run the trigger and insert a new row in `points` with the correct geometry point. Downside is you'd have duplicated data with `points_raw`, but maybe it could be a `TEMP` table (or have a cron that deletes all rows from that table every so often?)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1421544654, "label": "API to insert a single record into an existing table"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1789#issuecomment-1223347322", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1789", "id": 1223347322, "node_id": "IC_kwDOBm6k_c5I6sx6", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2022-08-23T00:03:20Z", "updated_at": "2022-08-23T00:03:20Z", "author_association": "CONTRIBUTOR", "body": "@simonw to build the extension on ubuntu, you can run:\r\n\r\n```\r\napt-get update && apt-get install libsqlite3-dev gcc\r\ngcc ext.c -fPIC -shared -o ext.so\r\n```\r\n\r\nI'm not the best with Actions, but if you set the cache key to `ext.c`, run those two commands to download dependencies + compile to `ext.so`, then the unit test should pick it up and run it correctly. Let me know if you want me to update the PR with that added", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1344823170, "label": "Add new entrypoint option to `--load-extension`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1789#issuecomment-1221576460", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1789", "id": 1221576460, "node_id": "IC_kwDOBm6k_c5Iz8cM", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2022-08-21T16:16:42Z", "updated_at": "2022-08-21T16:16:42Z", "author_association": "CONTRIBUTOR", "body": "Rebased, Read the docs failure should now now fixed\r\n\r\nRe docs - ya that's a pretty ambitious page, I'm still not 100% sure what the best practices are/should be... Would be happy to make that page in a future PR", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1344823170, "label": "Add new entrypoint option to `--load-extension`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1528#issuecomment-975955589", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1528", "id": 975955589, "node_id": "IC_kwDOBm6k_c46K-aF", "user": {"value": 15178711, "label": "asg017"}, "created_at": "2021-11-22T22:00:30Z", "updated_at": "2021-11-22T22:00:30Z", "author_association": "CONTRIBUTOR", "body": "Oh, another thing to consider: I believe this would be the first `\"_file\"` key in datasette's metadata, compared to other `\"_url\"` keys like `\"license_url\"` or `\"about_url\"`. Not too sure what considerations to include with this (ex should missing files cause Datasette to stop before starting, should build scripts bundle these sql files somewhere during `datasette package`, etc.)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1060631257, "label": "Add new `\"sql_file\"` key to Canned Queries in metadata?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/666#issuecomment-590022164", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/666", "id": 590022164, "node_id": "MDEyOklzc3VlQ29tbWVudDU5MDAyMjE2NA==", "user": {"value": 13896256, "label": "kevindkeogh"}, "created_at": "2020-02-23T03:26:00Z", "updated_at": "2020-02-23T03:26:00Z", "author_association": "CONTRIBUTOR", "body": "It was very helpful for me, using it for a 15M row table. Added a test, happy to amend though!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 562085508, "label": "Use inspect-file, if possible, for total row count"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/394#issuecomment-499923145", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/394", "id": 499923145, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5OTkyMzE0NQ==", "user": {"value": 13896256, "label": "kevindkeogh"}, "created_at": "2019-06-07T15:10:57Z", "updated_at": "2019-06-07T15:11:07Z", "author_association": "CONTRIBUTOR", "body": "Putting this here in case anyone else encounters the same issue with nginx, I was able to resolve it by passing the header in the nginx proxy config (i.e., `proxy_set_header Host $host`).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 396212021, "label": "base_url configuration setting"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/394#issuecomment-499320973", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/394", "id": 499320973, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5OTMyMDk3Mw==", "user": {"value": 13896256, "label": "kevindkeogh"}, "created_at": "2019-06-06T02:07:59Z", "updated_at": "2019-06-06T02:07:59Z", "author_association": "CONTRIBUTOR", "body": "Hey was this ever merged? Trying to run this behind nginx, and encountering this issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 396212021, "label": "base_url configuration setting"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1348#issuecomment-850077261", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1348", "id": 850077261, "node_id": "MDEyOklzc3VlQ29tbWVudDg1MDA3NzI2MQ==", "user": {"value": 10801138, "label": "blairdrummond"}, "created_at": "2021-05-28T03:05:38Z", "updated_at": "2021-05-28T03:05:38Z", "author_association": "CONTRIBUTOR", "body": "Note, the CVEs are probably resolvable with this https://github.com/simonw/datasette/pull/1296 . My experience is that Ubuntu seems to manage these better? Though that is surprising :/ ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 904598267, "label": "DRAFT: add test and scan for docker images"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1280#issuecomment-837166862", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1280", "id": 837166862, "node_id": "MDEyOklzc3VlQ29tbWVudDgzNzE2Njg2Mg==", "user": {"value": 10801138, "label": "blairdrummond"}, "created_at": "2021-05-10T19:07:46Z", "updated_at": "2021-05-10T19:07:46Z", "author_association": "CONTRIBUTOR", "body": "Do you have a list of sqlite versions you want to test against?\r\n\r\nOne cool thing I saw recently (that we started using) was using `import docker` within python, and then writing pytest functions which executed against the container\r\n\r\n[setup](https://github.com/StatCan/kubeflow-containers/blob/3c7dcfb5e7188982fb8ebcded82e84292720f720/conftest.py#L85)\r\n\r\n[example](https://github.com/StatCan/kubeflow-containers/blob/master/tests/jupyterlab-cpu/test_julia.py#L8-L18)\r\n\r\nThe inspiration for this came from the [jupyter docker-stacks](https://github.com/jupyter/docker-stacks/blob/09fb66007615ea68d9bce8f8e1a2cf9402f1e432/test/test_packages.py#L107)\r\n\r\nSo off the top of my head, could look at building the container with different sqlite versions as a build-arg, then run tests against the containers. Just brainstorming though", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 842862708, "label": "Ability to run CI against multiple SQLite versions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1296#issuecomment-835491318", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1296", "id": 835491318, "node_id": "MDEyOklzc3VlQ29tbWVudDgzNTQ5MTMxOA==", "user": {"value": 10801138, "label": "blairdrummond"}, "created_at": "2021-05-08T19:59:01Z", "updated_at": "2021-05-08T19:59:01Z", "author_association": "CONTRIBUTOR", "body": "I have also found that ubuntu has fewer vulnerabilities than the buster based images.\r\n\r\n```\r\n\u279c ~ docker pull python:3-buster\r\n\u279c ~ trivy image python:3-buster | head \r\n2021-04-28T17:14:29.313-0400 INFO Detecting Debian vulnerabilities...\r\n2021-04-28T17:14:29.393-0400 INFO Trivy skips scanning programming language libraries because no supported file was detected\r\npython:3-buster (debian 10.9)\r\n=============================\r\nTotal: 1621 (UNKNOWN: 13, LOW: 1106, MEDIUM: 343, HIGH: 145, CRITICAL: 14)\r\n+------------------------------+---------------------+----------+------------------------------+---------------+--------------------------------------------------------------+\r\n| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |\r\n+------------------------------+---------------------+----------+------------------------------+---------------+--------------------------------------------------------------+\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 855446829, "label": "Dockerfile: use Ubuntu 20.10 as base"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/434#issuecomment-489163939", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/434", "id": 489163939, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4OTE2MzkzOQ==", "user": {"value": 10352819, "label": "rprimet"}, "created_at": "2019-05-03T16:49:45Z", "updated_at": "2019-05-03T16:50:03Z", "author_association": "CONTRIBUTOR", "body": "> The second time I ran the command I got an error:\r\n\r\n> \r\n> ERROR: (gcloud.beta.run.deploy) Deployment endpoint was not found. Perhaps the\r\n> provided region was invalid. Set the `run/region` property to a valid region and\r\n> retry. Ex: `gcloud config set run/region us-central1`\r\n> \r\n\r\nYes, I was able to reproduce this; I used to get prompted for a run region interactively by the `gcloud` tool before, but maybe this is changing? (the [documentation](https://cloud.google.com/run/docs/deploying) now assumes `run/region` is set).\r\n\r\nNot sure which course of action is best: making `datasette` ensure that `run/region` is set beforehand or wait a bit until the gcloud CLI stabilizes?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 434321685, "label": "\"datasette publish cloudrun\" command to publish to Google Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1630776144", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1630776144, "node_id": "IC_kwDOBm6k_c5hM6tQ", "user": {"value": 9020979, "label": "hydrosquall"}, "created_at": "2023-07-11T12:54:03Z", "updated_at": "2023-07-11T12:54:03Z", "author_association": "CONTRIBUTOR", "body": "Thanks for the review and the code pointers @simonw - I've made the suggested edits, fixed the renamed variable, and confirmed that the panels still render on the `table` and `database` views. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1615997736", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1615997736, "node_id": "IC_kwDOBm6k_c5gUiso", "user": {"value": 9020979, "label": "hydrosquall"}, "created_at": "2023-07-01T16:55:24Z", "updated_at": "2023-07-01T16:55:24Z", "author_association": "CONTRIBUTOR", "body": "> Ok @hydrosquall a couple things before this PR should be good to go:\r\n\r\nThank you @asg017 ! I've pushed both suggested changes onto this branch.\r\n\r\n> Not sure how difficult it'll be to inject it server-side\r\n\r\nIf we are OK with having a build system, it would free me up to do do many things! We could make datasette-manager.js a server-side rendered file as a \"template\" instead of having it as a static JS file, but I'm not sure it's worth the extra jump in complexity / loss of syntax highlighting in the JS file.\r\n\r\nIn the short-term, I could see an intermediary solution where a unit test in the preferred language was able to read both `version.py` and `datasette-manager.js`, and make sure that the strings versions are in sync. (This assumes that we want the manager and datasette's versions to be synced, and not decoupled). Since the version is not changing very often, a \"manual sync\" might be good enough. \r\n\r\n> In terms of how to integrate this into Datasette, a few options I can see working:\r\n\r\nThis sounds good to me. I'm not sure how to add a settings flag, but will be interested to see the PR that adds support for it.\r\n\r\n> I'm also curious to see how \"plugins for a plugin' would work\r\n\r\nI'm comfortable to wait until we have a realistic usecase for this. In the short term, I think we could give plugins a way to grant access to a \"public API of other plugins\", and also ask to be notified when plugins with other names have loaded, but don't picture the datasette manager getting more involved than that. \r\n\r\n> here's a list of Simon's Datasette plugins that use \"extra_js_urls()\"\r\n\r\nNeat, thanks for compiling this list! Just curious, is there a query that can be used to compile this programmatically, or did you identify these through memory?\r\n\r\n> I want to make a javascript plugin on top of the code-mirror editor to make a few things nicer (function auto-complete, table/column descriptions, etc.)\r\n\r\nI look forward to trying this out \ud83d\udc4d \r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/2052#issuecomment-1585149909", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2052", "id": 1585149909, "node_id": "IC_kwDOBm6k_c5ee3fV", "user": {"value": 9020979, "label": "hydrosquall"}, "created_at": "2023-06-09T21:35:00Z", "updated_at": "2023-06-09T21:35:00Z", "author_association": "CONTRIBUTOR", "body": "Thanks @cldellow for the thoughtful comments! These are all things that I'll keep in mind as we figure out how/if this API is actually used by plugin authors once it's actually out in the world.\r\n\r\n> Yes, this would work - but it requires me to continue to communicate the column names out of band (in order to fetch the facet data per-column before registering my plugin), vs being able to re-use them from the plugin implementation.\r\n\r\nAh, I understand now! Thanks for explaining. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1651082214, "label": "feat: Javascript Plugin API (Custom panels, column menu items with JS actions)"}, "performed_via_github_app": null}