{"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646271834", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646271834, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MTgzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:49:41Z", "updated_at": "2020-06-24T18:49:22Z", "author_association": "OWNER", "body": "But then what kind of magic parameters might plugins want to add?\r\n\r\nHere's a crazy idea: `_scrapedcontent_url` - it would look for the `url` column on the data being inserted, scrape the content from it and insert that. This does suggest that the magic resolving function `scrapedcontent()` would need to optionally be sent the full row dictionary being inserted too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646270702", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646270702, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MDcwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:47:19Z", "updated_at": "2020-06-24T18:48:48Z", "author_association": "OWNER", "body": "Brainstorming more potential magic parameters:\r\n\r\n* `_actor_id`\r\n* `_actor_name`\r\n* `_request_ip`\r\n* `_request_user_agent`\r\n* `_cookie_cookiename`\r\n* `_signedcookie_cookiename` - reading signed cookies would be cool, not sure how to specify namespace though, maybe always use the same one? Or have the namespace come last, `_signedcookie_cookiename_mynamespace`. Might not need special signed cookie support since `actor` is already usually from a signed cookie.\r\n* `_timestamp_unix` (not happy with these names yet)\r\n* `_timestamp_localtime`\r\n* `_timestamp_datetime`\r\n* `_timestamp_utc`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-649000075", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 649000075, "node_id": "MDEyOklzc3VlQ29tbWVudDY0OTAwMDA3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-24T18:46:36Z", "updated_at": "2020-06-24T18:47:37Z", "author_association": "OWNER", "body": "Another magic parameter that would be useful would be `_random`. Consider https://github.com/simonw/datasette-auth-tokens/issues/1 for example - I'd like to be able to provide a writable canned query which can create new authentication tokens in the database, but ideally it would automatically populate a secure random secret for each one.\r\n\r\nMaybe `_random_chars_128` to create a 128 character long random string (using `os.urandom(64).hex()`).\r\n\r\nThis would be the first example of a magic parameter where part of the parameter name is used to configure the resulting value. Maybe neater to separate that with a different character? Unfortunately `_random_chars:128` wouldn't work because these parameters are used in a SQLite query where `:` has special meaning: `insert into blah (secret) values (:_random_chars:128)` wouldn't make sense.\r\n\r\nActually this is already supported by the proposed design - `_random_chars_128` would become `random(\"chars_128\")` so the `random()` function could split off the 128 itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/865#issuecomment-648998264", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/865", "id": 648998264, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODk5ODI2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-24T18:43:02Z", "updated_at": "2020-06-24T18:43:02Z", "author_association": "OWNER", "body": "Thanks for the bug report. Yes I think #838 may be the same issue. Will investigate.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644582921, "label": "base_url doesn't seem to work when adding criteria and clicking \"apply\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/858#issuecomment-648997857", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/858", "id": 648997857, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODk5Nzg1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-24T18:42:10Z", "updated_at": "2020-06-24T18:42:10Z", "author_association": "OWNER", "body": "I really need to get myself a Windows 10 development environment working so I can dig into this kind of bug properly. I have a gaming PC lying around that I could re-task for that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642388564, "label": "publish heroku does not work on Windows 10"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/866#issuecomment-648818707", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/866", "id": 648818707, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODgxODcwNw==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2020-06-24T13:26:14Z", "updated_at": "2020-06-24T13:26:14Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=h1) Report\n> Merging [#866](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=desc) into [master](https://codecov.io/gh/simonw/datasette/commit/1a5b7d318fa923edfcefd3df8f64dae2e9c49d3f&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/866/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #866 +/- ##\n=======================================\n Coverage 82.99% 82.99% \n=======================================\n Files 26 26 \n Lines 3547 3547 \n=======================================\n Hits 2944 2944 \n Misses 603 603 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=footer). Last update [1a5b7d3...fb64dda](https://codecov.io/gh/simonw/datasette/pull/866?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644610729, "label": "Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-648800356", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 648800356, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODgwMDM1Ng==", "user": {"value": 6739646, "label": "tballison"}, "created_at": "2020-06-24T12:51:48Z", "updated_at": "2020-06-24T12:51:48Z", "author_association": "NONE", "body": ">But also want to say thanks for a great tool\r\n\r\n+1!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/865#issuecomment-648799963", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/865", "id": 648799963, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODc5OTk2Mw==", "user": {"value": 6739646, "label": "tballison"}, "created_at": "2020-06-24T12:51:01Z", "updated_at": "2020-06-24T12:51:01Z", "author_association": "NONE", "body": "This seems to be a duplicate of: https://github.com/simonw/datasette/issues/838", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644582921, "label": "base_url doesn't seem to work when adding criteria and clicking \"apply\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-648669523", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 648669523, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODY2OTUyMw==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-24T08:13:23Z", "updated_at": "2020-06-24T10:30:36Z", "author_association": "CONTRIBUTOR", "body": "I tried setting `cache_size_kb=0` then `cache_size_kb=100000`, still getting this behavior. I even changed `Database::table_counts` and lowered time limit to 1\r\n\r\n```py\r\ntable_count = (\r\n await self.execute(\r\n \"select count(*) from [{}]\".format(table),\r\n custom_time_limit=1,\r\n )\r\n).rows[0][0]\r\ncounts[table] = table_count\r\n```\r\n\r\nI feel like 10 seconds is a magic number, like a processing timeout and datasette gives up and returns the page. \r\nIndex page loads instantly, table page, query page, as well. But when I return to database page after some time, it loads in 10s.\r\n\r\nEDIT:\r\n\r\nIt's always like 10 + 0.3s, like 10s wait and timeout then 300ms to render the page", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/864#issuecomment-648580556", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/864", "id": 648580556, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODU4MDU1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-24T04:40:49Z", "updated_at": "2020-06-24T04:40:49Z", "author_association": "OWNER", "body": "The ideal fix here would be to rework my `BaseView` subclass mechanism to work with `register_routes()` so that those views don't have any special privileges above plugin-provided views.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644309017, "label": "datasette.add_message() doesn't work inside plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/864#issuecomment-648580236", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/864", "id": 648580236, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODU4MDIzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-24T04:39:39Z", "updated_at": "2020-06-24T04:39:39Z", "author_association": "OWNER", "body": "Urgh, fixing this is going to be a bit of a pain.\r\n\r\nHere's where I added that custom `dispatch_request()` method - it was to implement flash messaging in #790: https://github.com/simonw/datasette/blame/1a5b7d318fa923edfcefd3df8f64dae2e9c49d3f/datasette/views/base.py#L85\r\n\r\nIf I want this to be made available to `register_routes()` views as well, I'm going to have to move the logic somewhere else. In particular I need to make sure that the `request` object is created once and used throughout the whole request cycle.\r\n\r\nCurrently `register_routes()` view functions get their own separate request object which is created here:\r\n\r\nhttps://github.com/simonw/datasette/blob/1a5b7d318fa923edfcefd3df8f64dae2e9c49d3f/datasette/app.py#L1057-L1068\r\n\r\nSo I'm going to have to refactor this quite a bit to get that shared request object which can be passed both to `register_routes` views and to my various `BaseView` subclasses.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644309017, "label": "datasette.add_message() doesn't work inside plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648442511", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/117", "id": 648442511, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODQ0MjUxMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T21:39:41Z", "updated_at": "2020-06-23T21:39:41Z", "author_association": "OWNER", "body": "So there are two sides to supporting this:\r\n\r\n- Being able to sensibly introspect composite foreign keys\r\n- Being able to define composite foreign keys when creating a table", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644161221, "label": "Support for compound (composite) foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648440634", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/117", "id": 648440634, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODQ0MDYzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T21:35:16Z", "updated_at": "2020-06-23T21:35:16Z", "author_association": "OWNER", "body": "Relevant discussion: https://github.com/simonw/sqlite-generate/issues/8#issuecomment-648438056", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644161221, "label": "Support for compound (composite) foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648440525", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/117", "id": 648440525, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODQ0MDUyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T21:35:01Z", "updated_at": "2020-06-23T21:35:01Z", "author_association": "OWNER", "body": "Here's what's missing:\r\n```\r\nIn [11]: db.conn.execute('PRAGMA foreign_key_list(song)').fetchall() \r\nOut[11]: \r\n[(0,\r\n 0,\r\n 'album',\r\n 'songartist',\r\n 'albumartist',\r\n 'NO ACTION',\r\n 'NO ACTION',\r\n 'NONE'),\r\n (0, 1, 'album', 'songalbum', 'albumname', 'NO ACTION', 'NO ACTION', 'NONE')]\r\n```\r\nCompare with this code here:\r\nhttps://github.com/simonw/sqlite-utils/blob/d0cdaaaf00249230e847be3a3b393ee2689fbfe4/sqlite_utils/db.py#L563-L579\r\n\r\nThe first two columns returned by `PRAGMA foreign_key_list(table)` are `id` and `seq` - these show when two foreign key records are part of the same compound foreign key. `sqlite-utils` entirely ignores those at the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644161221, "label": "Support for compound (composite) foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/116#issuecomment-648434885", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/116", "id": 648434885, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODQzNDg4NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T21:21:33Z", "updated_at": "2020-06-23T21:21:33Z", "author_association": "OWNER", "body": "New docs: https://github.com/simonw/sqlite-utils/blob/2.10.1/docs/python-api.rst#introspection", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644122661, "label": "Documentation for table.pks introspection property"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/116#issuecomment-648403834", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/116", "id": 648403834, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODQwMzgzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T20:36:29Z", "updated_at": "2020-06-23T20:36:29Z", "author_association": "OWNER", "body": "Should go in this section https://sqlite-utils.readthedocs.io/en/stable/python-api.html#introspection - under `.columns_dict` and before `.foreign_keys`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644122661, "label": "Documentation for table.pks introspection property"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/694#issuecomment-648296323", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/694", "id": 648296323, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODI5NjMyMw==", "user": {"value": 3903726, "label": "kwladyka"}, "created_at": "2020-06-23T17:10:51Z", "updated_at": "2020-06-23T17:10:51Z", "author_association": "NONE", "body": "@simonw \r\n\r\nDid you find the reason? I had similar situation and I check this on millions ways. I am sure app doesn't consume such memory.\r\n\r\nI was trying the app with:\r\n`docker run --rm -it -p 80:80 -m 128M foo`\r\n\r\nI was watching app with `docker stats`. Even limited memory by `CMD [\"java\", \"-Xms60M\", \"-Xmx60M\", \"-jar\", \"api.jar\"]`.\r\nChecked memory usage by app in code and print bash commands. The app definitely doesn't use this memory. Also doesn't write files.\r\n\r\nOnly one solution is to change memory to 512M.\r\n\r\nIt is definitely something wrong with `cloud run`.\r\n\r\nI even did special app for testing this. It looks like when I cross very small amount of code / memory / app size in random when, then memory needs grow +hundreds. Nothing make sense here. Especially it works everywhere expect cloud run.\r\n\r\nPlease let me know if you discovered something more.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 576582604, "label": "datasette publish cloudrun --memory option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-648234787", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 648234787, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODIzNDc4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T15:22:51Z", "updated_at": "2020-06-23T15:22:51Z", "author_association": "OWNER", "body": "I wonder if this is a SQLite caching issue then?\n\nDatasette has a configuration option for this but I haven't spent much time experimenting with it so I don't know how much of an impact it can have: https://datasette.readthedocs.io/en/stable/config.html#cache-size-kb", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-648232645", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 648232645, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODIzMjY0NQ==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-23T15:19:53Z", "updated_at": "2020-06-23T15:19:53Z", "author_association": "CONTRIBUTOR", "body": "The issue seems to appear sporadically, like when I return to database page after a while, during which some records have been added to the database.\r\n\r\nI've just visited database, page first visit took ~10s, consecutive visits took 0.3s.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-648163272", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 648163272, "node_id": "MDEyOklzc3VlQ29tbWVudDY0ODE2MzI3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T13:52:23Z", "updated_at": "2020-06-23T13:52:23Z", "author_association": "OWNER", "body": "I'm chunking inserts at 100 at a time right now: https://github.com/simonw/sqlite-utils/blob/4d9a3204361d956440307a57bd18c829a15861db/sqlite_utils/db.py#L1030\r\n\r\nI think the performance is more down to using Faker to create the test data - generating millions of entirely fake, randomized records takes a fair bit of time.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647925594", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647925594, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzkyNTU5NA==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-23T05:55:21Z", "updated_at": "2020-06-23T06:28:29Z", "author_association": "CONTRIBUTOR", "body": "Hmm, not seeing the problem now. \r\nI've removed the commented out sections in `database.py` and restarted the process. Database page now loads in <250ms.\r\n\r\nI have couple of workers that check some pages regularly and scrape new content and save to the DB. Could it be that datasette tries to recount tables every time database size changes? Normally it keeps a count cache, but as DB gets updated so often (new content every 5 min or so) it's practically recounting every time I go to the database page?\r\n\r\nEDIT: \r\nIt turns out it doesn't hold cache with mutable databases.\r\n\r\nI'll update the issue with more findings and a better way to reproduce the problem if I encounter it again.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647936117", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647936117, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzkzNjExNw==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-23T06:25:17Z", "updated_at": "2020-06-23T06:25:17Z", "author_association": "CONTRIBUTOR", "body": "> \r\n> \r\n> ```\r\n> sqlite-generate many-cols.db --tables 2 --rows 200000 --columns 50\r\n> ```\r\n> \r\n> Looks like that will take 35 minutes to run (it's not a particularly fast tool).\r\n\r\nTry chunking write operations into batches every 1000 records or so.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647935300", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647935300, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzkzNTMwMA==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-23T06:23:01Z", "updated_at": "2020-06-23T06:23:01Z", "author_association": "CONTRIBUTOR", "body": "> You said \"200k+, 50+ rows in a couple of tables\" - does that mean 50+ columns? I'll try with larger numbers of columns and see what difference that makes.\r\n\r\nAh that was a typo, I meant 50k.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647923666", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647923666, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzkyMzY2Ng==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-23T05:49:31Z", "updated_at": "2020-06-23T05:49:31Z", "author_association": "CONTRIBUTOR", "body": "I think I should mention that having FTS on all tables mean I have 5 visible, 25 hidden (FTS) tables displayed on database page.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647894903", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647894903, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg5NDkwMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T04:07:59Z", "updated_at": "2020-06-23T04:07:59Z", "author_association": "OWNER", "body": "Just to check: are you seeing the problem on this page: https://latest.datasette.io/fixtures (the database page) - or this page (the table page): https://latest.datasette.io/fixtures/compound_three_primary_keys\r\n\r\nIf it's the table page then the problem may well be #862.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/596#issuecomment-647893140", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/596", "id": 647893140, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg5MzE0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T03:59:51Z", "updated_at": "2020-06-23T03:59:51Z", "author_association": "OWNER", "body": "Related: #862 - a time limit on the total time spent considering suggested facets for a table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 507454958, "label": "Handle really wide tables better"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/862#issuecomment-647892930", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/862", "id": 647892930, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg5MjkzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T03:58:48Z", "updated_at": "2020-06-23T03:58:48Z", "author_association": "OWNER", "body": "Should this be controlled be a separate configuration setting? I'm inclined to say no - I think instead I'll set the limit to be 10 * whatever `facet_suggest_time_limit_ms` is.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 643510821, "label": "Set an upper limit on total facet suggestion time for a page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647890619", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647890619, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg5MDYxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T03:48:21Z", "updated_at": "2020-06-23T03:48:21Z", "author_association": "OWNER", "body": " sqlite-generate many-cols.db --tables 2 --rows 200000 --columns 50\r\n\r\nLooks like that will take 35 minutes to run (it's not a particularly fast tool).\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647890378", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647890378, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg5MDM3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T03:47:19Z", "updated_at": "2020-06-23T03:47:19Z", "author_association": "OWNER", "body": "I generated a 600MB database using [sqlite-generate](https://github.com/simonw/sqlite-generate) just now - with 100 tables at 100,00 rows and 3 tables at 1,000,000 rows - and performance of the database page was fine, 250ms.\r\n\r\nThose tables only had 4 columns each though.\r\n\r\nYou said \"200k+, 50+ rows in a couple of tables\" - does that mean 50+ columns? I'll try with larger numbers of columns and see what difference that makes.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/861#issuecomment-647889674", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/861", "id": 647889674, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Nzg4OTY3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-23T03:44:17Z", "updated_at": "2020-06-23T03:44:17Z", "author_association": "OWNER", "body": "https://github.com/simonw/sqlite-generate is now ready to be used - see also https://pypi.org/project/sqlite-generate/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642652808, "label": "Script to generate larger SQLite test files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/861#issuecomment-647822757", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/861", "id": 647822757, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzgyMjc1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-22T23:40:43Z", "updated_at": "2020-06-22T23:40:43Z", "author_association": "OWNER", "body": "I started building that tool here: https://github.com/simonw/sqlite-generate\r\n\r\n(I built a new cookiecutter template for that too, https://github.com/simonw/click-app)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642652808, "label": "Script to generate larger SQLite test files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-647803394", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 647803394, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzgwMzM5NA==", "user": {"value": 6289012, "label": "ChristopherWilks"}, "created_at": "2020-06-22T22:36:34Z", "updated_at": "2020-06-22T22:36:34Z", "author_association": "NONE", "body": "I also am seeing the same issue with an Apache setup (same even w/o `ProxyPassReverse`, though I typically use it as @tsibley stated).\r\n\r\nBut also want to say thanks for a great tool (this issue not withstanding)!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/861#issuecomment-647266979", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/861", "id": 647266979, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzI2Njk3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-22T04:26:25Z", "updated_at": "2020-06-22T04:26:25Z", "author_association": "OWNER", "body": "I think this is a separate Click utility. I'm going to call it `sqlite-generate`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642652808, "label": "Script to generate larger SQLite test files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-647258199", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 647258199, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzI1ODE5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-22T03:55:20Z", "updated_at": "2020-06-22T03:55:20Z", "author_association": "OWNER", "body": "https://datasette.readthedocs.io/en/latest/testing_plugins.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-647237091", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 647237091, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzIzNzA5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-22T02:44:10Z", "updated_at": "2020-06-22T02:44:10Z", "author_association": "OWNER", "body": "Now split into four pages:\r\n\r\n- https://datasette.readthedocs.io/en/latest/plugins.html\r\n- https://datasette.readthedocs.io/en/latest/writing_plugins.html\r\n- https://datasette.readthedocs.io/en/latest/plugin_hooks.html\r\n- https://datasette.readthedocs.io/en/latest/internals.html\r\n\r\nStill need to add the \"Testing plugins\" page, then I can close this issue.\r\n\r\nI should also do #855, documenting the new `datasette-plugin` cookiecutter template. That can go in `writing_plugins.rst`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-647203845", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 647203845, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzIwMzg0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-22T00:32:42Z", "updated_at": "2020-06-22T00:32:42Z", "author_association": "OWNER", "body": "Maybe add this to the plugins.rst page near the top:\r\n```\r\n\r\n.. toctree::\r\n :caption: See also\r\n :maxdepth: 1\r\n\r\n plugin_hooks\r\n internals\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647194131", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647194131, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE5NDEzMQ==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-21T23:15:54Z", "updated_at": "2020-06-21T23:26:09Z", "author_association": "CONTRIBUTOR", "body": "I'm not sure if table counts are to blame. There shouldn't be a ~3 orders of magnitude difference.\r\n\r\n```fish\r\nuser@klein /a/w/scrapyard (master)> set sql \"select count(*) from table_1; select count(*) from table_2; select count(*) from table_3;\"\r\nuser@klein /a/w/scrapyard (master)> time sqlite3 scrapyard.db \"$sql\"\r\n187489\r\n46492\r\n2229\r\n\r\n________________________________________________________\r\nExecuted in 25.57 millis fish external\r\n usr time 3.55 millis 0.00 micros 3.55 millis\r\n sys time 22.42 millis 1123.00 micros 21.30 millis\r\n```\r\n\r\nbut not letting datasette count the tables definitely helps.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-647190177", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 647190177, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE5MDE3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:32:36Z", "updated_at": "2020-06-21T22:32:36Z", "author_association": "OWNER", "body": "I'm going to break out the plugin hooks first in a single commit to make for a cleaner commit history (since that way git can hopefully detect that the content moved).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-647190144", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 647190144, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE5MDE0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:32:13Z", "updated_at": "2020-06-21T22:32:13Z", "author_association": "OWNER", "body": "So the new plan is NOT to have a `plugins/` folder, but instead have several top-level pages:\r\n\r\n- Plugins (exists)\r\n- Writing plugins\r\n- Plugin hooks\r\n- Testing plugins\r\n- Internals for plugins (exists)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647189948", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647189948, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE4OTk0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:30:12Z", "updated_at": "2020-06-21T22:30:43Z", "author_association": "OWNER", "body": "I'll write a little script which generates a 300MB SQLite file with a bunch of tables with lots of randomly generated rows in to help test this.\r\n\r\nHaving a tool like that which can generate larger databases with different gnarly performance characteristics will be useful for other performance work too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647189666", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647189666, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE4OTY2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:26:55Z", "updated_at": "2020-06-21T22:26:55Z", "author_association": "OWNER", "body": "This makes a lot of sense. I implemented the mechanism for the index page because I have my own instance of Datasette that was running slow, but it had a dozen database files attached to it. I've not run into this with a single giant database file but it absolutely makes sense that the same optimization would be necessary for the database page there too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/860#issuecomment-647189535", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/860", "id": 647189535, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE4OTUzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:25:16Z", "updated_at": "2020-06-21T22:25:27Z", "author_association": "OWNER", "body": "This is also relevant to #639, and may mean I can close that ticket in place of this one. I'm going to get this at least to a proof-of-concept stage first though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642651572, "label": "Plugin hook for instance/database/table metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/357#issuecomment-647189045", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/357", "id": 647189045, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzE4OTA0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-21T22:19:58Z", "updated_at": "2020-06-21T22:19:58Z", "author_association": "OWNER", "body": "I'm going to take this in a different direction.\r\n\r\nI'm not happy with how `metadata.(json|yaml)` keeps growing new features. Rather than having a single plugin hook for all of `metadata.json` I'm going to split out the feature that shows actual real metadata for tables and databases - `source`, `license` etc - into its own plugin-powered mechanism.\r\n\r\nSo I'm going to close this ticket and spin up a new one for that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 348043884, "label": "Plugin hook for loading metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/859#issuecomment-647135713", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/859", "id": 647135713, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NzEzNTcxMw==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-21T14:30:02Z", "updated_at": "2020-06-21T14:30:02Z", "author_association": "CONTRIBUTOR", "body": "Oops, the same method is called from both index and database pages. But removing select count queries speed up the page load quite a bit.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642572841, "label": "Database page loads too slowly with many large tables (due to table counts)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-646938984", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 646938984, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzODk4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T04:22:25Z", "updated_at": "2020-06-20T04:23:02Z", "author_association": "OWNER", "body": "I think I want the \"Plugin hooks\" page to be top-level, parallel to \"Plugins\" and \"Internals for Plugins\". It's the page of documentation refer to most often so I don't want to have to click down a hierarchy from the side navigation to find it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-646930455", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 646930455, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDQ1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:22:21Z", "updated_at": "2020-06-20T03:22:21Z", "author_association": "OWNER", "body": "The tutorial can start by showing how to use the new cookiecutter template from #642.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/855#issuecomment-646930365", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/855", "id": 646930365, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDM2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:21:48Z", "updated_at": "2020-06-20T03:21:48Z", "author_association": "OWNER", "body": "Maybe I should also refactor the plugin documentation, as contemplated in #687.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642127307, "label": "Add instructions for using cookiecutter plugin template to plugin docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/642#issuecomment-646930160", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/642", "id": 646930160, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDE2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:20:25Z", "updated_at": "2020-06-20T03:20:25Z", "author_association": "OWNER", "body": "Shipped this today! https://github.com/simonw/datasette-plugin is a cookiecutter template for creating new plugins.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 529429214, "label": "Provide a cookiecutter template for creating new plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/642#issuecomment-646930059", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/642", "id": 646930059, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDA1OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:19:57Z", "updated_at": "2020-06-20T03:19:57Z", "author_association": "OWNER", "body": "@psychemedia sorry I missed your comment before.\r\n\r\nNiche Museums is definitely the best example of custom templates at the moment: https://github.com/simonw/museums/tree/master/templates\r\n\r\nI want to comprehensively document the variables made available to custom templates before shipping Datasette 1.0 - just filed that as #857.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 529429214, "label": "Provide a cookiecutter template for creating new plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/855#issuecomment-646928638", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/855", "id": 646928638, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkyODYzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:09:41Z", "updated_at": "2020-06-20T03:09:41Z", "author_association": "OWNER", "body": "I've shipped the cookiecutter template and used it to build https://github.com/simonw/datasette-saved-queries - it's ready to add to the official documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642127307, "label": "Add instructions for using cookiecutter plugin template to plugin docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646905073", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646905073, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkwNTA3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T00:21:34Z", "updated_at": "2020-06-20T00:22:28Z", "author_association": "OWNER", "body": "New repo: https://github.com/simonw/datasette-saved-queries - which I created using the new cookiecutter template at https://github.com/simonw/datasette-plugin", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646760805", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646760805, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Njc2MDgwNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T17:07:45Z", "updated_at": "2020-06-19T17:07:45Z", "author_association": "OWNER", "body": "Plugin idea: `datasette-saved-queries` - it uses the `startup` hook to initialize a `saved_queries` table, then uses the `canned_queries` hook to add a writable canned query for saving records to that table.\r\n\r\nThen it returns any queries from that table as additional canned queries.\r\n\r\nBonus idea: it could write the user's actor_id to a column if they are signed in, and provide a link to see \"just my saved queries\" in that case.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-646686493", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 646686493, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjY4NjQ5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T15:04:51Z", "updated_at": "2020-06-19T15:04:51Z", "author_association": "OWNER", "body": "https://twitter.com/jaffathecake/status/1273983493006077952 concerns what happens to open pull requests - they will automatically close when you remove `master` unless you repoint them to `main` first.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396772", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396772, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5Njc3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:16:47Z", "updated_at": "2020-06-19T02:16:47Z", "author_association": "OWNER", "body": "I'll close this once I've built a plugin against it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396690", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396690, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5NjY5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:16:24Z", "updated_at": "2020-06-19T02:16:24Z", "author_association": "OWNER", "body": "Documentation: https://datasette.readthedocs.io/en/latest/plugins.html#canned-queries-datasette-database-actor", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396499", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396499, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5NjQ5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:15:49Z", "updated_at": "2020-06-19T02:15:58Z", "author_association": "OWNER", "body": "Released an alpha preview in https://github.com/simonw/datasette/releases/tag/0.45a1\r\n\r\nWrote about this here: https://simonwillison.net/2020/Jun/19/datasette-alphas/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646350530", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646350530, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM1MDUzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T23:13:57Z", "updated_at": "2020-06-18T23:14:11Z", "author_association": "OWNER", "body": "```python\r\n@hookspec\r\ndef canned_queries(datasette, database, actor):\r\n \"Return a dictionary of canned query definitions or an awaitable function that returns them\"\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646329456", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646329456, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMyOTQ1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T22:07:09Z", "updated_at": "2020-06-18T22:07:37Z", "author_association": "OWNER", "body": "It would be neat if the queries returned by this hook could be restricted to specific users. I think I can do that by returning an \"allow\" block as part of the query.\r\n\r\nBut... what if we allow users to save private queries and we might have thousands of users each with hundreds of saved queries?\r\n\r\nFor that case it would be good if the plugin hook could take an optional `actor` parameter.\r\n\r\nThis would also allow us to dynamically generate a canned query for \"return the bookmarks belonging to this actor\" or similar!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646320237", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646320237, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMyMDIzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:41:16Z", "updated_at": "2020-06-18T21:41:16Z", "author_association": "OWNER", "body": "https://pypi.org/project/datasette/0.45a0/ is the release on PyPI.\r\n\r\nAnd in a fresh virtual environment:\r\n\r\n```\r\n$ pip install datasette==0.45a0\r\n...\r\n$ datasette --version\r\ndatasette, version 0.45a0\r\n```\r\nBut running `pip install datasette` still gets 0.44.\r\n\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646319315", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646319315, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMxOTMxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:38:56Z", "updated_at": "2020-06-18T21:38:56Z", "author_association": "OWNER", "body": "This worked!\r\n\r\nhttps://pypi.org/project/datasette/#history\r\n\r\n\"Banners_and_Alerts_and_datasette_\u00b7_PyPI\"\r\n\r\nhttps://github.com/simonw/datasette/releases/tag/0.45a0 is my manually created GitHub prerelease.\r\n\r\nhttps://datasette.readthedocs.io/en/latest/changelog.html#a0-2020-06-18 has the release notes.\r\n\r\nA shame Read The Docs doesn't seem to build the docs for these releases -it's not showing the tag in the releases pane here:\r\n\r\n\"Changelog_\u2014_Datasette_documentation\"\r\n\r\nAlso the new tag isn't an option in the Build menu on https://readthedocs.org/projects/datasette/builds/\r\n\r\nNot a big problem though since the \"latest\" tag on Read The Docs will still carry the in-development documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646308467", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646308467, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwODQ2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:12:50Z", "updated_at": "2020-06-18T21:12:50Z", "author_association": "OWNER", "body": "Problem there is Login CSRF attacks: https://cheatsheetseries.owasp.org/cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.html#login-csrf - I still want to perform CSRF checks on login forms, even though the user may not yet have any cookies.\r\n\r\nMaybe I can turn off CSRF checks for cookie-free requests but allow login forms to specifically opt back in to CSRF protection?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646307083", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646307083, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwNzA4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:09:35Z", "updated_at": "2020-06-18T21:09:35Z", "author_association": "OWNER", "body": "So maybe one really easy fix here is to disable CSRF checks entirely for any request that doesn't have any cookies? Also suggested here: https://twitter.com/mrkurt/status/1273682965168603137", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646303240", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646303240, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwMzI0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:00:41Z", "updated_at": "2020-06-18T21:00:41Z", "author_association": "OWNER", "body": "New documentation about the alpha/beta releases: https://datasette.readthedocs.io/en/latest/contributing.html#contributing-alpha-beta", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646302909", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646302909, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwMjkwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:00:02Z", "updated_at": "2020-06-18T21:00:02Z", "author_association": "OWNER", "body": "Alpha release is running through Travis now: https://travis-ci.org/github/simonw/datasette/builds/699864168", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646293670", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646293670, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MzY3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:38:50Z", "updated_at": "2020-06-18T20:38:50Z", "author_association": "OWNER", "body": "https://pypi.org/project/datasette-render-images/#history worked:\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nI'm now confident enough that I'll make these changes and ship an alpha of Datasette itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646293029", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646293029, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MzAyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:37:28Z", "updated_at": "2020-06-18T20:37:46Z", "author_association": "OWNER", "body": "Here's the Read The Docs documentation on versioned releases: https://docs.readthedocs.io/en/stable/versions.html\r\n\r\nIt looks like they do the right thing:\r\n\r\n> We in fact are parsing your tag names against the rules given by PEP 440. This spec allows \u201cnormal\u201d version numbers like 1.4.2 as well as pre-releases. An alpha version or a release candidate are examples of pre-releases and they look like this: 2.0a1.\r\n> \r\n> We only consider non pre-releases for the stable version of your documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646292578", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646292578, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MjU3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:36:22Z", "updated_at": "2020-06-18T20:36:22Z", "author_association": "OWNER", "body": "https://travis-ci.com/github/simonw/datasette-render-images/builds/172118541 demonstrates that the alpha/beta conditional is working as intended:\r\n\r\n\"Banners_and_Alerts_and_Build__13_-_simonw_datasette-render-images_-_Travis_CI\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646291309", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646291309, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MTMwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:33:31Z", "updated_at": "2020-06-18T20:33:31Z", "author_association": "OWNER", "body": "One more experiment: I'm going to ship `datasette-render-images` 0.2 and see if that works correctly - including printing out the new debug section I put in the Travis config here: https://github.com/simonw/datasette-render-images/blob/6b5f22dab75ca364f671f5597556d2665a251bd8/.travis.yml#L35-L39 - which should demonstrate if my conditional for pushing to Docker Hub will work or not.\r\n\r\nIn the alpha releasing run on Travis that echo statement did NOT execute: https://travis-ci.com/github/simonw/datasette-render-images/builds/172116625", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646290171", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646290171, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MDE3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:30:48Z", "updated_at": "2020-06-18T20:30:48Z", "author_association": "OWNER", "body": "OK, I just shipped 0.2a0 of `datasette-render-images` - https://pypi.org/project/datasette-render-images/ has no indication of that:\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nBut this page does: https://pypi.org/project/datasette-render-images/#history\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nAnd https://pypi.org/project/datasette-render-images/0.2a0/ exists.\r\n\r\nIn a fresh virtual environment `pip install datasette-render-images` gets 0.1.\r\n\r\n`pip install datasette-render-images==0.2a0` gets 0.2a0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646288146", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646288146, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI4ODE0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:26:22Z", "updated_at": "2020-06-18T20:26:31Z", "author_association": "OWNER", "body": "Useful tip from Carlton Gibson: https://twitter.com/carltongibson/status/1273680590672453632\r\n\r\n> DRF makes ALL views CSRF exempt and then enforces CSRF if you're using Session auth only. \r\n>\r\n> View: https://github.com/encode/django-rest-framework/blob/e18e40d6ae42457f60ca9c68054ad40d15ba8433/rest_framework/views.py#L144\r\n> Auth: https://github.com/encode/django-rest-framework/blob/e18e40d6ae42457f60ca9c68054ad40d15ba8433/rest_framework/authentication.py#L130", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646280134", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646280134, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI4MDEzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:08:15Z", "updated_at": "2020-06-18T20:08:15Z", "author_association": "OWNER", "body": "https://github.com/simonw/datasette-render-images uses Travis and is low-risk for trying this out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646279428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646279428, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3OTQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:06:43Z", "updated_at": "2020-06-18T20:06:43Z", "author_association": "OWNER", "body": "I'm going to try this on a separate repository so I don't accidentally publish a Datasette release I didn't mean to publish!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646279280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646279280, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3OTI4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:06:24Z", "updated_at": "2020-06-18T20:06:24Z", "author_association": "OWNER", "body": "So maybe this condition is right?\r\n\r\n if: (tag IS present) AND NOT (tag =~ [ab])", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646278801", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646278801, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3ODgwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:05:18Z", "updated_at": "2020-06-18T20:05:18Z", "author_association": "OWNER", "body": "Travis conditions documentation: https://docs.travis-ci.com/user/conditions-v1\r\n\r\nThese look useful:\r\n```\r\nbranch =~ /^(one|two)-three$/\r\n(tag =~ ^v) AND (branch = master)\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646277680", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646277680, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NzY4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:02:42Z", "updated_at": "2020-06-18T20:02:42Z", "author_association": "OWNER", "body": "So I think if I push a tag of `0.45a0` everything might just work - Travis will build it, push the build to PyPI, PyPI won't treat it as a stable release.\r\n\r\nExcept... I don't want to push alphas as Docker images - so I need to fix this code:\r\n\r\nhttps://github.com/simonw/datasette/blob/6151c25a5a8d566c109af296244b9267c536bd9a/.travis.yml#L34-L43", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646277155", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646277155, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NzE1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:01:31Z", "updated_at": "2020-06-18T20:01:31Z", "author_association": "OWNER", "body": "I thought I might have to update a regex (my CircleCI configs won't match on `a0`, [example](https://github.com/simonw/datasette-publish-now/blob/420f349b278857f62183d8e9835d64f116758be7/.circleci/config.yml#L22)) but it turns out Travis is currently configured to treat ALL tags as potential releases:\r\n\r\nhttps://github.com/simonw/datasette/blob/6151c25a5a8d566c109af296244b9267c536bd9a/.travis.yml#L21-L35", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646276150", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646276150, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NjE1MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:59:17Z", "updated_at": "2020-06-18T19:59:17Z", "author_association": "OWNER", "body": "Relevant PEP: https://www.python.org/dev/peps/pep-0440/\r\n\r\nDjango's implementation dates back 8 years: https://github.com/django/django/commit/40f0ecc56a23d35c2849f8e79276f6d8931412d1\r\n\r\nFrom the PEP:\r\n\r\n> Implicit pre-release number\r\n>\r\n> Pre releases allow omitting the numeral in which case it is implicitly assumed to be 0. The normal form for this is to include the 0 explicitly. This allows versions such as 1.2a which is normalized to 1.2a0.\r\n\r\nI'm going to habitually include the 0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646273035", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646273035, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MzAzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:52:28Z", "updated_at": "2020-06-18T19:52:28Z", "author_association": "OWNER", "body": "I'd like this soon, because I want to start experimenting with things like #852 and #842 without shipping those plugin hooks in a full stable release.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646272627", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646272627, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MjYyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:51:32Z", "updated_at": "2020-06-18T19:51:32Z", "author_association": "OWNER", "body": "I'd be OK with the first version of this not including a plugin hook.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646264051", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646264051, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI2NDA1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:32:13Z", "updated_at": "2020-06-18T19:32:37Z", "author_association": "OWNER", "body": "If every magic parameter has a prefix and suffix, like `_request_ip` and `_actor_id`, then plugins could register a function for a prefix. Register a function to `_actor` and `actor(\"id\")`will be called for `_actor_id`.\r\n\r\nBut does it make sense for every magic parameter to be of form `_a_b`? I think so.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646246062", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646246062, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI0NjA2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:54:41Z", "updated_at": "2020-06-18T18:54:41Z", "author_association": "OWNER", "body": "The `_actor_id` param makes this a bit trickier, because we can't just say \"if you see an unknown parameter called X call this function\" - our magic parameter logic isn't adding single parameters, it might add a whole family of them.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646242172", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646242172, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI0MjE3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:46:06Z", "updated_at": "2020-06-18T18:53:31Z", "author_association": "OWNER", "body": "Yes that can work - and using `__missing__` (new in Python 3) is nicer because then the regular dictionary gets checked first:\r\n```python\r\nimport sqlite3\r\n\r\nconn = sqlite3.connect(\":memory:\")\r\n\r\n\r\nclass Magic(dict):\r\n def __missing__(self, key):\r\n return key.upper()\r\n\r\n\r\nconn.execute(\"select :name\", Magic()).fetchall()\r\n```\r\nOutputs:\r\n```\r\n[('NAME',)]\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646238702", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646238702, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIzODcwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:39:07Z", "updated_at": "2020-06-18T18:39:07Z", "author_association": "OWNER", "body": "It would be nice if Datasette didn't have to do any additional work to find e.g. `_request_ip` if that parameter turned out not to be used by the query.\r\n\r\nCould I do this with a custom class that implements `__getitem__()` and then gets passed as SQLite arguments?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/820#issuecomment-646218809", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/820", "id": 646218809, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxODgwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:58:02Z", "updated_at": "2020-06-18T17:58:02Z", "author_association": "OWNER", "body": "I had the same idea again ten days later: #852.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 635049296, "label": "Idea: Plugin hook for registering canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646217766", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646217766, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNzc2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:55:54Z", "updated_at": "2020-06-18T17:56:04Z", "author_association": "OWNER", "body": "Idea: a mechanism where the `asgi_csrf()` can take an optional `should_protect()` callback function which gets called with the `scope` and decides if the current request should be protected or not. It can then look at headers and paths and suchlike and make its own decisions. Datasette could then provide a `should_protect()` callback which can interact with plugins.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646216934", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646216934, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNjkzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:54:14Z", "updated_at": "2020-06-18T17:54:14Z", "author_association": "OWNER", "body": "> if you did Origin based CSRF checks, then could the absence of an Origin header be used?\r\nhttps://twitter.com/cnorthwood/status/1273674392757829632", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646214158", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646214158, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNDE1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:48:45Z", "updated_at": "2020-06-18T17:48:45Z", "author_association": "OWNER", "body": "I wonder if it's safe to generically say \"Don't do CSRF protection on any request that includes a `Authorization: Bearer...` header - because it's not possible for a regular browser to send that header since the format is different from the header used in browser-based HTTP basic auth?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646209520", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646209520, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIwOTUyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:39:30Z", "updated_at": "2020-06-18T17:40:53Z", "author_association": "OWNER", "body": "`datasette-auth-tokens` could switch to using `asgi_wrapper` instead of `actor_from_request` - then it could add a `scope[\"skip_csrf\"] = True` scope property to indicate that CSRF should not be protected.\r\n\r\nSince `asgi_wrapper` wraps the CSRF protection middleware changes made to the `scope` by an `asgi_wrapper` will be visible to the CSRF middleware:\r\n\r\nhttps://github.com/simonw/datasette/blob/d2aef9f7ef30fa20b1450cd181cf803f44fb4e21/datasette/app.py#L877-L888", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646204308", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646204308, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIwNDMwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:32:41Z", "updated_at": "2020-06-18T17:32:41Z", "author_association": "OWNER", "body": "The only way I can think of for a view to opt-out of CSRF protection is for them to be able to reconfigure the `asgi-csrf` middleware to skip specific URL patterns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646175055", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646175055, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE3NTA1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:00:45Z", "updated_at": "2020-06-18T17:00:45Z", "author_association": "OWNER", "body": "Here's the Rails pattern for this: https://gist.github.com/maxivak/a25957942b6c21a41acd", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646172200", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646172200, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE3MjIwMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:57:45Z", "updated_at": "2020-06-18T16:57:45Z", "author_association": "OWNER", "body": "I think there are a couple of steps to this one.\r\n\r\nThe nature of CSRF is that it's about hijacking existing authentication credentials. If your Datasette site runs without any authentication plugins at all CSRF protection isn't actually useful.\r\n\r\nSome POST endpoints should be able to opt-out of CSRF protection entirely. A writable canned query that accepts anonymous poll submissions for example might determine that CSRF is not needed.\r\n\r\nIf a plugin adds `Authorization: Bearer xxx` token support that plugin should also be able to specify that CSRF protection can be skipped. https://github.com/simonw/datasette-auth-tokens could do this.\r\n\r\nThis means I need two new mechanisms:\r\n\r\n- A way for wrapped views to indicate \"actually don't CSRF protect me\". I'm not sure how feasible this is without a major redesign, since the decision to return a 403 forbidden status is made before the wrapped function has even been called.\r\n- A way for authentication plugins like `datasette-auth-tokens` to say \"CSRF protection is not needed for this request\". This is a bit tricky too, since right now the `actor_from_request` hook doesn't have a channel for information other than returning the actor dictionary.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646151706", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646151706, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE1MTcwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:36:23Z", "updated_at": "2020-06-18T16:36:23Z", "author_association": "OWNER", "body": "Tweeted about this here: https://twitter.com/simonw/status/1273655053170077701", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/853#issuecomment-646140022", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/853", "id": 646140022, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE0MDAyMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:21:53Z", "updated_at": "2020-06-18T16:21:53Z", "author_association": "OWNER", "body": "I have a test that demonstrates this working, but also demonstrates that the CSRF protection from #798 makes this really tricky to work with. I'd like to improve that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640943441, "label": "Ensure register_routes() works for POST"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-645785830", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 645785830, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTc4NTgzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T05:37:00Z", "updated_at": "2020-06-18T05:37:00Z", "author_association": "OWNER", "body": "The easiest way to do this would be with a new plugin hook:\r\n\r\n def canned_queries(datasette, database):\r\n \"\"\"Return a list of canned query definitions\r\n or an awaitable function that returns them\"\r\n\r\nAnother approach would be to make the whole of `metadata.json` customizable by plugins.\r\n\r\nI think I like the dedicated `canned_queries` option better. I'm not happy with the way metadata keeps growing - see #493 - so adding a dedicated hook would be more future proof against other changes I might make to the metadata mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-645781482", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 645781482, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTc4MTQ4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T05:24:55Z", "updated_at": "2020-06-18T05:25:00Z", "author_association": "OWNER", "body": "Question about this on Twitter: https://twitter.com/amjithr/status/1273440766862352384", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645599881", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645599881, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTU5OTg4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-17T20:13:48Z", "updated_at": "2020-06-17T20:13:48Z", "author_association": "MEMBER", "body": "I've now figured out how to compile specific SQLite versions to help replicate this problem: https://github.com/simonw/til/blob/master/sqlite/ld-preload.md\r\n\r\nNext step: replicate the problem!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645515103", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645515103, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTUxNTEwMw==", "user": {"value": 73579, "label": "hpk42"}, "created_at": "2020-06-17T17:30:01Z", "updated_at": "2020-06-17T17:30:01Z", "author_association": "NONE", "body": "It's the one with python3.7::\n\n >>> sqlite3.sqlite_version\n '3.11.0'\n\n \nOn Wed, Jun 17, 2020 at 10:24 -0700, Simon Willison wrote:\n\n> That means your version of SQLite is old enough that it doesn't support the FTS5 extension.\n> \n> Could you share what operating system you're running, and what the output is that you get from running this?\n> \n> python -c 'import sqlite3; print(sqlite3.connect(\":memory:\").execute(\"select sqlite_version()\").fetchone()[0])'\n> \n> I can teach this tool to fall back on FTS4 if FTS5 isn't available.\n> \n> -- \n> You are receiving this because you authored the thread.\n> Reply to this email directly or view it on GitHub:\n> https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645512127\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645512127", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645512127, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTUxMjEyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-17T17:24:22Z", "updated_at": "2020-06-17T17:24:22Z", "author_association": "MEMBER", "body": "That means your version of SQLite is old enough that it doesn't support the FTS5 extension.\r\n\r\nCould you share what operating system you're running, and what the output is that you get from running this?\r\n\r\n python -c 'import sqlite3; print(sqlite3.connect(\":memory:\").execute(\"select sqlite_version()\").fetchone()[0])'\r\n\r\nI can teach this tool to fall back on FTS4 if FTS5 isn't available.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/851#issuecomment-645293374", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/851", "id": 645293374, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTI5MzM3NA==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-17T10:32:02Z", "updated_at": "2020-06-17T10:32:28Z", "author_association": "CONTRIBUTOR", "body": "Welp, I'm an idiot.\r\n\r\nTurns out I had a sneaky comma `,` after `sql` key:\r\n```\r\n... (:name, :url),\r\n```\r\nwhich tells sqlite to expect another `values(...)` list.\r\n\r\nCorrecting the SQL solved the issue. \r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640330278, "label": "Having trouble getting writable canned queries to work"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645068128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645068128, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2ODEyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:52:16Z", "updated_at": "2020-06-16T23:52:16Z", "author_association": "OWNER", "body": "https://aws.amazon.com/blogs/compute/announcing-http-apis-for-amazon-api-gateway/ looks very important here: AWS HTTP APIs were introduced in December 2019 and appear to be a third of the price of API Gateway.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null}