{"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-646938984", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 646938984, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzODk4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T04:22:25Z", "updated_at": "2020-06-20T04:23:02Z", "author_association": "OWNER", "body": "I think I want the \"Plugin hooks\" page to be top-level, parallel to \"Plugins\" and \"Internals for Plugins\". It's the page of documentation refer to most often so I don't want to have to click down a hierarchy from the side navigation to find it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/687#issuecomment-646930455", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/687", "id": 646930455, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDQ1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:22:21Z", "updated_at": "2020-06-20T03:22:21Z", "author_association": "OWNER", "body": "The tutorial can start by showing how to use the new cookiecutter template from #642.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 572896293, "label": "Expand plugins documentation to multiple pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/855#issuecomment-646930365", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/855", "id": 646930365, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDM2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:21:48Z", "updated_at": "2020-06-20T03:21:48Z", "author_association": "OWNER", "body": "Maybe I should also refactor the plugin documentation, as contemplated in #687.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642127307, "label": "Add instructions for using cookiecutter plugin template to plugin docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/642#issuecomment-646930160", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/642", "id": 646930160, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDE2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:20:25Z", "updated_at": "2020-06-20T03:20:25Z", "author_association": "OWNER", "body": "Shipped this today! https://github.com/simonw/datasette-plugin is a cookiecutter template for creating new plugins.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 529429214, "label": "Provide a cookiecutter template for creating new plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/642#issuecomment-646930059", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/642", "id": 646930059, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkzMDA1OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:19:57Z", "updated_at": "2020-06-20T03:19:57Z", "author_association": "OWNER", "body": "@psychemedia sorry I missed your comment before.\r\n\r\nNiche Museums is definitely the best example of custom templates at the moment: https://github.com/simonw/museums/tree/master/templates\r\n\r\nI want to comprehensively document the variables made available to custom templates before shipping Datasette 1.0 - just filed that as #857.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 529429214, "label": "Provide a cookiecutter template for creating new plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/855#issuecomment-646928638", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/855", "id": 646928638, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkyODYzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T03:09:41Z", "updated_at": "2020-06-20T03:09:41Z", "author_association": "OWNER", "body": "I've shipped the cookiecutter template and used it to build https://github.com/simonw/datasette-saved-queries - it's ready to add to the official documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642127307, "label": "Add instructions for using cookiecutter plugin template to plugin docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646905073", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646905073, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjkwNTA3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-20T00:21:34Z", "updated_at": "2020-06-20T00:22:28Z", "author_association": "OWNER", "body": "New repo: https://github.com/simonw/datasette-saved-queries - which I created using the new cookiecutter template at https://github.com/simonw/datasette-plugin", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646760805", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646760805, "node_id": "MDEyOklzc3VlQ29tbWVudDY0Njc2MDgwNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T17:07:45Z", "updated_at": "2020-06-19T17:07:45Z", "author_association": "OWNER", "body": "Plugin idea: `datasette-saved-queries` - it uses the `startup` hook to initialize a `saved_queries` table, then uses the `canned_queries` hook to add a writable canned query for saving records to that table.\r\n\r\nThen it returns any queries from that table as additional canned queries.\r\n\r\nBonus idea: it could write the user's actor_id to a column if they are signed in, and provide a link to see \"just my saved queries\" in that case.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-646686493", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 646686493, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjY4NjQ5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T15:04:51Z", "updated_at": "2020-06-19T15:04:51Z", "author_association": "OWNER", "body": "https://twitter.com/jaffathecake/status/1273983493006077952 concerns what happens to open pull requests - they will automatically close when you remove `master` unless you repoint them to `main` first.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396772", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396772, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5Njc3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:16:47Z", "updated_at": "2020-06-19T02:16:47Z", "author_association": "OWNER", "body": "I'll close this once I've built a plugin against it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396690", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396690, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5NjY5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:16:24Z", "updated_at": "2020-06-19T02:16:24Z", "author_association": "OWNER", "body": "Documentation: https://datasette.readthedocs.io/en/latest/plugins.html#canned-queries-datasette-database-actor", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646396499", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646396499, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM5NjQ5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-19T02:15:49Z", "updated_at": "2020-06-19T02:15:58Z", "author_association": "OWNER", "body": "Released an alpha preview in https://github.com/simonw/datasette/releases/tag/0.45a1\r\n\r\nWrote about this here: https://simonwillison.net/2020/Jun/19/datasette-alphas/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646350530", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646350530, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjM1MDUzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T23:13:57Z", "updated_at": "2020-06-18T23:14:11Z", "author_association": "OWNER", "body": "```python\r\n@hookspec\r\ndef canned_queries(datasette, database, actor):\r\n \"Return a dictionary of canned query definitions or an awaitable function that returns them\"\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-646329456", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 646329456, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMyOTQ1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T22:07:09Z", "updated_at": "2020-06-18T22:07:37Z", "author_association": "OWNER", "body": "It would be neat if the queries returned by this hook could be restricted to specific users. I think I can do that by returning an \"allow\" block as part of the query.\r\n\r\nBut... what if we allow users to save private queries and we might have thousands of users each with hundreds of saved queries?\r\n\r\nFor that case it would be good if the plugin hook could take an optional `actor` parameter.\r\n\r\nThis would also allow us to dynamically generate a canned query for \"return the bookmarks belonging to this actor\" or similar!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646320237", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646320237, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMyMDIzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:41:16Z", "updated_at": "2020-06-18T21:41:16Z", "author_association": "OWNER", "body": "https://pypi.org/project/datasette/0.45a0/ is the release on PyPI.\r\n\r\nAnd in a fresh virtual environment:\r\n\r\n```\r\n$ pip install datasette==0.45a0\r\n...\r\n$ datasette --version\r\ndatasette, version 0.45a0\r\n```\r\nBut running `pip install datasette` still gets 0.44.\r\n\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646319315", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646319315, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMxOTMxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:38:56Z", "updated_at": "2020-06-18T21:38:56Z", "author_association": "OWNER", "body": "This worked!\r\n\r\nhttps://pypi.org/project/datasette/#history\r\n\r\n\"Banners_and_Alerts_and_datasette_\u00b7_PyPI\"\r\n\r\nhttps://github.com/simonw/datasette/releases/tag/0.45a0 is my manually created GitHub prerelease.\r\n\r\nhttps://datasette.readthedocs.io/en/latest/changelog.html#a0-2020-06-18 has the release notes.\r\n\r\nA shame Read The Docs doesn't seem to build the docs for these releases -it's not showing the tag in the releases pane here:\r\n\r\n\"Changelog_\u2014_Datasette_documentation\"\r\n\r\nAlso the new tag isn't an option in the Build menu on https://readthedocs.org/projects/datasette/builds/\r\n\r\nNot a big problem though since the \"latest\" tag on Read The Docs will still carry the in-development documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646308467", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646308467, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwODQ2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:12:50Z", "updated_at": "2020-06-18T21:12:50Z", "author_association": "OWNER", "body": "Problem there is Login CSRF attacks: https://cheatsheetseries.owasp.org/cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.html#login-csrf - I still want to perform CSRF checks on login forms, even though the user may not yet have any cookies.\r\n\r\nMaybe I can turn off CSRF checks for cookie-free requests but allow login forms to specifically opt back in to CSRF protection?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646307083", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646307083, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwNzA4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:09:35Z", "updated_at": "2020-06-18T21:09:35Z", "author_association": "OWNER", "body": "So maybe one really easy fix here is to disable CSRF checks entirely for any request that doesn't have any cookies? Also suggested here: https://twitter.com/mrkurt/status/1273682965168603137", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646303240", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646303240, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwMzI0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:00:41Z", "updated_at": "2020-06-18T21:00:41Z", "author_association": "OWNER", "body": "New documentation about the alpha/beta releases: https://datasette.readthedocs.io/en/latest/contributing.html#contributing-alpha-beta", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646302909", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646302909, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjMwMjkwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T21:00:02Z", "updated_at": "2020-06-18T21:00:02Z", "author_association": "OWNER", "body": "Alpha release is running through Travis now: https://travis-ci.org/github/simonw/datasette/builds/699864168", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646293670", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646293670, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MzY3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:38:50Z", "updated_at": "2020-06-18T20:38:50Z", "author_association": "OWNER", "body": "https://pypi.org/project/datasette-render-images/#history worked:\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nI'm now confident enough that I'll make these changes and ship an alpha of Datasette itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646293029", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646293029, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MzAyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:37:28Z", "updated_at": "2020-06-18T20:37:46Z", "author_association": "OWNER", "body": "Here's the Read The Docs documentation on versioned releases: https://docs.readthedocs.io/en/stable/versions.html\r\n\r\nIt looks like they do the right thing:\r\n\r\n> We in fact are parsing your tag names against the rules given by PEP 440. This spec allows \u201cnormal\u201d version numbers like 1.4.2 as well as pre-releases. An alpha version or a release candidate are examples of pre-releases and they look like this: 2.0a1.\r\n> \r\n> We only consider non pre-releases for the stable version of your documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646292578", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646292578, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MjU3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:36:22Z", "updated_at": "2020-06-18T20:36:22Z", "author_association": "OWNER", "body": "https://travis-ci.com/github/simonw/datasette-render-images/builds/172118541 demonstrates that the alpha/beta conditional is working as intended:\r\n\r\n\"Banners_and_Alerts_and_Build__13_-_simonw_datasette-render-images_-_Travis_CI\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646291309", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646291309, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MTMwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:33:31Z", "updated_at": "2020-06-18T20:33:31Z", "author_association": "OWNER", "body": "One more experiment: I'm going to ship `datasette-render-images` 0.2 and see if that works correctly - including printing out the new debug section I put in the Travis config here: https://github.com/simonw/datasette-render-images/blob/6b5f22dab75ca364f671f5597556d2665a251bd8/.travis.yml#L35-L39 - which should demonstrate if my conditional for pushing to Docker Hub will work or not.\r\n\r\nIn the alpha releasing run on Travis that echo statement did NOT execute: https://travis-ci.com/github/simonw/datasette-render-images/builds/172116625", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646290171", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646290171, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI5MDE3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:30:48Z", "updated_at": "2020-06-18T20:30:48Z", "author_association": "OWNER", "body": "OK, I just shipped 0.2a0 of `datasette-render-images` - https://pypi.org/project/datasette-render-images/ has no indication of that:\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nBut this page does: https://pypi.org/project/datasette-render-images/#history\r\n\r\n\"Banners_and_Alerts_and_datasette-render-images_\u00b7_PyPI\"\r\n\r\nAnd https://pypi.org/project/datasette-render-images/0.2a0/ exists.\r\n\r\nIn a fresh virtual environment `pip install datasette-render-images` gets 0.1.\r\n\r\n`pip install datasette-render-images==0.2a0` gets 0.2a0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646288146", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646288146, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI4ODE0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:26:22Z", "updated_at": "2020-06-18T20:26:31Z", "author_association": "OWNER", "body": "Useful tip from Carlton Gibson: https://twitter.com/carltongibson/status/1273680590672453632\r\n\r\n> DRF makes ALL views CSRF exempt and then enforces CSRF if you're using Session auth only. \r\n>\r\n> View: https://github.com/encode/django-rest-framework/blob/e18e40d6ae42457f60ca9c68054ad40d15ba8433/rest_framework/views.py#L144\r\n> Auth: https://github.com/encode/django-rest-framework/blob/e18e40d6ae42457f60ca9c68054ad40d15ba8433/rest_framework/authentication.py#L130", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646280134", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646280134, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI4MDEzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:08:15Z", "updated_at": "2020-06-18T20:08:15Z", "author_association": "OWNER", "body": "https://github.com/simonw/datasette-render-images uses Travis and is low-risk for trying this out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646279428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646279428, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3OTQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:06:43Z", "updated_at": "2020-06-18T20:06:43Z", "author_association": "OWNER", "body": "I'm going to try this on a separate repository so I don't accidentally publish a Datasette release I didn't mean to publish!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646279280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646279280, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3OTI4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:06:24Z", "updated_at": "2020-06-18T20:06:24Z", "author_association": "OWNER", "body": "So maybe this condition is right?\r\n\r\n if: (tag IS present) AND NOT (tag =~ [ab])", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646278801", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646278801, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3ODgwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:05:18Z", "updated_at": "2020-06-18T20:05:18Z", "author_association": "OWNER", "body": "Travis conditions documentation: https://docs.travis-ci.com/user/conditions-v1\r\n\r\nThese look useful:\r\n```\r\nbranch =~ /^(one|two)-three$/\r\n(tag =~ ^v) AND (branch = master)\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646277680", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646277680, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NzY4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:02:42Z", "updated_at": "2020-06-18T20:02:42Z", "author_association": "OWNER", "body": "So I think if I push a tag of `0.45a0` everything might just work - Travis will build it, push the build to PyPI, PyPI won't treat it as a stable release.\r\n\r\nExcept... I don't want to push alphas as Docker images - so I need to fix this code:\r\n\r\nhttps://github.com/simonw/datasette/blob/6151c25a5a8d566c109af296244b9267c536bd9a/.travis.yml#L34-L43", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646277155", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646277155, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NzE1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T20:01:31Z", "updated_at": "2020-06-18T20:01:31Z", "author_association": "OWNER", "body": "I thought I might have to update a regex (my CircleCI configs won't match on `a0`, [example](https://github.com/simonw/datasette-publish-now/blob/420f349b278857f62183d8e9835d64f116758be7/.circleci/config.yml#L22)) but it turns out Travis is currently configured to treat ALL tags as potential releases:\r\n\r\nhttps://github.com/simonw/datasette/blob/6151c25a5a8d566c109af296244b9267c536bd9a/.travis.yml#L21-L35", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646276150", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646276150, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3NjE1MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:59:17Z", "updated_at": "2020-06-18T19:59:17Z", "author_association": "OWNER", "body": "Relevant PEP: https://www.python.org/dev/peps/pep-0440/\r\n\r\nDjango's implementation dates back 8 years: https://github.com/django/django/commit/40f0ecc56a23d35c2849f8e79276f6d8931412d1\r\n\r\nFrom the PEP:\r\n\r\n> Implicit pre-release number\r\n>\r\n> Pre releases allow omitting the numeral in which case it is implicitly assumed to be 0. The normal form for this is to include the 0 explicitly. This allows versions such as 1.2a which is normalized to 1.2a0.\r\n\r\nI'm going to habitually include the 0.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/807#issuecomment-646273035", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/807", "id": 646273035, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MzAzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:52:28Z", "updated_at": "2020-06-18T19:52:28Z", "author_association": "OWNER", "body": "I'd like this soon, because I want to start experimenting with things like #852 and #842 without shipping those plugin hooks in a full stable release.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 632843030, "label": "Ability to ship alpha and beta releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646272627", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646272627, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI3MjYyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:51:32Z", "updated_at": "2020-06-18T19:51:32Z", "author_association": "OWNER", "body": "I'd be OK with the first version of this not including a plugin hook.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646264051", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646264051, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI2NDA1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T19:32:13Z", "updated_at": "2020-06-18T19:32:37Z", "author_association": "OWNER", "body": "If every magic parameter has a prefix and suffix, like `_request_ip` and `_actor_id`, then plugins could register a function for a prefix. Register a function to `_actor` and `actor(\"id\")`will be called for `_actor_id`.\r\n\r\nBut does it make sense for every magic parameter to be of form `_a_b`? I think so.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646246062", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646246062, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI0NjA2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:54:41Z", "updated_at": "2020-06-18T18:54:41Z", "author_association": "OWNER", "body": "The `_actor_id` param makes this a bit trickier, because we can't just say \"if you see an unknown parameter called X call this function\" - our magic parameter logic isn't adding single parameters, it might add a whole family of them.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646242172", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646242172, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjI0MjE3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:46:06Z", "updated_at": "2020-06-18T18:53:31Z", "author_association": "OWNER", "body": "Yes that can work - and using `__missing__` (new in Python 3) is nicer because then the regular dictionary gets checked first:\r\n```python\r\nimport sqlite3\r\n\r\nconn = sqlite3.connect(\":memory:\")\r\n\r\n\r\nclass Magic(dict):\r\n def __missing__(self, key):\r\n return key.upper()\r\n\r\n\r\nconn.execute(\"select :name\", Magic()).fetchall()\r\n```\r\nOutputs:\r\n```\r\n[('NAME',)]\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/842#issuecomment-646238702", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/842", "id": 646238702, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIzODcwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T18:39:07Z", "updated_at": "2020-06-18T18:39:07Z", "author_association": "OWNER", "body": "It would be nice if Datasette didn't have to do any additional work to find e.g. `_request_ip` if that parameter turned out not to be used by the query.\r\n\r\nCould I do this with a custom class that implements `__getitem__()` and then gets passed as SQLite arguments?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638212085, "label": "Magic parameters for canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/820#issuecomment-646218809", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/820", "id": 646218809, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxODgwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:58:02Z", "updated_at": "2020-06-18T17:58:02Z", "author_association": "OWNER", "body": "I had the same idea again ten days later: #852.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 635049296, "label": "Idea: Plugin hook for registering canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646217766", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646217766, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNzc2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:55:54Z", "updated_at": "2020-06-18T17:56:04Z", "author_association": "OWNER", "body": "Idea: a mechanism where the `asgi_csrf()` can take an optional `should_protect()` callback function which gets called with the `scope` and decides if the current request should be protected or not. It can then look at headers and paths and suchlike and make its own decisions. Datasette could then provide a `should_protect()` callback which can interact with plugins.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646216934", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646216934, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNjkzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:54:14Z", "updated_at": "2020-06-18T17:54:14Z", "author_association": "OWNER", "body": "> if you did Origin based CSRF checks, then could the absence of an Origin header be used?\r\nhttps://twitter.com/cnorthwood/status/1273674392757829632", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646214158", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646214158, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIxNDE1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:48:45Z", "updated_at": "2020-06-18T17:48:45Z", "author_association": "OWNER", "body": "I wonder if it's safe to generically say \"Don't do CSRF protection on any request that includes a `Authorization: Bearer...` header - because it's not possible for a regular browser to send that header since the format is different from the header used in browser-based HTTP basic auth?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646209520", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646209520, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIwOTUyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:39:30Z", "updated_at": "2020-06-18T17:40:53Z", "author_association": "OWNER", "body": "`datasette-auth-tokens` could switch to using `asgi_wrapper` instead of `actor_from_request` - then it could add a `scope[\"skip_csrf\"] = True` scope property to indicate that CSRF should not be protected.\r\n\r\nSince `asgi_wrapper` wraps the CSRF protection middleware changes made to the `scope` by an `asgi_wrapper` will be visible to the CSRF middleware:\r\n\r\nhttps://github.com/simonw/datasette/blob/d2aef9f7ef30fa20b1450cd181cf803f44fb4e21/datasette/app.py#L877-L888", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646204308", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646204308, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjIwNDMwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:32:41Z", "updated_at": "2020-06-18T17:32:41Z", "author_association": "OWNER", "body": "The only way I can think of for a view to opt-out of CSRF protection is for them to be able to reconfigure the `asgi-csrf` middleware to skip specific URL patterns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646175055", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646175055, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE3NTA1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T17:00:45Z", "updated_at": "2020-06-18T17:00:45Z", "author_association": "OWNER", "body": "Here's the Rails pattern for this: https://gist.github.com/maxivak/a25957942b6c21a41acd", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646172200", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646172200, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE3MjIwMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:57:45Z", "updated_at": "2020-06-18T16:57:45Z", "author_association": "OWNER", "body": "I think there are a couple of steps to this one.\r\n\r\nThe nature of CSRF is that it's about hijacking existing authentication credentials. If your Datasette site runs without any authentication plugins at all CSRF protection isn't actually useful.\r\n\r\nSome POST endpoints should be able to opt-out of CSRF protection entirely. A writable canned query that accepts anonymous poll submissions for example might determine that CSRF is not needed.\r\n\r\nIf a plugin adds `Authorization: Bearer xxx` token support that plugin should also be able to specify that CSRF protection can be skipped. https://github.com/simonw/datasette-auth-tokens could do this.\r\n\r\nThis means I need two new mechanisms:\r\n\r\n- A way for wrapped views to indicate \"actually don't CSRF protect me\". I'm not sure how feasible this is without a major redesign, since the decision to return a 403 forbidden status is made before the wrapped function has even been called.\r\n- A way for authentication plugins like `datasette-auth-tokens` to say \"CSRF protection is not needed for this request\". This is a bit tricky too, since right now the `actor_from_request` hook doesn't have a channel for information other than returning the actor dictionary.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/835#issuecomment-646151706", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/835", "id": 646151706, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE1MTcwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:36:23Z", "updated_at": "2020-06-18T16:36:23Z", "author_association": "OWNER", "body": "Tweeted about this here: https://twitter.com/simonw/status/1273655053170077701", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637363686, "label": "Mechanism for skipping CSRF checks on API posts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/853#issuecomment-646140022", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/853", "id": 646140022, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NjE0MDAyMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T16:21:53Z", "updated_at": "2020-06-18T16:21:53Z", "author_association": "OWNER", "body": "I have a test that demonstrates this working, but also demonstrates that the CSRF protection from #798 makes this really tricky to work with. I'd like to improve that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640943441, "label": "Ensure register_routes() works for POST"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-645785830", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 645785830, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTc4NTgzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T05:37:00Z", "updated_at": "2020-06-18T05:37:00Z", "author_association": "OWNER", "body": "The easiest way to do this would be with a new plugin hook:\r\n\r\n def canned_queries(datasette, database):\r\n \"\"\"Return a list of canned query definitions\r\n or an awaitable function that returns them\"\r\n\r\nAnother approach would be to make the whole of `metadata.json` customizable by plugins.\r\n\r\nI think I like the dedicated `canned_queries` option better. I'm not happy with the way metadata keeps growing - see #493 - so adding a dedicated hook would be more future proof against other changes I might make to the metadata mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/852#issuecomment-645781482", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/852", "id": 645781482, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTc4MTQ4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-18T05:24:55Z", "updated_at": "2020-06-18T05:25:00Z", "author_association": "OWNER", "body": "Question about this on Twitter: https://twitter.com/amjithr/status/1273440766862352384", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640917326, "label": "canned_queries() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645599881", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645599881, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTU5OTg4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-17T20:13:48Z", "updated_at": "2020-06-17T20:13:48Z", "author_association": "MEMBER", "body": "I've now figured out how to compile specific SQLite versions to help replicate this problem: https://github.com/simonw/til/blob/master/sqlite/ld-preload.md\r\n\r\nNext step: replicate the problem!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645515103", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645515103, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTUxNTEwMw==", "user": {"value": 73579, "label": "hpk42"}, "created_at": "2020-06-17T17:30:01Z", "updated_at": "2020-06-17T17:30:01Z", "author_association": "NONE", "body": "It's the one with python3.7::\n\n >>> sqlite3.sqlite_version\n '3.11.0'\n\n \nOn Wed, Jun 17, 2020 at 10:24 -0700, Simon Willison wrote:\n\n> That means your version of SQLite is old enough that it doesn't support the FTS5 extension.\n> \n> Could you share what operating system you're running, and what the output is that you get from running this?\n> \n> python -c 'import sqlite3; print(sqlite3.connect(\":memory:\").execute(\"select sqlite_version()\").fetchone()[0])'\n> \n> I can teach this tool to fall back on FTS4 if FTS5 isn't available.\n> \n> -- \n> You are receiving this because you authored the thread.\n> Reply to this email directly or view it on GitHub:\n> https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645512127\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645512127", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47", "id": 645512127, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTUxMjEyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-17T17:24:22Z", "updated_at": "2020-06-17T17:24:22Z", "author_association": "MEMBER", "body": "That means your version of SQLite is old enough that it doesn't support the FTS5 extension.\r\n\r\nCould you share what operating system you're running, and what the output is that you get from running this?\r\n\r\n python -c 'import sqlite3; print(sqlite3.connect(\":memory:\").execute(\"select sqlite_version()\").fetchone()[0])'\r\n\r\nI can teach this tool to fall back on FTS4 if FTS5 isn't available.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639542974, "label": "Fall back to FTS4 if FTS5 is not available"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/851#issuecomment-645293374", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/851", "id": 645293374, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTI5MzM3NA==", "user": {"value": 3243482, "label": "abdusco"}, "created_at": "2020-06-17T10:32:02Z", "updated_at": "2020-06-17T10:32:28Z", "author_association": "CONTRIBUTOR", "body": "Welp, I'm an idiot.\r\n\r\nTurns out I had a sneaky comma `,` after `sql` key:\r\n```\r\n... (:name, :url),\r\n```\r\nwhich tells sqlite to expect another `values(...)` list.\r\n\r\nCorrecting the SQL solved the issue. \r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 640330278, "label": "Having trouble getting writable canned queries to work"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645068128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645068128, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2ODEyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:52:16Z", "updated_at": "2020-06-16T23:52:16Z", "author_association": "OWNER", "body": "https://aws.amazon.com/blogs/compute/announcing-http-apis-for-amazon-api-gateway/ looks very important here: AWS HTTP APIs were introduced in December 2019 and appear to be a third of the price of API Gateway.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/236#issuecomment-645067611", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/236", "id": 645067611, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2NzYxMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:50:12Z", "updated_at": "2020-06-16T23:50:59Z", "author_association": "OWNER", "body": "As for your other questions:\r\n\r\n> 1. I assume the goal is to have a CORS-friendly HTTPS endpoint that hosts the datasette service + user's db.\r\n\r\nYes, exactly. I know this will limit the size of database that can be deployed (since Lambda has a 50MB total package limit as far as I can tell) but there are plenty of interesting databases that are small enough to fit there.\r\n\r\nThe new EFS support for Lambda means that theoretically the size of database is now unlimited, which is really interesting. That's what got me inspired to take a look at a proof of concept in #850.\r\n\r\n> 2. If that's the goal, I think Lambda alone is insufficient. Lambda provides the compute fabric, but not the HTTP routing. You'd also need to add Application Load Balancer or API Gateway to provide an HTTP endpoint that routes to the lambda function.\r\n> \r\n> Do you have a preference between ALB or API GW? ALB has better economics at scale, but has a minimum monthly cost. API GW has worse per-request economics, but scales to zero when no requests are happening.\r\n\r\nI personally like scale-to-zero because many of my projects are likely to receive very little traffic. So API GW first, and maybe ALB as an option later on for people operating at scale?\r\n\r\n> 3. Does Datasette have any native components, or is it all pure python? If it has native bits, they'll likely need to be recompiled to work on Amazon Linux 2.\r\n\r\nAs you've found, the only native component is uvloop which is only needed if uvicorn is being used to serve requests.\r\n\r\n> 4. There are a few disparate services that need to be wired together to expose a Python service securely to the web. If I was doing this outside of the datasette publish system, I'd use an AWS CloudFormation template. Even within datasette, I think it still makes sense to use a CloudFormation template and just have the publish plugin invoke it (via the standard `aws` cli) with user-specified parameters. Does that sound reasonable to you?\r\n\r\nFor the eventual \"datasette publish lambda\" command I want whatever results in the smallest amount of inconvenience for users. I've been trying out Amazon SAM in #850 and it requires users to run Docker on their machines, which is a pretty huge barrier to entry! I don't have much experience with CloudFormation but it's probably a better bet, especially if you can \"pip install\" the dependencies needed to deploy with it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317001500, "label": "datasette publish lambda plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/236#issuecomment-645066486", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/236", "id": 645066486, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2NjQ4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:45:45Z", "updated_at": "2020-06-16T23:45:45Z", "author_association": "OWNER", "body": "Hi Colin,\r\n\r\nSorry I didn't see this sooner! I've just started digging into this myself, to try and play with the new EFS Lambda support: #850.\r\n\r\nYes, uvloop is only needed because of uvicorn. I have a branch here that removes that dependency just for trying out Lambda: https://github.com/simonw/datasette/tree/no-uvicorn - so you can run `pip install https://github.com/simonw/datasette/archive/no-uvicorn.zip` to get that.\r\n\r\nI'm going to try out your `datasette-lambda` project next - really excited to see how far you've got with it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317001500, "label": "datasette publish lambda plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645064332", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645064332, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2NDMzMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:37:34Z", "updated_at": "2020-06-16T23:37:34Z", "author_association": "OWNER", "body": "Just realized Colin Dellow reported an issue with Datasette and Mangum back in April - #719 - and has in fact been working on https://github.com/code402/datasette-lambda for a while!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645063386", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645063386, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2MzM4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:34:07Z", "updated_at": "2020-06-16T23:34:07Z", "author_association": "OWNER", "body": "Tried `sam local invoke`:\r\n```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam local invoke\r\nInvoking app.lambda_handler (python3.8)\r\n\r\nFetching lambci/lambda:python3.8 Docker container image......\r\nMounting /private/tmp/datasette-proof-of-concept/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container\r\nSTART RequestId: 7c04480b-5d42-168e-dec0-4e8bf34fa596 Version: $LATEST\r\n[INFO]\t2020-06-16T23:33:27.24Z\t7c04480b-5d42-168e-dec0-4e8bf34fa596\tWaiting for application startup.\r\n[INFO]\t2020-06-16T23:33:27.24Z\t7c04480b-5d42-168e-dec0-4e8bf34fa596\tLifespanCycleState.STARTUP: 'lifespan.startup.complete' event received from application.\r\n[INFO]\t2020-06-16T23:33:27.24Z\t7c04480b-5d42-168e-dec0-4e8bf34fa596\tApplication startup complete.\r\n[INFO]\t2020-06-16T23:33:27.24Z\t7c04480b-5d42-168e-dec0-4e8bf34fa596\tWaiting for application shutdown.\r\n[INFO]\t2020-06-16T23:33:27.24Z\t7c04480b-5d42-168e-dec0-4e8bf34fa596\tLifespanCycleState.SHUTDOWN: 'lifespan.shutdown.complete' event received from application.\r\n[ERROR] KeyError: 'requestContext'\r\nTraceback (most recent call last):\r\n\u00a0\u00a0File \"/var/task/mangum/adapter.py\", line 110, in __call__\r\n\u00a0\u00a0\u00a0\u00a0return self.handler(event, context)\r\n\u00a0\u00a0File \"/var/task/mangum/adapter.py\", line 130, in handler\r\n\u00a0\u00a0\u00a0\u00a0if \"eventType\" in event[\"requestContext\"]:\r\nEND RequestId: 7c04480b-5d42-168e-dec0-4e8bf34fa596\r\nREPORT RequestId: 7c04480b-5d42-168e-dec0-4e8bf34fa596\tInit Duration: 1120.76 ms\tDuration: 7.08 ms\tBilled Duration: 100 ms\tMemory Size: 128 MBMax Memory Used: 47 MB\t\r\n\r\n{\"errorType\":\"KeyError\",\"errorMessage\":\"'requestContext'\",\"stackTrace\":[\" File \\\"/var/task/mangum/adapter.py\\\", line 110, in __call__\\n return self.handler(event, context)\\n\",\" File \\\"/var/task/mangum/adapter.py\\\", line 130, in handler\\n if \\\"eventType\\\" in event[\\\"requestContext\\\"]:\\n\"]}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645062266", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645062266, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2MjI2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:30:12Z", "updated_at": "2020-06-16T23:33:12Z", "author_association": "OWNER", "body": "OK, changed `requirements.txt` to this:\r\n```\r\nhttps://github.com/simonw/datasette/archive/no-uvicorn.zip\r\nmangum\r\n```\r\nNo `sam build --use-container` runs without errors. Ran `sam deploy` too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645063058", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645063058, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2MzA1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:32:57Z", "updated_at": "2020-06-16T23:32:57Z", "author_association": "OWNER", "body": "https://q7lymja3sj.execute-api.us-east-1.amazonaws.com/Prod/hello/ is now giving me a 500 internal server error.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645061088", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645061088, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2MTA4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:25:41Z", "updated_at": "2020-06-16T23:25:41Z", "author_association": "OWNER", "body": "Someone else ran into this problem: https://github.com/iwpnd/fastapi-aws-lambda-example/issues/1\r\n\r\nSo I need to be able to pip install MOST of Datasette, but skip `uvicorn`. Tricky. I'll try installing a custom fork?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645060598", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645060598, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA2MDU5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:24:01Z", "updated_at": "2020-06-16T23:24:01Z", "author_association": "OWNER", "body": "I changed `requirements.txt` to this:\r\n```\r\ndatasette\r\nmangum\r\n```\r\nAnd `app.py` to this:\r\n```python\r\nfrom datasette.app import Datasette\r\nfrom mangum import Mangum\r\n\r\n\r\ndatasette = Datasette([], memory=True)\r\nlambda_handler = Mangum(datasette.app())\r\n```\r\nBut then when I ran `sam build --use-container` I got this:\r\n```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam build --use-container\r\nStarting Build inside a container\r\nBuilding function 'HelloWorldFunction'\r\n\r\nFetching lambci/lambda:build-python3.8 Docker container image......\r\nMounting /private/tmp/datasette-proof-of-concept/hello_world as /tmp/samcli/source:ro,delegated inside runtime container\r\n\r\nBuild Failed\r\nRunning PythonPipBuilder:ResolveDependencies\r\nError: PythonPipBuilder:ResolveDependencies - {uvloop==0.14.0(wheel)}\r\n```\r\n`uvloop` isn't actually necessary for this project, since it's used by `uvicorn` which isn't needed if Lambda is serving ASGI traffic directly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645059663", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645059663, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1OTY2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:20:46Z", "updated_at": "2020-06-16T23:20:46Z", "author_association": "OWNER", "body": "I added an exclamation mark to hello world and ran `sam deploy` again. https://q7lymja3sj.execute-api.us-east-1.amazonaws.com/Prod/hello/ still shows the old message.\r\n\r\nRunning `sam build --use-container` first and then `sam deploy` did the right thing.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645058947", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645058947, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1ODk0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:18:18Z", "updated_at": "2020-06-16T23:18:18Z", "author_association": "OWNER", "body": "https://q7lymja3sj.execute-api.us-east-1.amazonaws.com/Prod/hello/\r\n\r\nThat's a pretty ugly URL. I'm not sure how to get rid of the `/Prod/` prefix on it. Might have to use the `base_url` setting to get something working: https://datasette.readthedocs.io/en/stable/config.html#base-url ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645058617", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645058617, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1ODYxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:17:09Z", "updated_at": "2020-06-16T23:17:09Z", "author_association": "OWNER", "body": "OK, `sam deploy --guided` now works!\r\n```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam deploy --guided\r\n\r\nConfiguring SAM deploy\r\n======================\r\n\r\n\tLooking for samconfig.toml : Not found\r\n\r\n\tSetting default arguments for 'sam deploy'\r\n\t=========================================\r\n\tStack Name [sam-app]: datasette-proof-of-concept\r\n\tAWS Region [us-east-1]: \r\n\t#Shows you resources changes to be deployed and require a 'Y' to initiate deploy\r\n\tConfirm changes before deploy [y/N]: \r\n\t#SAM needs permission to be able to create roles to connect to the resources in your template\r\n\tAllow SAM CLI IAM role creation [Y/n]: \r\n\tHelloWorldFunction may not have authorization defined, Is this okay? [y/N]: y\r\n\tSave arguments to samconfig.toml [Y/n]: \r\n\r\n\tLooking for resources needed for deployment: Not found.\r\n\tCreating the required resources...\r\n\tSuccessfully created!\r\n\r\n\t\tManaged S3 bucket: aws-sam-cli-managed-default-samclisourcebucket-1ksajo4h62s07\r\n\t\tA different default S3 bucket can be set in samconfig.toml\r\n\r\n\tSaved arguments to config file\r\n\tRunning 'sam deploy' for future deployments will use the parameters saved above.\r\n\tThe above parameters can be changed by modifying samconfig.toml\r\n\tLearn more about samconfig.toml syntax at \r\n\thttps://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-config.html\r\n\r\n\tDeploying with following values\r\n\t===============================\r\n\tStack name : datasette-proof-of-concept\r\n\tRegion : us-east-1\r\n\tConfirm changeset : False\r\n\tDeployment s3 bucket : aws-sam-cli-managed-default-samclisourcebucket-1ksajo4h62s07\r\n\tCapabilities : [\"CAPABILITY_IAM\"]\r\n\tParameter overrides : {}\r\n\r\nInitiating deployment\r\n=====================\r\nUploading to datasette-proof-of-concept/0c208b5656a7aeb6186d49bebc595237 535344 / 535344.0 (100.00%)\r\nHelloWorldFunction may not have authorization defined.\r\nUploading to datasette-proof-of-concept/14bd9ce3e21f9c88634d13c0c9b377e4.template 1147 / 1147.0 (100.00%)\r\n\r\nWaiting for changeset to be created..\r\n\r\nCloudFormation stack changeset\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\nOperation LogicalResourceId ResourceType \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\n+ Add HelloWorldFunctionHelloWorldPermissionProd AWS::Lambda::Permission \r\n+ Add HelloWorldFunctionRole AWS::IAM::Role \r\n+ Add HelloWorldFunction AWS::Lambda::Function \r\n+ Add ServerlessRestApiDeployment47fc2d5f9d AWS::ApiGateway::Deployment \r\n+ Add ServerlessRestApiProdStage AWS::ApiGateway::Stage \r\n+ Add ServerlessRestApi AWS::ApiGateway::RestApi \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\nChangeset created successfully. arn:aws:cloudformation:us-east-1:462092780466:changeSet/samcli-deploy1592349262/d685f2de-87c1-4b8e-b13a-67b94f8fc928\r\n\r\n\r\n2020-06-16 16:14:29 - Waiting for stack create/update to complete\r\n\r\nCloudFormation events from changeset\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\nResourceStatus ResourceType LogicalResourceId ResourceStatusReason \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\nCREATE_IN_PROGRESS AWS::IAM::Role HelloWorldFunctionRole - \r\nCREATE_IN_PROGRESS AWS::IAM::Role HelloWorldFunctionRole Resource creation Initiated \r\nCREATE_COMPLETE AWS::IAM::Role HelloWorldFunctionRole - \r\nCREATE_IN_PROGRESS AWS::Lambda::Function HelloWorldFunction Resource creation Initiated \r\nCREATE_IN_PROGRESS AWS::Lambda::Function HelloWorldFunction - \r\nCREATE_COMPLETE AWS::Lambda::Function HelloWorldFunction - \r\nCREATE_IN_PROGRESS AWS::ApiGateway::RestApi ServerlessRestApi Resource creation Initiated \r\nCREATE_IN_PROGRESS AWS::ApiGateway::RestApi ServerlessRestApi - \r\nCREATE_COMPLETE AWS::ApiGateway::RestApi ServerlessRestApi - \r\nCREATE_IN_PROGRESS AWS::Lambda::Permission HelloWorldFunctionHelloWorldPermissi - \r\n onProd \r\nCREATE_IN_PROGRESS AWS::ApiGateway::Deployment ServerlessRestApiDeployment47fc2d5f9 - \r\n d \r\nCREATE_COMPLETE AWS::ApiGateway::Deployment ServerlessRestApiDeployment47fc2d5f9 - \r\n d \r\nCREATE_IN_PROGRESS AWS::ApiGateway::Deployment ServerlessRestApiDeployment47fc2d5f9 Resource creation Initiated \r\n d \r\nCREATE_IN_PROGRESS AWS::Lambda::Permission HelloWorldFunctionHelloWorldPermissi Resource creation Initiated \r\n onProd \r\nCREATE_IN_PROGRESS AWS::ApiGateway::Stage ServerlessRestApiProdStage - \r\nCREATE_COMPLETE AWS::ApiGateway::Stage ServerlessRestApiProdStage - \r\nCREATE_IN_PROGRESS AWS::ApiGateway::Stage ServerlessRestApiProdStage Resource creation Initiated \r\nCREATE_COMPLETE AWS::Lambda::Permission HelloWorldFunctionHelloWorldPermissi - \r\n onProd \r\nCREATE_COMPLETE AWS::CloudFormation::Stack datasette-proof-of-concept - \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\nCloudFormation outputs from deployed stack\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\nOutputs \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\nKey HelloWorldFunctionIamRole \r\nDescription Implicit IAM Role created for Hello World function \r\nValue arn:aws:iam::462092780466:role/datasette-proof-of-concept-HelloWorldFunctionRole-8MIDNIV5ECA6 \r\n\r\nKey HelloWorldApi \r\nDescription API Gateway endpoint URL for Prod stage for Hello World function \r\nValue https://q7lymja3sj.execute-api.us-east-1.amazonaws.com/Prod/hello/ \r\n\r\nKey HelloWorldFunction \r\nDescription Hello World Lambda Function ARN \r\nValue arn:aws:lambda:us-east-1:462092780466:function:datasette-proof-of-concept-HelloWorldFunction-QTF78ZEUDCB \r\n---------------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\nSuccessfully created/updated stack - datasette-proof-of-concept in us-east-1\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645056636", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645056636, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1NjYzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:10:22Z", "updated_at": "2020-06-16T23:10:22Z", "author_association": "OWNER", "body": "Clicking that button generated me an access key ID / access key secret pair. Dropping those into `~/.aws/credentials` using this format:\r\n```\r\n[default]\r\naws_access_key_id = your_access_key_id\r\naws_secret_access_key = your_secret_access_key\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645055200", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645055200, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1NTIwMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:05:48Z", "updated_at": "2020-06-16T23:05:48Z", "author_association": "OWNER", "body": "Logged in as `simon-administrator` I'm using https://console.aws.amazon.com/iam/home?region=us-east-2#/security_credentials to create credentials:\r\n\r\n\"Banners_and_Alerts_and_IAM_Management_Console\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645054206", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645054206, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1NDIwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:02:54Z", "updated_at": "2020-06-16T23:04:59Z", "author_association": "OWNER", "body": "I think I need to sign in to the AWS console with this new `simon-administrator` account and create IAM credentials for it.\r\n\r\n... for which I needed my root \"account ID\" - a 12 digit number - to use on the IAM login form.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645053923", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645053923, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1MzkyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T23:01:49Z", "updated_at": "2020-06-16T23:01:49Z", "author_association": "OWNER", "body": "I used https://console.aws.amazon.com/billing/home?#/account and activated \"IAM user/role access to billing information\" - what a puzzling first step!\r\n\r\nI created a new user with AWS console access (which means access to the web UI) called `simon-administrator` and set a password. I created an `Administrators` group with `AdministratorAccess`.\r\n\r\n\"Banners_and_Alerts_and_IAM_Management_Console\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645051972", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645051972, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1MTk3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:55:04Z", "updated_at": "2020-06-16T22:55:04Z", "author_association": "OWNER", "body": "```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam deploy --guided\r\n\r\nConfiguring SAM deploy\r\n======================\r\n\r\n\tLooking for samconfig.toml : Not found\r\n\r\n\tSetting default arguments for 'sam deploy'\r\n\t=========================================\r\n\tStack Name [sam-app]: datasette-proof-of-concept\r\n\tAWS Region [us-east-1]: \r\n\t#Shows you resources changes to be deployed and require a 'Y' to initiate deploy\r\n\tConfirm changes before deploy [y/N]: y\r\n\t#SAM needs permission to be able to create roles to connect to the resources in your template\r\n\tAllow SAM CLI IAM role creation [Y/n]: y\r\n\tHelloWorldFunction may not have authorization defined, Is this okay? [y/N]: y\r\n\tSave arguments to samconfig.toml [Y/n]: y\r\nError: Failed to create managed resources: Unable to locate credentials\r\n```\r\nI need to get my AWS credentials sorted. I'm going to follow https://docs.aws.amazon.com/IAM/latest/UserGuide/getting-started_create-admin-group.html and https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-getting-started-set-up-credentials.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645051370", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645051370, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1MTM3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:53:05Z", "updated_at": "2020-06-16T22:53:05Z", "author_association": "OWNER", "body": "```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam local invoke\r\nInvoking app.lambda_handler (python3.8)\r\n\r\nFetching lambci/lambda:python3.8 Docker container image....................................................................................................................................................................................................................................\r\nMounting /private/tmp/datasette-proof-of-concept/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container\r\nSTART RequestId: 4616ab43-6882-1627-e5e3-5a29730d52f9 Version: $LATEST\r\nEND RequestId: 4616ab43-6882-1627-e5e3-5a29730d52f9\r\nREPORT RequestId: 4616ab43-6882-1627-e5e3-5a29730d52f9\tInit Duration: 140.84 ms\tDuration: 2.49 ms\tBilled Duration: 100 ms\tMemory Size: 128 MBMax Memory Used: 25 MB\t\r\n\r\n{\"statusCode\":200,\"body\":\"{\\\"message\\\": \\\"hello world\\\"}\"}\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam local invoke\r\nInvoking app.lambda_handler (python3.8)\r\n\r\nFetching lambci/lambda:python3.8 Docker container image......\r\nMounting /private/tmp/datasette-proof-of-concept/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container\r\nSTART RequestId: 3189df2f-e9c0-1be4-b9ac-f329c5fcd067 Version: $LATEST\r\nEND RequestId: 3189df2f-e9c0-1be4-b9ac-f329c5fcd067\r\nREPORT RequestId: 3189df2f-e9c0-1be4-b9ac-f329c5fcd067\tInit Duration: 87.22 ms\tDuration: 2.34 ms\tBilled Duration: 100 ms\tMemory Size: 128 MB\tMax Memory Used: 25 MB\t\r\n\r\n{\"statusCode\":200,\"body\":\"{\\\"message\\\": \\\"hello world\\\"}\"}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645050948", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645050948, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA1MDk0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:51:30Z", "updated_at": "2020-06-16T22:52:30Z", "author_association": "OWNER", "body": "```\r\nsimon@Simons-MacBook-Pro datasette-proof-of-concept % sam build --use-container\r\nStarting Build inside a container\r\nBuilding function 'HelloWorldFunction'\r\n\r\nFetching lambci/lambda:build-python3.8 Docker container image..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................\r\nMounting /private/tmp/datasette-proof-of-concept/hello_world as /tmp/samcli/source:ro,delegated inside runtime container\r\n\r\nBuild Succeeded\r\n\r\nBuilt Artifacts : .aws-sam/build\r\nBuilt Template : .aws-sam/build/template.yaml\r\n\r\nCommands you can use next\r\n=========================\r\n[*] Invoke Function: sam local invoke\r\n[*] Deploy: sam deploy --guided\r\n \r\nRunning PythonPipBuilder:ResolveDependencies\r\nRunning PythonPipBuilder:CopySource\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645048062", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645048062, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA0ODA2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:41:33Z", "updated_at": "2020-06-16T22:41:33Z", "author_association": "OWNER", "body": "```\r\nsimon@Simons-MacBook-Pro /tmp % sam init\r\n\r\n\tSAM CLI now collects telemetry to better understand customer needs.\r\n\r\n\tYou can OPT OUT and disable telemetry collection by setting the\r\n\tenvironment variable SAM_CLI_TELEMETRY=0 in your shell.\r\n\tThanks for your help!\r\n\r\n\tLearn More: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-telemetry.html\r\n\r\nWhich template source would you like to use?\r\n\t1 - AWS Quick Start Templates\r\n\t2 - Custom Template Location\r\nChoice: 1\r\n\r\nWhich runtime would you like to use?\r\n\t1 - nodejs12.x\r\n\t2 - python3.8\r\n\t3 - ruby2.7\r\n\t4 - go1.x\r\n\t5 - java11\r\n\t6 - dotnetcore3.1\r\n\t7 - nodejs10.x\r\n\t8 - python3.7\r\n\t9 - python3.6\r\n\t10 - python2.7\r\n\t11 - ruby2.5\r\n\t12 - java8\r\n\t13 - dotnetcore2.1\r\nRuntime: 2\r\n\r\nProject name [sam-app]: datasette-proof-of-concept\r\n\r\nCloning app templates from https://github.com/awslabs/aws-sam-cli-app-templates.git\r\n\r\nAWS quick start application templates:\r\n\t1 - Hello World Example\r\n\t2 - EventBridge Hello World\r\n\t3 - EventBridge App from scratch (100+ Event Schemas)\r\n\t4 - Step Functions Sample App (Stock Trader)\r\nTemplate selection: 1\r\n\r\n-----------------------\r\nGenerating application:\r\n-----------------------\r\nName: datasette-proof-of-concept\r\nRuntime: python3.8\r\nDependency Manager: pip\r\nApplication Template: hello-world\r\nOutput Directory: .\r\n\r\nNext steps can be found in the README file at ./datasette-proof-of-concept/README.md\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645047703", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645047703, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA0NzcwMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:40:19Z", "updated_at": "2020-06-16T22:40:19Z", "author_association": "OWNER", "body": "Installed SAM:\r\n```\r\nbrew tap aws/tap\r\nbrew install aws-sam-cli\r\nsam --version\r\nSAM CLI, version 0.52.0\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645045055", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645045055, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA0NTA1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:31:49Z", "updated_at": "2020-06-16T22:31:49Z", "author_association": "OWNER", "body": "It looks like SAM - AWS Serverless Application Model - is the currently recommended way to deploy Python apps to Lambda from the command-line: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-getting-started-hello-world.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645042625", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645042625, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA0MjYyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:24:26Z", "updated_at": "2020-06-16T22:24:26Z", "author_association": "OWNER", "body": "From https://mangum.io/adapter/\r\n\r\n> The AWS Lambda handler `event` and `context` arguments are made available to an ASGI application in the ASGI connection scope.\r\n> \r\n> ```\r\n> scope['aws.event']\r\n> scope['aws.context']\r\n> ```\r\nI can use https://github.com/simonw/datasette-debug-asgi to see that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645041663", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645041663, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTA0MTY2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T22:21:44Z", "updated_at": "2020-06-16T22:21:44Z", "author_association": "OWNER", "body": "https://github.com/jordaneremieff/mangum looks like the best way to run an ASGI app on Lambda at the moment.\r\n\r\n```python\r\nfrom mangum import Mangum\r\n\r\nasync def app(scope, receive, send):\r\n await send(\r\n {\r\n \"type\": \"http.response.start\",\r\n \"status\": 200,\r\n \"headers\": [[b\"content-type\", b\"text/plain; charset=utf-8\"]],\r\n }\r\n )\r\n await send({\"type\": \"http.response.body\", \"body\": b\"Hello, world!\"})\r\n\r\n\r\nhandler = Mangum(app)\r\n``` ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645032643", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645032643, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTAzMjY0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T21:57:10Z", "updated_at": "2020-06-16T21:57:10Z", "author_association": "OWNER", "body": "https://docs.aws.amazon.com/efs/latest/ug/wt1-getting-started.html is an EFS walk-through using the AWS CLI tool instead of clicking around in their web interface.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645031225", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645031225, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTAzMTIyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T21:53:25Z", "updated_at": "2020-06-16T21:53:25Z", "author_association": "OWNER", "body": "Easier solution to this might be to have two functions - a \"read-only\" one which is allowed to scale as much as it likes, and a \"write-only\" one which can write to the database files but is limited to running a maximum of one Lambda instance. https://docs.aws.amazon.com/lambda/latest/dg/invocation-scaling.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/850#issuecomment-645030262", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/850", "id": 645030262, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NTAzMDI2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T21:51:01Z", "updated_at": "2020-06-16T21:51:39Z", "author_association": "OWNER", "body": "File locking is interesting here. https://docs.aws.amazon.com/lambda/latest/dg/services-efs.html\r\n\r\n> Amazon EFS supports [file locking](https://docs.aws.amazon.com/efs/latest/ug/how-it-works.html#consistency) to prevent corruption if multiple functions try to write to the same file system at the same time. Locking in Amazon EFS follows the NFS v4.1 protocol for advisory locking, and enables your applications to use both whole file and byte range locks. \r\n\r\nSQLite can apparently work on NFS v4.1. I think I'd rather set things up so there's only ever one writer - so a Datasette instance could scale reads by running lots more lambda functions but only one function ever writes to a file at a time. Not sure if that's feasible with Lambda though - maybe by adding some additional shared state mechanism like Redis?", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639993467, "label": "Proof of concept for Datasette on AWS Lambda with EFS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/690#issuecomment-644987083", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/690", "id": 644987083, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NDk4NzA4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T20:11:35Z", "updated_at": "2020-06-16T20:11:35Z", "author_association": "OWNER", "body": "Twitter conversation about drop-down menu solutions that are accessible, fast loading and use minimal JavaScript: https://twitter.com/simonw/status/1272974294545395712\r\n\r\nI _really_ like the approach taken by GitHub Primer, which builds on top of HTML `` `
` tags: https://primer.style/css/components/dropdown", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 573755726, "label": "Mechanism for plugins to add action menu items for various things"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-644584075", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 644584075, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NDU4NDA3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-16T07:24:08Z", "updated_at": "2020-06-16T07:24:08Z", "author_association": "OWNER", "body": "This guide is fantastic - I'll be following it closely: https://github.com/chancancode/branch-rename/blob/main/README.md - in particular the Action to mirror master and main for a while.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-644384787", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 644384787, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NDM4NDc4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-15T20:56:07Z", "updated_at": "2020-06-15T20:56:19Z", "author_association": "OWNER", "body": "The big question is how this impacts existing CI configuration. `datasette-psutil` is configured to use Circle CI, what happens if I push a new commit?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-644384417", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 644384417, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NDM4NDQxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-15T20:55:23Z", "updated_at": "2020-06-15T20:55:23Z", "author_association": "OWNER", "body": "I'm doing https://github.com/simonw/datasette-psutil first.\r\n\r\nIn my local checkout:\r\n```\r\ngit branch -m master main\r\ngit push -u origin main\r\n```\r\n(Thanks, https://www.hanselman.com/blog/EasilyRenameYourGitDefaultBranchFromMasterToMain.aspx)\r\n\r\nThen in https://github.com/simonw/datasette-psutil/settings/branches I changed the default branch to `main`.\r\n\r\n\"Branches\"\r\n\r\nLinks to these docs: https://help.github.com/en/github/administering-a-repository/setting-the-default-branch\r\n\r\nThat worked! https://github.com/simonw/datasette-psutil\r\n\r\nOne catch, which I think will impact my most widely used repos the most (like datasette) - linking to a specific file now looks like this:\r\n\r\nhttps://github.com/simonw/datasette-psutil/blob/main/datasette_psutil/__init__.py\r\n\r\nThe old https://github.com/simonw/datasette-psutil/blob/master/datasette_psutil/__init__.py link is presumably frozen in time?\r\n\r\nI've definitely got links spread around the web to my \"most recent version of this code\" that would use the `master` reference, which would need to be updated to `main` instead. Most of those are probably in the Datasette docs and on my blog though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/849#issuecomment-644322234", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/849", "id": 644322234, "node_id": "MDEyOklzc3VlQ29tbWVudDY0NDMyMjIzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-15T19:06:16Z", "updated_at": "2020-06-15T19:06:16Z", "author_association": "OWNER", "body": "I'll make this change on a few of my other repos first to make sure I haven't missed any tricky edge-cases.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 639072811, "label": "Rename master branch to main"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/691#issuecomment-643709037", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/691", "id": 643709037, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzcwOTAzNw==", "user": {"value": 49260, "label": "amjith"}, "created_at": "2020-06-14T02:35:16Z", "updated_at": "2020-06-14T02:35:16Z", "author_association": "CONTRIBUTOR", "body": "The server should reload in the `config_dir` mode. \r\n\r\nRef: #848", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 574021194, "label": "--reload sould reload server if code in --plugins-dir changes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/847#issuecomment-643704730", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/847", "id": 643704730, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzcwNDczMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T01:28:34Z", "updated_at": "2020-06-14T01:28:34Z", "author_association": "OWNER", "body": "Here's the plugin that adds those custom SQLite functions:\r\n```python\r\nfrom datasette import hookimpl\r\nfrom coverage.numbits import register_sqlite_functions\r\n\r\n\r\n@hookimpl\r\ndef prepare_connection(conn):\r\n register_sqlite_functions(conn)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638259643, "label": "Take advantage of .coverage being a SQLite database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/847#issuecomment-643704565", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/847", "id": 643704565, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzcwNDU2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T01:26:56Z", "updated_at": "2020-06-14T01:26:56Z", "author_association": "OWNER", "body": "On closer inspection, I don't know if there's that much useful stuff you can do with the data from `.coverage` on its own.\r\n\r\nConsider the following query against a `.coverage` run against Datasette itself:\r\n\r\n```sql\r\nselect file_id, context_id, numbits_to_nums(numbits) from line_bits\r\n```\r\n\"_coverage__select_file_id__context_id__numbits_to_nums_numbits__from_line_bits\"\r\n\r\nIt looks like this tells me which lines of which files were executed during the test run. But... without the actual source code, I don't think I can calculate the coverage percentage for each file. I don't want to count comment lines or whitespace as untested for example, and I don't know how many lines were in the file.\r\n\r\nIf I'm right that it's not possible to calculate percentage coverage from just the `.coverage` data then I'll need to do something a bit more involved - maybe parsing the `coverage.xml` report and loading that into my own schema?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638259643, "label": "Take advantage of .coverage being a SQLite database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/847#issuecomment-643702715", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/847", "id": 643702715, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzcwMjcxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T01:03:30Z", "updated_at": "2020-06-14T01:03:40Z", "author_association": "OWNER", "body": "Filed a related issue with some ideas against `coveragepy` here: https://github.com/nedbat/coveragepy/issues/999", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638259643, "label": "Take advantage of .coverage being a SQLite database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643699583", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643699583, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY5OTU4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T00:26:31Z", "updated_at": "2020-06-14T00:26:31Z", "author_association": "OWNER", "body": "That seems to have fixed the problem, at least for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643699063", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643699063, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY5OTA2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T00:22:32Z", "updated_at": "2020-06-14T00:22:32Z", "author_association": "OWNER", "body": "Idea: `num_sql_threads` (described as \"Number of threads in the thread pool for executing SQLite queries\") defaults to 3 - can I knock that down to 1 in the tests and open less connections as a result?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643698790", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643698790, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY5ODc5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-14T00:20:42Z", "updated_at": "2020-06-14T00:20:42Z", "author_association": "OWNER", "body": "Released a new plugin, `datasette-psutil`, as a side-effect of this investigation: https://github.com/simonw/datasette-psutil", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643685669", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643685669, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY4NTY2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T22:24:22Z", "updated_at": "2020-06-13T22:24:22Z", "author_association": "OWNER", "body": "I tried this experiment:\r\n```python\r\nimport sqlite3, psutil\r\ndef show_things():\r\n conn = sqlite3.connect(\"fixtures.db\")\r\n tables = [r[0] for r in conn.execute(\"select * from sqlite_master\").fetchall()]\r\n return tables\r\nprint(psutil.Process().open_files())\r\nprint(show_things())\r\nprint(psutil.Process().open_files())\r\n```\r\nTo see if the connection would be automatically released when the `conn` variable was garbage collected at the end of the function... and it was correctly released - the two calls to `open_files()` showed that the file did not remain open.\r\n\r\nLikewise:\r\n```\r\nIn [11]: conn = sqlite3.connect(\"fixtures.db\") \r\n\r\nIn [12]: psutil.Process().open_files() \r\nOut[12]: \r\n[popenfile(path='/Users/simon/.ipython/profile_default/history.sqlite', fd=4),\r\n popenfile(path='/Users/simon/.ipython/profile_default/history.sqlite', fd=5),\r\n popenfile(path='/Users/simon/Dropbox/Development/datasette/fixtures.db', fd=12)]\r\n\r\nIn [13]: del conn \r\n\r\nIn [14]: psutil.Process().open_files() \r\nOut[14]: \r\n[popenfile(path='/Users/simon/.ipython/profile_default/history.sqlite', fd=4),\r\n popenfile(path='/Users/simon/.ipython/profile_default/history.sqlite', fd=5)]\r\n```\r\nSo presumably there's something about the way my pytest fixtures work that's causing the many different `Datasette()` instances and their underlying SQLite connections that I create not to be cleaned up later.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643685333", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643685333, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY4NTMzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T22:19:38Z", "updated_at": "2020-06-13T22:19:38Z", "author_association": "OWNER", "body": "That's 91 open files but only 29 unique filenames.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/846#issuecomment-643685207", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/846", "id": 643685207, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY4NTIwNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T22:18:01Z", "updated_at": "2020-06-13T22:18:01Z", "author_association": "OWNER", "body": "This shows currently open files (after `pip install psutil`):\r\n```\r\nimport psutil\r\npsutil.Process().open_files()\r\n```\r\n\r\nI ran it inside `pytest -x --pdb` and got this:\r\n```\r\n> /Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/utils.py(154)open_if_exists()\r\n-> return open(filename, mode)\r\n(Pdb) import psutil\r\n(Pdb) psutil.Process().open_files()\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp9uhx5d8x/fixtures.db', fd=10),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyfw44ica/fixtures.dot.db', fd=11),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyrg6g48b/fixtures.db', fd=12),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=13),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=14),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=15),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=16),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=17),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=18),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp33kkg62s/fixtures.db', fd=19),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpng4lg84_/fixtures.db', fd=20),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp9uhx5d8x/fixtures.db', fd=21),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp9uhx5d8x/fixtures.db', fd=22),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp9uhx5d8x/fixtures.db', fd=23),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmph11oalw_/fixtures.db', fd=24),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyfw44ica/fixtures.dot.db', fd=25),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyfw44ica/fixtures.dot.db', fd=26),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyfw44ica/fixtures.dot.db', fd=27),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpiorb2bo9/fixtures.db', fd=28),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyrg6g48b/fixtures.db', fd=29),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyrg6g48b/fixtures.db', fd=30),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpyrg6g48b/fixtures.db', fd=31),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmprvyj5udv/fixtures.db', fd=32),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpng4lg84_/fixtures.db', fd=33),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpng4lg84_/fixtures.db', fd=34),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpng4lg84_/fixtures.db', fd=35),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpb_l6gmq0/fixtures.db', fd=36),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmph11oalw_/extra database.db', fd=40),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/fixtures.db', fd=41),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpiorb2bo9/fixtures.db', fd=42),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpiorb2bo9/fixtures.db', fd=43),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpiorb2bo9/fixtures.db', fd=44),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmph11oalw_/fixtures.db', fd=45),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmph11oalw_/fixtures.db', fd=52),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/fixtures.db', fd=53),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmprvyj5udv/fixtures.db', fd=54),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmprvyj5udv/fixtures.db', fd=55),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmprvyj5udv/fixtures.db', fd=56),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpoveuwqn6/fixtures.db', fd=57),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpb_l6gmq0/fixtures.db', fd=61),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpb_l6gmq0/fixtures.db', fd=62),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpb_l6gmq0/fixtures.db', fd=63),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp_j4h9mrn/fixtures.db', fd=64),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/fixtures.db', fd=65),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/fixtures.db', fd=66),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/extra database.db', fd=67),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/extra database.db', fd=68),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/fixtures.db', fd=69),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpf0py4thp/extra database.db', fd=70)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpub3eodj1/fixtures.db', fd=71)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/fixtures.db', fd=72)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo.db', fd=73)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo.db', fd=74)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/fixtures.db', fd=75)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/fixtures.db', fd=76)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo.db', fd=77)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo-bar.db', fd=78)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo-bar.db', fd=79)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpwgcnmg4b/foo-bar.db', fd=80)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/pytest-of-simon/pytest-4/config-dir0/immutable.db', fd=81),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpoveuwqn6/fixtures.db', fd=82),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpoveuwqn6/fixtures.db', fd=83),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpoveuwqn6/fixtures.db', fd=84),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp44w5d5wo/fixtures.db', fd=85),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp_j4h9mrn/fixtures.db', fd=86),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp_j4h9mrn/fixtures.db', fd=87),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp_j4h9mrn/fixtures.db', fd=88),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpvu7h14uy/fixtures.db', fd=89),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/pytest-of-simon/pytest-4/config-dir0/demo.db', fd=119),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/pytest-of-simon/pytest-4/config-dir0/demo.db', fd=120),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/pytest-of-simon/pytest-4/config-dir0/demo.db', fd=121),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp0xcnrjag/fixtures.db', fd=122),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpub3eodj1/fixtures.db', fd=123),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpub3eodj1/fixtures.db', fd=124),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpub3eodj1/fixtures.db', fd=125),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpfz8go8rk/fixtures.db', fd=126),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp44w5d5wo/fixtures.db', fd=127),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp44w5d5wo/fixtures.db', fd=128),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp44w5d5wo/fixtures.db', fd=129),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp5j3k1ep_/fixtures.db', fd=130)\r\n popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpvu7h14uy/fixtures.db', fd=131),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpvu7h14uy/fixtures.db', fd=132),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpvo3cobk9/fixtures.db', fd=133),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp2t9txyir/fixtures.db', fd=134),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpfz8go8rk/fixtures.db', fd=135),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpfz8go8rk/fixtures.db', fd=136),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp7h3skv8b/fixtures.db', fd=137),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp5j3k1ep_/fixtures.db', fd=138),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp5j3k1ep_/fixtures.db', fd=139),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp5j3k1ep_/fixtures.db', fd=140),\r\npopenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmp5j3k1ep_/extra database.db', fd=141),\r\n```\r\nSo yeah, that's too many open files!\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638241779, "label": "\"Too many open files\" error running tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/841#issuecomment-643681747", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/841", "id": 643681747, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY4MTc0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T21:38:46Z", "updated_at": "2020-06-13T21:38:46Z", "author_association": "OWNER", "body": "Closing this because I've researched feasibility. I may start a milestone in the future to help me get to 100%.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638104520, "label": "Research feasibility of 100% test coverage"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/844#issuecomment-643681517", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/844", "id": 643681517, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY4MTUxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T21:36:15Z", "updated_at": "2020-06-13T21:36:15Z", "author_association": "OWNER", "body": "OK, this works now: https://codecov.io/gh/simonw/datasette/tree/1210d9f41841bdca450f85a2342cdb0ff339c1b4", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638230433, "label": "Action to run tests and upload coverage report"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/843#issuecomment-643676314", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/843", "id": 643676314, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MzY3NjMxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-13T20:47:37Z", "updated_at": "2020-06-13T20:47:37Z", "author_association": "OWNER", "body": "I can use this action: https://github.com/codecov/codecov-action", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 638229448, "label": "Configure codecov.io"}, "performed_via_github_app": null}