html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/967#issuecomment-692938935,https://api.github.com/repos/simonw/datasette/issues/967,692938935,MDEyOklzc3VlQ29tbWVudDY5MjkzODkzNQ==,9599,2020-09-15T19:44:21Z,2020-09-15T19:44:41Z,OWNER,"While I'm running the above test, in the rounds that work the `receive()` awaitable returns `{'type': 'http.request', 'body': b'csrftoken=IlpwUGlSMFVVa3Z3ZlVoamQi.uY2U1tF4i0M-5M6x34vnBCmJgr0'}` In the rounds that fails it returns `{'type': 'http.request'}` So it looks like the `csrftoken_from=True` parameter may be helping just by ensuring the `body` key is present and not missing. I wonder if it would work if a body of `b''` was present there?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692937150,https://api.github.com/repos/simonw/datasette/issues/967,692937150,MDEyOklzc3VlQ29tbWVudDY5MjkzNzE1MA==,9599,2020-09-15T19:42:57Z,2020-09-15T19:42:57Z,OWNER,"New (failing) test: ```python @pytest.mark.parametrize(""use_csrf"", [True, False]) @pytest.mark.parametrize(""return_json"", [True, False]) def test_magic_parameters_csrf_json(magic_parameters_client, use_csrf, return_json): magic_parameters_client.ds._metadata[""databases""][""data""][""queries""][""runme_post""][ ""sql"" ] = ""insert into logs (line) values (:_header_host)"" qs = """" if return_json: qs = ""?_json=1"" response = magic_parameters_client.post( ""/data/runme_post{}"".format(qs), {}, csrftoken_from=use_csrf or None, allow_redirects=False, ) if return_json: assert response.status == 200 assert response.json[""ok""], response.json else: assert response.status == 302 messages = magic_parameters_client.ds.unsign( response.cookies[""ds_messages""], ""messages"" ) assert [[""Query executed, 1 row affected"", 1]] == messages post_actual = magic_parameters_client.get( ""/data/logs.json?_sort_desc=rowid&_shape=array"" ).json[0][""line""] assert post_actual == ""localhost"" ``` It passes twice, fails twice - failures are for the ones where `use_csrf` is `False`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692927867,https://api.github.com/repos/simonw/datasette/issues/967,692927867,MDEyOklzc3VlQ29tbWVudDY5MjkyNzg2Nw==,9599,2020-09-15T19:25:23Z,2020-09-15T19:25:23Z,OWNER,Hunch: I think the `asgi-csrf` middleware may be consuming the request body and failing to restore it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692835066,https://api.github.com/repos/simonw/datasette/issues/967,692835066,MDEyOklzc3VlQ29tbWVudDY5MjgzNTA2Ng==,9599,2020-09-15T16:40:12Z,2020-09-15T16:40:12Z,OWNER,Is the bug here that magic parameters are incompatible with CSRF-exempt requests (e.g. request with no cookies)?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692834670,https://api.github.com/repos/simonw/datasette/issues/967,692834670,MDEyOklzc3VlQ29tbWVudDY5MjgzNDY3MA==,9599,2020-09-15T16:39:29Z,2020-09-15T16:39:29Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/853c5fc37011a7bc09ca3a1af287102f00827c82/datasette/views/database.py#L222-L236 This issue may not be about `_json=1` interacting with magic parameters after all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692834064,https://api.github.com/repos/simonw/datasette/issues/967,692834064,MDEyOklzc3VlQ29tbWVudDY5MjgzNDA2NA==,9599,2020-09-15T16:38:21Z,2020-09-15T16:38:21Z,OWNER,So the mystery here is why does omitting `csrftoken_from=True` break the `MagicParameters` mechanism?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/967#issuecomment-692832113,https://api.github.com/repos/simonw/datasette/issues/967,692832113,MDEyOklzc3VlQ29tbWVudDY5MjgzMjExMw==,9599,2020-09-15T16:34:53Z,2020-09-15T16:37:43Z,OWNER,"This is so weird. In the test I wrote for this the following passed: response = magic_parameters_client.post(""/data/runme_post?_json=1"", {}, csrftoken_from=True) But without the `csrftoken_from=True` parameter it failed with the bindings error: response = magic_parameters_client.post(""/data/runme_post?_json=1"", {}) Here's the test I wrote: ```python def test_magic_parameters_json_body(magic_parameters_client): magic_parameters_client.ds._metadata[""databases""][""data""][""queries""][""runme_post""][ ""sql"" ] = ""insert into logs (line) values (:_header_host)"" response = magic_parameters_client.post(""/data/runme_post?_json=1"", {}, csrftoken_from=True) assert response.status == 200 assert response.json[""ok""], response.json post_actual = magic_parameters_client.get( ""/data/logs.json?_sort_desc=rowid&_shape=array"" ).json[0][""line""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702069429, https://github.com/simonw/datasette/issues/940#issuecomment-692340275,https://api.github.com/repos/simonw/datasette/issues/940,692340275,MDEyOklzc3VlQ29tbWVudDY5MjM0MDI3NQ==,9599,2020-09-14T22:09:35Z,2020-09-14T22:09:35Z,OWNER,I'm going to cross my fingers and hope that this works - I don't want to leave this issue open until Datasette 0.50.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692339645,https://api.github.com/repos/simonw/datasette/issues/940,692339645,MDEyOklzc3VlQ29tbWVudDY5MjMzOTY0NQ==,9599,2020-09-14T22:07:58Z,2020-09-14T22:07:58Z,OWNER,"I shipped the Docker build manually by running the following in a tmate session: docker login # Typed my username and password interactively export REPO=datasetteproject/datasette docker build -f Dockerfile -t $REPO:0.49 . docker tag $REPO:0.49 $REPO:latest docker push $REPO ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692337397,https://api.github.com/repos/simonw/datasette/issues/940,692337397,MDEyOklzc3VlQ29tbWVudDY5MjMzNzM5Nw==,9599,2020-09-14T22:01:56Z,2020-09-14T22:01:56Z,OWNER,"I'm going to switch to using this logic to decide if I should ship to Docker: https://github.community/t/release-prerelease-action-triggers/17275/2 if: ""!github.event.release.prerelease""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692336564,https://api.github.com/repos/simonw/datasette/issues/940,692336564,MDEyOklzc3VlQ29tbWVudDY5MjMzNjU2NA==,9599,2020-09-14T21:59:40Z,2020-09-14T21:59:40Z,OWNER,Using https://github.com/marketplace/actions/debugging-with-tmate to manually submit a new build from within an interactive GitHub Actions session.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692332430,https://api.github.com/repos/simonw/datasette/issues/940,692332430,MDEyOklzc3VlQ29tbWVudDY5MjMzMjQzMA==,9599,2020-09-14T21:48:59Z,2020-09-14T21:48:59Z,OWNER,"So now I've released Datasette 0.49 but failed to push a new Docker image. This is bad, and I need to fix it. I'd like to push to Docker from GitHub Actions, so I think I'm going to create a one-off workflow task for doing that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692331919,https://api.github.com/repos/simonw/datasette/issues/940,692331919,MDEyOklzc3VlQ29tbWVudDY5MjMzMTkxOQ==,9599,2020-09-14T21:47:39Z,2020-09-14T21:47:39Z,OWNER,"I bet that's because the `github.ref` actually looks like this: `${GITHUB_REF#refs/tags/}` And the `refs/tags/` part has an `a` in it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-692331349,https://api.github.com/repos/simonw/datasette/issues/940,692331349,MDEyOklzc3VlQ29tbWVudDY5MjMzMTM0OQ==,9599,2020-09-14T21:46:11Z,2020-09-14T21:46:11Z,OWNER,"Just release Datasette 0.49 - which shipped to PyPI just fine but skipped the Docker step for some reason! https://github.com/simonw/datasette/runs/1114585275?check_suite_focus=true ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/880#issuecomment-692324230,https://api.github.com/repos/simonw/datasette/issues/880,692324230,MDEyOklzc3VlQ29tbWVudDY5MjMyNDIzMA==,9599,2020-09-14T21:28:15Z,2020-09-14T21:28:21Z,OWNER,Documentation here: https://docs.datasette.io/en/latest/sql_queries.html#json-api-for-writable-canned-queries,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-692299770,https://api.github.com/repos/simonw/datasette/issues/880,692299770,MDEyOklzc3VlQ29tbWVudDY5MjI5OTc3MA==,9599,2020-09-14T20:36:40Z,2020-09-14T20:36:40Z,OWNER,"The JSON response will look like this: ```json { ""ok"": true, ""message"": ""A message"", ""redirect"": ""/blah"" } ``` `""ok""` will be `true` if everything went right and `false` if there was an error. The `""message""` and `""redirect""` will be whatever was configured using the on_success_message - the message shown `on_success_message`, `on_success_redirect`, `on_error_message` and `on_error_redirect` settings, see https://docs.datasette.io/en/stable/sql_queries.html#writable-canned-queries","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-692298011,https://api.github.com/repos/simonw/datasette/issues/880,692298011,MDEyOklzc3VlQ29tbWVudDY5MjI5ODAxMQ==,9599,2020-09-14T20:33:13Z,2020-09-14T20:33:13Z,OWNER,"I'm going to support several ways of indicating that you would like a JSON response instead of getting a HTTP redirect from your writable canned query submission: - Use the `Accept: application/json` request header - Include `?_json=1` in the request query string - Include `""_json"": 1` in the form submission (or the JSON body submission)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-692272860,https://api.github.com/repos/simonw/datasette/issues/880,692272860,MDEyOklzc3VlQ29tbWVudDY5MjI3Mjg2MA==,9599,2020-09-14T19:43:47Z,2020-09-14T19:43:47Z,OWNER,"I'm going to add support for POST content that is sent as a JSON document, in addition to the existing support for key=value encoded POST bodies.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-692271804,https://api.github.com/repos/simonw/datasette/issues/880,692271804,MDEyOklzc3VlQ29tbWVudDY5MjI3MTgwNA==,9599,2020-09-14T19:41:37Z,2020-09-14T19:41:37Z,OWNER,Relevant code section: https://github.com/simonw/datasette/blob/1552ac931e4d2cf516caac3ceeab4fd24da1510a/datasette/views/database.py#L209-L232,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/965#issuecomment-692244252,https://api.github.com/repos/simonw/datasette/issues/965,692244252,MDEyOklzc3VlQ29tbWVudDY5MjI0NDI1Mg==,9599,2020-09-14T18:49:48Z,2020-09-14T18:49:48Z,OWNER,Documented here: https://docs.datasette.io/en/latest/custom_templates.html#custom-error-pages,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",701294727, https://github.com/simonw/datasette/issues/965#issuecomment-692231257,https://api.github.com/repos/simonw/datasette/issues/965,692231257,MDEyOklzc3VlQ29tbWVudDY5MjIzMTI1Nw==,9599,2020-09-14T18:25:04Z,2020-09-14T18:25:04Z,OWNER,In documenting this I realized that it's confusing that the default `500.html` template is often used for non-500 errors (404 for example). I think I'll rename that default template to `error.html` instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",701294727, https://github.com/simonw/datasette/issues/964#issuecomment-692212641,https://api.github.com/repos/simonw/datasette/issues/964,692212641,MDEyOklzc3VlQ29tbWVudDY5MjIxMjY0MQ==,9599,2020-09-14T17:49:44Z,2020-09-14T17:49:44Z,OWNER,Documentation: https://docs.datasette.io/en/latest/custom_templates.html#returning-404s,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",700728217, https://github.com/simonw/datasette/issues/965#issuecomment-692207341,https://api.github.com/repos/simonw/datasette/issues/965,692207341,MDEyOklzc3VlQ29tbWVudDY5MjIwNzM0MQ==,9599,2020-09-14T17:40:05Z,2020-09-14T17:40:05Z,OWNER,Also link to these from the docs added in #964.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",701294727, https://github.com/simonw/datasette/issues/944#issuecomment-691788478,https://api.github.com/repos/simonw/datasette/issues/944,691788478,MDEyOklzc3VlQ29tbWVudDY5MTc4ODQ3OA==,9599,2020-09-14T03:21:45Z,2020-09-14T03:21:45Z,OWNER,Having tried this out I think it does need a `raise_404()` mechanism - which needs to be smart enough to trigger the default 404 handler without accidentally going into an infinite loop.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681516976, https://github.com/simonw/datasette/issues/880#issuecomment-691785692,https://api.github.com/repos/simonw/datasette/issues/880,691785692,MDEyOklzc3VlQ29tbWVudDY5MTc4NTY5Mg==,9599,2020-09-14T03:10:11Z,2020-09-14T03:10:11Z,OWNER,"Answer: no, it's [not safe](https://twitter.com/glenathan/status/1305081266065244162) to skip CSRF if there's an `Accept: application/json` header because of a nasty old `crossdomain.xml` Flash vulnerability: https://blog.appsecco.com/exploiting-csrf-on-json-endpoints-with-flash-and-redirects-681d4ad6b31b?gi=a5ee3d7a8235","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/940#issuecomment-691781345,https://api.github.com/repos/simonw/datasette/issues/940,691781345,MDEyOklzc3VlQ29tbWVudDY5MTc4MTM0NQ==,9599,2020-09-14T02:53:25Z,2020-09-14T02:53:49Z,OWNER,"That worked: https://github.com/simonw/datasette/runs/1110040212?check_suite_focus=true ran and deployed https://pypi.org/project/datasette/0.49a1/ to PyPI but it skipped the push to Docker step because there was an ""a"" in the tag.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-691779693,https://api.github.com/repos/simonw/datasette/issues/940,691779693,MDEyOklzc3VlQ29tbWVudDY5MTc3OTY5Mw==,9599,2020-09-14T02:46:39Z,2020-09-14T02:46:39Z,OWNER,I think those should be single quoted.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-691779510,https://api.github.com/repos/simonw/datasette/issues/940,691779510,MDEyOklzc3VlQ29tbWVudDY5MTc3OTUxMA==,9599,2020-09-14T02:45:53Z,2020-09-14T02:45:53Z,OWNER,This bit here: https://github.com/simonw/datasette/blob/c18117cf08ad67c704dab29e3cb3b88f1de4026b/.github/workflows/publish.yml#L58-L62,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-691779361,https://api.github.com/repos/simonw/datasette/issues/940,691779361,MDEyOklzc3VlQ29tbWVudDY5MTc3OTM2MQ==,9599,2020-09-14T02:45:04Z,2020-09-14T02:45:04Z,OWNER,"Package deploys are still broken, just got this error trying to ship 0.49a1: https://github.com/simonw/datasette/actions/runs/253099665 > The workflow is not valid. .github/workflows/publish.yml (Line: 61, Col: 9): Unexpected symbol: '""a""'. Located at position 24 within expression: !(contains(github.ref, ""a"") || contains(github.ref, ""b"")) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/944#issuecomment-691774262,https://api.github.com/repos/simonw/datasette/issues/944,691774262,MDEyOklzc3VlQ29tbWVudDY5MTc3NDI2Mg==,9599,2020-09-14T02:24:08Z,2020-09-14T02:24:08Z,OWNER,"Actually don't need `{{ raise_404(""Museum not found"") }}` because we already have `{{ custom_status(404) }}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681516976, https://github.com/simonw/datasette/issues/944#issuecomment-691769222,https://api.github.com/repos/simonw/datasette/issues/944,691769222,MDEyOklzc3VlQ29tbWVudDY5MTc2OTIyMg==,9599,2020-09-14T02:01:33Z,2020-09-14T02:01:33Z,OWNER,I'm going to cache the `list_templates()` result in memory. If you want to add a new template-defined route you will need to restart the server. I think that's acceptable.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681516976, https://github.com/simonw/datasette/issues/519#issuecomment-691566247,https://api.github.com/repos/simonw/datasette/issues/519,691566247,MDEyOklzc3VlQ29tbWVudDY5MTU2NjI0Nw==,9599,2020-09-12T22:48:53Z,2020-09-12T22:48:53Z,OWNER,"I think I've figured out what to do about stability of the HTML and the default templates with respect to semantic versioning. I'm going to announce that the JSON API - including the variables made available to templates - should be considered stable according to semver. I will only break backwards compatibility at that level in a major version release. The template HTML (and default CSS) will not be considered a stable interface. They won't change on bug fix releases but they may change (albeit described in the release notes) on minor version bumps. Since the template inputs are stable, you can run your own copy of the previous version's templates if something breaks. This means users (and plugin authors) who make changes to the default Datasette UI will have to test their changes against every minor release. I think that's OK. If you write plugins that don't affect the Datasette HTML UI you will be able to expect stability across minor version releases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459590021, https://github.com/simonw/datasette/issues/880#issuecomment-691558387,https://api.github.com/repos/simonw/datasette/issues/880,691558387,MDEyOklzc3VlQ29tbWVudDY5MTU1ODM4Nw==,9599,2020-09-12T22:04:48Z,2020-09-12T22:04:48Z,OWNER,"Is it safe to skip CSRF checks if the incoming request has `Accept: application/json` on it? I'm not sure that matters since `asgi-csrf` already won't reject requests that either have no cookies or are using a `Authorization: Bearer ...` header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-691557675,https://api.github.com/repos/simonw/datasette/issues/880,691557675,MDEyOklzc3VlQ29tbWVudDY5MTU1NzY3NQ==,9599,2020-09-12T22:01:02Z,2020-09-12T22:01:11Z,OWNER,"Maybe POST to `.json` doesn't actually make sense. I could instead support `POST /db/queryname` with an optional mechanism for requesting that the response to that POST be in a JSON format. Could be a `Accept: application/json` header with an option of including `""_accept"": ""json""` as a POST parameter instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/880#issuecomment-691557429,https://api.github.com/repos/simonw/datasette/issues/880,691557429,MDEyOklzc3VlQ29tbWVudDY5MTU1NzQyOQ==,9599,2020-09-12T21:59:39Z,2020-09-12T21:59:39Z,OWNER,"What should happen when something does a POST to an extension that was registered by a plugin, e.g. `POST /db/table.atom` ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648637666, https://github.com/simonw/datasette/issues/782#issuecomment-691554088,https://api.github.com/repos/simonw/datasette/issues/782,691554088,MDEyOklzc3VlQ29tbWVudDY5MTU1NDA4OA==,9599,2020-09-12T21:39:03Z,2020-09-12T21:39:03Z,OWNER,"Plan: release a new release of Datasette (probably 0.49) with the new JSON API design, but provide a plugin called something like `datasette-api-0-48` which runs as ASGI wrapping middleware and internally rewrites incoming requests to e.g. `/db/table.json` to behave if they have the `?_extra=` params on them necessary to produce the 0.48 version of the JSON. Anyone who has built applications against 0.48 can install that plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/262#issuecomment-691526975,https://api.github.com/repos/simonw/datasette/issues/262,691526975,MDEyOklzc3VlQ29tbWVudDY5MTUyNjk3NQ==,9599,2020-09-12T18:22:44Z,2020-09-12T18:22:44Z,OWNER,Are there any interesting use-cases for a plugin hook that allows plugins to define their own `?_extra=` blocks?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641, https://github.com/simonw/datasette/issues/782#issuecomment-691526878,https://api.github.com/repos/simonw/datasette/issues/782,691526878,MDEyOklzc3VlQ29tbWVudDY5MTUyNjg3OA==,9599,2020-09-12T18:21:41Z,2020-09-12T18:22:20Z,OWNER,"Would it be so bad if the default format had a `""rows""` key containing the array of rows? Maybe it wouldn't. The reason I always use `?_shape=array` is because I want an array of objects, rather than an array of arrays that I have to match up again with their columns. A default format that's an object rather than array also gives something for the `?_extra=` parameter to add its extras to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/782#issuecomment-691526762,https://api.github.com/repos/simonw/datasette/issues/782,691526762,MDEyOklzc3VlQ29tbWVudDY5MTUyNjc2Mg==,9599,2020-09-12T18:20:19Z,2020-09-12T18:20:19Z,OWNER,"I'd like to revisit the idea of using `?_extra=x` to opt-in to extra blocks of JSON, from #262","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/262#issuecomment-691526719,https://api.github.com/repos/simonw/datasette/issues/262,691526719,MDEyOklzc3VlQ29tbWVudDY5MTUyNjcxOQ==,9599,2020-09-12T18:19:50Z,2020-09-12T18:19:50Z,OWNER,"> Idea: `?_extra=sqllog` could output a lot of every individual SQL statement that was executed in order to generate the page - useful for seeing how foreign key expansion and faceting actually works. I built a version of that a while ago as the `?_trace=1` argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641, https://github.com/simonw/datasette/issues/262#issuecomment-389702480,https://api.github.com/repos/simonw/datasette/issues/262,389702480,MDEyOklzc3VlQ29tbWVudDM4OTcwMjQ4MA==,9599,2018-05-17T00:00:39Z,2020-09-12T18:19:30Z,OWNER,Idea: `?_extra=sqllog` could output a lot of every individual SQL statement that was executed in order to generate the page - useful for seeing how foreign key expansion and faceting actually works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641, https://github.com/simonw/datasette/issues/680#issuecomment-691526635,https://api.github.com/repos/simonw/datasette/issues/680,691526635,MDEyOklzc3VlQ29tbWVudDY5MTUyNjYzNQ==,9599,2020-09-12T18:18:50Z,2020-09-12T18:18:50Z,OWNER,"I'm happy with the not-quite-automated way I'm doing this, so I'm going to close this issue. That's documented here https://docs.datasette.io/en/0.48/contributing.html#release-process - I use https://euangoddard.github.io/clipboard2markdown/ to create the GitHub releases markdown version.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",569275763, https://github.com/simonw/datasette/issues/782#issuecomment-691526489,https://api.github.com/repos/simonw/datasette/issues/782,691526489,MDEyOklzc3VlQ29tbWVudDY5MTUyNjQ4OQ==,9599,2020-09-12T18:17:16Z,2020-09-12T18:17:16Z,OWNER,(I think I may have been over-thinking the details of this is for a couple of years now.),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/782#issuecomment-691526416,https://api.github.com/repos/simonw/datasette/issues/782,691526416,MDEyOklzc3VlQ29tbWVudDY5MTUyNjQxNg==,9599,2020-09-12T18:16:36Z,2020-09-12T18:16:36Z,OWNER,I'm going to hack together a preview of this in a branch and deploy it somewhere so people can see what I've got planned. Much easier to evaluate a working prototype than static examples.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/dogsheep/twitter-to-sqlite/issues/50#issuecomment-691501132,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50,691501132,MDEyOklzc3VlQ29tbWVudDY5MTUwMTEzMg==,706257,2020-09-12T14:48:10Z,2020-09-12T14:48:10Z,NONE,"This seems to be an issue even with larger values of `--stop_after`: ``` $ twitter-to-sqlite favorites twitter.db --stop_after=2000 Importing favorites [####################################] 198 $ ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",698791218, https://github.com/simonw/datasette/issues/963#issuecomment-691379980,https://api.github.com/repos/simonw/datasette/issues/963,691379980,MDEyOklzc3VlQ29tbWVudDY5MTM3OTk4MA==,9599,2020-09-12T01:50:56Z,2020-09-12T01:50:56Z,OWNER,Good bug - looks like a problem with the hidden form fields.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",699947574, https://github.com/simonw/datasette/issues/782#issuecomment-691323302,https://api.github.com/repos/simonw/datasette/issues/782,691323302,MDEyOklzc3VlQ29tbWVudDY5MTMyMzMwMg==,9599,2020-09-11T21:38:27Z,2020-09-11T21:40:04Z,OWNER,"Another idea: the default output could be the list of dicts: ```json [ { ""pk1"": ""a"", ""pk2"": ""a"", ""pk3"": ""a"", ""content"": ""a-a-a"" }, ... ] ``` BUT... I could include pagination information in the HTTP headers - as seen in the WordPress REST API or the GitHub API: ``` ~ % curl -s -i 'https://api.github.com/repos/simonw/datasette/commits' | head -n 40 HTTP/1.1 200 OK server: GitHub.com date: Fri, 11 Sep 2020 21:37:46 GMT content-type: application/json; charset=utf-8 status: 200 OK cache-control: public, max-age=60, s-maxage=60 vary: Accept, Accept-Encoding, Accept, X-Requested-With etag: W/""71c99379743513394e880c6306b66bf9"" last-modified: Fri, 11 Sep 2020 21:32:54 GMT x-github-media-type: github.v3; format=json link: ; rel=""next"", ; rel=""last"" access-control-expose-headers: ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Used, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type, Deprecation, Sunset access-control-allow-origin: * strict-transport-security: max-age=31536000; includeSubdomains; preload x-frame-options: deny x-content-type-options: nosniff x-xss-protection: 1; mode=block referrer-policy: origin-when-cross-origin, strict-origin-when-cross-origin content-security-policy: default-src 'none' X-Ratelimit-Limit: 60 X-Ratelimit-Remaining: 55 X-Ratelimit-Reset: 1599863850 X-Ratelimit-Used: 5 Accept-Ranges: bytes Content-Length: 118240 X-GitHub-Request-Id: EC76:0EAD:313F40:5291A4:5F5BEE37 [ { ""sha"": ""d02f6151dae073135a22d0123e8abdc6cbef7c50"", ""node_id"": ""MDY6Q29tbWl0MTA3OTE0NDkzOmQwMmY2MTUxZGFlMDczMTM1YTIyZDAxMjNlOGFiZGM2Y2JlZjdjNTA="", ""commit"": { ``` Alternative shapes would provide the pagination information (and other extensions) in the JSON, e.g.: `/squirrels/squirrels.json?_shape=paginated` ```json { ""rows"": [ { ""pk1"": ""a"", ""pk2"": ""a"", ""pk3"": ""a"", ""content"": ""a-a-a"" } ], ""pagination"": { ""next"": ""234"", ""count"": 442 } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/947#issuecomment-691318133,https://api.github.com/repos/simonw/datasette/issues/947,691318133,MDEyOklzc3VlQ29tbWVudDY5MTMxODEzMw==,9599,2020-09-11T21:23:40Z,2020-09-11T21:23:40Z,OWNER,"I'm going to use exit code 1 for any errors, be they 500 or 404.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",684111953, https://github.com/simonw/datasette/issues/962#issuecomment-691250299,https://api.github.com/repos/simonw/datasette/issues/962,691250299,MDEyOklzc3VlQ29tbWVudDY5MTI1MDI5OQ==,9599,2020-09-11T18:33:50Z,2020-09-11T18:33:50Z,OWNER,Since this is purely a debugging option I'm going to allow myself not to write a unit test for it!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",699622046, https://github.com/dogsheep/twitter-to-sqlite/issues/50#issuecomment-690860653,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50,690860653,MDEyOklzc3VlQ29tbWVudDY5MDg2MDY1Mw==,370930,2020-09-11T04:04:08Z,2020-09-11T04:04:08Z,CONTRIBUTOR,"There's probably a nicer way of doing (hence this is a comment rather than a PR), but this appears to fix it: ```diff --- a/twitter_to_sqlite/utils.py +++ b/twitter_to_sqlite/utils.py @@ -181,6 +181,7 @@ def fetch_timeline( args[""tweet_mode""] = ""extended"" min_seen_id = None num_rate_limit_errors = 0 + seen_count = 0 while True: if min_seen_id is not None: args[""max_id""] = min_seen_id - 1 @@ -208,6 +209,7 @@ def fetch_timeline( yield tweet min_seen_id = min(t[""id""] for t in tweets) max_seen_id = max(t[""id""] for t in tweets) + seen_count += len(tweets) if last_since_id is not None: max_seen_id = max((last_since_id, max_seen_id)) last_since_id = max_seen_id @@ -217,7 +219,9 @@ def fetch_timeline( replace=True, ) if stop_after is not None: - break + if seen_count >= stop_after: + break + args[""count""] = min(args[""count""], stop_after - seen_count) time.sleep(sleep) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",698791218, https://github.com/simonw/sqlite-utils/issues/157#issuecomment-689850509,https://api.github.com/repos/simonw/sqlite-utils/issues/157,689850509,MDEyOklzc3VlQ29tbWVudDY4OTg1MDUwOQ==,9599,2020-09-09T22:14:49Z,2020-09-09T22:14:49Z,OWNER,It will call this method: https://github.com/simonw/sqlite-utils/blob/367082e787101fb90901ef3214804ab23a92ce46/sqlite_utils/db.py#L405-L411,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",697179806, https://github.com/simonw/sqlite-utils/issues/157#issuecomment-689850289,https://api.github.com/repos/simonw/sqlite-utils/issues/157,689850289,MDEyOklzc3VlQ29tbWVudDY4OTg1MDI4OQ==,9599,2020-09-09T22:14:19Z,2020-09-09T22:14:19Z,OWNER,"This can accept four arguments: table, column, other_table, other_column: ``` sqlite-utils add-foreign-keys calands.db \ units_with_maps ACCESS_TYP ACCESS_TYP id \ units_with_maps AGNCY_NAME AGNCY_NAME id \ units_with_maps AGNCY_LEV AGNCY_LEV id ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",697179806, https://github.com/simonw/sqlite-utils/pull/156#issuecomment-689735140,https://api.github.com/repos/simonw/sqlite-utils/issues/156,689735140,MDEyOklzc3VlQ29tbWVudDY4OTczNTE0MA==,9599,2020-09-09T18:21:06Z,2020-09-09T18:21:06Z,OWNER,"Good spot, thanks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",697030843, https://github.com/simonw/datasette/issues/961#issuecomment-689635754,https://api.github.com/repos/simonw/datasette/issues/961,689635754,MDEyOklzc3VlQ29tbWVudDY4OTYzNTc1NA==,9599,2020-09-09T15:24:31Z,2020-09-09T15:24:31Z,OWNER,"I thought about checking that every database in the `databases:` section exists and ditto for `tables:` - but actually I think it's useful to be able to keep a `metadata.yml` around with configuration for databases or tables that aren't currently attached to Datasette. I could treat those as warnings and output a warning to standard out when the server starts instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",696908389, https://github.com/simonw/datasette/issues/961#issuecomment-689635094,https://api.github.com/repos/simonw/datasette/issues/961,689635094,MDEyOklzc3VlQ29tbWVudDY4OTYzNTA5NA==,9599,2020-09-09T15:23:24Z,2020-09-09T15:23:24Z,OWNER,"Checks can include: - `facets:` lists columns that exist - `sort:` and `sort_desc:` columns - `fts_table` and `fts_pk` are valid","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",696908389, https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-689226390,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17,689226390,MDEyOklzc3VlQ29tbWVudDY4OTIyNjM5MA==,9599,2020-09-09T00:36:07Z,2020-09-09T00:36:07Z,MEMBER,"Alternative names: - type - record_type - doctype I think `type` is right. It matches what Elasticsearch used to call their equivalent of this (before they removed the feature!). https://www.elastic.co/guide/en/elasticsearch/reference/current/removal-of-types.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694500679, https://github.com/simonw/sqlite-utils/issues/145#issuecomment-689186423,https://api.github.com/repos/simonw/sqlite-utils/issues/145,689186423,MDEyOklzc3VlQ29tbWVudDY4OTE4NjQyMw==,9599,2020-09-08T23:21:23Z,2020-09-08T23:21:23Z,OWNER,Fixed in PR #146.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688659182, https://github.com/simonw/sqlite-utils/pull/146#issuecomment-689185393,https://api.github.com/repos/simonw/sqlite-utils/issues/146,689185393,MDEyOklzc3VlQ29tbWVudDY4OTE4NTM5Mw==,9599,2020-09-08T23:17:42Z,2020-09-08T23:17:42Z,OWNER,"That seems like a reasonable approach to me, especially since this is going to be a pretty rare edge-case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688668680, https://github.com/simonw/sqlite-utils/issues/155#issuecomment-689166404,https://api.github.com/repos/simonw/sqlite-utils/issues/155,689166404,MDEyOklzc3VlQ29tbWVudDY4OTE2NjQwNA==,9599,2020-09-08T22:20:03Z,2020-09-08T22:20:03Z,OWNER,"I'm going to update `sqlite-utils optimize` to also take an optional list of tables, for consistency.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",696045581, https://github.com/simonw/sqlite-utils/issues/153#issuecomment-689165985,https://api.github.com/repos/simonw/sqlite-utils/issues/153,689165985,MDEyOklzc3VlQ29tbWVudDY4OTE2NTk4NQ==,9599,2020-09-08T22:18:52Z,2020-09-08T22:18:52Z,OWNER,"I've reverted this change again, because it turns out using the `rebuild` FTS mechanism is a better way of repairing this issue - see #155.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695377804, https://github.com/simonw/sqlite-utils/issues/155#issuecomment-689163158,https://api.github.com/repos/simonw/sqlite-utils/issues/155,689163158,MDEyOklzc3VlQ29tbWVudDY4OTE2MzE1OA==,9599,2020-09-08T22:10:27Z,2020-09-08T22:10:27Z,OWNER,"For the command version: sqlite-utils rebuild-fts mydb.db This will rebuild all detected FTS tables. You can also specify one or more explicit tables: sqlite-utils rebuild-fts mydb.db dogs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",696045581, https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688626037,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19,688626037,MDEyOklzc3VlQ29tbWVudDY4ODYyNjAzNw==,9599,2020-09-08T05:27:07Z,2020-09-08T05:27:07Z,MEMBER,"A really clever way to do this would be with triggers. The indexer script would add triggers to each of the database tables that it is indexing - each in their own database. Those triggers would then maintain a `_index_queue_` table. This table would record the primary key of rows that are added, modified or deleted. The indexer could then work by reading through the `_index_queue_` table, re-indexing (or deleting) just the primary keys listed there, and then emptying the queue once it has finished. This would add a small amount of overhead to insert/update/delete queries run against the table. My hunch is that the overhead would be miniscule, but I could still allow people to opt-out for tables that are so high traffic that this would matter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695556681, https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688625430,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19,688625430,MDEyOklzc3VlQ29tbWVudDY4ODYyNTQzMA==,9599,2020-09-08T05:24:50Z,2020-09-08T05:24:50Z,MEMBER,"I thought about allowing tables to define a incremental indexing SQL query - maybe something that can return just records touched in the past hour, or records since a recorded ""last indexed record"" value. The problem with this is deletes - if you delete a record, how does the indexer know to remove it? See #18 - that's already caused problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695556681, https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688623097,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18,688623097,MDEyOklzc3VlQ29tbWVudDY4ODYyMzA5Nw==,9599,2020-09-08T05:15:51Z,2020-09-08T05:15:51Z,MEMBER,"I'm inclined to go with the first, simpler option. I have longer term plans for efficient incremental index updates based on clever trickery with triggers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695553522, https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688622995,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18,688622995,MDEyOklzc3VlQ29tbWVudDY4ODYyMjk5NQ==,9599,2020-09-08T05:15:21Z,2020-09-08T05:15:21Z,MEMBER,"Alternatively it could run as it does now but add a `DELETE FROM index1.search_index WHERE key not in (select key from ...)`. I'm not sure which would be more efficient.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695553522, https://github.com/simonw/sqlite-utils/pull/146#issuecomment-688573964,https://api.github.com/repos/simonw/sqlite-utils/issues/146,688573964,MDEyOklzc3VlQ29tbWVudDY4ODU3Mzk2NA==,96218,2020-09-08T01:55:07Z,2020-09-08T01:55:07Z,CONTRIBUTOR,"Okay, I've rewritten this PR to preserve the batching behaviour but still fix #145, and rebased the branch to account for the `db.execute()` api change. It's not terribly sophisticated -- if it attempts to insert a batch which has too many variables, the exception is caught, the batch is split in two and each half is inserted separately, and then it carries on as before with the same `batch_size`. In the edge case where this gets triggered, subsequent batches will all be inserted in two groups too if they continue to have the same number of columns (which is presumably reasonably likely). Do you reckon this is acceptable when set against the awkwardness of recalculating the `batch_size` on the fly?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688668680, https://github.com/simonw/sqlite-utils/issues/154#issuecomment-688544156,https://api.github.com/repos/simonw/sqlite-utils/issues/154,688544156,MDEyOklzc3VlQ29tbWVudDY4ODU0NDE1Ng==,9599,2020-09-07T23:47:10Z,2020-09-07T23:47:10Z,OWNER,This is already covered in the tests though: https://github.com/simonw/sqlite-utils/blob/deb2eb013ff85bbc828ebc244a9654f0d9c3139e/tests/test_cli.py#L1300-L1328,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695441530, https://github.com/simonw/sqlite-utils/issues/154#issuecomment-688543128,https://api.github.com/repos/simonw/sqlite-utils/issues/154,688543128,MDEyOklzc3VlQ29tbWVudDY4ODU0MzEyOA==,9599,2020-09-07T23:43:10Z,2020-09-07T23:43:10Z,OWNER,"Running this against the same file works: ``` $ sqlite3 beta.db SQLite version 3.31.1 2020-01-27 19:55:54 Enter "".help"" for usage hints. sqlite> PRAGMA journal_mode=wal; wal ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695441530, https://github.com/simonw/sqlite-utils/issues/152#issuecomment-688500704,https://api.github.com/repos/simonw/sqlite-utils/issues/152,688500704,MDEyOklzc3VlQ29tbWVudDY4ODUwMDcwNA==,9599,2020-09-07T20:28:45Z,2020-09-07T21:17:48Z,OWNER,"The principle reason to turn these on - at least so far - is that without it weird things happen where FTS tables (in particular `*_fts_docsize`) grow without limit over time, because calls to `INSERT OR REPLACE` against the parent table cause additional rows to be inserted into `*_fts_docsize` even if the row was replaced rather than being inserted.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695376054, https://github.com/simonw/sqlite-utils/issues/153#issuecomment-688511161,https://api.github.com/repos/simonw/sqlite-utils/issues/153,688511161,MDEyOklzc3VlQ29tbWVudDY4ODUxMTE2MQ==,9599,2020-09-07T21:07:20Z,2020-09-07T21:07:29Z,OWNER,"FTS4 uses a different column name here: https://datasette-sqlite-fts4.datasette.io/24ways-fts4/articles_fts_docsize ``` CREATE TABLE 'articles_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695377804, https://github.com/simonw/sqlite-utils/pull/146#issuecomment-688508510,https://api.github.com/repos/simonw/sqlite-utils/issues/146,688508510,MDEyOklzc3VlQ29tbWVudDY4ODUwODUxMA==,9599,2020-09-07T20:56:03Z,2020-09-07T20:56:24Z,OWNER,"The problem with this approach is that it requires us to consume the entire iterator before we can start inserting rows into the table - here on line 1052: https://github.com/simonw/sqlite-utils/blob/bb131793feac16bc7181ab997568f941b0220ef2/sqlite_utils/db.py#L1047-L1054 I designed the `.insert_all()` to avoid doing this, because I want to be able to pass it an iterator (or more likely a generator) that could produce potentially millions of records. Doing things one batch of 100 records at a time means that the Python process doesn't need to pull millions of records into memory at once. `db-to-sqlite` is one example of a tool that uses that characteristic, in https://github.com/simonw/db-to-sqlite/blob/63e4ee972f292de13bb11767c0fb64b35339d954/db_to_sqlite/cli.py#L94-L106 So we need to solve this issue without consuming the entire iterator with a `records = list(records)` call. I think one way to do this is to execute each chunk one at a time and watch out for an exception that indicates that we sent too many parameters - then adjust the chunk size down and try again.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688668680, https://github.com/simonw/sqlite-utils/issues/153#issuecomment-688506015,https://api.github.com/repos/simonw/sqlite-utils/issues/153,688506015,MDEyOklzc3VlQ29tbWVudDY4ODUwNjAxNQ==,9599,2020-09-07T20:46:58Z,2020-09-07T20:46:58Z,OWNER,Writing a test for this will be a tiny bit tricky. I think I'll use a test that replicates the bug in #149.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695377804, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688501064,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688501064,MDEyOklzc3VlQ29tbWVudDY4ODUwMTA2NA==,9599,2020-09-07T20:30:15Z,2020-09-07T20:30:38Z,OWNER,"The second challenge here is cleaning up all of those junk rows in existing `*_fts_docsize` tables. Doing that just to the demo database from https://github-to-sqlite.dogsheep.net/github.db dropped its size from 22MB to 16MB! Here's the SQL: ```sql DELETE FROM [licenses_fts_docsize] WHERE id NOT IN ( SELECT rowid FROM [licenses_fts]); ``` I can do that as part of the existing `table.optimize()` method, which optimizes FTS tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/152#issuecomment-688500294,https://api.github.com/repos/simonw/sqlite-utils/issues/152,688500294,MDEyOklzc3VlQ29tbWVudDY4ODUwMDI5NA==,9599,2020-09-07T20:27:07Z,2020-09-07T20:27:07Z,OWNER,I'm going to make this an argument to the `Database()` class constructor which defaults to `True`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695376054, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688499924,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688499924,MDEyOklzc3VlQ29tbWVudDY4ODQ5OTkyNA==,9599,2020-09-07T20:25:40Z,2020-09-07T20:25:50Z,OWNER,"https://www.sqlite.org/pragma.html#pragma_recursive_triggers says: > Prior to SQLite [version 3.6.18](https://www.sqlite.org/releaselog/3_6_18.html) (2009-09-11), recursive triggers were not supported. The behavior of SQLite was always as if this pragma was set to OFF. Support for recursive triggers was added in version 3.6.18 but was initially turned OFF by default, for compatibility. Recursive triggers may be turned on by default in future versions of SQLite. So I think the fix is to turn on `recursive_triggers` globally by default for `sqlite-utils`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688499650,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688499650,MDEyOklzc3VlQ29tbWVudDY4ODQ5OTY1MA==,9599,2020-09-07T20:24:35Z,2020-09-07T20:24:35Z,OWNER,"This replicates the problem: ``` (github-to-sqlite) /tmp % sqlite-utils tables --counts github.db | grep licenses {""table"": ""licenses"", ""count"": 7}, {""table"": ""licenses_fts_data"", ""count"": 35}, {""table"": ""licenses_fts_idx"", ""count"": 16}, {""table"": ""licenses_fts_docsize"", ""count"": 9151}, {""table"": ""licenses_fts_config"", ""count"": 1}, {""table"": ""licenses_fts"", ""count"": 7}, (github-to-sqlite) /tmp % github-to-sqlite repos github.db dogsheep (github-to-sqlite) /tmp % sqlite-utils tables --counts github.db | grep licenses {""table"": ""licenses"", ""count"": 7}, {""table"": ""licenses_fts_data"", ""count"": 45}, {""table"": ""licenses_fts_idx"", ""count"": 26}, {""table"": ""licenses_fts_docsize"", ""count"": 9161}, {""table"": ""licenses_fts_config"", ""count"": 1}, {""table"": ""licenses_fts"", ""count"": 7}, ``` Note how the number of rows in `licenses_fts_docsize` goes from 9151 to 9161. The number went up by ten. I used tracing from #151 to show that the following SQL executed ten times: ``` INSERT OR REPLACE INTO [licenses] ([key], [name], [node_id], [spdx_id], [url]) VALUES (?, ?, ?, ?, ?); ``` Then I tried executing `PRAGMA recursive_triggers=on;` at the start of the script. This fixed the problem - running the script did not increase the number of rows in `licenses_fts_docsize`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688482355,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688482355,MDEyOklzc3VlQ29tbWVudDY4ODQ4MjM1NQ==,9599,2020-09-07T19:22:51Z,2020-09-07T19:22:51Z,OWNER,"And the SQLite documentation says: > When the REPLACE conflict resolution strategy deletes rows in order to satisfy a constraint, [delete triggers](https://www.sqlite.org/lang_createtrigger.html) fire if and only if [recursive triggers](https://www.sqlite.org/pragma.html#pragma_recursive_triggers) are enabled.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688482055,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688482055,MDEyOklzc3VlQ29tbWVudDY4ODQ4MjA1NQ==,9599,2020-09-07T19:21:42Z,2020-09-07T19:21:42Z,OWNER,"Using `replace=True` there executes `INSERT OR REPLACE` - and Dan Kennedy (SQLite maintainer) on the SQLite forums said this: > Are you using ""REPLACE INTO"", or ""UPDATE OR REPLACE"" on the ""licenses"" table without having first executed ""PRAGMA recursive_triggers = 1""? The docs note that delete triggers will not be fired in this case, which would explain things. Second paragraph under ""REPLACE"" here: > > https://www.sqlite.org/lang_conflict.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688481374,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688481374,MDEyOklzc3VlQ29tbWVudDY4ODQ4MTM3NA==,9599,2020-09-07T19:19:08Z,2020-09-07T19:19:08Z,OWNER,"reading through the code for `github-to-sqlite repos` - one of the things it does is calls `save_license` for each repo: https://github.com/dogsheep/github-to-sqlite/blob/39b2234253096bd579feed4e25104698b8ccd2ba/github_to_sqlite/utils.py#L259-L262 ```python def save_license(db, license): if license is None: return None return db[""licenses""].insert(license, pk=""key"", replace=True).last_pk ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/pull/146#issuecomment-688481317,https://api.github.com/repos/simonw/sqlite-utils/issues/146,688481317,MDEyOklzc3VlQ29tbWVudDY4ODQ4MTMxNw==,96218,2020-09-07T19:18:55Z,2020-09-07T19:18:55Z,CONTRIBUTOR,"Just force-pushed to update d042f9c with more formatting changes to satisfy `black==20.8b1` and pass the GitHub Actions ""Test"" workflow.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688668680, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688480665,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688480665,MDEyOklzc3VlQ29tbWVudDY4ODQ4MDY2NQ==,9599,2020-09-07T19:16:20Z,2020-09-07T19:16:20Z,OWNER,"Aha! I have managed to replicate the bug: ``` (github-to-sqlite) /tmp % sqlite-utils tables --counts github.db | grep licenses {""table"": ""licenses"", ""count"": 7}, {""table"": ""licenses_fts_data"", ""count"": 35}, {""table"": ""licenses_fts_idx"", ""count"": 16}, {""table"": ""licenses_fts_docsize"", ""count"": 9151}, {""table"": ""licenses_fts_config"", ""count"": 1}, {""table"": ""licenses_fts"", ""count"": 7}, (github-to-sqlite) /tmp % github-to-sqlite repos github.db dogsheep (github-to-sqlite) /tmp % sqlite-utils tables --counts github.db | grep licenses {""table"": ""licenses"", ""count"": 7}, {""table"": ""licenses_fts_data"", ""count"": 45}, {""table"": ""licenses_fts_idx"", ""count"": 26}, {""table"": ""licenses_fts_docsize"", ""count"": 9161}, {""table"": ""licenses_fts_config"", ""count"": 1}, {""table"": ""licenses_fts"", ""count"": 7}, ``` Note that the number of records in `licenses_fts_docsize` went from 9151 to 9161.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/pull/146#issuecomment-688479163,https://api.github.com/repos/simonw/sqlite-utils/issues/146,688479163,MDEyOklzc3VlQ29tbWVudDY4ODQ3OTE2Mw==,96218,2020-09-07T19:10:33Z,2020-09-07T19:11:57Z,CONTRIBUTOR,"@simonw -- I've gone ahead updated the documentation to reflect the changes introduced in this PR. IMO it's ready to merge now. In writing the documentation changes, I begin to wonder about the value and role of `batch_size` at all, tbh. May I assume it was originally intended to prevent using the entire row set to determine columns and column types, and that this was a performance consideration? If so, this PR entirely undermines its purpose. I've been passing in excess of 500,000 rows at a time to `insert_all()` with these changes and although I'm sure the performance difference is measurable it's not really noticeable; given #145, I don't know that any performance advantages outweigh the problems doing it this way removes. What do you think about just dropping the argument and defaulting to the maximum `batch_size` permissible given `SQLITE_MAX_VARS`? Are there other reasons one might want to restrict `batch_size` that I've overlooked? I could open a new issue to discuss/implement this. Of course the documentation will need to change again too if/when something is done about #147.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688668680, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688464181,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688464181,MDEyOklzc3VlQ29tbWVudDY4ODQ2NDE4MQ==,9599,2020-09-07T18:19:54Z,2020-09-07T18:19:54Z,OWNER,"Even though that table doesn't declare an integer primary key it does have a `rowid` column: https://github-to-sqlite.dogsheep.net/github?sql=select+rowid%2C+%5Bkey%5D%2C+name%2C+spdx_id%2C+url%2C+node_id+from+licenses+order+by+%5Bkey%5D+limit+101 | rowid | key | name | spdx_id | url | node_id | | --- | --- | --- | --- | --- | --- | | 9150 | apache-2.0 | Apache License 2.0 | Apache-2.0 | | MDc6TGljZW5zZTI= | | 112 | bsd-3-clause | BSD 3-Clause ""New"" or ""Revised"" License | BSD-3-Clause | | MDc6TGljZW5zZTU= | https://www.sqlite.org/rowidtable.html explains has this clue: > If the rowid is not aliased by INTEGER PRIMARY KEY then it is not persistent and might change. In particular the VACUUM command will change rowids for tables that do not declare an INTEGER PRIMARY KEY. Therefore, applications should not normally access the rowid directly, but instead use an INTEGER PRIMARY KEY. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688460865,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688460865,MDEyOklzc3VlQ29tbWVudDY4ODQ2MDg2NQ==,9599,2020-09-07T18:07:14Z,2020-09-07T18:07:14Z,OWNER,"Another likely culprit: `licenses` has a text primary key, so it's not using `rowid`: ```sql CREATE TABLE [licenses] ( [key] TEXT PRIMARY KEY, [name] TEXT, [spdx_id] TEXT, [url] TEXT, [node_id] TEXT ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688460729,https://api.github.com/repos/simonw/sqlite-utils/issues/149,688460729,MDEyOklzc3VlQ29tbWVudDY4ODQ2MDcyOQ==,9599,2020-09-07T18:06:44Z,2020-09-07T18:06:44Z,OWNER,First posted on SQLite forum here but I'm pretty sure this is a bug in how `sqlite-utils` created those tables: https://sqlite.org/forum/forumpost/51aada1b45,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695319258, https://github.com/simonw/sqlite-utils/issues/148#issuecomment-688434226,https://api.github.com/repos/simonw/sqlite-utils/issues/148,688434226,MDEyOklzc3VlQ29tbWVudDY4ODQzNDIyNg==,9599,2020-09-07T16:50:33Z,2020-09-07T16:50:33Z,OWNER,"This may be as easy as applying `textwrap.dedent()` to this: https://github.com/simonw/sqlite-utils/blob/0e62744da9a429093e3409575c1f881376b0361f/sqlite_utils/db.py#L778-L787 I could apply that to a few other queries in that code as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695276328, https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-687880459,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17,687880459,MDEyOklzc3VlQ29tbWVudDY4Nzg4MDQ1OQ==,9599,2020-09-06T19:36:32Z,2020-09-06T19:36:32Z,MEMBER,At some point I may even want to support search types which are indexed from (and inflated from) more than one database file. I'm going to ignore that for the moment though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694500679, https://github.com/dogsheep/dogsheep-beta/issues/13#issuecomment-686774592,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13,686774592,MDEyOklzc3VlQ29tbWVudDY4Njc3NDU5Mg==,9599,2020-09-03T21:30:21Z,2020-09-03T21:30:21Z,MEMBER,"This is partially supported: the custom search SQL we run doesn't escape them, but the `?_search` used to calculate facet counts does. So this is a bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",692386625, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686767208,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686767208,MDEyOklzc3VlQ29tbWVudDY4Njc2NzIwOA==,9599,2020-09-03T21:12:14Z,2020-09-03T21:12:14Z,MEMBER,Documentation: https://github.com/dogsheep/dogsheep-beta/blob/0.4/README.md#custom-results-display,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-686689612,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3,686689612,MDEyOklzc3VlQ29tbWVudDY4NjY4OTYxMg==,9599,2020-09-03T18:44:20Z,2020-09-03T18:44:20Z,MEMBER,Facets are now displayed but selecting them doesn't work yet.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689810340, https://github.com/dogsheep/dogsheep-beta/issues/5#issuecomment-686689366,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5,686689366,MDEyOklzc3VlQ29tbWVudDY4NjY4OTM2Ng==,9599,2020-09-03T18:43:50Z,2020-09-03T18:43:50Z,MEMBER,No longer needed thanks to #9,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689847361, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686689122,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686689122,MDEyOklzc3VlQ29tbWVudDY4NjY4OTEyMg==,9599,2020-09-03T18:43:20Z,2020-09-03T18:43:20Z,MEMBER,Needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686688963,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686688963,MDEyOklzc3VlQ29tbWVudDY4NjY4ODk2Mw==,9599,2020-09-03T18:42:59Z,2020-09-03T18:42:59Z,MEMBER,I'm pleased with how this works now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/11#issuecomment-686618669,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11,686618669,MDEyOklzc3VlQ29tbWVudDY4NjYxODY2OQ==,9599,2020-09-03T16:47:34Z,2020-09-03T16:53:25Z,MEMBER,I think a `is_public` integer column which defaults to 0 would be good here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",692125110, https://github.com/dogsheep/dogsheep-beta/issues/10#issuecomment-686238498,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10,686238498,MDEyOklzc3VlQ29tbWVudDY4NjIzODQ5OA==,9599,2020-09-03T04:05:05Z,2020-09-03T04:05:05Z,MEMBER,Since the first two categories are `created` and `saved` this one should be called `received`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691557547, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686163754,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686163754,MDEyOklzc3VlQ29tbWVudDY4NjE2Mzc1NA==,9599,2020-09-03T00:46:21Z,2020-09-03T00:46:21Z,MEMBER,"Challenge: the `dogsheep-beta.yml` configuration file that is passed to the `dogsheep-beta index` command needs to also be made available to Datasette itself, so that it can read the configuration. Let's say it can either be duplicated in the `plugins` configuration block of the `metadata.yml` OR you can do this in `metadata.yml`: ```yaml plugins: dogsheep-beta: config_file: dogsheep-beta.yml ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686158454,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686158454,MDEyOklzc3VlQ29tbWVudDY4NjE1ODQ1NA==,9599,2020-09-03T00:32:42Z,2020-09-03T00:32:42Z,MEMBER,"If this turns out to be too inefficient I could add a `display` text column to the `search_index` table which is designed to be populated with arbitrary JSON by the indexing query, which can then be used to render the template fragment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154627,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686154627,MDEyOklzc3VlQ29tbWVudDY4NjE1NDYyNw==,9599,2020-09-03T00:19:22Z,2020-09-03T00:19:22Z,MEMBER,If this performs well enough (100 displayed items will be 100 extra `display_sql` calls) then I'll go with this as the design for the feature.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154486,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686154486,MDEyOklzc3VlQ29tbWVudDY4NjE1NDQ4Ng==,9599,2020-09-03T00:18:54Z,2020-09-03T00:18:54Z,MEMBER,"`display_sql` could be optional. If it's not defined, a `row` object is passed to the template which is the row that's stored in `search_index`. If `display_sql` IS defined then it's executed and the result is made available as a `display` object in addition to the `row` object.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686153967,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686153967,MDEyOklzc3VlQ29tbWVudDY4NjE1Mzk2Nw==,9599,2020-09-03T00:17:16Z,2020-09-03T00:17:55Z,MEMBER,"Maybe I can take advantage of https://sqlite.org/np1queryprob.html here - I could define a SQL query for fetching the ""display"" version of each item, and include a Jinja template fragment in the configuration as well. Maybe something like this: ```yaml photos.db: photos_with_apple_metadata: sql: |- select sha256 as key, 'Photo in ' || coalesce(place_city, 'unknown') as title, ( select group_concat(normalized_string, ' ') from labels where labels.uuid = photos_with_apple_metadata.uuid ) as search_1, date as timestamp, 1 as category from photos_with_apple_metadata display_sql: |- select sha256, place_city, date from photos_with_apple_metadata where sha256 = :key display: |-

Taken in {{ display.place_city }} on {{ display.date }}

```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965, https://github.com/simonw/datasette/pull/952#issuecomment-686061028,https://api.github.com/repos/simonw/datasette/issues/952,686061028,MDEyOklzc3VlQ29tbWVudDY4NjA2MTAyOA==,27856297,2020-09-02T22:26:14Z,2020-09-02T22:26:14Z,CONTRIBUTOR,"Looks like black is up-to-date now, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",687245650,