html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app
https://github.com/simonw/datasette/pull/518#issuecomment-504798977,https://api.github.com/repos/simonw/datasette/issues/518,504798977,MDEyOklzc3VlQ29tbWVudDUwNDc5ODk3Nw==,9599,2019-06-23T23:52:38Z,2019-06-23T23:52:38Z,OWNER,"Last thing is to replace `sanic.response`:
* `response.text("""")`
* `response.html()`
* `response.redirect(path)`
* `response.HTTPResponse`
Implementations here: https://github.com/huge-success/sanic/blob/0.7.0/sanic/response.py#L175-L285","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459587155,
https://github.com/simonw/datasette/pull/518#issuecomment-504795648,https://api.github.com/repos/simonw/datasette/issues/518,504795648,MDEyOklzc3VlQ29tbWVudDUwNDc5NTY0OA==,9599,2019-06-23T23:07:06Z,2019-06-23T23:07:06Z,OWNER,"For the request object.... what are the fields of it I actually use?
* `request.url`
* `request.query_string`
* `request.path`
* `request.method`
* `request.args`
* `request.raw_args`
ALL of those are things that can be derived from the `scope` - so I think my new `Request` class (in `utils/asgi.py`) is just going to be a wrapper around a `scope`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459587155,
https://github.com/simonw/datasette/issues/498#issuecomment-504793988,https://api.github.com/repos/simonw/datasette/issues/498,504793988,MDEyOklzc3VlQ29tbWVudDUwNDc5Mzk4OA==,9599,2019-06-23T22:40:25Z,2019-06-23T22:41:25Z,OWNER,"You can use a database query against `sqlite_master` to find all the FTS tables:
http://sf-trees.datasettes.com/trees-b64f0cb?sql=select+name+from+sqlite_master+where+name+like+%22%25_fts%22+and+type+%3D+%22table%22
`select name from sqlite_master where name like ""%_fts"" and type = ""table""`
You could then construct a crafty UNION query to get results back from all of those tables at once:
http://sf-trees.datasettes.com/trees-b64f0cb?sql=select+%27PlantType_value_fts%27%2C+rowid%2C+value+from+PlantType_value_fts+where+PlantType_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qCareAssistant_value_fts%27%2C+rowid%2C+value+from+qCareAssistant_value_fts+where+qCareAssistant_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qCaretaker_value_fts%27%2C+rowid%2C+value+from+qCaretaker_value_fts+where+qCaretaker_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qLegalStatus_value_fts%27%2C+rowid%2C+value+from+qLegalStatus_value_fts+where+qLegalStatus_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qSiteInfo_value_fts%27%2C+rowid%2C+value+from+qSiteInfo_value_fts+where+qSiteInfo_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qSpecies_value_fts%27%2C+rowid%2C+value+from+qSpecies_value_fts+where+qSpecies_value_fts+match+%3Asearch&search=fi%2A
(I'm searching for `fi*` here to demonstrate wildcards)
The problem, as discussed earlier, is relevance: there's no way to compare the scores you're getting across different tables, so you won't be able to order by anything.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",451513541,
https://github.com/simonw/datasette/issues/514#issuecomment-504793379,https://api.github.com/repos/simonw/datasette/issues/514,504793379,MDEyOklzc3VlQ29tbWVudDUwNDc5MzM3OQ==,9599,2019-06-23T22:31:29Z,2019-06-23T22:31:48Z,OWNER,"I suggest trying a full path in `ExecStart` like this:
`ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 /home/chris/digital-library/databases/*.db --cors --metadata /home/chris/digital-library/metadata.json`
That should eliminate the chance of some kind of path confusion.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459397625,
https://github.com/simonw/datasette/pull/518#issuecomment-504791053,https://api.github.com/repos/simonw/datasette/issues/518,504791053,MDEyOklzc3VlQ29tbWVudDUwNDc5MTA1Mw==,9599,2019-06-23T22:00:56Z,2019-06-23T22:00:56Z,OWNER,"The `InvalidUsage` exception is thrown by Sanic when it gets an unhandled request - usually a HEAD. It was added in efbb4e83374a2c795e436c72fa79f70da72309b8
I'm going to replace it with specific handling for HEAD requests plus a unit test.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459587155,
https://github.com/simonw/datasette/pull/518#issuecomment-504790825,https://api.github.com/repos/simonw/datasette/issues/518,504790825,MDEyOklzc3VlQ29tbWVudDUwNDc5MDgyNQ==,9599,2019-06-23T21:57:58Z,2019-06-23T21:57:58Z,OWNER,"The big one: **Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency**","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459587155,
https://github.com/simonw/datasette/issues/514#issuecomment-504789231,https://api.github.com/repos/simonw/datasette/issues/514,504789231,MDEyOklzc3VlQ29tbWVudDUwNDc4OTIzMQ==,7936571,2019-06-23T21:35:33Z,2019-06-23T21:35:33Z,NONE,"@russss
Thanks, just one more thing.
I edited `datasette.service`:
```
[Unit]
Description=Datasette
After=network.target
[Service]
Type=simple
User=chris
WorkingDirectory=/home/chris/digital-library
ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json
Restart=on-failure
[Install]
WantedBy=multi-user.target
```
Then ran:
```
$ sudo systemctl daemon-reload
$ sudo systemctl enable datasette
$ sudo systemctl start datasette
```
But the logs from `journalctl` show this datasette error:
```
Jun 23 23:31:41 ns331247 datasette[1771]: Error: Invalid value for ""[FILES]..."": Path ""databases/*.db"" does not exist.
Jun 23 23:31:44 ns331247 datasette[1778]: Usage: datasette serve [OPTIONS] [FILES]...
Jun 23 23:31:44 ns331247 datasette[1778]: Try ""datasette serve --help"" for help.
```
But the `databases` directory does exist in the directory specified by `WorkingDirectory`. Is this a datasette problem or did I write something incorrectly in the `.service` file?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459397625,
https://github.com/simonw/datasette/issues/514#issuecomment-504686266,https://api.github.com/repos/simonw/datasette/issues/514,504686266,MDEyOklzc3VlQ29tbWVudDUwNDY4NjI2Ng==,7936571,2019-06-22T17:58:50Z,2019-06-23T21:21:57Z,NONE,"@russss
Actually, here's what I've got in `/etc/systemd/system/datasette.service`
```
[Unit]
Description=Datasette
After=network.target
[Service]
Type=simple
User=chris
WorkingDirectory=/home/chris/digital-library
ExecStart=/home/chris/Env/datasette/lib/python3.7/site-packages/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json
Restart=on-failure
[Install]
WantedBy=multi-user.target
```
I ran:
```
$ sudo systemctl daemon-reload
$ sudo systemctl enable datasette
$ sudo systemctl start datasette
```
Then I ran:
`$ journalctl -u datasette -f`
Got this message.
```
Hint: You are currently not seeing messages from other users and the system.
Users in groups 'adm', 'systemd-journal', 'wheel' can see all messages.
Pass -q to turn off this notice.
-- Logs begin at Thu 2019-06-20 00:05:23 CEST. --
Jun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed to execute command: Permission denied
Jun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied
Jun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed to execute command: Permission denied
Jun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied
Jun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed to execute command: Permission denied
Jun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied
Jun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed to execute command: Permission denied
Jun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied
Jun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed to execute command: Permission denied
Jun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied
```
When I go to the address for my server, I am met with the standard ""Welcome to nginx"" message:
```
Welcome to nginx!
If you see this page, the nginx web server is successfully installed and working. Further configuration is required.
For online documentation and support please refer to nginx.org.
Commercial support is available at nginx.com.
Thank you for using nginx.
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459397625,
https://github.com/simonw/datasette/issues/498#issuecomment-504785662,https://api.github.com/repos/simonw/datasette/issues/498,504785662,MDEyOklzc3VlQ29tbWVudDUwNDc4NTY2Mg==,7936571,2019-06-23T20:47:37Z,2019-06-23T20:47:37Z,NONE,"Very cool, thank you.
Using http://search-24ways.herokuapp.com as an example, let's say I want to search all FTS columns in all tables in all databases for the word ""web.""
[Here's a link](http://search-24ways.herokuapp.com/24ways-f8f455f?sql=select+count%28*%29from+articles+where+rowid+in+%28select+rowid+from+articles_fts+where+articles_fts+match+%3Asearch%29&search=web) to the query I'd need to run to search ""web"" on FTS columns in `articles` table of the `24ways` database.
And [here's a link](http://search-24ways.herokuapp.com/24ways-f8f455f.json?sql=select+count%28*%29from+articles+where+rowid+in+%28select+rowid+from+articles_fts+where+articles_fts+match+%3Asearch%29&search=web) to the JSON version of the above result. I'd like to get the JSON result of that query for each FTS table of each database in my datasette project.
Is it possible in Javascript to automate the construction of query URLs like the one I linked, but for every FTS table in my datasette project?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",451513541,
https://github.com/simonw/datasette/pull/518#issuecomment-504782618,https://api.github.com/repos/simonw/datasette/issues/518,504782618,MDEyOklzc3VlQ29tbWVudDUwNDc4MjYxOA==,9599,2019-06-23T20:05:44Z,2019-06-23T20:05:59Z,OWNER,"**Replacement for @app.listener(""before_server_start"")** - this is what the [ASGI lifespan protocol](https://asgi.readthedocs.io/en/latest/specs/lifespan.html) is for.
I know Uvicorn supports this because it keeps saying `ASGI 'lifespan' protocol appears unsupported` on the console.
I think the solution here will be to introduce another ASGI wrapper class similar to `AsgiTracer`. I'll model this on the example in the ASGI lifespan spec.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459587155,
https://github.com/simonw/datasette/issues/520#issuecomment-504772599,https://api.github.com/repos/simonw/datasette/issues/520,504772599,MDEyOklzc3VlQ29tbWVudDUwNDc3MjU5OQ==,9599,2019-06-23T17:44:36Z,2019-06-23T17:44:36Z,OWNER,These plugins will need access to configuration and the ability to execute SQL - which means we need to make the `datasette` instance available to them.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459598080,
https://github.com/simonw/datasette/issues/516#issuecomment-504768147,https://api.github.com/repos/simonw/datasette/issues/516,504768147,MDEyOklzc3VlQ29tbWVudDUwNDc2ODE0Nw==,9599,2019-06-23T16:43:23Z,2019-06-23T16:43:23Z,OWNER,"The Starlette lint and test scripts do this, and also apply autoflake to remove any unnecessary imports: https://github.com/encode/starlette/tree/8c8cc2ec0a5cb834a9a15b871ae8b480503abb67/scripts","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459509126,
https://github.com/simonw/datasette/pull/518#issuecomment-504765738,https://api.github.com/repos/simonw/datasette/issues/518,504765738,MDEyOklzc3VlQ29tbWVudDUwNDc2NTczOA==,9599,2019-06-23T16:11:49Z,2019-06-23T16:20:44Z,OWNER,"OK, for **Get ?_trace=1 working again**. The old code lives in two places:
https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/app.py#L546-L560
And then:
https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/app.py#L653-L672
So it's stashing something on the request to tell the rest of the code it should be tracing, then using that collected data from the request to add information to the final body.
One possible shape for the replacement is a new ASGI middleware that wraps everything else. We don't have a mutable request object here though, so we will need to untangle this entirely from the request object.
Also tricky is that in ASGI land we handle streams - we don't usually wait around for the entire response body to be compiled for us. This means the code that modifies the response (adding to the JSON or appending inside the `