home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

30 rows where "updated_at" is on date 2019-06-23 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date)

issue 8

  • Port Datasette to ASGI 10
  • Port Datasette from Sanic to ASGI + Uvicorn 10
  • Full text search of all tables at once? 3
  • Documentation with recommendations on running Datasette in production without using Docker 3
  • Unit tests for installable plugins 1
  • Enforce import sort order with isort 1
  • Add unit test for "static" mechanism in plugins 1
  • asgi_wrapper plugin hook 1

user 2

  • simonw 27
  • chrismp 3

author_association 2

  • OWNER 27
  • NONE 3
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
504798977 https://github.com/simonw/datasette/pull/518#issuecomment-504798977 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc5ODk3Nw== simonw 9599 2019-06-23T23:52:38Z 2019-06-23T23:52:38Z OWNER

Last thing is to replace sanic.response:

  • response.text("")
  • response.html()
  • response.redirect(path)
  • response.HTTPResponse

Implementations here: https://github.com/huge-success/sanic/blob/0.7.0/sanic/response.py#L175-L285

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504795648 https://github.com/simonw/datasette/pull/518#issuecomment-504795648 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc5NTY0OA== simonw 9599 2019-06-23T23:07:06Z 2019-06-23T23:07:06Z OWNER

For the request object.... what are the fields of it I actually use?

  • request.url
  • request.query_string
  • request.path
  • request.method
  • request.args
  • request.raw_args

ALL of those are things that can be derived from the scope - so I think my new Request class (in utils/asgi.py) is just going to be a wrapper around a scope.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504793988 https://github.com/simonw/datasette/issues/498#issuecomment-504793988 https://api.github.com/repos/simonw/datasette/issues/498 MDEyOklzc3VlQ29tbWVudDUwNDc5Mzk4OA== simonw 9599 2019-06-23T22:40:25Z 2019-06-23T22:41:25Z OWNER

You can use a database query against sqlite_master to find all the FTS tables:

http://sf-trees.datasettes.com/trees-b64f0cb?sql=select+name+from+sqlite_master+where+name+like+%22%25_fts%22+and+type+%3D+%22table%22

select name from sqlite_master where name like "%_fts" and type = "table"

You could then construct a crafty UNION query to get results back from all of those tables at once:

http://sf-trees.datasettes.com/trees-b64f0cb?sql=select+%27PlantType_value_fts%27%2C+rowid%2C+value+from+PlantType_value_fts+where+PlantType_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qCareAssistant_value_fts%27%2C+rowid%2C+value+from+qCareAssistant_value_fts+where+qCareAssistant_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qCaretaker_value_fts%27%2C+rowid%2C+value+from+qCaretaker_value_fts+where+qCaretaker_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qLegalStatus_value_fts%27%2C+rowid%2C+value+from+qLegalStatus_value_fts+where+qLegalStatus_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qSiteInfo_value_fts%27%2C+rowid%2C+value+from+qSiteInfo_value_fts+where+qSiteInfo_value_fts+match+%3Asearch%0D%0A++union%0D%0Aselect+%27qSpecies_value_fts%27%2C+rowid%2C+value+from+qSpecies_value_fts+where+qSpecies_value_fts+match+%3Asearch&search=fi%2A

(I'm searching for fi* here to demonstrate wildcards)

The problem, as discussed earlier, is relevance: there's no way to compare the scores you're getting across different tables, so you won't be able to order by anything.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Full text search of all tables at once? 451513541  
504793379 https://github.com/simonw/datasette/issues/514#issuecomment-504793379 https://api.github.com/repos/simonw/datasette/issues/514 MDEyOklzc3VlQ29tbWVudDUwNDc5MzM3OQ== simonw 9599 2019-06-23T22:31:29Z 2019-06-23T22:31:48Z OWNER

I suggest trying a full path in ExecStart like this:

ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 /home/chris/digital-library/databases/*.db --cors --metadata /home/chris/digital-library/metadata.json

That should eliminate the chance of some kind of path confusion.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Documentation with recommendations on running Datasette in production without using Docker 459397625  
504791053 https://github.com/simonw/datasette/pull/518#issuecomment-504791053 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc5MTA1Mw== simonw 9599 2019-06-23T22:00:56Z 2019-06-23T22:00:56Z OWNER

The InvalidUsage exception is thrown by Sanic when it gets an unhandled request - usually a HEAD. It was added in efbb4e83374a2c795e436c72fa79f70da72309b8

I'm going to replace it with specific handling for HEAD requests plus a unit test.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504790825 https://github.com/simonw/datasette/pull/518#issuecomment-504790825 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc5MDgyNQ== simonw 9599 2019-06-23T21:57:58Z 2019-06-23T21:57:58Z OWNER

The big one: Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504789231 https://github.com/simonw/datasette/issues/514#issuecomment-504789231 https://api.github.com/repos/simonw/datasette/issues/514 MDEyOklzc3VlQ29tbWVudDUwNDc4OTIzMQ== chrismp 7936571 2019-06-23T21:35:33Z 2019-06-23T21:35:33Z NONE

@russss

Thanks, just one more thing.

I edited datasette.service:

``` [Unit] Description=Datasette After=network.target

[Service] Type=simple User=chris WorkingDirectory=/home/chris/digital-library ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json Restart=on-failure

[Install] WantedBy=multi-user.target ```

Then ran:

$ sudo systemctl daemon-reload $ sudo systemctl enable datasette $ sudo systemctl start datasette

But the logs from journalctl show this datasette error:

Jun 23 23:31:41 ns331247 datasette[1771]: Error: Invalid value for "[FILES]...": Path "databases/*.db" does not exist. Jun 23 23:31:44 ns331247 datasette[1778]: Usage: datasette serve [OPTIONS] [FILES]... Jun 23 23:31:44 ns331247 datasette[1778]: Try "datasette serve --help" for help.

But the databases directory does exist in the directory specified by WorkingDirectory. Is this a datasette problem or did I write something incorrectly in the .service file?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Documentation with recommendations on running Datasette in production without using Docker 459397625  
504686266 https://github.com/simonw/datasette/issues/514#issuecomment-504686266 https://api.github.com/repos/simonw/datasette/issues/514 MDEyOklzc3VlQ29tbWVudDUwNDY4NjI2Ng== chrismp 7936571 2019-06-22T17:58:50Z 2019-06-23T21:21:57Z NONE

@russss

Actually, here's what I've got in /etc/systemd/system/datasette.service

``` [Unit] Description=Datasette After=network.target

[Service] Type=simple User=chris WorkingDirectory=/home/chris/digital-library ExecStart=/home/chris/Env/datasette/lib/python3.7/site-packages/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json Restart=on-failure

[Install] WantedBy=multi-user.target ```

I ran: $ sudo systemctl daemon-reload $ sudo systemctl enable datasette $ sudo systemctl start datasette Then I ran: $ journalctl -u datasette -f

Got this message.

Hint: You are currently not seeing messages from other users and the system. Users in groups 'adm', 'systemd-journal', 'wheel' can see all messages. Pass -q to turn off this notice. -- Logs begin at Thu 2019-06-20 00:05:23 CEST. -- Jun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed to execute command: Permission denied Jun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied Jun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed to execute command: Permission denied Jun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied Jun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed to execute command: Permission denied Jun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied Jun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed to execute command: Permission denied Jun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied Jun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed to execute command: Permission denied Jun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied When I go to the address for my server, I am met with the standard "Welcome to nginx" message:

``` Welcome to nginx! If you see this page, the nginx web server is successfully installed and working. Further configuration is required.

For online documentation and support please refer to nginx.org. Commercial support is available at nginx.com.

Thank you for using nginx. ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Documentation with recommendations on running Datasette in production without using Docker 459397625  
504785662 https://github.com/simonw/datasette/issues/498#issuecomment-504785662 https://api.github.com/repos/simonw/datasette/issues/498 MDEyOklzc3VlQ29tbWVudDUwNDc4NTY2Mg== chrismp 7936571 2019-06-23T20:47:37Z 2019-06-23T20:47:37Z NONE

Very cool, thank you.

Using http://search-24ways.herokuapp.com as an example, let's say I want to search all FTS columns in all tables in all databases for the word "web."

Here's a link to the query I'd need to run to search "web" on FTS columns in articles table of the 24ways database.

And here's a link to the JSON version of the above result. I'd like to get the JSON result of that query for each FTS table of each database in my datasette project.

Is it possible in Javascript to automate the construction of query URLs like the one I linked, but for every FTS table in my datasette project?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Full text search of all tables at once? 451513541  
504782618 https://github.com/simonw/datasette/pull/518#issuecomment-504782618 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc4MjYxOA== simonw 9599 2019-06-23T20:05:44Z 2019-06-23T20:05:59Z OWNER

Replacement for @app.listener("before_server_start") - this is what the ASGI lifespan protocol is for.

I know Uvicorn supports this because it keeps saying ASGI 'lifespan' protocol appears unsupported on the console.

I think the solution here will be to introduce another ASGI wrapper class similar to AsgiTracer. I'll model this on the example in the ASGI lifespan spec.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504772599 https://github.com/simonw/datasette/issues/520#issuecomment-504772599 https://api.github.com/repos/simonw/datasette/issues/520 MDEyOklzc3VlQ29tbWVudDUwNDc3MjU5OQ== simonw 9599 2019-06-23T17:44:36Z 2019-06-23T17:44:36Z OWNER

These plugins will need access to configuration and the ability to execute SQL - which means we need to make the datasette instance available to them.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
asgi_wrapper plugin hook 459598080  
504768147 https://github.com/simonw/datasette/issues/516#issuecomment-504768147 https://api.github.com/repos/simonw/datasette/issues/516 MDEyOklzc3VlQ29tbWVudDUwNDc2ODE0Nw== simonw 9599 2019-06-23T16:43:23Z 2019-06-23T16:43:23Z OWNER

The Starlette lint and test scripts do this, and also apply autoflake to remove any unnecessary imports: https://github.com/encode/starlette/tree/8c8cc2ec0a5cb834a9a15b871ae8b480503abb67/scripts

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Enforce import sort order with isort 459509126  
504765738 https://github.com/simonw/datasette/pull/518#issuecomment-504765738 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc2NTczOA== simonw 9599 2019-06-23T16:11:49Z 2019-06-23T16:20:44Z OWNER

OK, for Get ?_trace=1 working again. The old code lives in two places:

https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/app.py#L546-L560

And then:

https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/app.py#L653-L672

So it's stashing something on the request to tell the rest of the code it should be tracing, then using that collected data from the request to add information to the final body.

One possible shape for the replacement is a new ASGI middleware that wraps everything else. We don't have a mutable request object here though, so we will need to untangle this entirely from the request object.

Also tricky is that in ASGI land we handle streams - we don't usually wait around for the entire response body to be compiled for us. This means the code that modifies the response (adding to the JSON or appending inside the </body>) needs to be reconsidered.

As usual, Starlette seems to have figured this out: https://github.com/encode/starlette/blob/8c8cc2ec0a5cb834a9a15b871ae8b480503abb67/starlette/middleware/gzip.py

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504765145 https://github.com/simonw/datasette/pull/518#issuecomment-504765145 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc2NTE0NQ== simonw 9599 2019-06-23T16:04:37Z 2019-06-23T16:04:37Z OWNER

Another bug: JSON is being served without a content-type header: ``` ~ $ curl -i 'http://127.0.0.1:8001/fivethirtyeight/ahca-polls%2Fahca_polls.json' HTTP/1.1 200 OK date: Sun, 23 Jun 2019 16:04:01 GMT server: uvicorn referrer-policy: no-referrer transfer-encoding: chunked

{"database": "fivethirtyeight", "table": "ahca-polls/ahca_polls", ... ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504765018 https://github.com/simonw/datasette/pull/518#issuecomment-504765018 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc2NTAxOA== simonw 9599 2019-06-23T16:03:20Z 2019-06-23T16:03:20Z OWNER

Weird new bug: http://127.0.0.1:8001/fixtures/table%2Fwith%2Fslashes.csv?_format=json is downloading CSV for me now.

https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json does the right thing.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504763919 https://github.com/simonw/datasette/issues/498#issuecomment-504763919 https://api.github.com/repos/simonw/datasette/issues/498 MDEyOklzc3VlQ29tbWVudDUwNDc2MzkxOQ== simonw 9599 2019-06-23T15:50:49Z 2019-06-23T15:50:49Z OWNER

One interesting way to approach this could be to do it entirely in JavaScript. I've had a lot of success building small apps on top of Datasette's JavaScript API - I wrote up one example here: https://24ways.org/2018/fast-autocomplete-search-for-your-website/

Once #272 is done I'll be adding a plugin hook that allows plugins to define entirely new pages within the Datasette application, which may also be a good way to work on this.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Full text search of all tables at once? 451513541  
504762887 https://github.com/simonw/datasette/pull/518#issuecomment-504762887 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc2Mjg4Nw== simonw 9599 2019-06-23T15:38:58Z 2019-06-23T15:38:58Z OWNER

Mystery solved: that's because I'm constructing my own scope object and testing via ApplicationCommunicator rather than exercising Uvicorn directly.

https://github.com/simonw/datasette/blob/d60fbfcae2658e71cab6d7b3b9f53f8d895064ef/tests/fixtures.py#L42-L57

I don't want to introduce the complexity of launching a real Uvicorn as part of the tests, so I guess I'll have to carefully update my ApplicationCommunicator test harness to more correctly emulate real life.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504762769 https://github.com/simonw/datasette/pull/518#issuecomment-504762769 https://api.github.com/repos/simonw/datasette/issues/518 MDEyOklzc3VlQ29tbWVudDUwNDc2Mjc2OQ== simonw 9599 2019-06-23T15:37:26Z 2019-06-23T15:37:26Z OWNER

This is strange: on my local machine http://127.0.0.1:8001/fixtures/table%2Fwith%2Fslashes.csv is returning a 404 BUT there's a test for that which is passing under pytest:

https://github.com/simonw/datasette/blob/d60fbfcae2658e71cab6d7b3b9f53f8d895064ef/tests/test_api.py#L721-L727

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette from Sanic to ASGI + Uvicorn 459587155  
504761039 https://github.com/simonw/datasette/issues/272#issuecomment-504761039 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc2MTAzOQ== simonw 9599 2019-06-23T15:15:41Z 2019-06-23T15:18:36Z OWNER

And now the tests are all passing!

Still to do:

  • Use raw_path so table names containing / can work correctly
  • Get ?_trace=1 working again
  • Replacement for @app.listener("before_server_start")
  • Replace Sanic request object with my own request class, so I can remove Sanic dependency
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504761165 https://github.com/simonw/datasette/issues/272#issuecomment-504761165 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc2MTE2NQ== simonw 9599 2019-06-23T15:17:07Z 2019-06-23T15:17:07Z OWNER

I'm going to move the remaining work into a pull request.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504716988 https://github.com/simonw/datasette/issues/272#issuecomment-504716988 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDcxNjk4OA== simonw 9599 2019-06-23T03:43:46Z 2019-06-23T15:15:26Z OWNER

OK, it's beginning to shape up now. Next steps:

  • [x] Static file support (including for plugins) - plus tests
  • [x] Streaming support so the CSV tests will pass
  • [x] Ability to download the database file
  • [x] Implement missing-slash redirects
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504760061 https://github.com/simonw/datasette/issues/272#issuecomment-504760061 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc2MDA2MQ== simonw 9599 2019-06-23T15:02:52Z 2019-06-23T15:02:52Z OWNER

Tests are failing on Python 3.5: https://travis-ci.org/simonw/datasette/jobs/549380098 - error is TypeError: the JSON object must be str, not 'bytes'

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504759842 https://github.com/simonw/datasette/issues/272#issuecomment-504759842 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc1OTg0Mg== simonw 9599 2019-06-23T15:00:06Z 2019-06-23T15:00:06Z OWNER

I also need to actually take advantage of raw_path such that pages like https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators can be correctly served.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504759683 https://github.com/simonw/datasette/issues/272#issuecomment-504759683 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc1OTY4Mw== simonw 9599 2019-06-23T14:57:50Z 2019-06-23T14:57:50Z OWNER

All of the tests are now passing!

I still need a solution for this:

https://github.com/simonw/datasette/blob/5bd510b01adae3f719e4426b9bfbc346a946ba5c/datasette/app.py#L706-L714

I think the answer is ASGI lifespan, which is supported by Uvicorn. https://asgi.readthedocs.io/en/latest/specs/lifespan.html#startup

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504754552 https://github.com/simonw/datasette/issues/272#issuecomment-504754552 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc1NDU1Mg== simonw 9599 2019-06-23T13:53:39Z 2019-06-23T13:53:39Z OWNER

Next test to fix (because by new test harness doesn't actually obey the allow_redirects= parameter): ``` ___ test_database_page_redirects_with_url_hash _____

app_client_with_hash = <tests.fixtures.TestClient object at 0x10981f240>

def test_database_page_redirects_with_url_hash(app_client_with_hash):
    response = app_client_with_hash.get("/fixtures", allow_redirects=False)
    assert response.status == 302
    response = app_client_with_hash.get("/fixtures")
  assert "fixtures" in response.text

E AssertionError: assert 'fixtures' in '' E + where '' = <tests.fixtures.TestResponse object at 0x10981f550>.text ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504754433 https://github.com/simonw/datasette/issues/272#issuecomment-504754433 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDc1NDQzMw== simonw 9599 2019-06-23T13:51:53Z 2019-06-23T13:51:53Z OWNER

CSV tests all pass as of https://github.com/simonw/datasette/commit/ff9efa668ebc33f17ef9b30139960e29906a18fb

This code could be a lot neater though. At the very least I'm going to refactor datasette/utils.py into a datasette/utils package and put all of my new ASGI utilities in datasette/utils/asgi.py

The way I implemented streaming on top of a writer object (inspired by Sanic) is a bit of a weird hack. I think I'd rather use an abstraction where my view functions can yield chunks of body data.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504720379 https://github.com/simonw/datasette/issues/226#issuecomment-504720379 https://api.github.com/repos/simonw/datasette/issues/226 MDEyOklzc3VlQ29tbWVudDUwNDcyMDM3OQ== simonw 9599 2019-06-23T05:05:32Z 2019-06-23T05:05:32Z OWNER

The mechanism I described here - having a tests/example_plugin folder - is probably the right solution for #517

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Unit tests for installable plugins 315738696  
504720326 https://github.com/simonw/datasette/issues/517#issuecomment-504720326 https://api.github.com/repos/simonw/datasette/issues/517 MDEyOklzc3VlQ29tbWVudDUwNDcyMDMyNg== simonw 9599 2019-06-23T05:04:26Z 2019-06-23T05:04:42Z OWNER

See also #226 - "Unit tests for installable plugins"

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add unit test for "static" mechanism in plugins 459537047  
504711468 https://github.com/simonw/datasette/issues/272#issuecomment-504711468 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDcxMTQ2OA== simonw 9599 2019-06-23T01:36:33Z 2019-06-23T01:36:33Z OWNER

Published an in-progress demo:

datasette publish now fixtures.db -n datasette-asgi-early-demo --branch=asgi

Here it is: https://datasette-asgi-early-demo-qahhxctqpw.now.sh/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  
504710331 https://github.com/simonw/datasette/issues/272#issuecomment-504710331 https://api.github.com/repos/simonw/datasette/issues/272 MDEyOklzc3VlQ29tbWVudDUwNDcxMDMzMQ== simonw 9599 2019-06-23T01:08:45Z 2019-06-23T01:08:45Z OWNER

Lots still to do:

  • Static files are not being served
  • Streaming CSV files don't work
  • Tests all fail
  • Some URLs (e.g. the 'next' link on tables) are incorrect

I'm going to work on getting the unit test framework to be ASGI-compatible next.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Port Datasette to ASGI 324188953  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 1290.439ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows