issues

40 rows where milestone = 3268330 sorted by updated_at descending

View and edit SQL

Suggested facets: user, comments, author_association, created_at (date), updated_at (date), closed_at (date)

type

state

repo

id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app
639072811 MDU6SXNzdWU2MzkwNzI4MTE= 849 Rename master branch to main simonw 9599 open 0   Datasette 1.0 3268330 9 2020-06-15T19:05:54Z 2020-07-31T23:23:24Z   OWNER  

I was waiting for consensus to form around this (and kind-of hoping for trunk since I like the tree metaphor) and it looks like main is it.

I've seen convincing arguments against trunk too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So main is better anyway.

datasette 107914493 issue    
668064026 MDU6SXNzdWU2NjgwNjQwMjY= 911 Rethink the --name option to "datasette publish" simonw 9599 open 0   Datasette 1.0 3268330 0 2020-07-29T18:49:49Z 2020-07-29T18:49:49Z   OWNER  

--name works inconsistently across the different publish providers - on Cloud Run you should use --service instead for example. Need to review it across all of them and either remove it or clarify what it does.

datasette 107914493 issue    
646737558 MDU6SXNzdWU2NDY3Mzc1NTg= 870 Refactor default views to use register_routes simonw 9599 open 0   Datasette 1.0 3268330 10 2020-06-27T18:53:12Z 2020-06-30T19:26:35Z   OWNER  

It would be much cleaner if Datasette's default views were all registered using the new register_routes() plugin hook. Could dramatically reduce the code in datasette/app.py.

The ideal fix here would be to rework my BaseView subclass mechanism to work with register_routes() so that those views don't have any special privileges above plugin-provided views.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/864#issuecomment-648580556_

datasette 107914493 issue    
648435885 MDU6SXNzdWU2NDg0MzU4ODU= 878 BaseView should be a documented API for plugins to use simonw 9599 open 0   Datasette 1.0 3268330 0 2020-06-30T19:26:13Z 2020-06-30T19:26:26Z   OWNER  

Can be part of #870 - refactoring existing views to use register_routes().

I'm going to put the new check_permissions() method on BaseView as well. If I want that method to be available to plugins I can do so by turning that BaseView class into a documented API that plugins are encouraged to use themselves.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/832#issuecomment-651995453_

datasette 107914493 issue    
529429214 MDU6SXNzdWU1Mjk0MjkyMTQ= 642 Provide a cookiecutter template for creating new plugins simonw 9599 closed 0   Datasette 1.0 3268330 6 2019-11-27T15:46:36Z 2020-06-20T03:20:33Z 2020-06-20T03:20:25Z OWNER  

See this conversation: https://twitter.com/psychemedia/status/1199707352540368896

datasette 107914493 issue    
642297505 MDU6SXNzdWU2NDIyOTc1MDU= 857 Comprehensive documentation for variables made available to templates simonw 9599 open 0   Datasette 1.0 3268330 0 2020-06-20T03:19:43Z 2020-06-20T03:19:44Z   OWNER  

Needed for the Datasette 1.0 release, so template authors can trust that Datasette is unlikely to break their templates.

datasette 107914493 issue    
631932926 MDU6SXNzdWU2MzE5MzI5MjY= 801 allow_by_query setting for configuring permissions with a SQL statement simonw 9599 closed 0   Datasette 1.0 3268330 6 2020-06-05T20:30:19Z 2020-06-11T18:58:56Z 2020-06-11T18:58:49Z OWNER  

Idea: an "allow_sql" key with a SQL query that gets passed the actor JSON as :actor and can extract the relevant keys from it and return 1 or 0.

_Originally posted by @simonw in https://github.com/simonw/datasette/issues/698#issuecomment-639787304_

See also #800

datasette 107914493 issue    
636511683 MDU6SXNzdWU2MzY1MTE2ODM= 830 Redesign register_facet_classes plugin hook simonw 9599 open 0   Datasette 1.0 3268330 0 2020-06-10T20:03:27Z 2020-06-10T20:03:27Z   OWNER  

Nothing uses this plugin hook yet, so the design is not yet proven.

I'm going to build a real plugin against it and use that process to inform any design changes that may need to be made.

I'll add a warning about this to the documentation.

datasette 107914493 issue    
634651079 MDU6SXNzdWU2MzQ2NTEwNzk= 814 Remove --debug option from datasette serve simonw 9599 open 0   Datasette 1.0 3268330 1 2020-06-08T14:10:14Z 2020-06-08T22:42:17Z   OWNER  

It doesn't appear to do anything useful at all:

https://github.com/simonw/datasette/blob/f786033a5f0098371cb1df1ce83959b27c588115/datasette/cli.py#L251-L253

https://github.com/simonw/datasette/blob/f786033a5f0098371cb1df1ce83959b27c588115/datasette/cli.py#L365-L367

datasette 107914493 issue    
449886319 MDU6SXNzdWU0NDk4ODYzMTk= 493 Rename metadata.json to config.json simonw 9599 open 0   Datasette 1.0 3268330 3 2019-05-29T15:48:03Z 2020-06-08T22:40:01Z   OWNER  

It is increasingly being useful configuration options, when it started out as purely metadata.

Could cause confusion with the --config mechanism though - maybe that should be called "settings" instead?

datasette 107914493 issue    
634663505 MDU6SXNzdWU2MzQ2NjM1MDU= 815 Group permission checks by request on /-/permissions debug page simonw 9599 open 0   Datasette 1.0 3268330 6 2020-06-08T14:25:23Z 2020-06-08T14:42:56Z   OWNER  

Now that we're making a LOT more permission checks (on the DB index page we do a check for every listed table for example) the /-/permissions page gets filled up pretty quickly.

Can make this more readable by grouping permission checks by request. Have most recent request at the top of the page but the permission requests within that page sorted chronologically by most recent last.

datasette 107914493 issue    
628572716 MDU6SXNzdWU2Mjg1NzI3MTY= 791 Tutorial: building a something-interesting with writable canned queries simonw 9599 open 0   Datasette 1.0 3268330 2 2020-06-01T16:32:05Z 2020-06-06T20:51:07Z   OWNER  

Initial idea: TODO list, as a tutorial for #698 writable canned queries.

datasette 107914493 issue    
610829227 MDU6SXNzdWU2MTA4MjkyMjc= 749 Respect Cloud Run max response size of 32MB simonw 9599 open 0   Datasette 1.0 3268330 1 2020-05-01T16:06:46Z 2020-06-06T20:01:54Z   OWNER  

https://cloud.google.com/run/quotas lists the maximum response size as 32MB.

I spotted a bug where attempting to download a database file larger than that from a Cloud Run deployment (in this case it was https://github-to-sqlite.dogsheep.net/github.db after I accidentally increased the size of that database) returned a 500 error because of this.

datasette 107914493 issue    
449854604 MDU6SXNzdWU0NDk4NTQ2MDQ= 492 Facets not correctly persisted in hidden form fields simonw 9599 open 0   Datasette 1.0 3268330 3 2019-05-29T14:49:39Z 2020-06-06T20:01:53Z   OWNER  

Steps to reproduce: visit https://2a4b892.datasette.io/fixtures/roadside_attractions?_facet_m2m=attraction_characteristic and click "Apply"

Result is a 500: no such column: attraction_characteristic

The error occurs because of this hidden HTML input:

<input type="hidden" name="_facet" value="attraction_characteristic">

This should be:

<input type="hidden" name="_facet_m2m" value="attraction_characteristic">
datasette 107914493 issue    
450032134 MDU6SXNzdWU0NTAwMzIxMzQ= 495 facet_m2m gets confused by multiple relationships simonw 9599 open 0   Datasette 1.0 3268330 2 2019-05-29T21:37:28Z 2020-06-06T20:01:53Z   OWNER  

I got this for a database I was playing with:

https://user-images.githubusercontent.com/9599/58593179-12a65b00-821f-11e9-8621-ec0a62d41b46.png">

I think this is because of these three tables:

https://user-images.githubusercontent.com/9599/58593271-471a1700-821f-11e9-8d23-b27bfc3dec73.png">

datasette 107914493 issue    
463492815 MDU6SXNzdWU0NjM0OTI4MTU= 534 500 error on m2m facet detection simonw 9599 open 0   Datasette 1.0 3268330 1 2019-07-03T00:42:42Z 2020-06-06T20:01:53Z   OWNER  

This may help debug:

diff --git a/datasette/facets.py b/datasette/facets.py
index 76d73e5..07a4034 100644
--- a/datasette/facets.py
+++ b/datasette/facets.py
@@ -499,11 +499,14 @@ class ManyToManyFacet(Facet):
                 "outgoing"
             ]
             if len(other_table_outgoing_foreign_keys) == 2:
-                destination_table = [
-                    t
-                    for t in other_table_outgoing_foreign_keys
-                    if t["other_table"] != self.table
-                ][0]["other_table"]
+                try:
+                    destination_table = [
+                        t
+                        for t in other_table_outgoing_foreign_keys
+                        if t["other_table"] != self.table
+                    ][0]["other_table"]
+                except IndexError:
+                    import pdb; pdb.pm()
                 # Only suggest if it's not selected already
                 if ("_facet_m2m", destination_table) in args:
                     continue
datasette 107914493 issue    
520740741 MDU6SXNzdWU1MjA3NDA3NDE= 625 If you apply ?_facet_array=tags then &_facet=tags does nothing simonw 9599 open 0   Datasette 1.0 3268330 0 2019-11-11T04:59:29Z 2020-06-06T20:01:53Z   OWNER  

Start here: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags

https://user-images.githubusercontent.com/9599/68562011-c0d44480-03fc-11ea-8354-def0ba35d365.png">

Note that tags is offered as a suggested facet. But if you click that you get this:

https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags&_facet=tags

The _facet=tags is added to the URL and it's removed from the list of suggested tags... but the facet itself is not displayed:

https://user-images.githubusercontent.com/9599/68562039-df3a4000-03fc-11ea-9778-81f7a931d608.png">

The _facet=tags facet should look like this:

https://user-images.githubusercontent.com/9599/68562067-f8db8780-03fc-11ea-8211-26281b7df9f9.png">

datasette 107914493 issue    
542553350 MDU6SXNzdWU1NDI1NTMzNTA= 655 Copy and paste doesn't work reliably on iPhone for SQL editor simonw 9599 open 0   Datasette 1.0 3268330 2 2019-12-26T13:15:10Z 2020-06-06T20:01:53Z   OWNER  

I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.

datasette 107914493 issue    
576722115 MDU6SXNzdWU1NzY3MjIxMTU= 696 Single failing unit test when run inside the Docker image simonw 9599 open 0   Datasette 1.0 3268330 1 2020-03-06T06:16:36Z 2020-06-06T20:01:53Z   OWNER  
docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash
root@0e1928cfdf79:/# cd /mnt
root@0e1928cfdf79:/mnt# pip install -e .[test]
root@0e1928cfdf79:/mnt# pytest

I get one failure!

It was for test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]

    def test_searchable(app_client, path, expected_rows):
        response = app_client.get(path)
>       assert expected_rows == response.json["rows"]
E       AssertionError: assert [[1, 'barry c...sel', 'puma']] == []
E         Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E         Full diff:
E         + []
E         - [[1, 'barry cat', 'terry dog', 'panther'],
E         -  [2, 'terry dog', 'sara weasel', 'puma']]

_Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_

datasette 107914493 issue    
398011658 MDU6SXNzdWUzOTgwMTE2NTg= 398 Ensure downloading a 100+MB SQLite database file works simonw 9599 open 0   Datasette 1.0 3268330 2 2019-01-10T20:57:52Z 2020-06-06T20:01:52Z   OWNER  

I've seen attempted downloads of large files fail after about ten seconds.

datasette 107914493 issue    
440222719 MDU6SXNzdWU0NDAyMjI3MTk= 448 _facet_array should work against views simonw 9599 open 0   Datasette 1.0 3268330 1 2019-05-03T21:08:04Z 2020-06-06T20:01:52Z   OWNER  

I created this view: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads-8dbda00/ads_with_targets

CREATE VIEW ads_with_targets as select ads.*, json_group_array(targets.name) as target_names from ads
  join ad_targets on ad_targets.ad_id = ads.id
  join targets on ad_targets.target_id = targets.id
  group by ad_targets.ad_id

When I try to apply faceting by array it appears to work at first: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names

But actually it's doing the wrong thing - the SQL for the facets uses rowid, but rowid is not present on views at all! These results are incorrect, and clicking to select a facet will fail to produce any rows: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names&target_names__arraycontains=people_who_match%3Ainterests%3AAfrican-American+Civil+Rights+Movement+%281954%E2%80%9468%29

Here's the SQL it should be using when you select a facet (note that it does not use a rowid):

https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads?sql=select+*+from+ads_with_targets+where+id+in+%28%0D%0A++++++++++++select+ads_with_targets.id+from+ads_with_targets%2C+json_each%28ads_with_targets.target_names%29+j%0D%0A++++++++++++where+j.value+%3D+%3Ap0%0D%0A++++++++%29+limit+101&p0=people_who_match%3Ainterests%3ABlack+%28Color%29

So we need to do something a lot smarter here. I'm not sure what the fix will look like, or even if it's feasible given that views don't have a rowid to hook into so the JSON faceting SQL may have to be completely rewritten.

datasette publish cloudrun \
    russian-ads.db \
    --name json-view-facet-bug-demo \
    --branch master \
    --extra-options "--config sql_time_limit_ms:5000 --config facet_time_limit_ms:5000"
datasette 107914493 issue    
570301333 MDU6SXNzdWU1NzAzMDEzMzM= 684 Add documentation on Database introspection methods to internals.rst simonw 9599 closed 0   Datasette 1.0 3268330 4 2020-02-25T04:20:24Z 2020-06-04T18:56:15Z 2020-05-30T18:40:39Z OWNER  

internals.rst will be landing as part of #683

datasette 107914493 issue    
629595228 MDExOlB1bGxSZXF1ZXN0NDI2ODkxNDcx 796 New WIP writable canned queries simonw 9599 closed 0   Datasette 1.0 3268330 9 2020-06-03T00:08:00Z 2020-06-03T15:16:52Z 2020-06-03T15:16:50Z OWNER simonw/datasette/pulls/796

Refs #698. Replaces #703

Still todo:

  • Unit tests
  • <del>Figure out .json mode</del>
  • Flash message solution
  • <del>CSRF protection</del>
  • Better error message display on errors
  • Documentation
  • <del>Maybe widgets?</del> I'll do these later
datasette 107914493 pull    
629535669 MDU6SXNzdWU2Mjk1MzU2Njk= 794 Show hooks implemented by each plugin on /-/plugins simonw 9599 closed 0   Datasette 1.0 3268330 2 2020-06-02T21:44:38Z 2020-06-02T22:30:17Z 2020-06-02T21:50:10Z OWNER  

e.g.

    {
        "name": "qs_actor.py",
        "static": false,
        "templates": false,
        "version": null,
        "hooks": [
            "actor_from_request"
        ]
    }
datasette 107914493 issue    
626593402 MDU6SXNzdWU2MjY1OTM0MDI= 780 Internals documentation for datasette.metadata() method simonw 9599 open 0   Datasette 1.0 3268330 2 2020-05-28T15:14:22Z 2020-06-02T22:13:12Z   OWNER  

https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/app.py#L297-L328

datasette 107914493 issue    
497170355 MDU6SXNzdWU0OTcxNzAzNTU= 576 Documented internals API for use in plugins simonw 9599 open 0   Datasette 1.0 3268330 8 2019-09-23T15:28:50Z 2020-06-02T22:13:09Z   OWNER  

Quite a few of the plugin hooks make a datasette”instance of the Datasette class available to the plugins, so that they can look up configuration settings and execute database queries.

This means it should provide a documented, stable API so that plugin authors can rely on it.

datasette 107914493 issue    
440134714 MDU6SXNzdWU0NDAxMzQ3MTQ= 446 Define mechanism for plugins to return structured data simonw 9599 open 0   Datasette 1.0 3268330 6 2019-05-03T17:00:16Z 2020-06-02T22:12:15Z   OWNER  

Several plugin hooks now expect plugins to return data in a specific shape - notably the new output format hook and the custom facet hook.

These use Python dictionaries right now but that's quite error prone: it would be good to have a mechanism that supported a more structured format.

Full list of current hooks is here: https://datasette.readthedocs.io/en/latest/plugins.html#plugin-hooks

datasette 107914493 issue    
459590021 MDU6SXNzdWU0NTk1OTAwMjE= 519 Decide what goes into Datasette 1.0 simonw 9599 open 0   Datasette 1.0 3268330 2 2019-06-23T15:47:41Z 2020-05-30T18:55:24Z   OWNER  

Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.

datasette 107914493 issue    
570101428 MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4 683 .execute_write() and .execute_write_fn() methods on Database simonw 9599 closed 0   Datasette 1.0 3268330 14 2020-02-24T19:51:58Z 2020-05-30T18:40:20Z 2020-02-25T04:45:08Z OWNER simonw/datasette/pulls/683

See #682

  • Come up with design for .execute_write() and .execute_write_fn()
  • Build some quick demo plugins to exercise the design
  • Write some unit tests
  • Write the documentation
datasette 107914493 pull    
374953006 MDU6SXNzdWUzNzQ5NTMwMDY= 369 Interface should show same JSON shape options for custom SQL queries slygent 416374 open 0   Datasette 1.0 3268330 2 2018-10-29T10:39:15Z 2020-05-30T17:24:06Z   NONE  

At the moment the page returning a custom SQL query shows the JSON and CSV APIs, but not the multiple JSON shapes. However, adding the _shape parameter to the JSON API URL manually still works, so perhaps there should be consistency in the interface by having the same "Advanced Export" box for custom SQL queries.

datasette 107914493 issue    
459397625 MDU6SXNzdWU0NTkzOTc2MjU= 514 Documentation with recommendations on running Datasette in production without using Docker chrismp 7936571 open 0   Datasette 1.0 3268330 26 2019-06-21T22:48:12Z 2020-05-30T17:22:56Z   NONE  

I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette.

So instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code.

nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &.

When I go to http://my-remote-server.com:8000, the site loads. But I know this is not a good long-term solution to running datasette on this server.

What is the "correct" way to have this site run, preferably on server port 80?

datasette 107914493 issue    
520667773 MDU6SXNzdWU1MjA2Njc3NzM= 620 Mechanism for indicating foreign key relationships in the table and query page URLs simonw 9599 open 0   Datasette 1.0 3268330 5 2019-11-10T22:26:27Z 2020-05-30T17:22:56Z   OWNER  

Datasette currently only inflates foreign keys (into names hyperlinks) if it detects them as foreign key constraints in the underlying database.

It would be useful if you could specify additional "foreign keys" using both metadata.json and the querystring - similar time how you can pass ?_fts_table=x https://datasette.readthedocs.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view

datasette 107914493 issue    
520681725 MDU6SXNzdWU1MjA2ODE3MjU= 621 Syntax for ?_through= that works as a form field simonw 9599 open 0   Datasette 1.0 3268330 3 2019-11-11T00:19:03Z 2020-05-30T17:22:56Z   OWNER  

The current syntax for ?_through= uses JSON to avoid any risk of confusion with table or column names that contain special characters.

This means you can't target a form field at it.

We should be able to support both - ?x.y.z=value for tables and columns with "regular" names, falling back to the current JSON syntax for columns or tables that won't work with the key/value syntax.

datasette 107914493 issue    
531502365 MDU6SXNzdWU1MzE1MDIzNjU= 646 Make database level information from metadata.json available in the index.html template lagolucas 18017473 open 0   Datasette 1.0 3268330 3 2019-12-02T19:55:10Z 2020-05-30T17:22:56Z   NONE  

Did a search on the issues here and didn't find anything related to what I want.

I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page.

I tried some small tweaks on the python and html files, but failed to get that result.

Is there a way? Thanks!

datasette 107914493 issue    
570309546 MDU6SXNzdWU1NzAzMDk1NDY= 685 Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread() simonw 9599 closed 0   Datasette 1.0 3268330 15 2020-02-25T04:49:44Z 2020-05-30T13:20:50Z 2020-05-08T17:42:18Z OWNER  

In #683 I started a new section of internals documentation covering the Database class: https://datasette.readthedocs.io/en/latest/internals.html#database-class

I decided not to document .execute() and .execute_against_connection_in_thread() yet because I'm not 100% happy with their API design yet.

datasette 107914493 issue    
585633142 MDU6SXNzdWU1ODU2MzMxNDI= 706 Documentation for the "request" object simonw 9599 closed 0   Datasette 1.0 3268330 6 2020-03-22T02:55:50Z 2020-05-30T13:20:00Z 2020-05-27T22:31:22Z OWNER  

Since that object is passed to the extra_template_vars hooks AND the classes registered by register_facet_classes it should be part of the documented interface on https://datasette.readthedocs.io/en/stable/internals.html

I could also start passing it to the register_output_renderer callback.

datasette 107914493 issue    
626078521 MDU6SXNzdWU2MjYwNzg1MjE= 774 Consolidate request.raw_args and request.args simonw 9599 closed 0   Datasette 1.0 3268330 8 2020-05-27T22:30:59Z 2020-05-29T23:27:35Z 2020-05-29T23:22:38Z OWNER  

request.raw_args is not documented, and I'd like to remove it entirely.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/706#issuecomment-634975252_

I use it in a few places in other projects though, so I'll have to fix those first: https://github.com/search?q=user%3Asimonw+raw_args&type=Code

datasette 107914493 issue    
497171390 MDU6SXNzdWU0OTcxNzEzOTA= 577 Utility mechanism for plugins to render templates simonw 9599 closed 0   Datasette 1.0 3268330 7 2019-09-23T15:30:36Z 2020-02-04T20:26:20Z 2020-02-04T20:26:19Z OWNER  

Sometimes a plugin will need to render a template for some custom UI. We need a documented API for doing this, which ensures that everything will work correctly if you extend base.html etc.

See also #576. This could be a .render() method on the Datasette class, but that feels a bit weird - should that class also take responsibility for rendering?

datasette 107914493 issue    
459587155 MDExOlB1bGxSZXF1ZXN0MjkwODk3MTA0 518 Port Datasette from Sanic to ASGI + Uvicorn simonw 9599 closed 0 simonw 9599 Datasette 1.0 3268330 12 2019-06-23T15:18:42Z 2019-06-24T13:42:50Z 2019-06-24T03:13:09Z OWNER simonw/datasette/pulls/518

Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces:

  • Update test harness to more correctly simulate the raw_path issue
  • Use raw_path so table names containing / can work correctly
  • Bug: JSON not served with correct content-type
  • Get ?_trace=1 working again
  • Replacement for @app.listener("before_server_start")
  • Bug: /fixtures/table%2Fwith%2Fslashes.csv?_format=json downloads as CSV
  • Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency
  • Final code tidy-up before merging to master
datasette 107914493 pull    
324188953 MDU6SXNzdWUzMjQxODg5NTM= 272 Port Datasette to ASGI simonw 9599 closed 0 simonw 9599 Datasette 1.0 3268330 42 2018-05-17T21:16:32Z 2019-06-24T04:54:15Z 2019-06-24T03:33:06Z OWNER  

Datasette doesn't take much advantage of Sanic, and I'm increasingly having to work around parts of it because of idiosyncrasies that are specific to Datasette - caring about the exact order of querystring arguments for example.

Since Datasette is GET-only our needs from a web framework are actually pretty slim.

This becomes more important as I expand the plugins #14 framework. Am I sure I want the plugin ecosystem to depend on a Sanic if I might move away from it in the future?

If Datasette wasn't all about async/await I would use WSGI, but today it makes more sense to use ASGI. I'd like to be confident that switching to ASGI would still give me the excellent performance that Sanic provides.

https://github.com/django/asgiref/blob/master/specs/asgi.rst

datasette 107914493 issue    

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Query took 47.287ms · About: github-to-sqlite