home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

10,427 rows sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user >30

  • simonw 8,845
  • codecov[bot] 235
  • fgregg 79
  • eyeseast 74
  • russss 39
  • psychemedia 35
  • dependabot[bot] 34
  • abdusco 26
  • bgrins 24
  • cldellow 24
  • asg017 24
  • mroswell 22
  • chapmanjacobd 22
  • aborruso 19
  • chrismp 18
  • brandonrobertz 15
  • hydrosquall 15
  • RhetTbull 15
  • jacobian 14
  • carlmjohnson 14
  • tballison 13
  • wragge 12
  • tsibley 11
  • rixx 11
  • stonebig 11
  • frafra 10
  • maxhawkins 10
  • terrycojones 10
  • dracos 10
  • rayvoelker 10
  • …

issue >30

  • Redesign default .json format 55
  • Show column metadata plus links for foreign keys on arbitrary query results 51
  • ?_extra= support (draft) 49
  • Rethink how .ext formats (v.s. ?_format=) works before 1.0 48
  • Upgrade to CodeMirror 6, add SQL autocomplete 48
  • JavaScript plugin hooks mechanism similar to pluggy 47
  • Updated Dockerfile with SpatiaLite version 5.0 45
  • Complete refactor of TableView and table.html template 45
  • Port Datasette to ASGI 42
  • Authentication (and permissions) as a core concept 40
  • invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 36
  • Deploy a live instance of demos/apache-proxy 34
  • await datasette.client.get(path) mechanism for executing internal requests 33
  • Maintain an in-memory SQLite table of connected databases and their tables 32
  • Research: demonstrate if parallel SQL queries are worthwhile 32
  • Ability to sort (and paginate) by column 31
  • Server hang on parallel execution of queries to named in-memory databases 31
  • Default API token authentication mechanism 30
  • Port as many tests as possible to async def tests against ds_client 29
  • link_or_copy_directory() error - Invalid cross-device link 28
  • Add ?_extra= mechanism for requesting extra properties in JSON 27
  • Export to CSV 27
  • base_url configuration setting 27
  • Documentation with recommendations on running Datasette in production without using Docker 27
  • Optimize all those calls to index_list and foreign_key_list 27
  • Support cross-database joins 26
  • Ability for a canned query to write to the database 26
  • table.transform() method for advanced alter table 26
  • New pattern for views that return either JSON or HTML, available for plugins 26
  • Proof of concept for Datasette on AWS Lambda with EFS 25
  • …

reactions 23 ✖

  • {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 10,067
  • {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 208
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} 39
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 37
  • {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 19
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0} 13
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1} 9
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 7
  • {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 6
  • {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0} 3
  • {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 2, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 3
  • {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 3
  • {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} 2
  • {"total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 3, "rocket": 0, "eyes": 0} 2
  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 1, "heart": 0, "rocket": 0, "eyes": 0} 1
  • {"total_count": 15, "+1": 7, "-1": 0, "laugh": 1, "hooray": 1, "confused": 0, "heart": 5, "rocket": 1, "eyes": 0} 1
  • {"total_count": 2, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0} 1
  • {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1} 1
  • {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0} 1
  • {"total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 1
  • {"total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 1
  • {"total_count": 5, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 3, "eyes": 0} 1
  • {"total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 1

author_association 4

  • OWNER 8,320
  • NONE 990
  • CONTRIBUTOR 592
  • MEMBER 525
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1733312349 https://github.com/simonw/sqlite-utils/issues/595#issuecomment-1733312349 https://api.github.com/repos/simonw/sqlite-utils/issues/595 IC_kwDOCGYnMM5nUD9d cycle-data 123451970 2023-09-25T09:38:13Z 2023-09-25T09:38:57Z NONE

Never mind

When I created the connection using sqlite_utils.Database(path)

I just needed to add the following statement right after and it did the trick

self.db.conn.execute("PRAGMA foreign_keys = ON")

Hope this helps people in the future 👍

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Cascading DELETE not working with Table.delete(pk) 1907281675  
1732018273 https://github.com/simonw/sqlite-utils/issues/297#issuecomment-1732018273 https://api.github.com/repos/simonw/sqlite-utils/issues/297 IC_kwDOCGYnMM5nPIBh radusuciu 1108600 2023-09-22T20:49:51Z 2023-09-22T20:49:51Z NONE

This would be awesome to have for multi-gig tsv and csv files! I'm currently looking at a 10 hour countdown for one such important. Not a problem because I'm lazy and happy to let it run and check on it tomorrow..

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Option for importing CSV data using the SQLite .import mechanism 944846776  
1730458954 https://github.com/simonw/datasette/issues/2195#issuecomment-1730458954 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJLVK simonw 9599 2023-09-21T22:57:39Z 2023-09-21T22:57:48Z OWNER

Worth noting that it already sets --cors automatically without you needing to specify it:

https://github.com/simonw/datasette/blob/d97e82df3c8a3f2e97038d7080167be9bb74a68d/datasette/utils/init.py#L374-L374

I wonder if that's actually surprising behaviour that we should change before 1.0.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730457374 https://github.com/simonw/datasette/issues/2195#issuecomment-1730457374 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJK8e simonw 9599 2023-09-21T22:56:18Z 2023-09-21T22:56:18Z OWNER

Maybe I should add --cors and --crossdb to datasette publish cloudrun as well?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730446937 https://github.com/simonw/datasette/issues/2195#issuecomment-1730446937 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJIZZ simonw 9599 2023-09-21T22:46:42Z 2023-09-21T22:46:52Z OWNER

Found more when I searched for YAML.

Here's the most interesting: https://github.com/labordata/warehouse/blob/0029a72fc1ceae9091932da6566f891167179012/.github/workflows/build.yml#L59

--extra-options="--crossdb --setting sql_time_limit_ms 100000 --cors --setting facet_time_limit_ms 500 --setting allow_facet off --setting trace_debug 1"

Uses both --cors and --crossdb.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730441613 https://github.com/simonw/datasette/issues/2195#issuecomment-1730441613 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJHGN simonw 9599 2023-09-21T22:42:12Z 2023-09-21T22:42:12Z OWNER

https://github.com/search?q=datasette+publish+extra-options+language%3AShell&type=code&l=Shell shows 17 matches, I'll copy in illustrative examples here:

--extra-options="--setting sql_time_limit_ms 5000" --extra-options="--config default_cache_ttl:3600 --config hash_urls:1" --extra-options "--setting sql_time_limit_ms 3500 --setting default_page_size 20 --setting trace_debug 1" --extra-options="--config default_page_size:50 --config sql_time_limit_ms:30000 --config facet_time_limit_ms:10000" --extra-options="--setting sql_time_limit_ms 5000" --extra-options "--setting suggest_facets off --setting allow_download on --setting truncate_cells_html 0 --setting max_csv_mb 0 --setting sql_time_limit_ms 2000"

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730438503 https://github.com/simonw/datasette/issues/2195#issuecomment-1730438503 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJGVn simonw 9599 2023-09-21T22:38:10Z 2023-09-21T22:38:10Z OWNER

I'd really like to remove --extra-options. I think the new design makes that completely obsolete?

Maybe it doesn't. You still need --extra-options for the --crossdb option for example.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730437934 https://github.com/simonw/datasette/issues/2195#issuecomment-1730437934 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJGMu simonw 9599 2023-09-21T22:37:22Z 2023-09-21T22:37:22Z OWNER

Here's the full help for Cloud Run at the moment: bash datasette publish cloudrun --help ``` Usage: datasette publish cloudrun [OPTIONS] [FILES]...

Publish databases to Datasette running on Cloud Run

Options: -m, --metadata FILENAME Path to JSON/YAML file containing metadata to publish --extra-options TEXT Extra options to pass to datasette serve --branch TEXT Install datasette from a GitHub branch e.g. main --template-dir DIRECTORY Path to directory containing custom templates --plugins-dir DIRECTORY Path to directory containing custom plugins --static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/... --install TEXT Additional packages (e.g. plugins) to install --plugin-secret <TEXT TEXT TEXT>... Secrets to pass to plugins, e.g. --plugin- secret datasette-auth-github client_id xxx --version-note TEXT Additional note to show on /-/versions --secret TEXT Secret used for signing secure values, such as signed cookies --title TEXT Title for metadata --license TEXT License label for metadata --license_url TEXT License URL for metadata --source TEXT Source label for metadata --source_url TEXT Source URL for metadata --about TEXT About label for metadata --about_url TEXT About URL for metadata -n, --name TEXT Application name to use when building --service TEXT Cloud Run service to deploy (or over-write) --spatialite Enable SpatialLite extension --show-files Output the generated Dockerfile and metadata.json --memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi --cpu [1|2|4] Number of vCPUs to allocate in Cloud Run --timeout INTEGER Build timeout in seconds --apt-get-install TEXT Additional packages to apt-get install --max-instances INTEGER Maximum Cloud Run instances --min-instances INTEGER Minimum Cloud Run instances --help Show this message and exit. ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730437237 https://github.com/simonw/datasette/issues/2195#issuecomment-1730437237 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nJGB1 simonw 9599 2023-09-21T22:36:22Z 2023-09-21T22:36:22Z OWNER

I think the actual design of this is pretty simple. Current help starts like this:

``` Usage: datasette publish cloudrun [OPTIONS] [FILES]...

Publish databases to Datasette running on Cloud Run

Options: -m, --metadata FILENAME Path to JSON/YAML file containing metadata to publish --extra-options TEXT Extra options to pass to datasette serve `` The-sand-c` short options are not being used.

So I think -c/--config can point to a JSON or YAML datasette.yaml file, and -s/--setting key value can mirror the new -s/--setting option in datasette serve itself (a shortcut for populating the config file directly from the CLI).

Here's the relevant help section from datasette serve: -m, --metadata FILENAME Path to JSON/YAML file containing license/source metadata -c, --config FILENAME Path to JSON/YAML Datasette configuration file -s, --setting SETTING... nested.key, value setting to use in Datasette configuration

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730388418 https://github.com/simonw/datasette/issues/2189#issuecomment-1730388418 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5nI6HC simonw 9599 2023-09-21T22:26:19Z 2023-09-21T22:26:19Z OWNER

1.0a7 is out with this fix as well now: https://docs.datasette.io/en/1.0a7/changelog.html#a7-2023-09-21

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1730363182 https://github.com/simonw/datasette/issues/2057#issuecomment-1730363182 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIz8u simonw 9599 2023-09-21T22:09:10Z 2023-09-21T22:09:10Z OWNER

Tests all pass now.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730362441 https://github.com/simonw/datasette/issues/2194#issuecomment-1730362441 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIzxJ simonw 9599 2023-09-21T22:08:19Z 2023-09-21T22:08:19Z OWNER

That worked

https://github.com/simonw/datasette/commit/e4f868801a6633400045f59584cfe650961c3fa6 is the latest commit right now and https://latest.datasette.io/-/versions shows that as the deployed version.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730356422 https://github.com/simonw/datasette/issues/2057#issuecomment-1730356422 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIyTG simonw 9599 2023-09-21T22:01:00Z 2023-09-21T22:01:00Z OWNER

Tested that locally with Python 3.9 from pyenv and it worked.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730353462 https://github.com/simonw/datasette/issues/2057#issuecomment-1730353462 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIxk2 simonw 9599 2023-09-21T21:57:17Z 2023-09-21T21:57:17Z OWNER

Still fails in Python 3.9: https://github.com/simonw/datasette/actions/runs/6266752548/job/17018363302 plugin_info["name"] = distinfo.name or distinfo.project_name AttributeError: 'PathDistribution' object has no attribute 'name' Test failed: datasette-json-html should not have been loaded

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730353006 https://github.com/simonw/datasette/issues/2193#issuecomment-1730353006 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nIxdu simonw 9599 2023-09-21T21:56:43Z 2023-09-21T21:56:43Z OWNER

The test fails as expected now. Closing this issue, will solve the remaining problems in: - #2057

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730352111 https://github.com/simonw/datasette/issues/2193#issuecomment-1730352111 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nIxPv simonw 9599 2023-09-21T21:55:41Z 2023-09-21T21:55:41Z OWNER

https://github.com/simonw/datasette/actions/runs/6267146158/job/17019594849 failed on 3.9 this time.

plugin_info["name"] = distinfo.name or distinfo.project_name AttributeError: 'PathDistribution' object has no attribute 'name' Test failed: datasette-json-html should not have been loaded

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730313565 https://github.com/simonw/datasette/issues/2195#issuecomment-1730313565 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nIn1d simonw 9599 2023-09-21T21:16:31Z 2023-09-21T21:16:31Z OWNER

The @add_common_publish_arguments_and_options decorator described here is bad. If I update it to support a new config option all plugins that use it will break.

https://github.com/simonw/datasette/blob/f130c7c0a88e50cea4121ea18d1f6db2431b6fab/docs/plugin_hooks.rst#L347-L355

I want to deprecate it and switch to a different, better design to address the same problem.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730312128 https://github.com/simonw/datasette/issues/2195#issuecomment-1730312128 https://api.github.com/repos/simonw/datasette/issues/2195 IC_kwDOBm6k_c5nInfA simonw 9599 2023-09-21T21:15:11Z 2023-09-21T21:15:11Z OWNER

As soon as datasette publish cloudrun has this I can re-enable this bit of the demo deploy:

https://github.com/simonw/datasette/blob/2da1a6acec915b81a16127008fd739c7d6075681/.github/workflows/deploy-latest.yml#L91-L97

Which should fix this broken demo from https://simonwillison.net/2022/Dec/2/datasette-write-api/

https://todomvc.datasette.io/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette publish` needs support for the new config/metadata split 1907765514  
1730305920 https://github.com/simonw/datasette/issues/2194#issuecomment-1730305920 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIl-A simonw 9599 2023-09-21T21:09:21Z 2023-09-21T21:09:21Z OWNER

I'm going to disable this bit of the deploy for the moment, which will break the demo linked to from https://simonwillison.net/2022/Dec/2/datasette-write-api/

https://github.com/simonw/datasette/blob/2da1a6acec915b81a16127008fd739c7d6075681/.github/workflows/deploy-latest.yml#L91-L97

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730259871 https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIauf simonw 9599 2023-09-21T20:34:09Z 2023-09-21T20:34:09Z OWNER

... which raises the challenge that datasette publish doesn't yet know what to do with a config file!

https://github.com/simonw/datasette/blob/2da1a6acec915b81a16127008fd739c7d6075681/.github/workflows/deploy-latest.yml#L114-L122

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730258302 https://github.com/simonw/datasette/issues/2194#issuecomment-1730258302 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIaV- simonw 9599 2023-09-21T20:32:53Z 2023-09-21T20:33:02Z OWNER

Correct usage is now: bash python tests/fixtures.py fixtures.db fixtures-config.json fixtures-metadata.json \ plugins --extra-db-filename extra_database.db Test tables written to fixtures.db - metadata written to fixtures-metadata.json - config written to fixtures-config.json Wrote plugin: plugins/register_output_renderer.py Wrote plugin: plugins/view_name.py Wrote plugin: plugins/my_plugin.py Wrote plugin: plugins/messages_output_renderer.py Wrote plugin: plugins/sleep_sql_function.py Wrote plugin: plugins/my_plugin_2.py Test tables written to extra_database.db

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730256435 https://github.com/simonw/datasette/issues/2194#issuecomment-1730256435 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIZ4z simonw 9599 2023-09-21T20:31:22Z 2023-09-21T20:31:31Z OWNER

New error: "Error: Metadata should end with .json"

https://github.com/simonw/datasette/actions/runs/6266720924/job/17018265851

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730250337 https://github.com/simonw/datasette/issues/2057#issuecomment-1730250337 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIYZh simonw 9599 2023-09-21T20:26:12Z 2023-09-21T20:26:12Z OWNER

That does seem to fix the problem!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730247545 https://github.com/simonw/datasette/issues/2057#issuecomment-1730247545 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIXt5 simonw 9599 2023-09-21T20:23:47Z 2023-09-21T20:23:47Z OWNER

Hunch: https://pypi.org/project/importlib-metadata/ may help here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730245204 https://github.com/simonw/datasette/issues/2194#issuecomment-1730245204 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIXJU simonw 9599 2023-09-21T20:21:42Z 2023-09-21T20:21:42Z OWNER

I think I see the problem - it's from here: https://github.com/simonw/datasette/commit/b2ec8717c3619260a1b535eea20e618bf95aa30b#diff-5dbc88d6e5c3615caf10e32a9d6fc6ff683f5b5814948928cb84c3ab91c038b6L770

The config and metadata Click options are the wrong way round:

https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/tests/fixtures.py#L785-L786

https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/tests/fixtures.py#L801

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730242734 https://github.com/simonw/datasette/issues/2194#issuecomment-1730242734 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIWiu simonw 9599 2023-09-21T20:19:29Z 2023-09-21T20:19:29Z OWNER

Maybe plugins/ does not exist? It should have been created by this line:

https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/.github/workflows/deploy-latest.yml#L41-L42

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730241813 https://github.com/simonw/datasette/issues/2194#issuecomment-1730241813 https://api.github.com/repos/simonw/datasette/issues/2194 IC_kwDOBm6k_c5nIWUV simonw 9599 2023-09-21T20:18:40Z 2023-09-21T20:18:40Z OWNER

This looks to be the step that is failing:

https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/.github/workflows/deploy-latest.yml#L50-L60

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy failing with "plugins/alternative_route.py: Not a directory" 1907695234  
1730232308 https://github.com/simonw/datasette/issues/2189#issuecomment-1730232308 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5nIT_0 simonw 9599 2023-09-21T20:11:16Z 2023-09-21T20:11:16Z OWNER

We're planning a breaking change in 1.0a7: - #2191

Since that's a breaking change I'm going to ship 1.0a7 right now with this fix, then ship that breaking change as 1.0a8 instead.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1730231404 https://github.com/simonw/datasette/issues/2189#issuecomment-1730231404 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5nITxs simonw 9599 2023-09-21T20:10:28Z 2023-09-21T20:10:28Z OWNER

Release 0.64.4: https://docs.datasette.io/en/stable/changelog.html#v0-64-4

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1730226107 https://github.com/simonw/datasette/issues/2057#issuecomment-1730226107 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nISe7 simonw 9599 2023-09-21T20:06:19Z 2023-09-21T20:06:19Z OWNER

No that's not it actually, it's something else.

Got to this point: bash DATASETTE_LOAD_PLUGINS=datasette-init python -i $(which datasette) plugins That fails and drops me into a debugger:

File "/Users/simon/Dropbox/Development/datasette/datasette/cli.py", line 186, in plugins app = Datasette([], plugins_dir=plugins_dir) File "/Users/simon/Dropbox/Development/datasette/datasette/app.py", line 405, in __init__ for plugin in get_plugins() File "/Users/simon/Dropbox/Development/datasette/datasette/plugins.py", line 89, in get_plugins plugin_info["name"] = distinfo.name or distinfo.project_name AttributeError: 'PathDistribution' object has no attribute 'name'

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730219703 https://github.com/simonw/datasette/issues/2057#issuecomment-1730219703 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIQ63 simonw 9599 2023-09-21T20:01:54Z 2023-09-21T20:01:54Z OWNER

The problem is here: 86 distinfo = plugin_to_distinfo.get(plugin) 87 if distinfo is None: 88 breakpoint() 89 -> assert False 90 if distinfo.name is None: 91 breakpoint() 92 assert False 93 if distinfo: 94 plugin_info["version"] = distinfo.version (Pdb) distinfo (Pdb) plugin <module 'datasette.sql_functions' from '/Users/simon/Dropbox/Development/datasette/datasette/sql_functions.py'> That plugin_to_distinfo is missing some stuff.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730214654 https://github.com/simonw/datasette/issues/2057#issuecomment-1730214654 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIPr- simonw 9599 2023-09-21T19:59:51Z 2023-09-21T19:59:51Z OWNER

So the problem is the get_plugins() function returning plugins with None for their name:

https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/datasette/plugins.py#L61-L91

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730212597 https://github.com/simonw/datasette/issues/2057#issuecomment-1730212597 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIPL1 simonw 9599 2023-09-21T19:58:38Z 2023-09-21T19:58:38Z OWNER

Relevant code: https://github.com/simonw/datasette/blob/80a9cd9620fddf2695d12d8386a91e7c6b145ef2/datasette/app.py#L1127-L1146

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730211445 https://github.com/simonw/datasette/issues/2057#issuecomment-1730211445 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIO51 simonw 9599 2023-09-21T19:57:44Z 2023-09-21T19:57:44Z OWNER

In the debugger: ```

import pdb pdb.pm() /Users/simon/Dropbox/Development/datasette/datasette/app.py(1136)_plugins() -> ps.sort(key=lambda p: p["name"]) (Pdb) ps [{'name': None, 'static_path': None, 'templates_path': None, 'hooks': ['prepare_connection', 'render_cell'], 'version': '1.0.1'}, {'name': None, 'static_path': None, 'templates_path': None, 'hooks': ['startup'], 'version': '0.2'}] ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730210728 https://github.com/simonw/datasette/issues/2057#issuecomment-1730210728 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIOuo simonw 9599 2023-09-21T19:57:08Z 2023-09-21T19:57:08Z OWNER

In my Python 3.8 environment I ran: bash datasette install datasette-init datasette-json-html And now datasette plugins produces this error: File "/Users/simon/Dropbox/Development/datasette/datasette/cli.py", line 192, in plugins click.echo(json.dumps(app._plugins(all=all), indent=4)) File "/Users/simon/Dropbox/Development/datasette/datasette/app.py", line 1136, in _plugins ps.sort(key=lambda p: p["name"]) TypeError: '<' not supported between instances of 'NoneType' and 'NoneType'

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730208566 https://github.com/simonw/datasette/issues/2193#issuecomment-1730208566 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nIOM2 simonw 9599 2023-09-21T19:55:19Z 2023-09-21T19:55:19Z OWNER

Yes, the new script seems to work. On Python 3.11:

tests/test-datasette-load-plugins.sh echo $? 0 On Python 3.8: tests/test-datasette-load-plugins.sh Test failed: datasette-json-html not found echo $? 1

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730206629 https://github.com/simonw/datasette/issues/2193#issuecomment-1730206629 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nINul simonw 9599 2023-09-21T19:53:39Z 2023-09-21T19:53:39Z OWNER

GPT-4 says:

In the script, you're using a subshell ( ... ) to group commands. If you exit 1 within the subshell, it will only exit the subshell and not the main script. This is why GitHub Actions does not see it as a failure.

It suggested doing this instead:

```bash

!/bin/bash

PLUGINS=$(datasette plugins) if ! echo "$PLUGINS" | jq 'any(.[]; .name == "datasette-json-html")' | grep -q true; then echo "Test failed: datasette-json-html not found" exit 1 fi

PLUGINS2=$(DATASETTE_LOAD_PLUGINS=datasette-init datasette plugins) if ! echo "$PLUGINS2" | jq 'any(.[]; .name == "datasette-json-html")' | grep -q false; then echo "Test failed: datasette-json-html should not have been loaded" exit 1 fi

if ! echo "$PLUGINS2" | jq 'any(.[]; .name == "datasette-init")' | grep -q true; then echo "Test failed: datasette-init should have been loaded" exit 1 fi

PLUGINS3=$(DATASETTE_LOAD_PLUGINS='' datasette plugins) if ! echo "$PLUGINS3" | grep -q '[]'; then echo "Test failed: datasette plugins should have returned []" exit 1 fi ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730203356 https://github.com/simonw/datasette/issues/2193#issuecomment-1730203356 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nIM7c simonw 9599 2023-09-21T19:51:04Z 2023-09-21T19:51:04Z OWNER

The script:

https://github.com/simonw/datasette/blob/b0d0a0e5de8bb5b9b6c253e8af451a532266bcf1/tests/test-datasette-load-plugins.sh#L1-L29

I'm not sure why those exit 1 lines did not cause a CI failure.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730202533 https://github.com/simonw/datasette/issues/2193#issuecomment-1730202533 https://api.github.com/repos/simonw/datasette/issues/2193 IC_kwDOBm6k_c5nIMul simonw 9599 2023-09-21T19:50:22Z 2023-09-21T19:50:22Z OWNER

Here's the failure in CI, which did not cause the workflow to fail even though it should have:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 1907655261  
1730201226 https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIMaK simonw 9599 2023-09-21T19:49:20Z 2023-09-21T19:49:20Z OWNER

That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the "Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730188367 https://github.com/simonw/datasette/issues/2057#issuecomment-1730188367 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIJRP simonw 9599 2023-09-21T19:38:28Z 2023-09-21T19:40:38Z OWNER

I'll imitate certbot:

https://github.com/certbot/certbot/blob/694c758db7fcd8410b5dadcd136c61b3eb028fdc/certbot-ci/setup.py#L9

python 'importlib_resources>=1.3.1; python_version < "3.9"', Looks like 1.3 is the minimum version needed for compatibility with the 3.9 standard library, according to https://github.com/python/importlib_resources/blob/main/README.rst#compatibility

https://github.com/certbot/certbot/blob/694c758db7fcd8410b5dadcd136c61b3eb028fdc/certbot/certbot/_internal/constants.py#L13C29-L16

python if sys.version_info >= (3, 9): # pragma: no cover import importlib.resources as importlib_resources else: # pragma: no cover import importlib_resources

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730185322 https://github.com/simonw/datasette/issues/2057#issuecomment-1730185322 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIIhq simonw 9599 2023-09-21T19:35:49Z 2023-09-21T19:35:49Z OWNER

I think I can fix this using https://importlib-resources.readthedocs.io/en/latest/using.html - maybe as a dependency only installed if the Python version is less than 3.9.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730183405 https://github.com/simonw/datasette/issues/2057#issuecomment-1730183405 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIIDt simonw 9599 2023-09-21T19:34:09Z 2023-09-21T19:34:09Z OWNER

Confirmed: https://docs.python.org/3/library/importlib.resources.html#importlib.resources.files

importlib.resources.files(package) [...] New in version 3.9.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730171241 https://github.com/simonw/datasette/issues/2057#issuecomment-1730171241 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5nIFFp simonw 9599 2023-09-21T19:27:25Z 2023-09-21T19:27:25Z OWNER

This broke in Python 3.8:

if plugin.__name__ not in DEFAULT_PLUGINS: try: if (importlib.resources.files(plugin.__name__) / "static").is_dir(): E AttributeError: module 'importlib.resources' has no attribute 'files'

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1730162283 https://github.com/simonw/datasette/issues/2189#issuecomment-1730162283 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5nIC5r simonw 9599 2023-09-21T19:19:47Z 2023-09-21T19:19:47Z OWNER

I'm going to release this in 1.0a7, and I'll backport it to a 0.64.4 release too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1729961503 https://github.com/simonw/datasette/pull/2190#issuecomment-1729961503 https://api.github.com/repos/simonw/datasette/issues/2190 IC_kwDOBm6k_c5nHR4f asg017 15178711 2023-09-21T16:56:57Z 2023-09-21T16:56:57Z CONTRIBUTOR

TODO: add similar checks for permissions/allow/canned queries

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Raise an exception if a "plugins" block exists in metadata.json 1901483874  
1728504633 https://github.com/simonw/datasette/pull/2191#issuecomment-1728504633 https://api.github.com/repos/simonw/datasette/issues/2191 IC_kwDOBm6k_c5nBuM5 simonw 9599 2023-09-20T22:24:51Z 2023-09-20T22:25:16Z OWNER

The {"units": {"distance": "m", "frequency": "Hz"}} bit is for the units feature which I've half-disabled already and would like to remove before 1.0, though ideally turning that functionality into a plugin instead (if I can figure out how to do that).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DRAFT: Move `permissions` and `allow` blocks out of `metadata.yaml` and into `datasette.yaml` 1901768721  
1728503623 https://github.com/simonw/datasette/pull/2191#issuecomment-1728503623 https://api.github.com/repos/simonw/datasette/issues/2191 IC_kwDOBm6k_c5nBt9H simonw 9599 2023-09-20T22:23:33Z 2023-09-20T22:24:10Z OWNER

This is one of the most interesting illustrative examples in the new code:

https://github.com/simonw/datasette/blob/f7bdedff779606466b580d8528e5a44509291002/tests/fixtures.py#L301-L349

Interesting to note that it now has canned queries in it, which include this bit:

https://github.com/simonw/datasette/blob/f7bdedff779606466b580d8528e5a44509291002/tests/fixtures.py#L341-L342

It looks like metadata, but in this case it's configuration. That blur between metadata and configuration at the canned query level still feels a little bit odd to me, but I still think we're going in the right direction with it.

Also interesting, from that same file:

https://github.com/simonw/datasette/blob/f7bdedff779606466b580d8528e5a44509291002/tests/fixtures.py#L351-L399

There are a few things in that metadata block that are arguably configuration, not metadata - for example:

https://github.com/simonw/datasette/blob/f7bdedff779606466b580d8528e5a44509291002/tests/fixtures.py#L360

I think extra_css_urls is definitely configuration, not metadata.

https://github.com/simonw/datasette/blob/f7bdedff779606466b580d8528e5a44509291002/tests/fixtures.py#L369-L395

Most of that stuff is arguably configuration too, with the exception of the roadside_attractions.columns bit which is metadata about those columns.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DRAFT: Move `permissions` and `allow` blocks out of `metadata.yaml` and into `datasette.yaml` 1901768721  
1728498221 https://github.com/simonw/datasette/pull/2191#issuecomment-1728498221 https://api.github.com/repos/simonw/datasette/issues/2191 IC_kwDOBm6k_c5nBsot simonw 9599 2023-09-20T22:17:26Z 2023-09-20T22:17:26Z OWNER

I tested this locally for permissions like this. datasette.yml:

yaml databases: content: allow: id: root Started Datasette like this: bash datasette --root content.db pottery2.db -c datasette.yml As root I could see this (note the padlock):

http://127.0.0.1:8001/-/metadata returned {} showing that the permissions must have come from the config file instead.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DRAFT: Move `permissions` and `allow` blocks out of `metadata.yaml` and into `datasette.yaml` 1901768721  
1728192688 https://github.com/simonw/datasette/issues/2189#issuecomment-1728192688 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5nAiCw yozlet 173848 2023-09-20T17:53:31Z 2023-09-20T17:53:31Z NONE

/me munches popcorn at a furious rate, utterly entralled

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1726754119 https://github.com/simonw/datasette/pull/2192#issuecomment-1726754119 https://api.github.com/repos/simonw/datasette/issues/2192 IC_kwDOBm6k_c5m7C1H codecov[bot] 22429695 2023-09-20T01:35:45Z 2023-09-20T01:35:45Z NONE

Codecov Report

Patch coverage: 100.00% and project coverage change: +0.02% :tada:

Comparison is base (6ed7908) 92.69% compared to head (4e6a341) 92.72%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2192 +/- ## ========================================== + Coverage 92.69% 92.72% +0.02% ========================================== Files 40 40 Lines 6039 6036 -3 ========================================== - Hits 5598 5597 -1 + Misses 441 439 -2 ``` | [Files Changed](https://app.codecov.io/gh/simonw/datasette/pull/2192?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/views/table.py](https://app.codecov.io/gh/simonw/datasette/pull/2192?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `96.31% <100.00%> (+0.51%)` | :arrow_up: | ... and [1 file with indirect coverage changes](https://app.codecov.io/gh/simonw/datasette/pull/2192/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Stop using parallel SQL queries for tables 1903932086  
1726749355 https://github.com/simonw/datasette/issues/2189#issuecomment-1726749355 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5m7Bqr simonw 9599 2023-09-20T01:28:16Z 2023-09-20T01:28:16Z OWNER

Added a note to that example in the documentation: https://github.com/simonw/datasette/blob/4e6a34179eaedec44c1263275d7592fd83d7e2ac/docs/internals.rst?plain=1#L1320

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1719468727 https://github.com/simonw/datasette/pull/2185#issuecomment-1719468727 https://api.github.com/repos/simonw/datasette/issues/2185 IC_kwDOBm6k_c5mfQK3 codecov[bot] 22429695 2023-09-14T13:36:07Z 2023-09-19T13:42:35Z NONE

Codecov Report

Patch coverage has no change and project coverage change: -0.04% :warning:

Comparison is base (6ed7908) 92.69% compared to head (fe5f881) 92.66%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2185 +/- ## ========================================== - Coverage 92.69% 92.66% -0.04% ========================================== Files 40 40 Lines 6039 6039 ========================================== - Hits 5598 5596 -2 - Misses 441 443 +2 ``` [see 1 file with indirect coverage changes](https://app.codecov.io/gh/simonw/datasette/pull/2185/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 3 updates 1896578249  
1724480716 https://github.com/simonw/datasette/pull/2191#issuecomment-1724480716 https://api.github.com/repos/simonw/datasette/issues/2191 IC_kwDOBm6k_c5myXzM codecov[bot] 22429695 2023-09-18T21:28:36Z 2023-09-18T21:28:36Z NONE

Codecov Report

Patch coverage: 100.00% and project coverage change: -0.05% :warning:

Comparison is base (6ed7908) 92.69% compared to head (f7bdedf) 92.64%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2191 +/- ## ========================================== - Coverage 92.69% 92.64% -0.05% ========================================== Files 40 40 Lines 6039 6040 +1 ========================================== - Hits 5598 5596 -2 - Misses 441 444 +3 ``` | [Files Changed](https://app.codecov.io/gh/simonw/datasette/pull/2191?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/app.py](https://app.codecov.io/gh/simonw/datasette/pull/2191?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.09% <100.00%> (-0.11%)` | :arrow_down: | | [datasette/default\_permissions.py](https://app.codecov.io/gh/simonw/datasette/pull/2191?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2RlZmF1bHRfcGVybWlzc2lvbnMucHk=) | `97.36% <100.00%> (+0.01%)` | :arrow_up: | ... and [1 file with indirect coverage changes](https://app.codecov.io/gh/simonw/datasette/pull/2191/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DRAFT: Move `permissions` and `allow` blocks out of `metadata.yaml` and into `datasette.yaml` 1901768721  
1724325068 https://github.com/simonw/datasette/issues/2189#issuecomment-1724325068 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxxzM simonw 9599 2023-09-18T20:29:41Z 2023-09-18T20:29:41Z OWNER

The one other thing affected by this change is this documentation, which suggests a not-actually-safe pattern: https://github.com/simonw/datasette/blob/6ed7908580fa2ba9297c3225d85c56f8b08b9937/docs/internals.rst#L1292-L1321

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724317367 https://github.com/simonw/datasette/issues/2189#issuecomment-1724317367 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxv63 simonw 9599 2023-09-18T20:25:44Z 2023-09-18T20:25:44Z OWNER

My current hunch is that SQLite gets unhappy if multiple threads access the same underlying C object - which sometimes happens with in-memory connections and Datasette presumably because they are faster than file-backed databases.

I'm going to remove the asyncio.gather() code from the table view. I'll ship a 0.x release with that fix too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724315591 https://github.com/simonw/datasette/issues/2189#issuecomment-1724315591 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxvfH simonw 9599 2023-09-18T20:24:30Z 2023-09-18T20:24:30Z OWNER

Using SQLite In Multi-Threaded Applications

That indicates that there's a SQLite option for "Serialized" mode where it's safe to access anything SQLite provides from multiple threads, but as far as I can tell Python doesn't give you an option to turn that mode on or off for a connection - you can read sqlite3.threadsafety to see if that mode was compiled in or not, but not actually change it.

On my Mac sqlite3.threadsafety returns 1 which means https://docs.python.org/3/library/sqlite3.html#sqlite3.threadsafety "Multi-thread: In this mode, SQLite can be safely used by multiple threads provided that no single database connection is used simultaneously in two or more threads." - it would need to return 3 for that serialized mode.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724305169 https://github.com/simonw/datasette/issues/2189#issuecomment-1724305169 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxs8R simonw 9599 2023-09-18T20:16:22Z 2023-09-18T20:16:36Z OWNER

Looking again at this code:

https://github.com/simonw/datasette/blob/6ed7908580fa2ba9297c3225d85c56f8b08b9937/datasette/database.py#L87-L117

check_same_thread=False really stands out here.

Python docs at https://docs.python.org/3/library/sqlite3.html

check_same_thread (bool) -- If True (default), ProgrammingError will be raised if the database connection is used by a thread other than the one that created it. If False, the connection may be accessed in multiple threads; write operations may need to be serialized by the user to avoid data corruption. See threadsafety for more information.

I think I'm playing with fire by allowing multiple threads to access the same connection without doing my own serialization of those requests.

I do do that using the write connection - and in this particular case the bug isn't coming from write queries, it's coming from read queries - but perhaps SQLite has issues with threading for reads, too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724298817 https://github.com/simonw/datasette/issues/2189#issuecomment-1724298817 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxrZB simonw 9599 2023-09-18T20:11:26Z 2023-09-18T20:11:26Z OWNER

Now that I've confirmed that parallel query execution of the kind introduced in https://github.com/simonw/datasette/commit/942411ef946e9a34a2094944d3423cddad27efd3 can cause hangs (presumably some kind of locking issue) against in-memory databases, some options:

  1. Disable parallel execution entirely and rip out related code.
  2. Disable parallel execution entirely by leaving that code but having it always behave as if _noparallel=1
  3. Continue digging and try and find some way to avoid this problem

The parallel execution work is something I was playing with last year in the hope of speeding up Datasette pages like the table page which need to execute a bunch of queries - one for each facet, plus one for each column to see if it should be suggested as a facet.

I wrote about this at the time here: https://simonwillison.net/2022/May/6/weeknotes/

My hope was that despite Python's GIL this optimization would still help, because the SQLite C module releases the GIL once it gets to SQLite.

But... that didn't hold up. It looked like enough work was happening in Python land with the GIL that the optimization didn't improve things.

Running the nogil fork of Python DID improve things though! I left the code in partly on the hope that the nogil fork would be accepted into Python core.

... which it now has! But it will still be a year or two before it fully lands: https://discuss.python.org/t/a-steering-council-notice-about-pep-703-making-the-global-interpreter-lock-optional-in-cpython/30474

So I'm not particularly concerned about dropping the parallel execution. If I do drop it though do I leave the potentially complex code in that relates to it?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724281824 https://github.com/simonw/datasette/issues/2189#issuecomment-1724281824 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxnPg simonw 9599 2023-09-18T19:58:06Z 2023-09-18T19:58:06Z OWNER

I also confirmed that http://127.0.0.1:8064/airtable_refs/airtable_refs?_noparallel=1 does not trigger the bug but http://127.0.0.1:8064/airtable_refs/airtable_refs does.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724278386 https://github.com/simonw/datasette/issues/2189#issuecomment-1724278386 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxmZy simonw 9599 2023-09-18T19:55:32Z 2023-09-18T19:55:32Z OWNER

OK it looks like it found it!

``` 942411ef946e9a34a2094944d3423cddad27efd3 is the first bad commit commit

Author: Simon Willison swillison@gmail.com Date: Tue Apr 26 15:48:56 2022 -0700

Execute some TableView queries in parallel

Use ?_noparallel=1 to opt out (undocumented, useful for benchmark comparisons)

Refs #1723, #1715

datasette/views/table.py | 93 ++++++++++++++++++++++++++++++++++-------------- 1 file changed, 67 insertions(+), 26 deletions(-) bisect found first bad commit ``` https://github.com/simonw/datasette/commit/942411ef946e9a34a2094944d3423cddad27efd3 does look like the cause of this problem.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724276917 https://github.com/simonw/datasette/issues/2189#issuecomment-1724276917 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxmC1 simonw 9599 2023-09-18T19:54:23Z 2023-09-18T19:54:23Z OWNER

Turned out I wasn't running the datasette from the current directory, so it was not testing what I intended.

FIxed that with pip install -e . in the datasette/ directory.

Now I'm seeing some passes, which look like this: running '../airtable-export/testit.sh' INFO: Started server process [77810] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8064 (Press CTRL+C to quit) Running curls INFO: 127.0.0.1:59439 - "GET /airtable_refs/airtable_refs HTTP/1.1" 200 OK INFO: 127.0.0.1:59440 - "GET /airtable_refs/airtable_refs HTTP/1.1" 200 OK INFO: 127.0.0.1:59441 - "GET /airtable_refs/airtable_refs HTTP/1.1" 200 OK All curl succeeded Killing datasette server with PID 77810 ../airtable-export/testit.sh: line 54: 77810 Killed: 9 datasette pottery2.db -p $port All three curls succeeded. Bisecting: 4 revisions left to test after this (roughly 2 steps) [7463b051cf8d7f856df5eba9f7aa944183ebabe5] Cosmetic tweaks after blacken-docs, refs #1718 running '../airtable-export/testit.sh' INFO: Started server process [77826] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8064 (Press CTRL+C to quit) Running curls

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724257290 https://github.com/simonw/datasette/issues/2189#issuecomment-1724257290 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxhQK simonw 9599 2023-09-18T19:39:27Z 2023-09-18T19:44:26Z OWNER

I'm now trying this test script: ```bash

!/bin/bash

port=8064

Start datasette server in the background and get its PID

datasette pottery2.db -p $port & server_pid=$!

Wait for a moment to ensure the server has time to start up

sleep 2

Initialize counters and parameters

retry_count=0 max_retries=3 success_count=0 path="/airtable_refs/airtable_refs"

Function to run curl with a timeout

function test_curl { # Run the curl command with a timeout of 3 seconds timeout 3s curl -s "http://localhost:${port}${path}" > /dev/null if [ $? -eq 0 ]; then # Curl was successful ((success_count++)) fi }

Try three parallel curl requests

while [[ $retry_count -lt $max_retries ]]; do # Reset the success counter success_count=0

# Run the curls in parallel
echo "  Running curls"
test_curl
test_curl
test_curl #  & test_curl & test_curl &

# Wait for all curls to finish
#wait

# Check the success count
if [[ $success_count -eq 3 ]]; then
    # All curls succeeded, break out of the loop
    echo "  All curl succeeded"
    break
fi

((retry_count++))

done

Kill the datasette server

echo "Killing datasette server with PID $server_pid" kill -9 $server_pid sleep 2

Print result

if [[ $success_count -eq 3 ]]; then echo "All three curls succeeded." exit 0 else echo "Error: Not all curls succeeded after $retry_count attempts." exit 1 fi I run it like this:bash git bisect reset git bisect start git bisect good 0.59.4 git bisect bad 1.0a6 git bisect run ../airtable-export/testit.sh ``` But... it's not having the desired result, I think because the bug is intermittent so each time I run it the bisect spits out a different commit as the one that is to blame.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724263390 https://github.com/simonw/datasette/issues/2189#issuecomment-1724263390 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxive simonw 9599 2023-09-18T19:44:03Z 2023-09-18T19:44:03Z OWNER

I knocked it down to 1 retry just to see what happened.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724259229 https://github.com/simonw/datasette/issues/2189#issuecomment-1724259229 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxhud simonw 9599 2023-09-18T19:40:56Z 2023-09-18T19:40:56Z OWNER

I tried it with a path of / and everything passed - so it's definitely the path of /airtable_refs/airtable_refs (an in-memory database created by an experimental branch of https://github.com/simonw/airtable-export) that triggers the problem.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724258279 https://github.com/simonw/datasette/issues/2189#issuecomment-1724258279 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxhfn simonw 9599 2023-09-18T19:40:13Z 2023-09-18T19:40:13Z OWNER

Output while it is running looks like this: running '../airtable-export/testit.sh' INFO: Started server process [75649] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8064 (Press CTRL+C to quit) Running curls Running curls Running curls Killing datasette server with PID 75649 ../airtable-export/testit.sh: line 54: 75649 Killed: 9 datasette pottery2.db -p $port Error: Not all curls succeeded after 3 attempts. Bisecting: 155 revisions left to test after this (roughly 7 steps) [247e460e08bf823142f7b84058fe44e43626787f] Update beautifulsoup4 requirement (#1703) running '../airtable-export/testit.sh' INFO: Started server process [75722] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8064 (Press CTRL+C to quit) Running curls Running curls Running curls Killing datasette server with PID 75722 ../airtable-export/testit.sh: line 54: 75722 Killed: 9 datasette pottery2.db -p $port Error: Not all curls succeeded after 3 attempts. Bisecting: 77 revisions left to test after this (roughly 6 steps) [3ef47a0896c7e63404a34e465b7160c80eaa571d] Link rel=alternate header for tables and rows running '../airtable-export/testit.sh' INFO: Started server process [75818] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8064 (Press CTRL+C to quit) Running curls

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724169693 https://github.com/simonw/datasette/pull/2190#issuecomment-1724169693 https://api.github.com/repos/simonw/datasette/issues/2190 IC_kwDOBm6k_c5mxL3d codecov[bot] 22429695 2023-09-18T18:39:19Z 2023-09-18T18:39:19Z NONE

Codecov Report

Patch coverage: 100.00% and project coverage change: -0.03% :warning:

Comparison is base (6ed7908) 92.69% compared to head (fc7dbe0) 92.67%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2190 +/- ## ========================================== - Coverage 92.69% 92.67% -0.03% ========================================== Files 40 40 Lines 6039 6044 +5 ========================================== + Hits 5598 5601 +3 - Misses 441 443 +2 ``` | [Files Changed](https://app.codecov.io/gh/simonw/datasette/pull/2190?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/app.py](https://app.codecov.io/gh/simonw/datasette/pull/2190?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.19% <100.00%> (ø)` | | | [datasette/cli.py](https://app.codecov.io/gh/simonw/datasette/pull/2190?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `82.53% <100.00%> (ø)` | | | [datasette/utils/\_\_init\_\_.py](https://app.codecov.io/gh/simonw/datasette/pull/2190?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.83% <100.00%> (+0.03%)` | :arrow_up: | ... and [1 file with indirect coverage changes](https://app.codecov.io/gh/simonw/datasette/pull/2190/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Raise an exception if a "plugins" block exists in metadata.json 1901483874  
1724159882 https://github.com/simonw/datasette/issues/2189#issuecomment-1724159882 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxJeK simonw 9599 2023-09-18T18:32:29Z 2023-09-18T18:32:29Z OWNER

This worked, including on macOS even though GPT-4 thought timeout would not work there: https://chat.openai.com/share/cc4628e9-5240-4f35-b640-16a9c178b315 ```bash

!/bin/bash

Run the command with a timeout of 5 seconds

timeout 5s datasette pottery2.db -p 8045 --get /airtable_refs/airtable_refs

Check the exit code from timeout

if [ $? -eq 124 ]; then echo "Error: Command timed out after 5 seconds." exit 1 fi ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724157182 https://github.com/simonw/datasette/issues/2189#issuecomment-1724157182 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mxIz- simonw 9599 2023-09-18T18:30:30Z 2023-09-18T18:30:30Z OWNER

OK, I can trigger the bug like this:

bash datasette pottery2.db -p 8045 --get /airtable_refs/airtable_refs Can I write a bash script that fails (and terminates the process) if it takes longer than X seconds?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724089666 https://github.com/simonw/datasette/issues/2189#issuecomment-1724089666 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mw4VC simonw 9599 2023-09-18T17:49:24Z 2023-09-18T17:49:24Z OWNER

I switched that particular implementation to using an on-disk database instead of an in-memory database and could no longer recreate the bug.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724084199 https://github.com/simonw/datasette/issues/2189#issuecomment-1724084199 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mw2_n simonw 9599 2023-09-18T17:47:01Z 2023-09-18T17:47:01Z OWNER

I managed to trigger it by loading http://127.0.0.1:8045/airtable_refs/airtable_refs - which worked - and then hitting refresh on that page a bunch of times until it hung.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724083324 https://github.com/simonw/datasette/issues/2189#issuecomment-1724083324 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mw2x8 simonw 9599 2023-09-18T17:46:21Z 2023-09-18T17:46:21Z OWNER

Sometimes it takes a few clicks for the bug to occur, but it does seem to always be within the in-memory database.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724081909 https://github.com/simonw/datasette/issues/2189#issuecomment-1724081909 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mw2b1 simonw 9599 2023-09-18T17:45:27Z 2023-09-18T17:45:27Z OWNER

Maybe it's not related to faceting - I just got it on a hit to http://127.0.0.1:8045/airtable_refs/airtable_refs instead.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724072390 https://github.com/simonw/datasette/issues/2189#issuecomment-1724072390 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mw0HG simonw 9599 2023-09-18T17:39:06Z 2023-09-18T17:39:06Z OWNER

Landing a version of that test anyway.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724064440 https://github.com/simonw/datasette/issues/2189#issuecomment-1724064440 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwyK4 simonw 9599 2023-09-18T17:36:00Z 2023-09-18T17:36:00Z OWNER

I wrote this test, but it passes: python @pytest.mark.asyncio async def test_facet_against_in_memory_database(): ds = Datasette() db = ds.add_memory_database("mem") await db.execute_write("create table t (id integer primary key, name text)") await db.execute_write_many( "insert into t (name) values (?)", [["one"], ["one"], ["two"]] ) response1 = await ds.client.get("/mem/t.json") assert response1.status_code == 200 response2 = await ds.client.get("/mem/t.json?_facet=name") assert response2.status_code == 200 assert response2.json() == { "ok": True, "next": None, "facet_results": { "results": { "name": { "name": "name", "type": "column", "hideable": True, "toggle_url": "/mem/t.json", "results": [ { "value": "one", "label": "one", "count": 2, "toggle_url": "http://localhost/mem/t.json?_facet=name&name=one", "selected": False, }, { "value": "two", "label": "two", "count": 1, "toggle_url": "http://localhost/mem/t.json?_facet=name&name=two", "selected": False, }, ], "truncated": False, } }, "timed_out": [], }, "rows": [ {"id": 1, "name": "one"}, {"id": 2, "name": "one"}, {"id": 3, "name": "two"}, ], "truncated": False, }

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724055823 https://github.com/simonw/datasette/issues/2189#issuecomment-1724055823 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwwEP simonw 9599 2023-09-18T17:31:10Z 2023-09-18T17:31:10Z OWNER

That line was added in https://github.com/simonw/datasette/commit/942411ef946e9a34a2094944d3423cddad27efd3 which first shipped in 0.62a0.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724051886 https://github.com/simonw/datasette/issues/2189#issuecomment-1724051886 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwvGu simonw 9599 2023-09-18T17:28:20Z 2023-09-18T17:30:30Z OWNER

The bug exhibits when I try to add a facet. I think it's caused by the parallel query execution I added to facets at some point.

http://127.0.0.1:8045/airtable_refs/airtable_refs - no error http://127.0.0.1:8045/airtable_refs/airtable_refs?_facet=table_name#facet-table_name - hangs the server

Crucial line in the traceback: await gather(execute_facets(), execute_suggested_facets()) From here: https://github.com/simonw/datasette/blob/917272c864ad7b8a00c48c77f5c2944093babb4e/datasette/views/table.py#L568

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724049538 https://github.com/simonw/datasette/issues/2189#issuecomment-1724049538 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwuiC simonw 9599 2023-09-18T17:26:44Z 2023-09-18T17:26:44Z OWNER

Just managed to get this exception trace: return await self.route_path(scope, receive, send, path) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/app.py", line 1354, in route_path response = await view(request, send) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/base.py", line 134, in view return await self.dispatch_request(request) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/base.py", line 91, in dispatch_request return await handler(request) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/base.py", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/table.py", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/table.py", line 568, in _data_traced await gather(execute_facets(), execute_suggested_facets()) File "/Users/simon/.local/share/virtualenvs/airtable-export-Ca4U-3qk/lib/python3.8/site-packages/datasette/views/table.py", line 177, in _gather_parallel return await asyncio.gather(*args) asyncio.exceptions.CancelledError INFO: 127.0.0.1:64109 - "GET /airtable_refs/airtable_refs?_facet=table_name&table_name=Sessions HTTP/1.1" 500 Internal Server Error ^CError in atexit._run_exitfuncs: Traceback (most recent call last): File "/Users/simon/.pyenv/versions/3.8.17/lib/python3.8/concurrent/futures/thread.py", line 40, in _python_exit t.join() File "/Users/simon/.pyenv/versions/3.8.17/lib/python3.8/threading.py", line 1011, in join self._wait_for_tstate_lock() File "/Users/simon/.pyenv/versions/3.8.17/lib/python3.8/threading.py", line 1027, in _wait_for_tstate_lock elif lock.acquire(block, timeout): KeyboardInterrupt

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724048314 https://github.com/simonw/datasette/issues/2189#issuecomment-1724048314 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwuO6 simonw 9599 2023-09-18T17:25:55Z 2023-09-18T17:25:55Z OWNER

The good news is that this bug is currently unlikely to affect most users since named in-memory databases (created using datasette.add_memory_database("airtable_refs") (docs) are a pretty obscure feature, only available to plugins.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1724045748 https://github.com/simonw/datasette/issues/2189#issuecomment-1724045748 https://api.github.com/repos/simonw/datasette/issues/2189 IC_kwDOBm6k_c5mwtm0 simonw 9599 2023-09-18T17:24:07Z 2023-09-18T17:24:07Z OWNER

I need reliable steps to reproduce, then I can bisect and figure out which exact version of Datasette introduced the problem.

I have a hunch that it relates to changes made to the datasette/database.py module, maybe one of these changes here: https://github.com/simonw/datasette/compare/0.61...0.63.1#diff-4e20309c969326a0008dc9237f6807f48d55783315fbfc1e7dfa480b550e16f9

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Server hang on parallel execution of queries to named in-memory databases 1901416155  
1723362847 https://github.com/simonw/datasette/issues/2123#issuecomment-1723362847 https://api.github.com/repos/simonw/datasette/issues/2123 IC_kwDOBm6k_c5muG4f hueyy 6523121 2023-09-18T13:02:46Z 2023-09-18T13:02:46Z NONE

Can confirm that this bug can be reproduced as follows:

docker run datasetteproject/datasette datasette serve --reload

which produces the following output:

Starting monitor for PID 10. Error: Invalid value for '[FILES]...': Path 'serve' does not exist.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
datasette serve when invoked with --reload interprets the serve command as a file 1825007061  
1722943484 https://github.com/simonw/datasette/pull/2052#issuecomment-1722943484 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5msgf8 20after4 30934 2023-09-18T08:14:47Z 2023-09-18T08:14:47Z NONE

This is such a well thought out contribution. I don't think I've seen such a thoroughly considered PR on any project in recent memory.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1722848454 https://github.com/simonw/datasette/issues/2188#issuecomment-1722848454 https://api.github.com/repos/simonw/datasette/issues/2188 IC_kwDOBm6k_c5msJTG asg017 15178711 2023-09-18T06:58:53Z 2023-09-18T06:58:53Z CONTRIBUTOR

Thinking about this more, here a list of things I imagine a "compile-to-sql" plugin would want to do:

  1. Attach itself to the SQL code editor (switch from SQL -> PRQL/Logica, additional syntax highlighting)
  2. Add "Query using PRQL" buttons in various parts of Datasette's UI, like /dbname page
  3. Use $LANGUAGE= instead of sql= in the JSON API and the SQL results pages
  4. Have their own dedicated code editor page

1) and 2) would be difficult to do with current plugin hooks, unless we add the concept of "slots" and get the JS plugin support in. 3) could maybe be done with the asgi_wrapper(datasette) hook? And 4) ca n be done easily with the register_routes() hooks.

So it really only sounds like extending the SQL editor will be the hard part. In #2094 I want to add JavaScript plugin hooks for extending the SQL editor, which may work here.

If I get the time/motivation, I might try out a datasette-prql extension, just because I like playing with it. It'd be really cool if I can get the asgi_wrapper() hook to work right there...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin Hooks for "compile to SQL" languages 1900026059  
1722845490 https://github.com/simonw/datasette/issues/1191#issuecomment-1722845490 https://api.github.com/repos/simonw/datasette/issues/1191 IC_kwDOBm6k_c5msIky asg017 15178711 2023-09-18T06:55:52Z 2023-09-18T06:55:52Z CONTRIBUTOR

One note here: this feature could be called "slots", similar to Layout Slots in Vitepress.

In Vitepress, you can add custom components/widget/gadgets into determined named "slots", like so:

doc-top doc-bottom doc-footer-before doc-before doc-after ...

Would be great to do in both Python and Javascript, with the upcoming JavaScript API #2052. In datasette-write-ui, all we do is add a few "Insert row" and "edit this row" buttons and that required completely capturing the table.html template, which isn't great for other plugins. But having "slots" like table-footer-before or table-row-id or something would be great to work with.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Ability for plugins to collaborate when adding extra HTML to blocks in default templates 787098345  
1722662413 https://github.com/simonw/datasette/issues/2188#issuecomment-1722662413 https://api.github.com/repos/simonw/datasette/issues/2188 IC_kwDOBm6k_c5mrb4N simonw 9599 2023-09-18T02:01:34Z 2023-09-18T02:01:34Z OWNER

I'm not interested in these in Datasette core itself, but I think they have a ton of potential for plugins.

I wonder what the best way to handle that would be?

Right now it's possible to write a plugin that adds extra routes, so someone could build a /dbname/-/prql?query=xxx endpoint.

If this could return JSON, they could add JavaScript to the /dbname page that provided a UI for kicking off one of those queries.

Something that could make that more ergonomic might be the plugin hook that allows plugins to add extra HTML to different core database pages - e.g. adding a "Query this database using PRQL" button or link or even a full form at the top of that database page. That's this issue here:

  • 1191

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin Hooks for "compile to SQL" languages 1900026059  
1722323967 https://github.com/simonw/datasette/issues/2057#issuecomment-1722323967 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mqJP_ simonw 9599 2023-09-16T21:54:33Z 2023-09-16T21:54:33Z OWNER

Just found this migration guide: https://importlib-metadata.readthedocs.io/en/latest/migration.html

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1722266942 https://github.com/simonw/datasette/issues/2057#issuecomment-1722266942 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mp7U- simonw 9599 2023-09-16T16:38:27Z 2023-09-16T16:38:27Z OWNER

The importlib.metadata.entry_points() function is pretty interesting: ```pycon

import importlib.metadata from pprint import pprint pprint(importlib.metadata.entry_points()) {'babel.checkers': [EntryPoint(name='num_plurals', value='babel.messages.checkers:num_plurals', group='babel.checkers'), EntryPoint(name='python_format', value='babel.messages.checkers:python_format', group='babel.checkers')], 'babel.extractors': [EntryPoint(name='jinja2', value='jinja2.ext:babel_extract[i18n]', group='babel.extractors'), EntryPoint(name='ignore', value='babel.messages.extract:extract_nothing', group='babel.extractors'), EntryPoint(name='javascript', value='babel.messages.extract:extract_javascript', group='babel.extractors'), EntryPoint(name='python', value='babel.messages.extract:extract_python', group='babel.extractors')], 'console_scripts': [EntryPoint(name='datasette', value='datasette.cli:cli', group='console_scripts'), EntryPoint(name='normalizer', value='charset_normalizer.cli.normalizer:cli_detect', group='console_scripts'), EntryPoint(name='pypprint', value='pprintpp:console', group='console_scripts'), EntryPoint(name='cog', value='cogapp:main', group='console_scripts'), EntryPoint(name='icdiff', value='icdiff:start', group='console_scripts'), EntryPoint(name='pycodestyle', value='pycodestyle:_main', group='console_scripts'), EntryPoint(name='sphinx-autobuild', value='sphinx_autobuild.main:main', group='console_scripts'), EntryPoint(name='sphinx-apidoc', value='sphinx.ext.apidoc:main', group='console_scripts'), EntryPoint(name='sphinx-autogen', value='sphinx.ext.autosummary.generate:main', group='console_scripts'), EntryPoint(name='sphinx-build', value='sphinx.cmd.build:main', group='console_scripts'), EntryPoint(name='sphinx-quickstart', value='sphinx.cmd.quickstart:main', group='console_scripts'), EntryPoint(name='sphinx-to-sqlite', value='sphinx_to_sqlite.cli:cli', group='console_scripts'), EntryPoint(name='pybabel', value='babel.messages.frontend:main', group='console_scripts'), EntryPoint(name='docutils', value='docutils.main:main', group='console_scripts'), EntryPoint(name='isort', value='isort.main:main', group='console_scripts'), EntryPoint(name='isort-identify-imports', value='isort.main:identify_imports_main', group='console_scripts'), EntryPoint(name='hupper', value='hupper.cli:main', group='console_scripts'), EntryPoint(name='sqlite-utils', value='sqlite_utils.cli:cli', group='console_scripts'), EntryPoint(name='py.test', value='pytest:console_main', group='console_scripts'), EntryPoint(name='pytest', value='pytest:console_main', group='console_scripts'), EntryPoint(name='pyflakes', value='pyflakes.api:main', group='console_scripts'), EntryPoint(name='livereload', value='livereload.cli:main', group='console_scripts'), EntryPoint(name='uvicorn', value='uvicorn.main:main', group='console_scripts'), EntryPoint(name='httpx', value='httpx:main', group='console_scripts'), EntryPoint(name='flake8', value='flake8.main.cli:main', group='console_scripts'), EntryPoint(name='blacken-docs', value='blacken_docs:main', group='console_scripts'), EntryPoint(name='pip', value='pip._internal.cli.main:main', group='console_scripts'), EntryPoint(name='pip3', value='pip._internal.cli.main:main', group='console_scripts'), EntryPoint(name='pip3.10', value='pip._internal.cli.main:main', group='console_scripts'), EntryPoint(name='wheel', value='wheel.cli:main', group='console_scripts'), EntryPoint(name='pygmentize', value='pygments.cmdline:main', group='console_scripts'), EntryPoint(name='black', value='black:patched_main', group='console_scripts'), EntryPoint(name='blackd', value='blackd:patched_main [d]', group='console_scripts'), EntryPoint(name='codespell', value='codespell_lib:_script_main', group='console_scripts'), EntryPoint(name='tabulate', value='tabulate:_main', group='console_scripts')], 'datasette': [EntryPoint(name='debug_permissions', value='datasette_debug_permissions', group='datasette'), EntryPoint(name='codespaces', value='datasette_codespaces', group='datasette'), EntryPoint(name='vega', value='datasette_vega', group='datasette'), EntryPoint(name='x_forwarded_host', value='datasette_x_forwarded_host', group='datasette'), EntryPoint(name='json_html', value='datasette_json_html', group='datasette'), EntryPoint(name='datasette_write_ui', value='datasette_write_ui', group='datasette'), EntryPoint(name='pretty_json', value='datasette_pretty_json', group='datasette'), EntryPoint(name='graphql', value='datasette_graphql', group='datasette')], 'distutils.commands': [EntryPoint(name='compile_catalog', value='babel.messages.frontend:compile_catalog', group='distutils.commands'), EntryPoint(name='extract_messages', value='babel.messages.frontend:extract_messages', group='distutils.commands'), EntryPoint(name='init_catalog', value='babel.messages.frontend:init_catalog', group='distutils.commands'), EntryPoint(name='update_catalog', value='babel.messages.frontend:update_catalog', group='distutils.commands'), EntryPoint(name='isort', value='isort.setuptools_commands:ISortCommand', group='distutils.commands'), EntryPoint(name='alias', value='setuptools.command.alias:alias', group='distutils.commands'), EntryPoint(name='bdist_egg', value='setuptools.command.bdist_egg:bdist_egg', group='distutils.commands'), EntryPoint(name='bdist_rpm', value='setuptools.command.bdist_rpm:bdist_rpm', group='distutils.commands'), EntryPoint(name='build', value='setuptools.command.build:build', group='distutils.commands'), EntryPoint(name='build_clib', value='setuptools.command.build_clib:build_clib', group='distutils.commands'), EntryPoint(name='build_ext', value='setuptools.command.build_ext:build_ext', group='distutils.commands'), EntryPoint(name='build_py', value='setuptools.command.build_py:build_py', group='distutils.commands'), EntryPoint(name='develop', value='setuptools.command.develop:develop', group='distutils.commands'), EntryPoint(name='dist_info', value='setuptools.command.dist_info:dist_info', group='distutils.commands'), EntryPoint(name='easy_install', value='setuptools.command.easy_install:easy_install', group='distutils.commands'), EntryPoint(name='editable_wheel', value='setuptools.command.editable_wheel:editable_wheel', group='distutils.commands'), EntryPoint(name='egg_info', value='setuptools.command.egg_info:egg_info', group='distutils.commands'), EntryPoint(name='install', value='setuptools.command.install:install', group='distutils.commands'), EntryPoint(name='install_egg_info', value='setuptools.command.install_egg_info:install_egg_info', group='distutils.commands'), EntryPoint(name='install_lib', value='setuptools.command.install_lib:install_lib', group='distutils.commands'), EntryPoint(name='install_scripts', value='setuptools.command.install_scripts:install_scripts', group='distutils.commands'), EntryPoint(name='rotate', value='setuptools.command.rotate:rotate', group='distutils.commands'), EntryPoint(name='saveopts', value='setuptools.command.saveopts:saveopts', group='distutils.commands'), EntryPoint(name='sdist', value='setuptools.command.sdist:sdist', group='distutils.commands'), EntryPoint(name='setopt', value='setuptools.command.setopt:setopt', group='distutils.commands'), EntryPoint(name='test', value='setuptools.command.test:test', group='distutils.commands'), EntryPoint(name='upload_docs', value='setuptools.command.upload_docs:upload_docs', group='distutils.commands'), EntryPoint(name='bdist_wheel', value='wheel.bdist_wheel:bdist_wheel', group='distutils.commands')], 'distutils.setup_keywords': [EntryPoint(name='message_extractors', value='babel.messages.frontend:check_message_extractors', group='distutils.setup_keywords'), EntryPoint(name='cffi_modules', value='cffi.setuptools_ext:cffi_modules', group='distutils.setup_keywords'), EntryPoint(name='dependency_links', value='setuptools.dist:assert_string_list', group='distutils.setup_keywords'), EntryPoint(name='eager_resources', value='setuptools.dist:assert_string_list', group='distutils.setup_keywords'), EntryPoint(name='entry_points', value='setuptools.dist:check_entry_points', group='distutils.setup_keywords'), EntryPoint(name='exclude_package_data', value='setuptools.dist:check_package_data', group='distutils.setup_keywords'), EntryPoint(name='extras_require', value='setuptools.dist:check_extras', group='distutils.setup_keywords'), EntryPoint(name='include_package_data', value='setuptools.dist:assert_bool', group='distutils.setup_keywords'), EntryPoint(name='install_requires', value='setuptools.dist:check_requirements', group='distutils.setup_keywords'), EntryPoint(name='namespace_packages', value='setuptools.dist:check_nsp', group='distutils.setup_keywords'), EntryPoint(name='package_data', value='setuptools.dist:check_package_data', group='distutils.setup_keywords'), EntryPoint(name='packages', value='setuptools.dist:check_packages', group='distutils.setup_keywords'), EntryPoint(name='python_requires', value='setuptools.dist:check_specifier', group='distutils.setup_keywords'), EntryPoint(name='setup_requires', value='setuptools.dist:check_requirements', group='distutils.setup_keywords'), EntryPoint(name='test_loader', value='setuptools.dist:check_importable', group='distutils.setup_keywords'), EntryPoint(name='test_runner', value='setuptools.dist:check_importable', group='distutils.setup_keywords'), EntryPoint(name='test_suite', value='setuptools.dist:check_test_suite', group='distutils.setup_keywords'), EntryPoint(name='tests_require', value='setuptools.dist:check_requirements', group='distutils.setup_keywords'), EntryPoint(name='use_2to3', value='setuptools.dist:invalid_unless_false', group='distutils.setup_keywords'), EntryPoint(name='zip_safe', value='setuptools.dist:assert_bool', group='distutils.setup_keywords')], 'egg_info.writers': [EntryPoint(name='PKG-INFO', value='setuptools.command.egg_info:write_pkg_info', group='egg_info.writers'), EntryPoint(name='dependency_links.txt', value='setuptools.command.egg_info:overwrite_arg', group='egg_info.writers'), EntryPoint(name='depends.txt', value='setuptools.command.egg_info:warn_depends_obsolete', group='egg_info.writers'), EntryPoint(name='eager_resources.txt', value='setuptools.command.egg_info:overwrite_arg', group='egg_info.writers'), EntryPoint(name='entry_points.txt', value='setuptools.command.egg_info:write_entries', group='egg_info.writers'), EntryPoint(name='namespace_packages.txt', value='setuptools.command.egg_info:overwrite_arg', group='egg_info.writers'), EntryPoint(name='requires.txt', value='setuptools.command.egg_info:write_requirements', group='egg_info.writers'), EntryPoint(name='top_level.txt', value='setuptools.command.egg_info:write_toplevel_names', group='egg_info.writers')], 'flake8.extension': [EntryPoint(name='C90', value='mccabe:McCabeChecker', group='flake8.extension'), EntryPoint(name='E', value='flake8.plugins.pycodestyle:pycodestyle_logical', group='flake8.extension'), EntryPoint(name='F', value='flake8.plugins.pyflakes:FlakesChecker', group='flake8.extension'), EntryPoint(name='W', value='flake8.plugins.pycodestyle:pycodestyle_physical', group='flake8.extension')], 'flake8.report': [EntryPoint(name='default', value='flake8.formatting.default:Default', group='flake8.report'), EntryPoint(name='pylint', value='flake8.formatting.default:Pylint', group='flake8.report'), EntryPoint(name='quiet-filename', value='flake8.formatting.default:FilenameOnly', group='flake8.report'), EntryPoint(name='quiet-nothing', value='flake8.formatting.default:Nothing', group='flake8.report')], 'pylama.linter': [EntryPoint(name='isort', value='isort.pylama_isort:Linter', group='pylama.linter')], 'pytest11': [EntryPoint(name='icdiff', value='pytest_icdiff', group='pytest11'), EntryPoint(name='asyncio', value='pytest_asyncio.plugin', group='pytest11'), EntryPoint(name='xdist', value='xdist.plugin', group='pytest11'), EntryPoint(name='xdist.looponfail', value='xdist.looponfail', group='pytest11'), EntryPoint(name='timeout', value='pytest_timeout', group='pytest11'), EntryPoint(name='anyio', value='anyio.pytest_plugin', group='pytest11')], 'setuptools.finalize_distribution_options': [EntryPoint(name='keywords', value='setuptools.dist:Distribution._finalize_setup_keywords', group='setuptools.finalize_distribution_options'), EntryPoint(name='parent_finalize', value='setuptools.dist:_Distribution.finalize_options', group='setuptools.finalize_distribution_options')], 'sphinx.html_themes': [EntryPoint(name='alabaster', value='alabaster', group='sphinx.html_themes'), EntryPoint(name='basic-ng', value='sphinx_basic_ng', group='sphinx.html_themes'), EntryPoint(name='furo', value='furo', group='sphinx.html_themes')], 'sqlite_utils': [EntryPoint(name='hello_world', value='sqlite_utils_hello_world', group='sqlite_utils')]} ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1722266513 https://github.com/simonw/datasette/issues/2057#issuecomment-1722266513 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mp7OR simonw 9599 2023-09-16T16:36:09Z 2023-09-16T16:36:09Z OWNER

Now I need to switch out pkg_resources in plugins.py:

https://github.com/simonw/datasette/blob/852f5014853943fa27f43ddaa2d442545b3259fb/datasette/plugins.py#L33-L74

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1722265848 https://github.com/simonw/datasette/issues/2057#issuecomment-1722265848 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mp7D4 simonw 9599 2023-09-16T16:32:42Z 2023-09-16T16:32:42Z OWNER

Here's the exception it uses: ```pycon

importlib.metadata.version("datasette") '1.0a6' importlib.metadata.version("datasette2") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/importlib/metadata/init.py", line 996, in version return distribution(distribution_name).version File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/importlib/metadata/init.py", line 969, in distribution return Distribution.from_name(distribution_name) File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/importlib/metadata/init.py", line 548, in from_name raise PackageNotFoundError(name) importlib.metadata.PackageNotFoundError: No package metadata was found for datasette2 ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1722258980 https://github.com/simonw/datasette/issues/2057#issuecomment-1722258980 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mp5Yk simonw 9599 2023-09-16T15:56:45Z 2023-09-16T15:56:45Z OWNER

Weird, I still can't get the warning to show even with this: ```python @pytest.mark.asyncio async def test_plugin_is_installed(): datasette = Datasette(memory=True)

class DummyPlugin:
    __name__ = "DummyPlugin"

    @hookimpl
    def actors_from_ids(self, datasette, actor_ids):
        return {}

try:
    pm.register(DummyPlugin(), name="DummyPlugin")
    response = await datasette.client.get("/-/plugins.json")
    assert response.status_code == 200
    installed_plugins = {p["name"] for p in response.json()}
    assert "DummyPlugin" in installed_plugins

finally:
    pm.unregister(name="ReturnNothingPlugin")

```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1722257328 https://github.com/simonw/datasette/issues/2057#issuecomment-1722257328 https://api.github.com/repos/simonw/datasette/issues/2057 IC_kwDOBm6k_c5mp4-w simonw 9599 2023-09-16T15:47:32Z 2023-09-16T15:47:32Z OWNER

Frustrating that this warning doesn't show up in the Datasette test suite itself. It shows up in plugin test suites that run this test: python @pytest.mark.asyncio async def test_plugin_is_installed(): datasette = Datasette(memory=True) response = await datasette.client.get("/-/plugins.json") assert response.status_code == 200 installed_plugins = {p["name"] for p in response.json()} assert "datasette-chronicle" in installed_plugins If you run that test inside Datasette core installed_plugins is an empty set, which presumably is why the warning doesn't get triggered there.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DeprecationWarning: pkg_resources is deprecated as an API 1662951875  
1721742055 https://github.com/simonw/datasette/issues/2186#issuecomment-1721742055 https://api.github.com/repos/simonw/datasette/issues/2186 IC_kwDOBm6k_c5mn7Ln simonw 9599 2023-09-15T19:27:59Z 2023-09-15T19:27:59Z OWNER

This feels like it might be quite a nice pattern generally - providing optional arguments to plugins and views that can be await get_x() called to run an extra calculation.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for register_output_renderer hooks to access full count 1898927976  
1721740872 https://github.com/simonw/datasette/issues/2186#issuecomment-1721740872 https://api.github.com/repos/simonw/datasette/issues/2186 IC_kwDOBm6k_c5mn65I simonw 9599 2023-09-15T19:26:51Z 2023-09-15T19:27:19Z OWNER

Here's where it's called at the moment: https://github.com/simonw/datasette/blob/16f0b6d8222d06682a31b904d0a402c391ae1c1c/datasette/views/base.py#L297-L313

And the docs: https://github.com/simonw/datasette/blob/1.0a6/docs/plugin_hooks.rst#register-output-renderer-datasette

I'm tempted to add a get_count argument which, when awaited, returns the full count. Then plugins could do this:

python async def render_notebook(datasette, request, get_count, rows): count = await get_count() # ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for register_output_renderer hooks to access full count 1898927976  
1719451803 https://github.com/simonw/datasette/pull/2182#issuecomment-1719451803 https://api.github.com/repos/simonw/datasette/issues/2182 IC_kwDOBm6k_c5mfMCb dependabot[bot] 49699333 2023-09-14T13:27:26Z 2023-09-14T13:27:26Z CONTRIBUTOR

Looks like these dependencies are updatable in another way, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 2 updates 1890593563  
1718316733 https://github.com/simonw/datasette/pull/2183#issuecomment-1718316733 https://api.github.com/repos/simonw/datasette/issues/2183 IC_kwDOBm6k_c5ma269 simonw 9599 2023-09-13T21:05:36Z 2023-09-13T21:05:36Z OWNER

I'm going to land this and make any further documentation tweaks on main.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette.yaml` plugin support 1891212159  
1714544153 https://github.com/simonw/datasette/pull/2183#issuecomment-1714544153 https://api.github.com/repos/simonw/datasette/issues/2183 IC_kwDOBm6k_c5mMd4Z codecov[bot] 22429695 2023-09-11T20:37:52Z 2023-09-13T20:58:51Z NONE

Codecov Report

Patch coverage: 95.00% and project coverage change: -0.04% :warning:

Comparison is base (a4c96d0) 92.69% compared to head (659dcbd) 92.66%.

:exclamation: Current head 659dcbd differs from pull request most recent head acca338. Consider uploading reports for the commit acca338 to get more accurate results

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2183 +/- ## ========================================== - Coverage 92.69% 92.66% -0.04% ========================================== Files 40 40 Lines 6025 6039 +14 ========================================== + Hits 5585 5596 +11 - Misses 440 443 +3 ``` | [Files Changed](https://app.codecov.io/gh/simonw/datasette/pull/2183?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/app.py](https://app.codecov.io/gh/simonw/datasette/pull/2183?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.19% <95.00%> (-0.24%)` | :arrow_down: |

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette.yaml` plugin support 1891212159  
1716801971 https://github.com/simonw/datasette/pull/2183#issuecomment-1716801971 https://api.github.com/repos/simonw/datasette/issues/2183 IC_kwDOBm6k_c5mVFGz asg017 15178711 2023-09-13T01:34:01Z 2023-09-13T01:34:01Z CONTRIBUTOR

@simonw docs are finished, this is ready for review!

One thing: I added "Configuration" as a top-level item in the documentation site, at the very bottom. Not sure if this is the best, maybe it can be named "datasette.yaml Configuration" or something similar?

Mostly because "Configuration" by itself can mean many things, but adding "datasette.yaml" would make it pretty clear it's about that specific file, and is easier to scan. I'd also be fine with using "datasette.yaml" instead of "datasette.json", since writing in YAML is much more forgiving (and advanced users will know JSON is also supported)

Also, maybe this is a chance to consolidate the docs a bit? I think "Settings", "Configuration", "Metadata", and "Authentication and permissions" should possibly be under the same section. Maybe even consolidate the different Plugin pages that exist?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette.yaml` plugin support 1891212159  
1714920708 https://github.com/simonw/sqlite-utils/issues/594#issuecomment-1714920708 https://api.github.com/repos/simonw/sqlite-utils/issues/594 IC_kwDOCGYnMM5mN50E simonw 9599 2023-09-12T03:51:13Z 2023-09-12T03:51:13Z OWNER

Changing this without breaking backwards compatibility (and forcing a 4.0 release) will be tricky, because ForeignKey() is a namedtuple:

https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/db.py#L148-L150

I could swap it out for a dataclass and add those extra columns, but I need to make sure that code like this still works: python for table, column, other_table, other_column in table.foreign_keys: # ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Represent compound foreign keys in table.foreign_keys output 1891614971  
1714919806 https://github.com/simonw/sqlite-utils/issues/594#issuecomment-1714919806 https://api.github.com/repos/simonw/sqlite-utils/issues/594 IC_kwDOCGYnMM5mN5l- simonw 9599 2023-09-12T03:49:41Z 2023-09-12T03:49:41Z OWNER

Digging in a bit more: ```pycon

pprint(list(db.query('PRAGMA foreign_key_list(courses)'))) [{'from': 'campus_name', 'id': 0, 'match': 'NONE', 'on_delete': 'NO ACTION', 'on_update': 'NO ACTION', 'seq': 0, 'table': 'departments', 'to': 'campus_name'}, {'from': 'dept_code', 'id': 0, 'match': 'NONE', 'on_delete': 'NO ACTION', 'on_update': 'NO ACTION', 'seq': 1, 'table': 'departments', 'to': 'dept_code'}] `` I think the way you tell it's a compound foreign key is that both of those have the sameidvalue - of0- but they then have two differentseqvalues of0and1`.

Right now I ignore those columns entirely: https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/db.py#L1523-L1540

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Represent compound foreign keys in table.foreign_keys output 1891614971  
1714699724 https://github.com/simonw/datasette/pull/2183#issuecomment-1714699724 https://api.github.com/repos/simonw/datasette/issues/2183 IC_kwDOBm6k_c5mND3M simonw 9599 2023-09-11T23:01:36Z 2023-09-11T23:02:30Z OWNER

On thinking about this further, I'm fine releasing it as another alpha provided it causes Datasette to error loudly with an explanatory message if you attempt to load -m metadata.json at a metadata file that includes "plugins" configuration.

Something like this: bash datasette -m metadata.json Outputs:

Datasette no longer accepts plugin configuration in --metadata. Move your "plugins" configuration blocks to a separate file - we suggest calling that datasette.yaml - and start Datasette with datasette -c datasette.yaml.

For added usability points, let's have it suggest datasette.json if they used metadata.json and datasette.yaml if they tried to use metadata.yaml.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette.yaml` plugin support 1891212159  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 354.023ms · About: github-to-sqlite