home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

120 rows where comments = 1, repo = 107914493 and state = "open" sorted by updated_at descending

✖
✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: author_association, draft, created_at (date), updated_at (date)

type 2

  • issue 109
  • pull 11

state 1

  • open · 120 ✖

repo 1

  • datasette · 120 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1983600865 PR_kwDOBm6k_c5e7WH7 2206 Bump the python-packages group with 1 update dependabot[bot] 49699333 open 0     1 2023-11-08T13:18:56Z 2023-12-08T13:46:24Z   CONTRIBUTOR simonw/datasette/pulls/2206

Bumps the python-packages group with 1 update: black.

Release notes

Sourced from black's releases.

23.11.0

Highlights

  • Support formatting ranges of lines with the new --line-ranges command-line option (#4020)

Stable style

  • Fix crash on formatting bytes strings that look like docstrings (#4003)
  • Fix crash when whitespace followed a backslash before newline in a docstring (#4008)
  • Fix standalone comments inside complex blocks crashing Black (#4016)
  • Fix crash on formatting code like await (a ** b) (#3994)
  • No longer treat leading f-strings as docstrings. This matches Python's behaviour and fixes a crash (#4019)

Preview style

  • Multiline dicts and lists that are the sole argument to a function are now indented less (#3964)
  • Multiline unpacked dicts and lists as the sole argument to a function are now also indented less (#3992)
  • In f-string debug expressions, quote types that are visible in the final string are now preserved (#4005)
  • Fix a bug where long case blocks were not split into multiple lines. Also enable general trailing comma rules on case blocks (#4024)
  • Keep requiring two empty lines between module-level docstring and first function or class definition (#4028)
  • Add support for single-line format skip with other comments on the same line (#3959)

Configuration

  • Consistently apply force exclusion logic before resolving symlinks (#4015)
  • Fix a bug in the matching of absolute path names in --include (#3976)

Performance

  • Fix mypyc builds on arm64 on macOS (#4017)

Integrations

  • Black's pre-commit integration will now run only on git hooks appropriate for a code formatter (#3940)

23.10.1

Highlights

  • Maintanence release to get a fix out for GitHub Action edge case (#3957)

Preview style

... (truncated)

Changelog

Sourced from black's changelog.

23.11.0

Highlights

  • Support formatting ranges of lines with the new --line-ranges command-line option (#4020)

Stable style

  • Fix crash on formatting bytes strings that look like docstrings (#4003)
  • Fix crash when whitespace followed a backslash before newline in a docstring (#4008)
  • Fix standalone comments inside complex blocks crashing Black (#4016)
  • Fix crash on formatting code like await (a ** b) (#3994)
  • No longer treat leading f-strings as docstrings. This matches Python's behaviour and fixes a crash (#4019)

Preview style

  • Multiline dicts and lists that are the sole argument to a function are now indented less (#3964)
  • Multiline unpacked dicts and lists as the sole argument to a function are now also indented less (#3992)
  • In f-string debug expressions, quote types that are visible in the final string are now preserved (#4005)
  • Fix a bug where long case blocks were not split into multiple lines. Also enable general trailing comma rules on case blocks (#4024)
  • Keep requiring two empty lines between module-level docstring and first function or class definition (#4028)
  • Add support for single-line format skip with other comments on the same line (#3959)

Configuration

  • Consistently apply force exclusion logic before resolving symlinks (#4015)
  • Fix a bug in the matching of absolute path names in --include (#3976)

Performance

  • Fix mypyc builds on arm64 on macOS (#4017)

Integrations

  • Black's pre-commit integration will now run only on git hooks appropriate for a code formatter (#3940)

23.10.1

Highlights

  • Maintenance release to get a fix out for GitHub Action edge case (#3957)

... (truncated)

Commits
  • 2a1c67e Prepare release 23.11.0 (#4032)
  • 72e7a2e Remove redundant condition from has_magic_trailing_comma (#4023)
  • 1a7d9c2 Preserve visible quote types for f-string debug expressions (#4005)
  • f4c7be5 docs: fix minor typo (#4030)
  • 2e4fac9 Apply force exclude logic before symlink resolution (#4015)
  • 66008fd [563] Fix standalone comments inside complex blocks crashing Black (#4016)
  • 50ed622 Fix long case blocks not split into multiple lines (#4024)
  • 46be1f8 Support formatting specified lines (#4020)
  • ecbd9e8 Fix crash with f-string docstrings (#4019)
  • e808e61 Preview: Keep requiring two empty lines between module-level docstring and fi...
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore <dependency name> major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore <dependency name> minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore <dependency name>` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore <dependency name>` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore <dependency name> <ignore condition>` will remove the ignore condition of the specified dependency and ignore conditions

:books: Documentation preview :books:: https://datasette--2206.org.readthedocs.build/en/2206/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2206/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
2029908157 I_kwDOBm6k_c54_fC9 2214 CSV export fails for some `text` foreign key references precipice 2874 open 0     1 2023-12-07T05:04:34Z 2023-12-07T07:36:34Z   NONE  

I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research.

I'm using Datasette with the SWITRS data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com.

Their data makes extensive use of codes for incident column data (1 for Monday and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or -. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read.

If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail.

The foreign key configuration is as follows:

```

Some tables use integer ids, like sensible tables do. Let's import them first

since we favor them.

for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done

Other tables use letter keys, like they were raised by WOLVES. Let's put them

at the end of the import queue.

for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ```

You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db

If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the "advanced" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output:

``` INFO: 127.0.0.1:57885 - "GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1" 200 OK Caught this error:

```

(No other output follows error: other than a blank line.)

I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2214/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
2028698018 I_kwDOBm6k_c5463mi 2213 feature request: gzip compression of database downloads fgregg 536941 open 0     1 2023-12-06T14:35:03Z 2023-12-06T15:05:46Z   CONTRIBUTOR  

At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed.

this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of "application"

(see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2213/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1994857251 I_kwDOBm6k_c525xsj 2208 No suggested facets when a column named 'value' is included rgieseke 198537 open 0     1 2023-11-15T14:11:17Z 2023-11-15T14:18:59Z   CONTRIBUTOR  

When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'.

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174

Currently the following is shown (from https://latest.datasette.io/fixtures/facetable)

When I add a column named 'value' only the JSON facets are processed.

I think that not using aliases could be a solution (except if someone wants to use a column named count(*) though this seems to be unlikely). I'll open a PR with that.

There is also a TODO with a similar question in the same file. I have not looked into that yet.

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2208/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1884408624 I_kwDOBm6k_c5wUcsw 2177 Move schema tables from _internal to _catalog simonw 9599 open 0     1 2023-09-06T16:58:33Z 2023-09-06T17:04:30Z   OWNER  

This came up in discussion over: - https://github.com/simonw/datasette/pull/2174

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2177/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1864112887 PR_kwDOBm6k_c5Yo7bk 2151 Test Datasette on multiple SQLite versions asg017 15178711 open 0     1 2023-08-23T22:42:51Z 2023-08-23T22:58:13Z   CONTRIBUTOR simonw/datasette/pulls/2151

still testing, hope it works!


:books: Documentation preview :books:: https://datasette--2151.org.readthedocs.build/en/2151/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2151/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
476852861 MDU6SXNzdWU0NzY4NTI4NjE= 568 Add database_color as a configurable option LBHELewis 50906992 open 0     1 2019-08-05T13:14:45Z 2023-08-11T05:19:42Z   NONE  

This would be really useful as it would allow us to tie in with colour schemes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/568/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1843821954 I_kwDOBm6k_c5t5n2C 2137 Redesign row default JSON simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2023-08-09T18:49:11Z 2023-08-09T19:02:47Z   OWNER  

This URL here:

https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables

json { "database": "fixtures", "table": "simple_primary_key", "rows": [ { "id": "1", "content": "hello" } ], "columns": [ "id", "content" ], "primary_keys": [ "id" ], "primary_key_values": [ "1" ], "units": {}, "foreign_key_tables": [ { "other_table": "foreign_key_references", "column": "id", "other_column": "foreign_key_with_blank_label", "count": 0, "link": "/fixtures/foreign_key_references?foreign_key_with_blank_label=1" }, { "other_table": "foreign_key_references", "column": "id", "other_column": "foreign_key_with_label", "count": 1, "link": "/fixtures/foreign_key_references?foreign_key_with_label=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f3", "count": 1, "link": "/fixtures/complex_foreign_keys?f3=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f2", "count": 0, "link": "/fixtures/complex_foreign_keys?f2=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f1", "count": 1, "link": "/fixtures/complex_foreign_keys?f1=1" } ], "query_ms": 4.226590999678592, "source": "tests/fixtures.py", "source_url": "https://github.com/simonw/datasette/blob/main/tests/fixtures.py", "license": "Apache License 2.0", "license_url": "https://github.com/simonw/datasette/blob/main/LICENSE", "ok": true, "truncated": false }

That ?_extras= should be ?_extra= - plus the row JSON should be redesigned to fit the new default JSON representation.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2137/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1840417903 I_kwDOBm6k_c5tsoxv 2131 Refactor code that supports templates_considered comment simonw 9599 open 0   Datasette 1.0 3268330 1 2023-08-08T01:28:36Z 2023-08-09T15:27:41Z   OWNER  

I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167

I think it should move to datasette.render_template() - and maybe have a renamed template variable too.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2131/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1840324765 I_kwDOBm6k_c5tsSCd 2129 CSV ?sql= should indicate errors simonw 9599 open 0   Datasette 1.0 3268330 1 2023-08-07T23:13:04Z 2023-08-08T02:02:21Z   OWNER  

https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now:

bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend

Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2129/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1803264272 I_kwDOBm6k_c5re6EQ 2101 alter: true support for JSON write API simonw 9599 open 0     1 2023-07-13T15:24:11Z 2023-07-13T15:24:18Z   OWNER  

Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642

The former datasette-insert plugin had an option ?alter=1 to auto-add new columns. Does the JSON write API also have this?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2101/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1698865182 I_kwDOBm6k_c5lQqAe 2069 [BUG] Cannot insert new data to deployed instance yqlbu 31861128 open 0     1 2023-05-07T02:59:42Z 2023-05-07T03:17:35Z   NONE  

Summary

Recently, I deployed an instance of datasette to Vercel with the following plugins:

  • datasette-auth-tokens
  • datasette-insert

With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel:

console File "/var/task/datasette/database.py", line 179, in _execute_writes conn = self.connect(write=True) File "/var/task/datasette/database.py", line 93, in connect assert not (write and not self.is_mutable) AssertionError

I think it is a potential bug.

Reproduce

metadata.json
```json { "plugins": { "datasette-insert": { "allow": { "id": "*" } }, "datasette-auth-tokens": { "tokens": [ { "token": { "$env": "INSERT_TOKEN" }, "actor": { "id": "repeater" } } ], "param": "_auth_token" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H "Authorization: Bearer <token>" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File "/var/task/datasette/app.py", line 1354, in route_path response = await view(request, send) File "/var/task/datasette/app.py", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File "/var/task/datasette/utils/__init__.py", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File "/var/task/datasette_insert/__init__.py", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File "/var/task/datasette_insert/__init__.py", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File "/var/task/datasette/database.py", line 167, in execute_write_fn raise result File "/var/task/datasette/database.py", line 179, in _execute_writes conn = self.connect(write=True) File "/var/task/datasette/database.py", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2069/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1690765434 I_kwDOBm6k_c5kxwh6 2067 Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6 justmars 39538958 open 0     1 2023-05-01T12:42:28Z 2023-05-03T00:16:03Z   NONE  

Hi! Wondering if this issue is limited to my local system or if it affects others as well.

It seems like 3.11 errors out on a "litestream-restored" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6.

To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli.

```sh

create new shell with 3.11.3

litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite

SQLite version 3.41.2 2023-03-22 11:56:21

Enter ".help" for usage hints.

sqlite> .tables

_litestream_lock _litestream_seq movie

sqlite>

```

However this get me an OperationalError when reading via datasette:

Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning) Traceback (most recent call last): File "/tester/.venv/bin/datasette", line 8, in <module> sys.exit(cli()) ^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File "/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 660, in check_databases await database.execute_fn(check_connection) File "/tester/.venv/lib/python3.11/site-packages/datasette/database.py", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/database.py", line 211, in in_thread return fn(conn) ^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```

In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2067/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1665053646 I_kwDOBm6k_c5jPrPO 2059 "Deceptive site ahead" alert on Heroku deployment mtdukes 1186275 open 0     1 2023-04-12T18:34:51Z 2023-04-13T01:13:01Z   NONE  

I deployed a fairly basic instance of Datasette (datasette-auth-passwords is the only plugin) using Heroku. The deployed URL now gives a "Deceptive site ahead" warning to users.

Is there way around this? Maybe a way to add ownership verification through Google's search console?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2059/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1649791661 I_kwDOBm6k_c5iVdKt 2050 Row page JSON should use new ?_extra= format simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2023-03-31T17:56:53Z 2023-03-31T17:59:49Z   OWNER  

https://latest.datasette.io/fixtures/facetable/2.json

Related: - #2049 - #1709

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2050/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1579973223 I_kwDOBm6k_c5eLHpn 2024 Mention WAL mode in documentation simonw 9599 open 0     1 2023-02-10T16:11:10Z 2023-02-10T16:11:53Z   OWNER  

It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2024/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
473288428 MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz 564 First proof-of-concept of Datasette Library simonw 9599 open 0     1 2019-07-26T10:22:26Z 2023-02-07T15:14:11Z   OWNER simonw/datasette/pulls/564

Refs #417. Run it like this:

 datasette -d ~/Library

Uses a new plugin hook - available_databases()

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/564/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
1571207083 I_kwDOBm6k_c5dprer 2016 Database metadata fields like description are not available in the index page template's context palewire 9993 open 0   Datasette 1.0 3268330 1 2023-02-05T02:25:53Z 2023-02-05T22:56:43Z   NONE  

When looping through databases in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2016/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1556065335 PR_kwDOBm6k_c5Ie5nA 2004 use single quotes for string literals, fixes #2001 cldellow 193185 open 0     1 2023-01-25T05:08:46Z 2023-02-01T06:37:18Z   CONTRIBUTOR simonw/datasette/pulls/2004

This modernizes some uses of double quotes for string literals to use only single quotes, fixes simonw/datasette#2001

While developing it, I manually enabled the stricter mode by using the code snippet at https://gist.github.com/cldellow/85bba507c314b127f85563869cd94820

I think that code snippet isn't generally safe/portable, so I haven't tried to automate it in the tests.


:books: Documentation preview :books:: https://datasette--2004.org.readthedocs.build/en/2004/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2004/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1564769997 I_kwDOBm6k_c5dRH7N 2011 Applied facet did not result in an "x" icon to dismiss it simonw 9599 open 0     1 2023-01-31T17:57:44Z 2023-01-31T17:58:54Z   OWNER  

That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata

It's for Contract Type of Non-Purchasing Contract (Rents, etc.) - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2011/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1563264257 I_kwDOBm6k_c5dLYUB 2010 Row page should default to card view simonw 9599 open 0   Datasette 1.0 3268330 1 2023-01-30T21:49:37Z 2023-01-30T21:52:06Z   OWNER  

Datasette currently uses the same table layout on the row pages as it does on the table pages:

https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect

https://datasette.io/content/pypi_packages/datasette-column-inspect

If you shrink down to mobile width you get this instead, on both of those pages:

I think that view, which I think of as the "card view", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2010/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1557507274 I_kwDOBm6k_c5c1azK 2005 `extra_template_vars` should be OK to return `None` simonw 9599 open 0     1 2023-01-26T01:40:45Z 2023-01-26T01:41:50Z   OWNER  

Got this exception and had to make sure it always returned {}:

File ".../python3.11/site-packages/datasette/app.py", line 1049, in render_template assert isinstance(extra_vars, dict), "extra_vars is of type {}".format( AssertionError: extra_vars is of type <class 'NoneType'>

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2005/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1536851861 I_kwDOBm6k_c5bmn-V 1994 Stuck on loading screen jackhagley 10913053 open 0     1 2023-01-17T18:33:49Z 2023-01-23T08:21:08Z   NONE  

Can’t actually open it!

Downloaded today from the releases tab

Running macOS13.1

bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms STUCK

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1994/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1552368054 I_kwDOBm6k_c5ch0G2 2000 rewrite_sql hook cldellow 193185 open 0     1 2023-01-23T01:02:52Z 2023-01-23T06:08:01Z   CONTRIBUTOR  

I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like

python def rewrite_sql(datasette, database, request, fn, sql, params)

It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. fn would indicate which method was being used, in case that's relevant for the SQL inspection -- for example execute only permits a single statement.

The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed.

Plugins that could be written with this hook:

  • https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching
  • a plugin to inspect and reject unsafe Spatialite function calls (reported by Simon in Discord)
  • a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID
  • a plugin to maintain audit tables when users write to a table
  • a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue

Flaws with this idea:

execute_fn and execute_write_fn would not go through this hook, which limits the guarantees you can make about it for security purposes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2000/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1538342965 PR_kwDOBm6k_c5HpNYo 1996 Document custom json encoder eyeseast 25778 open 0     1 2023-01-18T16:54:14Z 2023-01-19T12:55:57Z   CONTRIBUTOR simonw/datasette/pulls/1996

Closes #1983

All documentation here. Edits welcome.


:books: Documentation preview :books:: https://datasette--1996.org.readthedocs.build/en/1996/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1996/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1426080014 I_kwDOBm6k_c5VAEEO 1867 /db/table/-/rename API (also allows atomic replace) simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-10-27T18:13:23Z 2023-01-09T15:34:12Z   OWNER  

There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first.

...

If people care about that kind of thing they could always push all of their inserts to a table called _tablename and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism).

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1867/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1524983536 I_kwDOBm6k_c5a5Wbw 1981 Canned query field labels truncated simonw 9599 open 0     1 2023-01-09T06:04:24Z 2023-01-09T06:05:44Z   OWNER  

Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1981/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1115435536 I_kwDOBm6k_c5CfDIQ 1614 Try again with SQLite codemirror support simonw 9599 open 0     1 2022-01-26T20:05:20Z 2022-12-23T21:27:10Z   OWNER  

I tried and failed to implement autocomplete a while ago. Relevant code:

https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335

Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1614/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1501778647 I_kwDOBm6k_c5Zg1LX 1964 Cog menu is not keyboard accessible (also no ARIA) simonw 9599 open 0     1 2022-12-18T06:36:28Z 2022-12-18T06:37:28Z   OWNER  

This menu here: https://latest.datasette.io/fixtures/attraction_characteristic

You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1964/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1500636982 I_kwDOBm6k_c5Zcec2 1962 Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient` simonw 9599 open 0     1 2022-12-16T17:56:51Z 2022-12-16T21:55:29Z   OWNER  

In this issue I replaced a whole bunch of places that used the non-async app_client fixture with an async ds_client fixture instead: - #1959

But I didn't get everything, and a lot of tests are still using the old TestClient mechanism as a result.

The main work here is replacing all of the app_client_... fixtures which use variants on the default client - and changing the tests that call make_app_client() to do something else instead.

This requires some careful thought. I need to come up with a really nice pattern for creating variants on the ds_client default fixture - and do so in a way that minimizes the number of open files, refs:

  • 1843

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1962/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1490576818 I_kwDOBm6k_c5Y2GWy 1943 `/-/permissions` should list available permissions simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-12-11T23:38:03Z 2022-12-15T00:41:37Z   OWNER  

Idea: a /-/permissions introspection endpoint for listing registered permissions

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1939#issuecomment-1345691103

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1943/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1497577017 I_kwDOBm6k_c5ZQzY5 1957 Reconsider row value truncation on query page simonw 9599 open 0     1 2022-12-14T23:49:47Z 2022-12-14T23:50:50Z   OWNER  

Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos

sql select json_group_array(full_name) from repos

My intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json

The truncation isn't helping here.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1957/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1495716243 I_kwDOBm6k_c5ZJtGT 1952 Improvements to /-/create-token restrictions interface simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-12-14T05:22:39Z 2022-12-14T05:23:13Z   OWNER  

It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks.

That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402

Also, DO show the _memory database there if Datasette was started in --crossdb mode.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1952/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1468689139 I_kwDOBm6k_c5Ximrz 1914 Finalize design of JSON for Datasette 1.0 simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-11-29T20:59:10Z 2022-12-13T06:15:54Z   OWNER  

Tracking issue.

  • [ ] #1709
  • [ ] #1729
  • [ ] #1875
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1914/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1200649502 I_kwDOBm6k_c5HkHUe 1709 Redesigned JSON API with ?_extra= parameters simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-04-11T22:57:49Z 2022-12-13T05:29:06Z   OWNER  

This will be the single biggest breaking change for the 1.0 release.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1709/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1200650491 I_kwDOBm6k_c5HkHj7 1711 Template context powered entirely by the JSON API format simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-04-11T22:59:27Z 2022-12-13T05:29:06Z   OWNER  

Datasette 1.0 will have a stable template context. I'm going to achieve this by refactoring the templates to work only with keys returned by the API (or some of its extras) - then the API documentation will double up as template documentation.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1711/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1197926598 I_kwDOBm6k_c5HZujG 1705 How to upgrade your plugin for 1.0 documentation simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2022-04-08T23:16:47Z 2022-12-13T05:29:05Z   OWNER  

Among other things, needed by: - #1704

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1705/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1493404423 I_kwDOBm6k_c5ZA4sH 1948 500 error on permission debug page when testing actors with _r simonw 9599 open 0     1 2022-12-13T05:22:03Z 2022-12-13T05:22:19Z   OWNER  

The 500 error is silent unless you are looking at the DevTools network pane.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1948/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1487738738 I_kwDOBm6k_c5YrRdy 1942 Option for plugins to request that JSON be served on the page simonw 9599 open 0   Datasette 1.0 3268330 1 2022-12-10T01:08:53Z 2022-12-10T01:11:30Z   OWNER  

Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say "I'd like the JSON for a page to be included in a variable on the HTML page"?

datasette-cluster-map already needs this - the first thing it does when the page loads is fetch() a JSON representation of that same data.

This idea fits with my overall goals to unify the JSON and HTML context too.

Refs: - #1711

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1942/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1486036269 I_kwDOBm6k_c5Ykx0t 1941 Mechanism for supporting key rotation for DATASETTE_SECRET simonw 9599 open 0     1 2022-12-09T05:24:53Z 2022-12-09T05:25:20Z   OWNER  

Currently if you change DATASETTE_SECRET all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire.

Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly.

Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1941/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1469062686 I_kwDOBm6k_c5XkB4e 1919 Intermittent `test_delete_row` test failure simonw 9599 open 0     1 2022-11-30T05:18:46Z 2022-11-30T05:20:56Z   OWNER  

https://github.com/simonw/datasette/actions/runs/3580503393/jobs/6022689591

``` delete_response = await ds_write.client.post( "/data/{}/{}/-/delete".format(table, delete_path), headers={ "Authorization": "***".format(write_token(ds_write)), }, )

  assert delete_response.status_code == 200

E assert 404 == 200 E + where 404 = <Response [404 Not Found]>.status_code

/home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError =========================== short test summary info ============================ FAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200 + where 404 = <Response [404 Not Found]>.status_code ``` This passes most of the time, but very occasionally fails - in this case in Python 3.7

It seems to only fail for the article,k compound primary key test.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1919/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1466952626 I_kwDOBm6k_c5Xb-uy 1909 Option to sort facets alphabetically simonw 9599 open 0     1 2022-11-28T19:18:14Z 2022-11-28T19:19:26Z   OWNER  

Suggested here: - https://github.com/simonw/datasette/discussions/1908

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1909/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
860722711 MDU6SXNzdWU4NjA3MjI3MTE= 1301 Publishing to cloudrun with immutable mode? louispotok 5413548 open 0     1 2021-04-18T17:51:46Z 2022-10-07T02:38:04Z   CONTRIBUTOR  

I'm a bit confused about immutable mode and publishing to cloudrun. (I want to publish with immutable mode so that I can support database downloads.)

Running datasette publish cloudrun --extra-options="-i example.db" leads to an error:

Error: Invalid value for '-i' / '--immutable': Path 'example.db' does not exist.

However, running datasette publish cloudrun example.db not only works but seems to publish in immutable mode anyway! I'm seeing this both with /-/databases.json and the fact that downloads are working.

When I just datasette serve locally, this succeeds both ways and works as expected.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1301/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1387712501 I_kwDOBm6k_c5Sts_1 1824 Convert &_hide_sql=1 to #_hide_sql CharlesNepote 562352 open 0     1 2022-09-27T12:53:31Z 2022-10-05T12:56:27Z   NONE  

Hiding the SQL textarea with &_hide_sql=1 enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries).

It could probably be done with a few lines of Javascript (I'm going to see if I can do that).

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1824/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1366915240 I_kwDOBm6k_c5ReXio 1807 Plugin ecosystem needs to avoid crashes due to no available databases simonw 9599 open 0     1 2022-09-08T19:54:34Z 2022-09-08T20:14:05Z   OWNER  

Opening this here to track the issue first reported in: - https://github.com/simonw/datasette-upload-dbs/issues/5

Plugins that expect to be able to write to a database need to not crash in situations where no writable database is available.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1807/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1268121674 PR_kwDOBm6k_c45fz-O 1757 feat: add a wildcard for _json columns ytjohn 163156 open 0     1 2022-06-11T01:01:17Z 2022-09-06T00:51:21Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/1757

This allows _json to accept a wildcard for when there are many JSON columns that the user wants to convert. I hope this is useful. I've tested it on our datasette and haven't ran into any issues. I imagine on a large set of results, there could be some performance issues, but it will probably be negligible for most use cases.

On a side note, I ran into an issue where I had to upgrade black on my system beyond the pinned version in setup.py. Here is the upstream issue <https://github.com/psf/black/issues/2964 . I didn't include this in the PR yet since I didn't look into the issue too far, but I can if you would like.

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1757/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1347717749 I_kwDOBm6k_c5QVIp1 1791 Updating metadata.json on Datasette for MacOS ment4list 1780782 open 0     1 2022-08-23T10:41:16Z 2022-08-23T13:29:51Z   NONE  

I've installed Datasette for Mac as per the documentation and it's working great!

However, I'm not sure how to go about adding something like "Canned Queries" or utilising other advanced features or settings by manipulating the metadata.json or settings.json files.

I can view these files from the Datasette App from the top right "burger" menu but it only shows the contents of the file with no way to edit or change it.

Am I missing something? Where can I update the metadata.json file using the MacOS App?

PS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1791/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1337541526 I_kwDOBm6k_c5PuUOW 1780 `facet_time_limit_ms` and `sql_time_limit_ms` overlap? davepeck 53165 open 0     1 2022-08-12T17:55:37Z 2022-08-15T23:50:08Z   NONE  

I needed more than the default 200ms to facet a specific column in a database I was working with, so I ran datasette with --setting facet_time_limit_ms 30000 — definitely overkill!

But it still didn't work; it took a moment to realize I also needed to up my sql_time_limit_ms to something larger too.

I'm happy to submit a PR that documents this behavior if it's helpful. Or, if there's a code change we'd like to make (like making sure sql_time_limit_ms is always set to the larger of itself and facet_time_limit_ms), happy to do that too.

Apologies if I missed this somewhere in the docs. And: thanks. I'm really enjoying the simple, effective tooling datasette gives me out of the box for exploring my databases!

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1780/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1200224939 I_kwDOBm6k_c5Hifqr 1707 [feature] expanded detail page fgregg 536941 open 0     1 2022-04-11T16:29:17Z 2022-04-11T16:33:00Z   CONTRIBUTOR  

Right now, if click on the detail page for a row you get the info for the row and links to related tables:

It would be very cool if there was an option to expand the rows of the related tables from within this detail view.

If you had that then datasette could fulfill a pretty common use case where you want to search for an entity and get a consolidate detail view about what you know about that entity.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1707/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1179998071 I_kwDOBm6k_c5GVVd3 1684 Mechanism for disabling faceting on large tables only simonw 9599 open 0     1 2022-03-24T20:06:11Z 2022-03-24T20:13:19Z   OWNER  

Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions

One option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1684/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
780153562 MDU6SXNzdWU3ODAxNTM1NjI= 1177 Ability to stream all rows as newline-delimited JSON simonw 9599 open 0   Datasette 1.0 3268330 1 2021-01-06T07:10:48Z 2022-03-21T15:08:52Z   OWNER  

Yet another use-case for this: I want to be able to stream newline-delimited JSON in order to better import into Pandas:

pandas.read_json("https://latest.datasette.io/fixtures/compound_three_primary_keys.json?_shape=array&_nl=on", lines=True)

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1101#issuecomment-755128038

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1177/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1174717287 I_kwDOBm6k_c5GBMNn 1674 Tweak design of /.json simonw 9599 open 0   Datasette 1.0 3268330 1 2022-03-20T22:58:01Z 2022-03-20T22:58:40Z   OWNER  

https://latest.datasette.io/.json

Currently: json { "_memory": { "name": "_memory", "hash": null, "color": "a6c7b9", "path": "/_memory", "tables_and_views_truncated": [], "tables_and_views_more": false, "tables_count": 0, "table_rows_sum": 0, "show_table_row_counts": false, "hidden_table_rows_sum": 0, "hidden_tables_count": 0, "views_count": 0, "private": false }, "fixtures": { "name": "fixtures", "hash": "645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa", "color": "645005", "path": "/fixtures", "tables_and_views_truncated": [ { "name": "compound_three_primary_keys", "columns": [ "pk1", "pk2", "pk3", "content" ], "primary_keys": [ "pk1", "pk2", "pk3" ], "count": 1001, "hidden": false, "fts_table": null, "num_relationships_for_sorting": 0, "private": false }, As-of this issue the "path" key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns:

  • 1668

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1674/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1174708375 I_kwDOBm6k_c5GBKCX 1673 Streaming CSV spends a lot of time in `table_column_details` simonw 9599 open 0     1 2022-03-20T22:25:28Z 2022-03-20T22:34:06Z   OWNER  

At least I think it does. I tried running py-spy top -p $PID against a Datasette process that was trying to do:

datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on'

While investigating: - #1355

And spotted this: ``` datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2) Total Samples 5800 GIL: 71.00%, Active: 98.00%, Threads: 4

%Own %Total OwnTime TotalTime Function (filename:line)
8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212) 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/init.py:614) 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81) 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120) 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/init.py:571) 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140) ``` Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/init.py#L609-L625

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1673/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1174697144 I_kwDOBm6k_c5GBHS4 1672 Refactor CSV handling code out of DataView simonw 9599 open 0   Datasette 1.0 3268330 1 2022-03-20T21:47:00Z 2022-03-20T21:52:39Z   OWNER  

I think the way to get rid of most of the remaining complexity in DataView is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1672/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1149310456 I_kwDOBm6k_c5EgRX4 1641 Tweak mobile keyboard settings simonw 9599 open 0     1 2022-02-24T13:47:10Z 2022-02-24T13:49:26Z   OWNER  

https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12

autocorrect="off" is worth experimenting with.

Twitter: https://twitter.com/forestgregg/status/1496842959563726852

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1641/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1096536240 I_kwDOBm6k_c5BW9Cw 1586 run analyze on all databases as part of start up or publishing fgregg 536941 open 0     1 2022-01-07T17:52:34Z 2022-02-02T07:13:37Z   CONTRIBUTOR  

Running analyze; lets sqlite's query planner make much better use of any indices.

It might be nice if the analyze was run as part of the start up of "serve" or "publish".

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1586/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1113384383 I_kwDOBm6k_c5CXOW_ 1611 Avoid ever running count(*) against SpatiaLite KNN table simonw 9599 open 0     1 2022-01-25T03:32:54Z 2022-02-02T06:45:47Z   OWNER  

Got this in a trace:

Looks like running count(*) against KNN took 83s! It ignored the time limit. And still only returned a count of 0.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1611/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
328155946 MDU6SXNzdWUzMjgxNTU5NDY= 301 --spatialite option for "datasette publish heroku" simonw 9599 open 0     1 2018-05-31T14:13:09Z 2022-01-20T21:28:50Z   OWNER  

Split off from #243. Need to figure out how to install and configure SpatiaLite on Heroku.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/301/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
849975810 MDU6SXNzdWU4NDk5NzU4MTA= 1292 Research ctypes.util.find_library('spatialite') simonw 9599 open 0     1 2021-04-04T22:36:59Z 2022-01-20T21:28:50Z   OWNER  

Spotted this in the Django SpatiaLite backend: https://github.com/django/django/blob/8f6a7a0e9e7c5404af6520ae606927e32415eb00/django/contrib/gis/db/backends/spatialite/base.py#L24-L36

python ctypes.util.find_library('spatialite')

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1292/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1108300685 I_kwDOBm6k_c5CD1ON 1604 Option to assign a domain/subdomain using `datasette publish cloudrun` simonw 9599 open 0     1 2022-01-19T16:21:17Z 2022-01-19T16:23:54Z   OWNER  

Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1604/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
733999615 MDU6SXNzdWU3MzM5OTk2MTU= 1079 Handle long breadcrumbs better with new menu simonw 9599 open 0     1 2020-11-01T15:57:41Z 2022-01-13T22:21:29Z   OWNER  

On this page when signed in as root: https://latest.datasette.io/fixtures/roadside_attraction_characteristics/1

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1079/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
707849175 MDU6SXNzdWU3MDc4NDkxNzU= 974 static assets and favicon aren't cached by the browser obra 45416 open 0     1 2020-09-24T04:44:55Z 2022-01-13T22:21:28Z   NONE  

Using datasette to solve some frustrating problems with our fulfillment provider today, I was surprised to see repeated requests for assets under /-/static and the favicon. While it won't likely be a huge performance bottleneck, I bet datasette would feel a bit zippier if you had Uvicorn serving up some caching-related headers telling the browser it was safe to cache static assets.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/974/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
895686039 MDU6SXNzdWU4OTU2ODYwMzk= 1336 Document turning on WAL for live served SQLite databases simonw 9599 open 0     1 2021-05-19T17:08:58Z 2022-01-13T21:55:59Z   OWNER  

Datasette docs don't talk about WAL yet, which allows you to safely serve reads from a database file while it is accepting writes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1336/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1100499619 I_kwDOBm6k_c5BmEqj 1592 Row pages should show links to foreign keys simonw 9599 open 0     1 2022-01-12T15:50:20Z 2022-01-12T15:52:17Z   OWNER  

Refs #1518 refactor.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1592/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1091838742 I_kwDOBm6k_c5BFCMW 1585 Fire base caching for `publish cloudrun` simonw 9599 open 0     1 2022-01-01T15:38:15Z 2022-01-01T15:40:38Z   OWNER  

https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848

Could this enable easier vanity URLs of the format https://$project_id.web.app/? How about CDN caching?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1585/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1090810196 I_kwDOBm6k_c5BBHFU 1583 consider adding deletion step of cloudbuild artifacts to gcloud publish fgregg 536941 open 0     1 2021-12-30T00:33:23Z 2021-12-30T00:34:16Z   CONTRIBUTOR  

right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun.

after successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1583/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1084185188 I_kwDOBm6k_c5An1pk 1573 Make trace() a documented internal API simonw 9599 open 0     1 2021-12-19T20:32:56Z 2021-12-19T21:13:13Z   OWNER  

This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52

Including the new kwargs pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1573/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
705840673 MDU6SXNzdWU3MDU4NDA2NzM= 972 Support faceting against arbitrary SQL queries simonw 9599 open 0     1 2020-09-21T19:00:43Z 2021-12-15T18:02:20Z   OWNER  

... support for running facets against arbitrary custom SQL queries is half-done in that facets now execute against wrapped subqueries as-of ea66c45df96479ef66a89caa71fff1a97a862646

https://github.com/simonw/datasette/blob/ea66c45df96479ef66a89caa71fff1a97a862646/datasette/facets.py#L192-L200 Originally posted by @simonw in https://github.com/simonw/datasette/issues/971#issuecomment-696307922

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/972/reactions",
    "total_count": 3,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 3,
    "rocket": 0,
    "eyes": 0
}
   
1072106103 I_kwDOBm6k_c4_5wp3 1542 feature request: order and dependency of plugins (that use js) fs111 33631 open 0     1 2021-12-06T12:40:45Z 2021-12-15T17:47:08Z   NONE  

I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of datasette-cluster-map and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js.

Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it.

Basically what I want to do is:

my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized.

Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1542/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1069881276 I_kwDOBm6k_c4_xRe8 1541 Different default layout for row page simonw 9599 open 0     1 2021-12-02T18:56:36Z 2021-12-02T18:56:54Z   OWNER  

The row page displays as a table even though it only has one table row.

maybe default to the same display as the narrow page version, even for wide pages?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1541/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1058803238 I_kwDOBm6k_c4_HA4m 1520 Pattern for avoiding accidental URL over-rides simonw 9599 open 0     1 2021-11-19T18:28:05Z 2021-11-19T18:29:26Z   OWNER  

Following #1517 I'm experimenting with a plugin that does this: python @hookimpl def register_routes(): return [ (r"/(?P<db_name>[^/]+)/(?P<table_and_format>[^/]+?)$", Table().view), ] This is supposed to replace the default table page with new code... but there's a problem: /-/versions on that instance now returns 404 Database '-' does not exist!

Need to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1520/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
812704869 MDU6SXNzdWU4MTI3MDQ4Njk= 1237 ?_pretty=1 option for pretty-printing JSON output simonw 9599 open 0   Datasette 1.0 3268330 1 2021-02-20T20:54:40Z 2021-11-16T18:28:33Z   OWNER  

Suggested by @frankieroberto in https://github.com/simonw/datasette/issues/782#issuecomment-782746755

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1237/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1052247023 I_kwDOBm6k_c4-uAPv 1505 Datasette should have an option to output CSV with semicolons simonw 9599 open 0     1 2021-11-12T18:02:21Z 2021-11-16T11:40:52Z   OWNER     datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1505/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1054246919 I_kwDOBm6k_c4-1ogH 1511 Review plugin hooks for Datasette 1.0 simonw 9599 open 0   Datasette 1.0 3268330 1 2021-11-15T23:26:05Z 2021-11-16T01:20:14Z   OWNER  

I need to perform a detailed review of the plugin interface - especially the plugin hooks like register_facet_classes() which I don't yet have complete confidence in.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1511/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
826064552 MDU6SXNzdWU4MjYwNjQ1NTI= 1253 Capture "Ctrl + Enter" or "⌘ + Enter" to send SQL query? rayvoelker 9308268 open 0     1 2021-03-09T15:00:50Z 2021-10-30T16:00:42Z   NONE  

It appears as though "Shift + Enter" triggers the form submit action to submit SQL, but could that action be bound to the "Ctrl + Enter" or "⌘ + Enter" action?

I feel like that pattern already exists in a number of similar tools and could improve usability of the editor.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1253/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
990367646 MDU6SXNzdWU5OTAzNjc2NDY= 1462 Separate out "debug" options from "root" options simonw 9599 open 0     1 2021-09-07T21:27:34Z 2021-09-07T21:34:33Z   OWNER  

I ditched "root" for "admin" because root by default gives you a whole bunch of stuff which I think could be confusing:

Maybe the real problem here is that I'm conflating "root" permissions with "debug" options. Perhaps there should be an extra Datasette mode that unlocks debug tools for the root user?

Originally posted by @simonw in https://github.com/simonw/datasette-app-support/issues/8#issuecomment-914638998

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1462/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
988556488 MDU6SXNzdWU5ODg1NTY0ODg= 1459 suggestion: allow `datasette --open` to take a relative URL ctb 51016 open 0     1 2021-09-05T17:17:07Z 2021-09-05T19:59:15Z   CONTRIBUTOR  

(soft suggestion because I'm not sure I'm using datasette right yet)

Over at https://github.com/ctb/2021-sourmash-datasette, I'm playing around with datasette, and I'm creating some static pages to send people to the right facets. There may well be better ways of achieving this end goal, and I will find out if so, I'm sure!

But regardless I think it might be neat to support an option to allow -o/--open to take a relative URL, that then gets appended to the hostname and port. This would let me improve my documentation. I don't see any downsides, either, but 🤷 there may well be some :)

Happy to dig in and provide a PR if it's of interest. I'm not sure off the top of my head how to support an optional value to a parameter in argparse - the current -o behavior is kinda nice so it'd be suboptimal to require a url for -o. Maybe --open-url= or something would work?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1459/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
970463436 MDExOlB1bGxSZXF1ZXN0NzEyNDEyODgz 1434 Enrich arbitrary query results with foreign key links and column descriptions simonw 9599 open 0     1 2021-08-13T14:43:01Z 2021-08-19T21:18:58Z   OWNER simonw/datasette/pulls/1434

Refs #1293, follows #942.

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1434/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
947044667 MDU6SXNzdWU5NDcwNDQ2Njc= 1398 Documentation on using Datasette as a library simonw 9599 open 0     1 2021-07-18T14:15:27Z 2021-07-30T03:21:49Z   OWNER  

Instantiating Datasette() directly is an increasingly interesting pattern. I do it in tests all the time, but thanks to datasette.client there are plenty of neat things you can do with it in a library context.

Maybe support from datasette import Datasette for this.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1398/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
776128565 MDU6SXNzdWU3NzYxMjg1NjU= 1163 "datasette insert data.db url-to-csv" simonw 9599 open 0     1 2020-12-29T23:21:21Z 2021-06-17T18:12:32Z   OWNER  

Refs #1160 - get filesystem imports working first for #1162, then add import-from-URL.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1163/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
743359646 MDU6SXNzdWU3NDMzNTk2NDY= 1096 TSV should be a default export option simonw 9599 open 0     1 2020-11-15T22:24:02Z 2021-06-17T18:12:31Z   OWNER  

Refs #1095

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1096/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
913900374 MDU6SXNzdWU5MTM5MDAzNzQ= 1369 Don't show foreign key IDs twice if no label simonw 9599 open 0     1 2021-06-07T19:47:02Z 2021-06-07T19:47:24Z   OWNER  

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1369/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
901009787 MDU6SXNzdWU5MDEwMDk3ODc= 1340 Research: Cell action menu (like column action but for individual cells) simonw 9599 open 0     1 2021-05-25T15:49:16Z 2021-05-26T18:59:58Z   OWNER  

Had an idea today that it might be useful to select an individual cell and say things like "show me all other rows with the same value" - maybe even a set of other menu options against cells as well.

Mocked up a show-on-hover ellipses demo using the CSS inspector:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1340/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
817544251 MDU6SXNzdWU4MTc1NDQyNTE= 1245 Sticky table column headers would be useful, especially on the query page simonw 9599 open 0     1 2021-02-26T17:42:51Z 2021-04-02T20:53:35Z   OWNER  

Suggestion from office hours.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1245/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
787173276 MDU6SXNzdWU3ODcxNzMyNzY= 1193 Research plugin hook for alternative database backends simonw 9599 open 0     1 2021-01-15T20:27:50Z 2021-03-12T01:01:54Z   OWNER  

I started exploring what Datasette would like running against PostgreSQL in #670 and @dazzag24 did some work on Parquet described in #657.

I had initially thought this was WAY too much additional complexity, but I'm beginning to think that the Database class may be small enough that having it abstract away the details of running queries against alternative database backends could be feasible.

A bigger issue is SQL generation, but I realized that most of Datasette's SQL generation code exists just in the TableView class that runs the table page. If this was abstracted into some kind of SQL builder that could be then customized per-database it might be reasonable to get it working.

Very unlikely for this to make it into Datasette 1.0, but maybe this would be the defining feature of Datasette 2.0?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1193/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
803356942 MDU6SXNzdWU4MDMzNTY5NDI= 1218 /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory robmarkcole 11855322 open 0     1 2021-02-08T09:07:00Z 2021-02-23T12:12:17Z   NONE  

Error as above, however I do have python3.8 and the readme indicates this is supported.

``` (venv) (base) Robins-MacBook:datasette robin$ ls /usr/local/opt/python3/bin/

.. pip3 python3 python3.8 ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1218/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
811505638 MDU6SXNzdWU4MTE1MDU2Mzg= 1234 Runtime support for ATTACHing multiple databases simonw 9599 open 0     1 2021-02-18T22:06:47Z 2021-02-22T21:06:28Z   OWNER  

The implementation in #1232 is ready to land. It's the simplest-thing-that-could-possibly-work: you can run datasette one.db two.db three.db --crossdb and then use the /_memory page to run joins across tables from multiple databases.

It only works on the first 10 databases that were passed to the command-line. This means that if you have a Datasette instance with hundreds of attached databases (see Datasette Library) this won't be particularly useful for you.

So... a better, future version of this feature would be one that lets you join across databases on command - maybe by hitting /_memory?attach=db1&attach=db2 to get a special connection.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/283#issuecomment-781665560

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1234/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
811458446 MDU6SXNzdWU4MTE0NTg0NDY= 1233 "datasette publish cloudrun" cannot publish files with spaces in their name simonw 9599 open 0     1 2021-02-18T21:08:31Z 2021-02-18T21:10:08Z   OWNER  

Got this error: ``` Step 6/9 : RUN datasette inspect fixtures.db extra database.db --inspect-file inspect-data.json ---> Running in db9da0068592 Usage: datasette inspect [OPTIONS] [FILES]... Try 'datasette inspect --help' for help.

Error: Invalid value for '[FILES]...': Path 'extra' does not exist. The command '/bin/sh -c datasette inspect fixtures.db extra database.db --inspect-file inspect-data.json' returned a non-zero code: 2 ERROR ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: step exited with non-zero status: 2 While working on the demo for #1232, using this deploy command: GITHUB_SHA=crossdb datasette publish cloudrun fixtures.db 'extra database.db' \ -m fixtures.json \ --plugins-dir=plugins \ --branch=$GITHUB_SHA \ --version-note=$GITHUB_SHA \ --extra-options="--setting template_debug 1 --crossdb" \ --install=pysqlite3-binary \ --service=datasette-latest-crossdb ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1233/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
811054000 MDU6SXNzdWU4MTEwNTQwMDA= 1230 Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages Kabouik 7107523 open 0     1 2021-02-18T12:27:02Z 2021-02-18T15:22:15Z   NONE  

I filtered a data set on some criteria and obtain 265 results, split over three pages (100, 100, 65), and reazlized that Vega plots are only applied to the results displayed on the current page, instead of the whole filtered data, e.g., 100 on page 1, 100 on page 2, 65 on page 3. Is there a way to force the graphs to consider all results instead of just the page, considering that pages rarely represent sensible information?

Likewise, while the cluster map does show all results on the first page, if you go to next pages, it will show all remaining results except the previous page(s), e.g., 265 on page 1, 165 on page 2, 65 on page 3.

In both cases, I don't see many situations where one would like to represent the data this way, and it might even lead to interpretation errors when viewing the data. Am I missing some cases where this would be best? Perhaps a clickable option to subset visual representations according visible pages vs. display all search results would do?

[Edit] Oh, I just saw the "Load all" button under the cluster map as well as the setting to alter the max number or results. So I guess this issue only is about the Vega charts.

datasette 107914493 issue          
795367402 MDU6SXNzdWU3OTUzNjc0MDI= 1209 v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround jrdmb 11788561 open 0     1 2021-01-27T19:08:13Z 2021-01-28T23:00:27Z   NONE  

v0.54 500 error in sql query template; code worked in v0.53; found a workaround

schema:
CREATE TABLE "talks" ("talk" TEXT,"series" INTEGER, "talkdate" TEXT)
CREATE TABLE "series" ("id" INTEGER PRIMARY KEY, "series" TEXT, talks_list TEXT default '', website TEXT default '');

Live example of correctly rendered template in v.053: https://cosmotalks-cy6xkkbezq-uw.a.run.app/cosmotalks/talks/1

Description of problem: I needed 'sql select' code in a custom row-mydatabase-mytable.html template to lookup the series name for a foreign key integer value in the talks table. So metadata.json specifies the datasette-template-sql plugin.

The code below worked perfectly in v0.53 (just the relevant sql statement part is shown; full code is here):

{# custom addition #} {% for row in display_rows %} ... {% set sname = sql("select series from series where id = ?", [row.series]) %} <strong>Series name: {{ sname[0].series }} ... {% endfor %} {# End of custom addition #}

In v0.54, that code resulted in a 500 error with a 'no such table series' message. A second query in that template also did not work but the above is fully illustrative of the problem.

All templates were up-to-date along with datasette v0.54.

Workaround: After fiddling around with trying different things, what worked was the syntax from Querying a different database from the datasette-template-sql github repo to add the database name to the sql statement:

{% set sname = sql("select series from series where id = ?", [row.series], database="cosmotalks") %}

Though this was found to work, it should not be necessary to add database="cosmotalks" since per the datasette-template-sql README, it's only needed when querying a different database, but here it's a table within the same database.

datasette 107914493 issue          
778682317 MDU6SXNzdWU3Nzg2ODIzMTc= 1173 GitHub Actions workflow to build manylinux binary simonw 9599 open 0     1 2021-01-05T07:41:11Z 2021-01-05T07:41:43Z   OWNER  

Refs #1171 and #93

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1173/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
776101101 MDU6SXNzdWU3NzYxMDExMDE= 1161 Update a whole bunch of links to datasette.io instead of datasette.readthedocs.io simonw 9599 open 0     1 2020-12-29T21:47:31Z 2020-12-29T21:49:57Z   OWNER  

https://ripgrep.datasette.io/-/ripgrep?pattern=%28datasette%7Csqlite-utils%29%5C.readthedocs%5C.io

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1161/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
463492815 MDU6SXNzdWU0NjM0OTI4MTU= 534 500 error on m2m facet detection simonw 9599 open 0     1 2019-07-03T00:42:42Z 2020-12-17T05:08:22Z   OWNER  

This may help debug: diff --git a/datasette/facets.py b/datasette/facets.py index 76d73e5..07a4034 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -499,11 +499,14 @@ class ManyToManyFacet(Facet): "outgoing" ] if len(other_table_outgoing_foreign_keys) == 2: - destination_table = [ - t - for t in other_table_outgoing_foreign_keys - if t["other_table"] != self.table - ][0]["other_table"] + try: + destination_table = [ + t + for t in other_table_outgoing_foreign_keys + if t["other_table"] != self.table + ][0]["other_table"] + except IndexError: + import pdb; pdb.pm() # Only suggest if it's not selected already if ("_facet_m2m", destination_table) in args: continue

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/534/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
718395987 MDExOlB1bGxSZXF1ZXN0NTAwNzk4MDkx 1008 Add json_loads and json_dumps jinja2 filters mhalle 649467 open 0     1 2020-10-09T20:11:34Z 2020-12-15T02:30:28Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/1008
datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1008/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
765637324 MDU6SXNzdWU3NjU2MzczMjQ= 1144 JavaScript to help plugins interact with the fragment part of the URL simonw 9599 open 0     1 2020-12-13T20:36:06Z 2020-12-14T14:47:11Z   OWNER  

Suggested by Markus Holtermann on Twitter, who is building https://github.com/MarkusH/datasette-chartjs

I've been looking at datasette-vega for how you persist chart settings between form submissions. I've adopted that for datasette-chartjs. Any thoughts on adding a public JS API to #datasette itself, that plugins can rely on?

I'm talking about functions like onFragmentChange, serialize, unserialize, ... That turn an object into a URL encoded string and put it into the location's hash. And also updating all links/forms automatically.

Essentially, a plugins could do something like document.datasette.setConfigValue('prefix', 'foo', 'bar') and .getConfigValue('prefix', 'foo'). And the functions would take care of updating document.location.hash, all (necessary) a.href and form.action

https://twitter.com/m_holtermann/status/1338183973311295492

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1144/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
352768017 MDU6SXNzdWUzNTI3NjgwMTc= 362 Add option to include/exclude columns in search filters annapowellsmith 78156 open 0     1 2018-08-22T01:32:08Z 2020-11-03T19:01:59Z   NONE  

I have a dataset with many columns, of which only some are likely to be of interest for searching.

It would be great for usability if the search filters in the UI could be configured to include/exclude columns.

See also: https://github.com/simonw/datasette/issues/292

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/362/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
723982480 MDExOlB1bGxSZXF1ZXN0NTA1NDUzOTAw 1030 Make `package` command deal with a configuration directory argument frankier 299380 open 0     1 2020-10-18T11:07:02Z 2020-10-19T08:01:51Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/1030

Currently if we run datasette package on a configuration directory we'll get an exception when we try to hard link to the directory. This PR copies the tree and makes the Dockerfile run inspect on all *.db files.

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1030/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
718910318 MDU6SXNzdWU3MTg5MTAzMTg= 1015 Research: could Datasette install its own plugins? simonw 9599 open 0     1 2020-10-11T19:33:06Z 2020-10-11T19:35:04Z   OWNER  

It would be cool if Datasette could offer a plugin browsing interface where users could install plugins by clicking "Install" on them - similar to how VS Code extensions work.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1015/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
718272593 MDU6SXNzdWU3MTgyNzI1OTM= 1007 set-env and add-path commands have been deprecated simonw 9599 open 0     1 2020-10-09T16:21:18Z 2020-10-09T16:23:51Z   OWNER  

https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/

Starting today runner version 2.273.5 will begin to warn you if you use the add-path or set-env commands. We are monitoring telemetry for the usage of these commands and plan to fully disable them in the future.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1007/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
626211658 MDU6SXNzdWU2MjYyMTE2NTg= 778 Ability to configure keyset pagination for views and queries simonw 9599 open 0     1 2020-05-28T04:48:56Z 2020-10-02T02:26:25Z   OWNER  

Currently views offer pagination, but it uses offset/limit - e.g. https://latest.datasette.io/fixtures/paginated_view?_next=100

This means pagination will perform poorly on deeper pages.

If a view is based on a table that has a primary key it should be possible to configure efficient keyset pagination that works the same way that table pagination works.

This may be as simple as configuring a column that can be treated as a "primary key" for the purpose of pagination using metadata.json - or with a ?_view_pk=colname querystring argument.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/778/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 645.638ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows