home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

245 rows where comments = 2, repo = 107914493 and type = "issue" sorted by updated_at descending

✖
✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, author_association, state_reason, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 182
  • open 63

type 1

  • issue · 245 ✖

repo 1

  • datasette · 245 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1910269679 I_kwDOBm6k_c5x3Gbv 2196 Discord invite link returns 401 Olshansk 1892194 closed 0     2 2023-09-24T15:16:54Z 2023-10-13T00:07:08Z 2023-10-12T21:54:54Z NONE  

I found the link to the datasette discord channel via this query.

The following video should be self explanatory:

https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02

Link for reference: https://discord.com/invite/ktd74dm5mw

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2196/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1825007061 I_kwDOBm6k_c5sx2XV 2123 datasette serve when invoked with --reload interprets the serve command as a file cadeef 79087 open 0     2 2023-07-27T19:07:22Z 2023-09-18T13:02:46Z   NONE  

When running datasette serve with the --reload flag, the serve command is picked up as a file argument:

$ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload.

If a 'serve' file is created it launches properly (albeit with an empty database called serve):

$ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)

Version (running from HEAD on main):

$ datasette --version datasette, version 1.0a2

This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context.

I'm happy to debug and land a patch if it's welcome.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2123/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1900026059 I_kwDOBm6k_c5xQBjL 2188 Plugin Hooks for "compile to SQL" languages asg017 15178711 open 0     2 2023-09-18T01:37:15Z 2023-09-18T06:58:53Z   CONTRIBUTOR  

There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples:

  • Logica https://logica.dev
  • PRQL https://prql-lang.org
  • Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy

It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage.

A few things I'd imagine a datasette-prql or datasette-logica plugin would do:

  • prql= instead of sql=
  • Code editor support (syntax highlighting, autocomplete)
  • Hide/show SQL
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2188/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1898927976 I_kwDOBm6k_c5xL1do 2186 Mechanism for register_output_renderer hooks to access full count simonw 9599 open 0   Datasette 1.0 3268330 2 2023-09-15T18:57:54Z 2023-09-15T19:27:59Z   OWNER  

The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17

Is that datasette-export-notebook was consulting data["filtered_table_rows_count"] in the render output plugin function in order to show the total number of rows that would be exported.

That field is no longer available by default - the "count" field is only available if ?_extra=count was passed.

It would be useful if plugins like this could access the total count on demand, should they need to.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2186/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1846076261 I_kwDOBm6k_c5uCONl 2139 border-color: ##ff0000 bug - two hashes simonw 9599 closed 0   Datasette 1.0a-next 8755003 2 2023-08-11T01:22:58Z 2023-08-11T05:16:24Z 2023-08-11T05:16:24Z OWNER  

Spotted this on https://latest.datasette.io/extra_database

```html

```
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2139/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1844213115 I_kwDOBm6k_c5t7HV7 2138 on_success_message_sql option for writable canned queries simonw 9599 closed 0   Datasette 1.0a-next 8755003 2 2023-08-10T00:20:14Z 2023-08-10T00:39:40Z 2023-08-10T00:34:26Z OWNER  

Or... how about if the on_success_message option could define a SQL query to be executed to generate that message? Maybe on_success_message_sql.

  • https://github.com/simonw/datasette/issues/2134
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2138/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1840329615 I_kwDOBm6k_c5tsTOP 2130 Render plugin mechanism needs `error` and `truncated` fields simonw 9599 closed 0   Datasette 1.0a3 9700784 2 2023-08-07T23:19:19Z 2023-08-08T01:51:54Z 2023-08-08T01:47:42Z OWNER  

While working on: - https://github.com/simonw/datasette/pull/2118

It became clear that the render callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette

Needs to grow the ability to be told if an error occurred (an error string) and if the results were truncated (a truncated boolean).

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2130/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1808215339 I_kwDOBm6k_c5rxy0r 2104 Tables starting with an underscore should be treated as hidden simonw 9599 open 0     2 2023-07-17T17:13:53Z 2023-07-18T22:41:37Z   OWNER  

Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2104/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1765870617 I_kwDOBm6k_c5pQQwZ 2087 `--settings settings.json` option simonw 9599 open 0     2 2023-06-20T17:48:45Z 2023-07-14T17:02:03Z   OWNER  

https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080

May I add a request to the whole metadata / settings ? Allow to pass --settings path/to/settings.json instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2087/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1781047747 I_kwDOBm6k_c5qKKHD 2092 test_homepage intermittent failure simonw 9599 closed 0     2 2023-06-29T15:20:37Z 2023-06-29T15:26:28Z 2023-06-29T15:24:13Z OWNER  

e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852

``` =================================== FAILURES =================================== _____ testhomepage _______ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python

ds_client = <datasette.app.DatasetteClient object at 0x7f85d271ef50>

@pytest.mark.asyncio
async def test_homepage(ds_client):
    response = await ds_client.get("/.json")
    assert response.status_code == 200
    assert "application/json; charset=utf-8" == response.headers["content-type"]
    data = response.json()
    assert data.keys() == {"fixtures": 0}.keys()
    d = data["fixtures"]
    assert d["name"] == "fixtures"
    assert d["tables_count"] == 24
    assert len(d["tables_and_views_truncated"]) == 5
    assert d["tables_and_views_more"] is True
    # 4 hidden FTS tables + no_primary_key (hidden in metadata)
    assert d["hidden_tables_count"] == 6
    # 201 in no_primary_key, plus 6 in other hidden tables:
  assert d["hidden_table_rows_sum"] == 207, data

E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}} E assert 0 == 207 ``` My guess is that this is a timing error, where very occasionally the "count rows but stop counting if it exceeds a time limit" thing fails.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2092/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1781005740 I_kwDOBm6k_c5qJ_2s 2090 Adopt ruff for linting simonw 9599 open 0     2 2023-06-29T14:56:43Z 2023-06-29T15:05:04Z   OWNER  

https://beta.ruff.rs/docs/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2090/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1708030220 I_kwDOBm6k_c5lznkM 2073 Faceting doesn't work against integer columns in views simonw 9599 open 0     2 2023-05-12T18:20:10Z 2023-05-12T18:24:07Z   OWNER  

Spotted this issue here: https://til.simonwillison.net/datasette/baseline

I had to do this workaround: sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] I think the core issue here is that, against a table, select * from x where integer_column = '1' works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2073/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1702354223 I_kwDOBm6k_c5ld90v 2070 Mechanism for deploying a preview of a branch using Vercel simonw 9599 closed 0     2 2023-05-09T16:21:45Z 2023-05-09T16:25:00Z 2023-05-09T16:24:31Z OWNER  

I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml

It deployed the json-extras-query branch here: https://datasette-preview-json-extras-query.vercel.app/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2070/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1633077183 I_kwDOBm6k_c5hVse_ 2041 Remove obsolete table POST code simonw 9599 closed 0   Datasette 1.0a-next 8755003 2 2023-03-21T01:01:40Z 2023-03-21T01:17:44Z 2023-03-21T01:17:43Z OWNER  

Spotted this in: - #1999

POST /db/table currently executes obsolete code for inserting a row - I replaced that with /db/table/-/insert in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2041/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1618249044 I_kwDOBm6k_c5gdIVU 2038 Consider a `strict_templates` setting simonw 9599 open 0     2 2023-03-10T02:09:13Z 2023-03-10T02:11:06Z   OWNER  

A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error.

Prototype here: diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( "Allow display of SQL trace debug information with ?_trace=1", ), Setting("base_url", "/", "Datasette URLs should use this base path"), + Setting("strict_templates", False, "Raise errors for undefined template variables"), ) _HASH_URLS_REMOVED = "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting("strict_templates"): + env_extras["undefined"] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters["escape_css_string"] = escape_css_string self.jinja_env.filters["quote_plus"] = urllib.parse.quote_plus Explored this idea a bit in: - #1999

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2038/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1612296210 I_kwDOBm6k_c5gGbAS 2033 `datasette install -r requirements.txt` simonw 9599 closed 0     2 2023-03-06T22:17:17Z 2023-03-06T22:54:52Z 2023-03-06T22:27:34Z OWNER  

Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2033/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1594383280 I_kwDOBm6k_c5fCFuw 2030 How to use Datasette with apache webserver on GCP? gk7279 19700859 closed 0     2 2023-02-22T03:08:49Z 2023-02-22T21:54:39Z 2023-02-22T21:54:39Z NONE  

Hi Simon and Datasette team-

I have installed apache2 webserver inside GCP VM using apt.

I can see my "Hello World" index.html if I use the external IP of this GCP in a browser.

However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage.

I cannot invest Docker on this VM.

Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated.

Thanks.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2030/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1579695809 I_kwDOBm6k_c5eKD7B 2023 Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 mlaparie 80409402 closed 0     2 2023-02-10T13:35:01Z 2023-02-10T15:40:00Z 2023-02-10T15:39:59Z NONE  

On a Debian machine, using datasette 0.64.1 installed with pip3, I am getting a datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json in journalctl -xe. The same settings work on 0.54.1 on another Debian server.

This is my settings.json:

json { "default_page_size": 200, "max_returned_rows": 8000, "num_sql_threads": 3, "sql_time_limit_ms": 1000, "default_facet_size": 30, "facet_time_limit_ms": 200, "facet_suggest_time_limit_ms": 50, "hash_urls": false, "allow_facet": true, "allow_download": true, "suggest_facets": true, "default_cache_ttl": 5, "default_cache_ttl_hashed": 31536000, "cache_size_kb": 0, "allow_csv_stream": true, "max_csv_mb": 100, "truncate_cells_html": 2048, "force_https_urls": false, "template_debug": false, "base_url": "/pclim/db/" }

This looks ok to me. Would you have any ideas?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2023/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1186696202 I_kwDOBm6k_c5Gu4wK 1696 Show foreign key label when filtering simonw 9599 open 0     2 2022-03-30T16:18:54Z 2023-01-29T20:56:20Z   OWNER  

For example here:

3 corresponds to "Human Related: Other" - it would be neat to display this in this area of the page somehow.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1696/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1558644003 I_kwDOBm6k_c5c5wUj 2006 Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release simonw 9599 open 0   Datasette 1.0 3268330 2 2023-01-26T19:17:40Z 2023-01-26T19:20:53Z   OWNER  

I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes.

I can hopefully help avoid that by shipping one last entry in the 0.x series that ensures datasette publish pins to <1.0 when it installs Datasette itself.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2006/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1551113681 I_kwDOBm6k_c5cdB3R 1998 `datasette --version` should also show the SQLite version simonw 9599 open 0     2 2023-01-20T16:11:30Z 2023-01-20T18:19:06Z   OWNER  

Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1998/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1522778923 I_kwDOBm6k_c5aw8Mr 1978 Document datasette.urls.row and row_blob eyeseast 25778 closed 0     2 2023-01-06T15:45:51Z 2023-01-09T14:30:00Z 2023-01-09T14:30:00Z CONTRIBUTOR  

These are in the codebase but not in documentation. I think everything else in this class is documented.

```python class Urls: ... def row(self, database, table, row_path, format=None): path = f"{self.table(database, table)}/{row_path}" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path)

def row_blob(self, database, table, row_path, column):
    return self.table(database, table) + "/{}.blob?_blob_column={}".format(
        row_path, urllib.parse.quote_plus(column)
    )

```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1978/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  not_planned
1524867951 I_kwDOBm6k_c5a46Nv 1980 "Cannot sort table by id" when sortable_columns is used simonw 9599 open 0     2 2023-01-09T03:21:33Z 2023-01-09T03:23:53Z   OWNER  

I had an instance with this in metadata.yml:

yaml databases: timezones: tables: timezones: sortable_columns: - tzid When I clicked on the "Apply" button here:

It sent me to /timezones/timezones?_sort=id&id__exact=133 with the error message:

500: Cannot sort table by id

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1980/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1515182998 I_kwDOBm6k_c5aT9uW 1970 Path "None" in _internal database table simonw 9599 closed 0     2 2022-12-31T18:51:05Z 2022-12-31T19:22:58Z 2022-12-31T18:52:49Z OWNER  

See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1970/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1501713288 I_kwDOBm6k_c5ZglOI 1963 0.63.3 bugfix release simonw 9599 closed 0     2 2022-12-18T02:48:15Z 2022-12-18T03:26:55Z 2022-12-18T03:26:55Z OWNER  

I'm going to ship a release which back-ports these two fixes:

  • https://github.com/simonw/datasette/issues/1958
  • https://github.com/simonw/datasette/issues/1955
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1963/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1495821607 I_kwDOBm6k_c5ZKG0n 1953 Release notes for Datasette 1.0a2 simonw 9599 closed 0   Datasette 1.0a2 8711695 2 2022-12-14T06:26:40Z 2022-12-15T02:02:15Z 2022-12-15T02:01:08Z OWNER  

https://github.com/simonw/datasette/milestone/27?closed=1

https://github.com/simonw/datasette/compare/1.0a1...9ad76d279e2c3874ca5070626a25458ce129f126

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1953/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
855296937 MDU6SXNzdWU4NTUyOTY5Mzc= 1295 Errors should have links to further information simonw 9599 open 0     2 2021-04-11T12:39:12Z 2022-12-14T23:28:49Z   OWNER  

Inspired by this tweet: https://twitter.com/willmcgugan/status/1381186384510255104

While I am thinking about faqs. I’d also like to add short URLs to Rich exceptions.

I loath cryptic error messages, and I’ve created a fair few myself. In Rich I’ve tried to make them as plain English as possible. But...

would be great if every error message linked to a page that explains the error in detail and offers fixes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1295/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1483320357 I_kwDOBm6k_c5Yaawl 1937 /db/-/create API should require insert-rows permission to use row: or rows: option simonw 9599 closed 0   Datasette 1.0a2 8711695 2 2022-12-08T01:33:09Z 2022-12-14T20:21:26Z 2022-12-14T20:21:26Z OWNER  

Otherwise someone with create-table but noinsert-rows permission could abuse it to insert data.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1937/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1497288666 I_kwDOBm6k_c5ZPs_a 1956 Handle abbreviations properly in permission_allowed_actor_restrictions simonw 9599 closed 0   Datasette 1.0a2 8711695 2 2022-12-14T19:54:21Z 2022-12-14T20:04:29Z 2022-12-14T20:04:28Z OWNER  

This code currently assumes abbreviations are:

pyton action_initials = "".join([word[0] for word in action.split("-")])

https://github.com/simonw/datasette/blob/1a3dcf494376e32f7cff110c86a88e5b0a3f3924/datasette/default_permissions.py#L182-L208

That's no longer correct, they are now registered by the new plugin hook: - #1939

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1956/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1495241162 I_kwDOBm6k_c5ZH5HK 1950 Bad ?_sort returns a 500 error, should be a 400 simonw 9599 closed 0     2 2022-12-13T22:08:16Z 2022-12-13T22:23:22Z 2022-12-13T22:23:22Z OWNER  

https://latest.datasette.io/fixtures/facetable?_sort=bad

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1950/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1200649124 I_kwDOBm6k_c5HkHOk 1708 Datasette 1.0 alpha upcoming release notes simonw 9599 open 0   Datasette 1.0a-next 8755003 2 2022-04-11T22:57:12Z 2022-12-13T05:29:06Z   OWNER  

I'm going to try writing the release notes first, to see if that helps unblock me.

⚠️ Any release notes in this issue are a draft, and should not be treated as the real thing ⚠️

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1708/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1493339206 I_kwDOBm6k_c5ZAoxG 1946 `datasette --get` mechanism for sending tokens simonw 9599 closed 0   Datasette 1.0a2 8711695 2 2022-12-13T04:25:05Z 2022-12-13T04:36:57Z 2022-12-13T04:36:57Z OWNER  

For the tests for datasette create-token it would be useful if datasette --get had a mechanism for sending an Authorization: Bearer X header.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1855#issuecomment-1347731288

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1946/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1483250004 I_kwDOBm6k_c5YaJlU 1936 Fix /db/table/-/upsert in the API explorer simonw 9599 open 0   Datasette 1.0 3268330 2 2022-12-08T00:59:34Z 2022-12-08T01:36:02Z   OWNER  

Split from: - #1931 - #1878

This is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not rowid should be included.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1936/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1469043836 I_kwDOBm6k_c5Xj9R8 1917 Don't allow writable API to edit the `_memory` database simonw 9599 closed 0   Datasette 1.0a1 7867486 2 2022-11-30T04:51:59Z 2022-11-30T05:07:56Z 2022-11-30T05:07:55Z OWNER  

It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1917/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1468519699 I_kwDOBm6k_c5Xh9UT 1911 `/db/-/create` should support creating tables with compound primary keys simonw 9599 closed 0   Datasette 1.0a0 8658075 2 2022-11-29T18:30:47Z 2022-11-29T18:50:58Z 2022-11-29T18:48:05Z OWNER  

Found myself needing this to write the tests for: - #1864

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1911/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1455928469 I_kwDOBm6k_c5Wx7SV 1903 Refactor all error classes into a datasette.exceptions module simonw 9599 open 0   Datasette 1.0 3268330 2 2022-11-18T22:44:45Z 2022-11-20T22:35:01Z   OWNER  

While working on this issue: - #1896

I realized that Datasette has error classes scattered around a fair bit, including some in the datasette.utils.asgi module for some reason.

I should clean these up.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1903/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1447388809 I_kwDOBm6k_c5WRWaJ 1887 Add a confirm step to the drop table API simonw 9599 closed 0   Datasette 1.0a0 8658075 2 2022-11-14T04:59:53Z 2022-11-15T19:59:59Z 2022-11-14T05:18:51Z OWNER  

In playing with the API explorer just now I realized it's way too easy to accidentally drop a table using it.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1871#issuecomment-1313097057

Added drop table API in: - #1874

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1887/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1431786951 I_kwDOBm6k_c5VV1XH 1876 SQL query should wrap on SQL interrupted screen simonw 9599 closed 0     2 2022-11-01T17:14:01Z 2022-11-01T17:22:33Z 2022-11-01T17:22:33Z OWNER  

Just saw this:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1876/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1420055377 I_kwDOBm6k_c5UpFNR 1847 Both _local_metadata and _metadata_local? simonw 9599 closed 0     2 2022-10-24T01:43:08Z 2022-10-24T01:53:13Z 2022-10-24T01:53:13Z OWNER  

Spotted this in the debugger against the datasette object while running tests (pytest -k test_permissions_cascade to be exact):

(Pdb) [p for p in dir(self) if p.startswith('_') and '__' not in p] ['_actor', '_asset_urls', '_connected_databases', '_crumb_items', '_local_metadata', '_metadata', '_metadata_local', '_metadata_recursive_update', '_permission_checks', '_plugins', '_prepare_connection', '_refresh_schemas', '_refresh_schemas_lock', '_register_custom_units', '_register_renderers', '_root_token', '_routes', '_secret', '_settings', '_show_messages', '_startup_hook_calculation', '_startup_hook_fired', '_startup_invoked', '_threads', '_versions', '_write_messages_to_response']

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1847/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1397193691 I_kwDOBm6k_c5TR3vb 1832 __bool__ method on Results simonw 9599 closed 0     2 2022-10-05T04:18:12Z 2022-10-05T04:32:33Z 2022-10-05T04:32:33Z OWNER  

Wrote this code today: https://github.com/simonw/datasette-public/blob/1401bfae50e71c1dfd2bfb6954f2e86d5a7ab21b/datasette_public/init.py#L41 python results = await db.execute( "select 1 from _public_tables where table_name = ?", [table_name] ) if len(results): return True Would be nice if I could use if results there instead.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1832/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1393903845 I_kwDOBm6k_c5TFUjl 1828 word-wrap: anywhere resulting in weird display simonw 9599 closed 0     2 2022-10-02T21:25:03Z 2022-10-02T23:01:17Z 2022-10-02T23:01:17Z OWNER  

e.g. on https://github-to-sqlite.dogsheep.net/github/commits

This is from a change introduced here: https://github.com/simonw/datasette/commit/bf8d84af5422606597be893cedd375020cb2b369 in #1805

https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/static/app.css#L447-L450

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1828/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1122427321 I_kwDOBm6k_c5C5uG5 1624 Index page `/` has no CORS headers simonw 9599 open 0     2 2022-02-02T21:56:10Z 2022-09-28T16:54:22Z   OWNER  

Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel="alternate"; type="application/json+datasette" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked

% curl -I 'https://latest.datasette.io/'
HTTP/1.1 200 OK link: https://latest.datasette.io/.json; rel="alternate"; type="application/json+datasette" content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:52 GMT Server: Google Frontend Transfer-Encoding: chunked ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1624/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1378640768 I_kwDOBm6k_c5SLGOA 1816 Validate settings.json on startup in configuration directory mode simonw 9599 closed 0     2 2022-09-19T23:35:18Z 2022-09-20T01:15:48Z 2022-09-20T01:15:48Z OWNER  

It might have been useful for Datasette to show an error when started against a settings.json file that contains an invalid setting though.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1816/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1378495690 I_kwDOBm6k_c5SKizK 1814 Static files not served frafra 4068 closed 0     2 2022-09-19T20:38:17Z 2022-09-19T23:35:06Z 2022-09-19T23:34:30Z NONE  

Folder structure:

bibliography/ bibliography/static-files bibliography/static-files/styles.css bibliography/bibliography.db bibliography/metadata.json bibliography/settings.json

$ cat bibliography/settings.json { "suggest_facets": false, "truncate_cells_html": 1000, "static": "assets:static-files/" }

File /assets/styles.css is not found (HTTP 404, Database not found: assets).

Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1814/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1339663518 I_kwDOBm6k_c5P2aSe 1784 Include "entrypoint" option on `--load-extension`? asg017 15178711 closed 0     2 2022-08-16T00:22:57Z 2022-08-23T18:34:31Z 2022-08-23T18:34:31Z CONTRIBUTOR  

Problem

SQLite extensions have the option to define multiple "entrypoints" in each loadable extension. For example, the upcoming version of sqlite-lines will have 2 entrypoints: the default sqlite3_lines_init (which SQLite will automatically guess for) and sqlite3_lines_noread_init. The sqlite3_lines_noread_init version omits functions that read from the filesystem, which is necessary for security purposes when running untrusted SQL (which Datasette does).

(Similar multiple entrypoints will also be added for sqlite-http).

The --load-extension flag, however, doesn't give the option to specify a different entrypoint, so the default one is always used.

Proposal

I want there to be a new command line option of the --load-extension flag to specify a custom entrypoint like so: datasette my.db \ --load-extension ./lines0 sqlite3_lines0_noread_init

Then, under the hood, this line of code:

https://github.com/simonw/datasette/blob/7af67b54b7d9bca43e948510fc62f6db2b748fa8/datasette/app.py#L562

Would look something like this:

python conn.execute("SELECT load_extension(?, ?)", [extension, entrypoint])

One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options ("arity"). So I guess it could also use a : delimiter like --static:

datasette my.db \ --load-extension ./lines0:sqlite3_lines0_noread_init

Or maybe even a new flag name?

datasette my.db \ --load-extension-entrypoint ./lines0 sqlite3_lines0_noread_init

Personally I prefer the : option... and maybe even --load-extension -> --load? Definitely out of scope for this issue tho

datasette my.db \ --load./lines0:sqlite3_lines0_noread_init

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1784/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1339444565 I_kwDOBm6k_c5P1k1V 1783 Better guidance as to what to do after you've installed Datasette simonw 9599 open 0     2 2022-08-15T20:11:06Z 2022-08-15T20:14:01Z   OWNER  

Feedback from Discord:

hello, love the project and came for help and to point out a possible gap in the docs. starting with "getting started" and "installation" every thing looks great, but then there's a giant leap after you have it installed and running. from the user perspective of "i have a csv of set of csvs that i want to turn into a table(s), what do i do next?" --- so something like maybe a page for creating your first project should go after "installation".

  • https://docs.datasette.io/en/0.62/getting_started.html
  • https://docs.datasette.io/en/0.62/installation.html
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1783/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1338278056 I_kwDOBm6k_c5PxICo 1782 Release notes for Datasette 0.62 simonw 9599 closed 0   Datasette 0.62 8303187 2 2022-08-14T15:26:45Z 2022-08-14T17:40:45Z 2022-08-14T17:32:54Z OWNER  

I've written a lot of these already for the alphas:

  • https://github.com/simonw/datasette/releases/tag/0.62a0
  • https://github.com/simonw/datasette/releases/tag/0.62a1
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1782/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1223527226 I_kwDOBm6k_c5I7Ys6 1738 "Cannot use _sort and _sort_desc at the same time" simonw 9599 closed 0   Datasette 0.62 8303187 2 2022-05-03T01:06:24Z 2022-08-14T16:13:55Z 2022-08-14T16:13:55Z OWNER  

Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width:

https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1

Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply:

Also notable: I managed to get to a page with ?_sort_desk=pk1 in the URL three times by clicking around with that button.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1738/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1243517592 I_kwDOBm6k_c5KHpKY 1748 Add copy buttons next to code examples in the documentation simonw 9599 closed 0     2 2022-05-20T19:09:00Z 2022-05-20T19:15:00Z 2022-05-20T19:11:32Z OWNER  

Similar to the ones in datasette-copyable which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1748/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1221849746 I_kwDOBm6k_c5I0_KS 1732 Custom page variables aren't decoded tannewt 52649 open 0     2 2022-04-30T14:55:46Z 2022-05-03T01:50:45Z   NONE  

I have a page templates/filer/{filer_id}.html. It uses filer_id in a sql() call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db.

Datasette should unescape the url component before passing them into the template.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1732/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1129052172 I_kwDOBm6k_c5DS_gM 1633 base_url or prefix does not work with _exact match henrikek 6613091 open 0     2 2022-02-09T21:45:07Z 2022-04-28T09:12:56Z   NONE  

When i hit "Apply" button to search with "_exact" for a column syntax the URL prefix is removed from the url.

And the result is:

If I add the marked row to url_builder.py it seams to work:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1633/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1182227211 I_kwDOBm6k_c5Gd1sL 1692 [plugins][feature request]: Support additional script tag attributes when loading custom JS hydrosquall 9020979 open 0     2 2022-03-27T01:16:03Z 2022-03-30T06:14:51Z   CONTRIBUTOR  

Motivation

  • The build system for my new plugin has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette.
  • I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers.

To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, nomodule and defer. My injected scripts should look something like this:

```html

<script type="module" src="/index.my-es-module-bundle.js"></script> <script src="/index.my-legacy-fallback-bundle.js" nomodule="" defer></script>

```

Proposal

To achieve this, I propose additional optional properties to the API accepted by the extra_js_urls hook and custom JS field the metadata.json described here.

Under this API, I'd write something like this to get the above HTML rendered in Datasette.

json { "extra_js_urls": [ { "url": "/index.my-es-module-bundle.js", "module": true, }, { "url": "/index.my-legacy-fallback-bundle.js", "nomodule": "", "defer": true } ] }

Resources

  • MDN on the script tag
  • There may be other properties that could be added that are potentially valuable, like async or referrerpolicy, but I don't have an immediate need for those.
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1692/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1185868354 I_kwDOBm6k_c5GrupC 1695 Option to un-filter facet not shown for `?col__exact=value` simonw 9599 open 0     2 2022-03-30T04:44:02Z 2022-03-30T04:46:18Z   OWNER  

Spotted this on a page with COUNTY__exact=Lee in the URL:

With COUNTY=Lee you get this instead:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1695/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1182141761 I_kwDOBm6k_c5Gdg1B 1690 Idea: `datasette.set_actor_cookie(response, actor)` simonw 9599 open 0     2 2022-03-26T22:41:52Z 2022-03-26T22:43:00Z   OWNER  

I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/init.py#L79-L92

python redirect_response = Response.redirect("/") expires_at = int(time.time()) + (24 * 60 * 60) redirect_response.set_cookie( "ds_actor", datasette.sign( { "a": profile_response.json(), "e": baseconv.base62.encode(expires_at), }, "actor", ), ) return redirect_response

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1690/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1177101697 I_kwDOBm6k_c5GKSWB 1681 Potential bug in numeric handling where_clause for filters simonw 9599 open 0     2 2022-03-22T17:43:50Z 2022-03-22T17:49:09Z   OWNER  

Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like >:

https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212

Though... it looks like there's a bug in that? It doesn't account for float values - "3.5".isdigit() return False - probably for the best, because int(3.5) would break that value anyway.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1681/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
910088936 MDU6SXNzdWU5MTAwODg5MzY= 1355 datasette --get should efficiently handle streaming CSV simonw 9599 open 0     2 2021-06-03T04:40:40Z 2022-03-20T22:38:53Z   OWNER  

It would be great if you could use datasette --get to run queries that return streaming CSV data without running out of RAM.

Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1355/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1174404647 I_kwDOBm6k_c5F__4n 1669 Release 0.61 alpha simonw 9599 closed 0     2 2022-03-20T00:35:35Z 2022-03-20T01:24:36Z 2022-03-20T01:24:36Z OWNER  

I'm going to release this as a 0.61 alpha so I can more easily depend on it from datasette-hashed-urls.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1669/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1170554975 I_kwDOBm6k_c5FxUBf 1663 Document the internals that were used in datasette-hashed-urls simonw 9599 closed 0   Datasette 1.0 3268330 2 2022-03-16T05:17:08Z 2022-03-19T04:04:50Z 2022-03-17T21:32:38Z OWNER  

The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features: - db.hash - Datasette(..., immutables=[...])

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1663/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1122416919 I_kwDOBm6k_c5C5rkX 1623 /-/patterns returns link: alternate JSON header to 404 simonw 9599 closed 0   Datasette 1.0 3268330 2 2022-02-02T21:42:49Z 2022-03-19T04:04:49Z 2022-02-02T21:48:56Z OWNER  

Bug from: - #1620

% curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel="alternate"; type="application/json+datasette"

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1623/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
626593402 MDU6SXNzdWU2MjY1OTM0MDI= 780 Internals documentation for datasette.metadata() method simonw 9599 open 0   Datasette 1.0 3268330 2 2020-05-28T15:14:22Z 2022-03-15T20:50:34Z   OWNER  

https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/app.py#L297-L328

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/780/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1161937073 I_kwDOBm6k_c5FQcCx 1653 Mechanism to default a table to sorting by multiple columns simonw 9599 open 0     2 2022-03-07T21:20:11Z 2022-03-07T21:23:39Z   OWNER  

Discussed in https://github.com/simonw/datasette/discussions/1652

<sup>Originally posted by **zaneselvans** March 7, 2022</sup> It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order): ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: created_time ``` But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette? ```sql SELECT respondent_id, report_year, spplmnt_num, row_number, row_seq, row_prvlg, acct_num, depr_plnt_base, est_avg_srvce_lf, net_salvage, apply_depr_rate, mrtlty_crv_typ, avg_remaining_lf, report_prd FROM f1_edcfu_epda WHERE respondent_id = 210 AND report_year = 2020 ORDER BY report_year, report_prd, respondent_id, spplmnt_num, row_number LIMIT 1000 ``` The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key. I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names... ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: "(report_year, report_prd, respondent_id, spplmnt_num, row_number)" ``` and they all give me server errors like: ``` Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number) ```
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1653/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1148725876 I_kwDOBm6k_c5EeCp0 1640 Support static assets where file length may change, e.g. logs broccolihighkicks 57859326 open 0     2 2022-02-24T00:34:42Z 2022-03-05T01:19:25Z   NONE  

This is a bit of an oxymoron.

I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc).

I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes).

python Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/datasette/app.py", line 1181, in route_path response = await view(request, send) File "/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File "/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py", line 280, in asgi_send_file await send( File "/usr/local/lib/python3.9/site-packages/asgi_csrf.py", line 104, in wrapped_send await send(event) File "/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 460, in send output = self.conn.send(event) File "/usr/local/lib/python3.9/site-packages/h11/_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "/usr/local/lib/python3.9/site-packages/h11/_connection.py", line 501, in send_with_data_passthrough writer(event, data_list.append) File "/usr/local/lib/python3.9/site-packages/h11/_writers.py", line 58, in __call__ self.send_data(event.data, write) File "/usr/local/lib/python3.9/site-packages/h11/_writers.py", line 78, in send_data raise LocalProtocolError("Too much data for declared Content-Length") h11._util.LocalProtocolError: Too much data for declared Content-Length ERROR: Exception in ASGI application Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/datasette/app.py", line 1181, in route_path response = await view(request, send) File "/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File "/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py", line 280, in asgi_send_file await send( File "/usr/local/lib/python3.9/site-packages/asgi_csrf.py", line 104, in wrapped_send await send(event) File "/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 460, in send output = self.conn.send(event) File "/usr/local/lib/python3.9/site-packages/h11/_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "/usr/local/lib/python3.9/site-packages/h11/_connection.py", line 501, in send_with_data_passthrough writer(event, data_list.append) File "/usr/local/lib/python3.9/site-packages/h11/_writers.py", line 58, in __call__ self.send_data(event.data, write) File "/usr/local/lib/python3.9/site-packages/h11/_writers.py", line 78, in send_data raise LocalProtocolError("Too much data for declared Content-Length") h11._util.LocalProtocolError: Too much data for declared Content-Length

Thanks, I am finding Datasette very useful.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1640/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1109783030 I_kwDOBm6k_c5CJfH2 1607 More detailed information about installed SpatiaLite version simonw 9599 closed 0   Datasette 1.0 3268330 2 2022-01-20T21:28:03Z 2022-02-09T06:42:02Z 2022-02-09T06:32:28Z OWNER  

https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like freexl_version() and geos_version() and HasMathSQL() and suchlike.

These could be shown on the /-/versions page.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1607/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1117132741 I_kwDOBm6k_c5ClhfF 1615 Potential simplified publishing mechanism aidansteele 369053 closed 0     2 2022-01-28T08:34:50Z 2022-02-02T07:34:21Z 2022-02-02T07:34:17Z NONE  

Hi,

Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet.

I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services.

So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting.

E.g. user johndoe creates a repo my-animals with a couple of files: dogs.csv, cats.csv and the following GitHub Actions workflow:

```yaml

.github/workflows/push.yml

on: push

this allows the publish action to use OIDC to authenticate johndoe/my-animals

permissions: id-token: write contents: read

jobs: publish: runs-on: ubuntu-latest steps: - uses: actions/setup-python@v2

  - run: pip install sqlite-utils

  - uses: actions/checkout@v2

  - run: |
      set -eux

      sqlite-utils create-database animals.db          
      sqlite-utils insert animals.db dogs dogs.csv --csv          
      sqlite-utils insert animals.db cats cats.csv --csv

  - uses: datasette-hub/publish@v1
    with:
      db: animals.db
      metadata: meta.yml

    # this step is helpful for debugging why the
    # generated sqlite db was rejected
  - uses: actions/upload-artifact@v2
    if: failure()
    with:
      path: animals.db
      retention-days: 1

```

This would then cause a Datasette instance to be available at https://johndoe-my-animals.datasette-hub.test/. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette.

What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with git scraping.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1615/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
842416110 MDU6SXNzdWU4NDI0MTYxMTA= 1278 SpatiaLite timezones demo is broken simonw 9599 closed 0     2 2021-03-27T04:45:27Z 2022-01-20T21:29:43Z 2021-03-27T16:17:13Z OWNER  

https://github.com/simonw/datasette/blob/5fd02890650db790b2ffdb90eb9f78f8e0639c37/docs/spatialite.rst#L96

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1278/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
443040665 MDU6SXNzdWU0NDMwNDA2NjU= 466 Move "no such module: VirtualSpatialIndex" code elsewhere simonw 9599 closed 0   0.28 4305096 2 2019-05-11T22:09:00Z 2022-01-20T21:29:41Z 2019-05-11T22:57:22Z OWNER  

We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module:

https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554

This code is part of .inspect() which is going away - see #462 - so I need to find somewhere else for it to live.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/466/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
563347679 MDU6SXNzdWU1NjMzNDc2Nzk= 668 Make it easier to load SpatiaLite simonw 9599 closed 0     2 2020-02-11T17:03:43Z 2022-01-20T21:29:41Z 2021-01-04T20:18:39Z OWNER  

``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn=<sqlite3.Connection object at 0x11e388f10>, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]...

Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module.

Read more: https://datasette.readthedocs.io/en/latest/spatialite.html This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib `` Even better: if Datasette had a--spatialite` option which automatically loads the extension from common locations, if it can find it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/668/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
336936010 MDU6SXNzdWUzMzY5MzYwMTA= 331 Datasette throws error when loading spatialite db without extension loaded psychemedia 82988 closed 0     2 2018-06-29T09:51:14Z 2022-01-20T21:29:40Z 2018-07-10T15:13:36Z CONTRIBUTOR  

When starting datasette on a SpatialLite database without loading the SpatiaLite extension (using eg --load-extension=/usr/local/lib/mod_spatialite.dylib) an error is thrown and the server fails to start:

datasette -p 8003 adminboundaries.db Serve! files=('adminboundaries.db',) on port 8003 Traceback (most recent call last): File "/Users/ajh59/anaconda3/bin/datasette", line 11, in <module> sys.exit(cli()) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 722, in __call__ return self.main(*args, **kwargs) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 697, in main rv = self.invoke(ctx) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 535, in invoke return callback(*args, **kwargs) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py", line 552, in serve ds.inspect() File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py", line 273, in inspect "tables": inspect_tables(conn, self.metadata.get("databases", {}).get(name, {})) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py", line 79, in inspect_tables "PRAGMA table_info({});".format(escape_sqlite(table)) sqlite3.OperationalError: no such module: VirtualSpatialIndex

It would be nice to trap this and return a message saying something like:

``` It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette.

Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/331/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1097040427 I_kwDOBm6k_c5BY4Ir 1587 Add `sqlite_stat1`(-4) tables to hidden table list simonw 9599 closed 0     2 2022-01-08T21:28:20Z 2022-01-20T04:12:59Z 2022-01-20T04:12:59Z OWNER  

Running ANALYZE creates a new visible table called sqlite_stat1: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table

This should be added to the default list of hidden tables in Datasette.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1587/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
991467558 MDU6SXNzdWU5OTE0Njc1NTg= 1466 Add Datasette Desktop to installation documentation simonw 9599 closed 0   Datasette 0.60 7571612 2 2021-09-08T19:41:27Z 2022-01-13T22:28:28Z 2022-01-13T21:55:18Z OWNER  

See https://datasette.io/desktop and https://simonwillison.net/2021/Sep/8/datasette-desktop/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1466/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1083921371 I_kwDOBm6k_c5Am1Pb 1570 Separate db.execute_write() into three methods simonw 9599 closed 0   Datasette 0.60 7571612 2 2021-12-18T18:45:54Z 2022-01-13T22:27:38Z 2021-12-18T18:57:25Z OWNER  

Rather than adding a executemany=True parameter, I'm now thinking a better design might be to have three methods:

  • db.execute_write(sql, params=None, block=False)
  • db.execute_write_script(sql, block=False)
  • db.execute_write_many(sql, params_seq, block=False)

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1570/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1083895395 I_kwDOBm6k_c5Amu5j 1569 db.execute_write(..., executescript=True) parameter simonw 9599 closed 0   Datasette 0.60 7571612 2 2021-12-18T18:20:47Z 2022-01-13T22:27:27Z 2021-12-18T18:34:18Z OWNER  

Idea: teach execute_write to accept an optional executescript=True parameter, like this: ```diff diff --git a/datasette/database.py b/datasette/database.py index 468e936..1a424f5 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -94,10 +94,14 @@ class Database: f"file:{self.path}{qs}", uri=True, check_same_thread=False )

  • async def execute_write(self, sql, params=None, block=False):
  • async def execute_write(self, sql, params=None, executescript=False, block=False):
  • assert not executescript and params, "Cannot use params with executescript=True" def _inner(conn): with conn:
  • return conn.execute(sql, params or [])
  • if executescript:
  • return conn.executescript(sql)
  • else:
  • return conn.execute(sql, params or [])
     with trace("sql", database=self.name, sql=sql.strip(), params=params):
         results = await self.execute_write_fn(_inner, block=block)
    

    ```

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1569/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1083726550 I_kwDOBm6k_c5AmFrW 1568 Trace should show queries on the write connection too simonw 9599 closed 0   Datasette 0.60 7571612 2 2021-12-18T02:34:12Z 2022-01-13T22:27:23Z 2021-12-18T02:42:34Z OWNER  

Here's why - trace only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1568/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1083573206 I_kwDOBm6k_c5AlgPW 1563 Datasette(... files=) should not be a required argument simonw 9599 closed 0   Datasette 0.60 7571612 2 2021-12-17T19:54:18Z 2022-01-13T22:27:18Z 2021-12-18T02:19:40Z OWNER  

```pycon

ds = Datasette(memory=True) Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: init() missing 1 required positional argument: 'files' ds = Datasette(memory=True, files=[]) `` I wanted to create an in-memory Datasette for running some tests, no point in forcing me to passfiles=[]` to do that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1563/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
712202333 MDU6SXNzdWU3MTIyMDIzMzM= 982 SQL editor should allow execution of write queries, if you have permission simonw 9599 open 0     2 2020-09-30T19:04:35Z 2022-01-13T22:21:29Z   OWNER  

The datasette-write plugin provides this at the moment https://github.com/simonw/datasette-write - but it feels like it should be a built-in capability, protected by a default permission.

UI concept: if you have write permission then the existing SQL editor gets an "execute write" checkbox underneath it.

JavaScript can spot if you appear to be trying to execute an UPDATE or INSERT or DELETE query and check that checkbox for you.

If you link to a query page with a non-SELECT then that query will be displayed in the box ready for you to POST submit it. The page will also then get "cannot be embedded" headers to protect against clickjacking.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/982/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1077628073 I_kwDOBm6k_c5AO0yp 1550 Research option for returning all rows from arbitrary query simonw 9599 open 0     2 2021-12-11T19:31:11Z 2021-12-11T23:43:24Z   OWNER  

Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor.

I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1550/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1073712378 I_kwDOBm6k_c4__4z6 1544 Code that detects the label column for a table is case-sensitive simonw 9599 closed 0     2 2021-12-07T20:01:25Z 2021-12-07T20:03:43Z 2021-12-07T20:03:43Z OWNER  

I just noticed that a column called Name is not being picked up as the label column for a table.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1544/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1059549523 I_kwDOBm6k_c4_J3FT 1526 Add to vercel.json, rather than overwriting it. mroswell 192568 closed 0     2 2021-11-22T00:47:12Z 2021-11-22T04:49:45Z 2021-11-22T04:13:47Z CONTRIBUTOR  

I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here: https://github.com/simonw/datasette-publish-vercel/issues/51

In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site).

Can this be addressed?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1526/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1057996111 I_kwDOBm6k_c4_D71P 1517 Let `register_routes()` over-ride default routes within Datasette simonw 9599 closed 0   Datasette 1.0 3268330 2 2021-11-19T00:22:15Z 2021-11-19T03:20:00Z 2021-11-19T03:07:27Z OWNER  

See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now register_routes() can't replace default Datasette routes.

It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1517/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1021550542 I_kwDOBm6k_c4845_O 1482 Support Python 3.10 simonw 9599 closed 0     2 2021-10-09T00:30:52Z 2021-10-24T22:21:40Z 2021-10-24T22:19:55Z OWNER  

I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see:

  • https://github.com/aio-libs/janus/issues/358

This is a tracking issue for anything else that shows up.

This is also needed for the Homebrew package to upgrade to 3.10:

  • https://github.com/Homebrew/homebrew-core/pull/86932
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1482/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
950664971 MDU6SXNzdWU5NTA2NjQ5NzE= 1401 unordered list is not rendering bullet points in description_html on database page fgregg 536941 open 0     2 2021-07-22T13:24:18Z 2021-10-23T13:09:10Z   CONTRIBUTOR  

Thanks for this tremendous package, @simonw!

In the description_html for a database, I have an unordered list.

However, on the database page on the deployed site, it is not rendering this as a bulleted list.

Page here: https://labordata-warehouse.herokuapp.com/nlrb-9da4ae5

The documentation gives an example of using an unordered list in a description_html, so I expected this will work.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1401/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
994450961 MDU6SXNzdWU5OTQ0NTA5NjE= 1469 Column cog shows "facet by this" when already default faceted simonw 9599 closed 0     2 2021-09-13T04:51:26Z 2021-10-13T21:20:07Z 2021-10-13T21:20:07Z OWNER  

e.g. on https://covid-19.datasettes.com/covid/economist_excess_deaths

But if you add ?_facet=country to the URL that goes away: https://covid-19.datasettes.com/covid/economist_excess_deaths?_facet_size=5&_facet=country

The logic that decides if the "Facet by this" item is shown does not take default metadata.json facets into account.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1469/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
999902754 I_kwDOBm6k_c47mU4i 1473 base logo link visits `undefined` rather than href url mroswell 192568 open 0     2 2021-09-18T04:17:04Z 2021-09-19T00:45:32Z   CONTRIBUTOR  

I have two connected sites: http://www.SaferOrToxic.org (a Hugo website) and: http://disinfectants.SaferOrToxic.org/disinfectants/listN (a datasette table page)

The latter is linked as "The List" in the former's menu. (I'd love a prettier URL, but that's what I've got.)

On: http://disinfectants.SaferOrToxic.org/disinfectants/listN ... all the other menu links should point back to: https://www.SaferOrToxic.org And they do!

But the logo, for some reason--though it has an href pointing to: https://www.SaferOrToxic.org Keeps going to this instead: https://disinfectants.saferortoxic.org/disinfectants/undefined

What is causing that? How can I fix it?

In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.)

But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page ("The List,") and then have the List point back to the home website.

The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly.

But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined

This is the HTML: <a class="text-3xl font-bold leading-none" href="https://www.saferortoxic.org"><img src="https://www.saferortoxic.org/images/logo_hu26e4dce8d5931af1ea33526b28fc8383_9734_c52a4f1635ef88bda858373270551ed2.webp" class="custom-logo" alt="Logo: Safer or Toxic?" width="300px"></a>

Is this somehow related to cloudflare? Or something in the datasette code?

I'm starting to think it's a cloudflare issue.

Can I at least rule out it being a datasette issue?

My repository is here: https://github.com/mroswell/list-N

(BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1473/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
994390593 MDU6SXNzdWU5OTQzOTA1OTM= 1468 Faceting for custom SQL queries MichaelTiemannOSC 72577720 closed 0     2 2021-09-13T02:52:16Z 2021-09-13T04:54:22Z 2021-09-13T04:54:17Z CONTRIBUTOR  

Facets are awesome. But not when I need to join to tidy tables together. Or even just running explicitly the default SQL query that simply lists all the rows and columns of a table (up to SIZE). That is to say, when I browse a table, I see facets:

https://latest.datasette.io/fixtures/compound_three_primary_keys

But when I run a custom query, I don't:

https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101

Is there an idiom to cause custom SQL to come back with facet suggestions?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1468/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
969548935 MDU6SXNzdWU5Njk1NDg5MzU= 1429 UI for setting `?_size=max` on table page simonw 9599 open 0     2 2021-08-12T20:52:09Z 2021-08-13T04:37:41Z   OWNER  

It defaults to 100 per page, but you can increase that to 1000 per page using ?_size=max (or higher if max_returned_rows is set higher than that).

But... that's only available to people who know how to hack URLs.

Solution: add a link that sets that option to the pagination block at the bottom of the table:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1429/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
957298475 MDU6SXNzdWU5NTcyOTg0NzU= 1407 OSError: AF_UNIX path too long in ds_unix_domain_socket_server simonw 9599 closed 0     2 2021-07-31T18:36:06Z 2021-07-31T19:03:44Z 2021-07-31T19:03:44Z OWNER  

Got this exception while working on #1406.

``` @pytest.fixture(scope="session") def ds_unix_domain_socket_server(tmp_path_factory): socket_folder = tmp_path_factory.mktemp("uds") uds = str(socket_folder / "datasette.sock") ds_proc = subprocess.Popen( ["datasette", "--memory", "--uds", uds], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=tempfile.gettempdir(), ) # Give the server time to start time.sleep(1.5) # Check it started successfully

  assert not ds_proc.poll(), ds_proc.stdout.read().decode("utf-8")

E AssertionError: INFO: Started server process [48453] E INFO: Waiting for application startup. E INFO: Application startup complete. E Traceback (most recent call last): E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/datasette", line 33, in <module> E sys.exit(load_entry_point('datasette', 'console_scripts', 'datasette')()) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py", line 1137, in call E return self.main(args, kwargs) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py", line 1062, in main E rv = self.invoke(ctx) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py", line 1668, in invoke E return _process_result(sub_ctx.command.invoke(sub_ctx)) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py", line 1404, in invoke E return ctx.invoke(self.callback, ctx.params) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py", line 763, in invoke E return __callback(args, kwargs) E File "/Users/simon/Dropbox/Development/datasette/datasette/cli.py", line 583, in serve E uvicorn.run(ds.app(), uvicorn_kwargs) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/main.py", line 393, in run E server.run() E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py", line 50, in run E loop.run_until_complete(self.serve(sockets=sockets)) E File "/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete E return future.result() E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py", line 67, in serve E await self.startup(sockets=sockets) E File "/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py", line 133, in startup E server = await asyncio.start_unix_server( E File "/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/streams.py", line 132, in start_unix_server E return await loop.create_unix_server(factory, path, **kwds) E File "/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/unix_events.py", line 296, in create_unix_server E sock.bind(path) E OSError: AF_UNIX path too long E
E assert not 1 E + where 1 = <bound method Popen.poll of \<subprocess.Popen object at 0x106924af0>>() E + where <bound method Popen.poll of \<subprocess.Popen object at 0x106924af0>> = <subprocess.Popen object at 0x106924af0>.poll ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1407/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
957302085 MDU6SXNzdWU5NTczMDIwODU= 1408 Review places in codebase that use os.chdir(), in particularly relating to tests simonw 9599 open 0     2 2021-07-31T18:57:06Z 2021-07-31T19:00:32Z   OWNER  

To clarify: the core problem here is that an error is thrown any time you call os.getcwd() but the directory you are currently in has been deleted.

runner.isolated_filesystem() assumes that the current directory in has not been deleted. But the various temporary directory utilities in pytest work by creating directories and then deleting them.

Maybe there's a larger problem here that I play a bit fast and loose with os.chdir() in both the test suite and in various lines of code in Datasette itself (in particular in the publish commands)?

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1406#issuecomment-890390198

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1408/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
951185411 MDU6SXNzdWU5NTExODU0MTE= 1402 feature request: social meta tags fgregg 536941 open 0     2 2021-07-23T01:57:23Z 2021-07-26T19:31:41Z   CONTRIBUTOR  

it would be very nice if the twitter, slack, and other social media could make rich cards when people post a link to a datasette instance

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1402/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
792652391 MDU6SXNzdWU3OTI2NTIzOTE= 1199 Experiment with PRAGMA mmap_size=N simonw 9599 open 0     2 2021-01-23T21:24:09Z 2021-07-17T17:39:17Z   OWNER  

https://sqlite.org/mmap.html - SQLite supports memory-mapped I/O but it's disabled by default. The PRAGMA mmap_size=N option can be used to enable it.

It would be very interesting to understand the impact this could have on Datasette performance for various different shapes of data.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1199/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
940891698 MDU6SXNzdWU5NDA4OTE2OTg= 1390 Mention restarting systemd in documentation simonw 9599 closed 0     2 2021-07-09T16:05:15Z 2021-07-09T16:32:57Z 2021-07-09T16:32:33Z OWNER  

https://docs.datasette.io/en/stable/deploying.html#running-datasette-using-systemd

Need to clarify that if you add a new database or change metadata you need to restart systemd.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1390/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
503190241 MDU6SXNzdWU1MDMxOTAyNDE= 584 Codec error in some CSV exports simonw 9599 closed 0     2 2019-10-07T01:15:34Z 2021-06-17T18:13:20Z 2019-10-18T05:23:16Z OWNER  

Got this exploring my Swarm checkins:

/swarm/stickers.csv?stickerType=messageOnly&_size=max

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/584/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
516748849 MDU6SXNzdWU1MTY3NDg4NDk= 612 CSV export is broken for tables with null foreign keys simonw 9599 closed 0     2 2019-11-02T22:52:47Z 2021-06-17T18:13:20Z 2019-11-02T23:12:53Z OWNER  

Following on from #406 - this CSV export appears to be broken:

https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max csv pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label 1,1,hello,1,1 2,, That second row should have 5 values, but it only has 4.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/612/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
736365306 MDU6SXNzdWU3MzYzNjUzMDY= 1083 Advanced CSV export for arbitrary queries simonw 9599 open 0     2 2020-11-04T19:23:05Z 2021-06-17T18:12:31Z   OWNER  

There's no link to download the CSV file - the table page has that as an advanced export option, but this is missing from the query page.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1083/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
759695780 MDU6SXNzdWU3NTk2OTU3ODA= 1133 Option to omit header row in CSV export simonw 9599 closed 0     2 2020-12-08T18:54:46Z 2021-06-17T18:12:31Z 2020-12-10T23:28:51Z OWNER  

?_header=off - for symmetry with existing option ?_nl=on.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1133/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
919508498 MDU6SXNzdWU5MTk1MDg0OTg= 1375 JSON export dumps JSON fields as TEXT frafra 4068 closed 0     2 2021-06-12T09:45:08Z 2021-06-14T09:41:59Z 2021-06-13T15:37:58Z NONE  

Hi! When a user tries to export data as JSON, I would expect to see the value of JSON columns represented as JSON instead of being rendered as a string. What do you think?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1375/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
913809802 MDU6SXNzdWU5MTM4MDk4MDI= 1366 Get rid of this `restore_working_directory` hack entirely simonw 9599 open 0     2 2021-06-07T18:01:21Z 2021-06-07T18:03:03Z   OWNER  

That seems to have fixed it. I'd love to get rid of this restore_working_directory hack entirely.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1361#issuecomment-855308811

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1366/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
912464443 MDU6SXNzdWU5MTI0NjQ0NDM= 1360 Security flaw, to be fixed in 0.56.1 and 0.57 simonw 9599 closed 0     2 2021-06-05T21:53:51Z 2021-06-05T22:23:23Z 2021-06-05T22:22:06Z OWNER  

See security advisory here for details: https://github.com/simonw/datasette/security/advisories/GHSA-xw7c-jx9m-xh5g - the ?_trace=1 debugging option was not correctly escaping its JSON output, resulting in a reflected cross-site scripting vulnerability.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1360/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
828811618 MDU6SXNzdWU4Mjg4MTE2MTg= 1257 Table names containing single quotes break things simonw 9599 closed 0     2 2021-03-11T06:29:38Z 2021-06-02T03:28:29Z 2021-06-02T03:28:29Z OWNER  

e.g. I found a table called Yesterday's ELRs by County

It threw an error inside the detect_fts() function attempting to run this SQL query:

sql select name from sqlite_master where rootpage = 0 and ( sql like '%VIRTUAL TABLE%USING FTS%content="Yesterday's ELRs by County"%' or sql like '%VIRTUAL TABLE%USING FTS%content=[Yesterday's ELRs by County]%' or ( tbl_name = "Yesterday's ELRs by County" and sql like '%VIRTUAL TABLE%USING FTS%' ) ) Here's the code at fault: https://github.com/simonw/datasette/blob/640ac7071b73111ba4423812cd683756e0e1936b/datasette/utils/init.py#L534-L548

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1257/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
906977719 MDU6SXNzdWU5MDY5Nzc3MTk= 1350 ?_nofacets=1 query string argument for disabling facets and suggested facets simonw 9599 closed 0     2 2021-05-31T02:22:29Z 2021-06-01T16:19:38Z 2021-05-31T02:39:18Z OWNER  

This is needed as an internal option for #1349. datasette-graphql can benefit from this too - maybe can even use it so that if you pass ?_shape=array it gets automatically added, fixing #263.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1350/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
908446997 MDU6SXNzdWU5MDg0NDY5OTc= 1353 ?_nocount=1 for opting out of table counts simonw 9599 closed 0     2 2021-06-01T15:53:27Z 2021-06-01T16:18:54Z 2021-06-01T16:17:04Z OWNER  

Running a trace against a CSV streaming export with the new _trace=1 feature from #1351 shows that the following code is executing a select count(*) from table for every page of results returned: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/table.py#L700-L705

This is inefficient - a new ?_nocount=1 option would let us disable this count in the same way as #1349: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/base.py#L264-L276

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1353/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 273.472ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows