home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

252 rows where comments = 0 and state = "open" sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, author_association, draft, created_at (date), updated_at (date)

repo 16

  • datasette 165
  • sqlite-utils 17
  • twitter-to-sqlite 13
  • github-to-sqlite 10
  • dogsheep-beta 9
  • google-takeout-to-sqlite 7
  • healthkit-to-sqlite 5
  • dogsheep-photos 5
  • apple-notes-to-sqlite 5
  • dogsheep.github.io 4
  • evernote-to-sqlite 3
  • swarm-to-sqlite 2
  • inaturalist-to-sqlite 2
  • pocket-to-sqlite 2
  • hacker-news-to-sqlite 2
  • genome-to-sqlite 1

type 2

  • issue 205
  • pull 47

state 1

  • open · 252 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
2023057255 I_kwDOBm6k_c54lWdn 2212 Can't filter with numbers fzakaria 605070 open 0     0 2023-12-04T05:26:29Z 2023-12-04T05:26:29Z   NONE  

I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column.

My guess is that Datasette is "stringifying" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2212/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
2019811176 I_kwDOBm6k_c54Y99o 2211 Unreachable exception handlers for `sqlite3.OperationalError` mattparmett 1214074 open 0     0 2023-12-01T00:50:22Z 2023-12-01T00:50:22Z   NONE  

There are several places where sqlite3.OperationalError is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler.

Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then sqlite3.OperationalError should be removed from the tuple of exceptions in the first handler.

This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it.

One example:

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537

Other instances:

https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2211/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1994845152 I_kwDOBm6k_c525uvg 2207 ModuleNotFoundError: No module named 'click_default_group honzajavorek 283441 open 0     0 2023-11-15T14:04:32Z 2023-11-15T14:04:32Z   NONE  

No matter what I do, I'm getting this error:

$ datasette Traceback (most recent call last): File "/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette", line 5, in <module> from datasette.cli import cli File "/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py", line 6, in <module> from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group'

I have datasette in my dependencies like this:

toml [tool.poetry.group.dev.dependencies] datasette = {version = "1.0a7", allow-prereleases = true}

I had the latest regular version (not pre-release) there originally, but the result was the same:

toml [tool.poetry.group.dev.dependencies] datasette = "0.64.5"

Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2207/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1978603203 I_kwDOCGYnMM517xbD 602 `sqlite-utils transform` removes the `AUTOINCREMENT` keyword ArsTapatun 4472046 open 0     0 2023-11-06T08:48:43Z 2023-11-06T08:48:43Z   NONE  

Context

We ran into this bug randomly, noticing that deleted ROWID would get reused after migrating the DB. Using transform to change any column in the table will also unexpectedly strip away the AUTOINCREMENT keyword from the primary key definition, even if it was not the transformation target.

Reproducible example

Original database

```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF

$ sqlite3 test.db ".schema mytable" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ```

Modified database after sqlite-utils

```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2

$ sqlite3 test.db "SELECT sql FROM sqlite_master WHERE name = 'mytable';" CREATE TABLE IF NOT EXISTS "mytable" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ```

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1978022687 I_kwDOBm6k_c515jsf 2204 request.post_body() can only be called once simonw 9599 open 0     0 2023-11-05T23:22:03Z 2023-11-05T23:23:23Z   OWNER  

This code here:

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135

It consumes the messages, which means if you try to call it a second time you won't be able to get at the body.

This is efficient - we don't end up with a request object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not.

Potential solution: set request._body the first time it is called, and return that on subsequent calls.

Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call post_body() multiple times against one of those larger bodies.

I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2204/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1977726056 I_kwDOBm6k_c514bRo 2203 custom plugin not seen as sql function LyzardKing 7113541 open 0     0 2023-11-05T10:30:19Z 2023-11-05T10:30:19Z   NONE  

Hi, I'm not sure if this is the right repo for this issue.

I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly.

Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!

Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2203/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1977155641 I_kwDOCGYnMM512QA5 601 Move plugin directory into documentation simonw 9599 open 0     0 2023-11-04T04:07:52Z 2023-11-04T04:07:52Z   OWNER  

https://github.com/simonw/sqlite-utils-plugins should be in the official documentation.

I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html

https://til.simonwillison.net/readthedocs/stable-docs

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1955676270 I_kwDOBm6k_c50kUBu 2201 Discord invite link is invalid andrewsanchez 11708906 open 0     0 2023-10-21T21:50:05Z 2023-10-21T21:50:05Z   NONE  

https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2201/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1943259395 I_kwDOEhK-wc5z08kD 16 time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' linonetwo 3746270 open 0     0 2023-10-14T13:24:39Z 2023-10-14T13:24:39Z   NONE  

evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File "/usr/local/bin/evernote-to-sqlite", line 8, in <module> sys.exit(cli()) ^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py", line 31, in enex save_note(db, note) File "/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py", line 46, in save_note "created": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py", line 111, in convert_datetime return datetime.datetime.strptime(s, "%Y%m%dT%H%M%SZ").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py", line 349, in _strptime raise ValueError("time data %r does not match format %r" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ'

enex is exported by evernote mac client

evernote-to-sqlite 303218369 issue    
{
    "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1931794126 I_kwDOBm6k_c5zJNbO 2198 --load-extension=spatialite not working with Windows hcarter333 363004 open 0     0 2023-10-08T12:50:22Z 2023-10-08T12:50:22Z   NONE  

Using each of python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite

and

python -m datasette counties.db --load-extension="C:\Windows\System32\mod_spatialite.dll"

and

python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll

I got the error:

``` File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py", line 596, in _prepare_connection conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found.

```

I finally tried modifying the code in app.py to read:

``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, "utf-8", "replace") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # "extension" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) #else: conn.execute("SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')")

``` At which point the counties example worked.

Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used.

On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2198/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1920416843 I_kwDOCGYnMM5ydzxL 597 sqlite-utils insert-files should be able to convert fields grimnight 1737541 open 0     0 2023-09-30T22:20:47Z 2023-09-30T22:20:47Z   NONE  

Currently using both insert-files and convert is needed in order to create sqlar files, it would be more convenient if it could be done with just one command.

```shell ~ ❯ cat test.py import os

class Example: def init(self, arg1, arg2): self.arg1 = arg1

~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100%

~ ❯ sqlite-utils convert test.sqlar sqlar data "zlib.compress(value)" --import=zlib --where "name = 'test.py'" [####################################] 100%

~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data "zlib.compress(sys.stdin.buffer.read())" --import=zlib --import=sys --where "name = 'test.py'" # Alternative way [####################################] 100%

~ ❯ sqlite3 test.sqlar "SELECT hex(data) FROM sqlar WHERE name = 'test.py';" | python3 -c "import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))" import os

class Example: def init(self, arg1, arg2): self.arg1 = arg1

~ ❯ rm test.py

~ ❯ sqlar -l test.sqlar test.py

~ ❯ sqlar -x test.sqlar

~ ❯ cat test.py import os

class Example: def init(self, arg1, arg2): self.arg1 = arg1

```

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1899310542 I_kwDOBm6k_c5xNS3O 2187 Datasette for serving JSON only geofinder 19705106 open 0     0 2023-09-16T05:48:29Z 2023-09-16T05:48:29Z   NONE  

Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2187/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1895266807 I_kwDOBm6k_c5w93n3 2184 Design decision - should configuration be exposed at /-/config ? simonw 9599 open 0     0 2023-09-13T21:07:08Z 2023-09-13T21:07:38Z   OWNER  

This made me think. That {"$env": "ENV_VAR"} hack was introduced back here:

  • https://github.com/simonw/datasette/issues/538

The problem it was solving was that metadata was visible to everyone with access to the instance at /-/metadata but plugins clearly needed a way to set secret settings.

Now that this stuff is moving to config, we have some decisions to make:

  1. Add /-/config to let people see the configuration of their instance, and keep the $env trick for secret settings.
  2. Say all configuration aside from metadata is secret and make $env optional or ditch it entirely.
  3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from /-/config

I've found /-/metadata extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like.

So I'm leaning towards option 1 or 3.

Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924

Also refs: - #2093

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2184/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1888477283 I_kwDOC8SPRc5wj-Bj 38 Run `rebuild_fts` after building the index simonw 9599 open 0     0 2023-09-08T23:17:45Z 2023-09-08T23:17:45Z   MEMBER  

In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347

This turned out to be the fix:

bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db

dogsheep-beta 197431109 issue    
{
    "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1884499674 PR_kwDODFE5qs5ZtYMc 13 use poetry for packages, asdf for versioning, and gh actions for ci iloveitaly 150855 open 0     0 2023-09-06T17:59:16Z 2023-09-06T17:59:16Z   FIRST_TIME_CONTRIBUTOR dogsheep/google-takeout-to-sqlite/pulls/13
  • build: use poetry for package management, asdf for python version
  • build: cleanup poetry config, add keywords, ignore dist
  • ci: migrate circleci to gh actions
  • fix: dup method definition
google-takeout-to-sqlite 206649770 pull    
{
    "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/13/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1880968405 PR_kwDOJHON9s5ZhYny 14 fix: fix the problem of Chinese character garbling barretlee 2698003 open 0     0 2023-09-04T23:48:28Z 2023-09-04T23:48:28Z   FIRST_TIME_CONTRIBUTOR dogsheep/apple-notes-to-sqlite/pulls/14
  1. The code uses two different ways of writing encoding formats, mac_roman and macroman. It is uncertain whether there are any typo errors.
  2. When there are Chinese characters in the content, exporting it results in garbled code. Changing it to utf8 can fix the issue.
apple-notes-to-sqlite 611552758 pull    
{
    "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/14/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1875739055 I_kwDOBm6k_c5vzYGv 2167 Document return type of await ds.permission_allowed() simonw 9599 open 0     0 2023-08-31T15:14:23Z 2023-08-31T15:14:23Z   OWNER  

The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350

On inspecting the code I'm not 100% sure if it's possible for this. method to return None, or if it can only return True or False. Need to confirm that.

https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2167/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
594237015 MDU6SXNzdWU1OTQyMzcwMTU= 718 Plugin idea: datasette-redirects simonw 9599 open 0     0 2020-04-05T03:41:38Z 2023-08-30T22:17:31Z   OWNER  

I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin.

https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/718/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  reopened
1866815458 PR_kwDOBm6k_c5YyF-C 2159 Implement Dark Mode colour scheme jamietanna 3315059 open 0     0 2023-08-25T10:46:23Z 2023-08-25T10:46:35Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2159

Closes #2095.


:books: Documentation preview :books:: https://datasette--2159.org.readthedocs.build/en/2159/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2159/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
1865983069 PR_kwDOBm6k_c5YvQSi 2158 add brand option to metadata.json. publicmatt 52261150 open 0     0 2023-08-24T22:37:41Z 2023-08-24T22:37:57Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2158

This adds a brand link to the top navbar if 'brand' key is populated in metadata.json. The link will be either '#' or use the contents of 'brand_url' in metadata.json for href.

I was able to get this done on my own site by replacing templates/_crumbs.html with a custom version, but I thought it would be nice to incorporate this in the tool directly.


:books: Documentation preview :books:: https://datasette--2158.org.readthedocs.build/en/2158/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2158/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1754174496 I_kwDOCGYnMM5ojpQg 558 Ability to define unique columns when creating a table aguinane 1910303 open 0     0 2023-06-13T06:56:19Z 2023-08-18T01:06:03Z   NONE  

When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set.

```python from sqlite_utils import Database

columns = {"mRID": str, "name": str} db = Database("example.db") db["ExampleTable"].create(columns, pk="mRID", not_null=["mRID"], if_not_exists=True) db["ExampleTable"].create_index(["mRID"], unique=True, if_not_exists=True) ```

So something like this would add the UNIQUE flag to the table definition.

python db["ExampleTable"].create(columns, pk="mRID", not_null=["mRID"], unique=["mRID"], if_not_exists=True)

sql CREATE TABLE ExampleTable ( mRID TEXT PRIMARY KEY NOT NULL UNIQUE, name TEXT );

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1802613340 PR_kwDOBm6k_c5VZhfw 2100 Make primary key view accessible to render_cell hook meowcat 1563881 open 0     0 2023-07-13T09:30:36Z 2023-08-10T13:15:41Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2100

:books: Documentation preview :books:: https://datasette--2100.org.readthedocs.build/en/2100/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2100/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1839344979 I_kwDOCGYnMM5toi1T 582 Handling CSV/file input that contains NUL bytes betatim 1448859 open 0     0 2023-08-07T12:24:14Z 2023-08-07T12:24:14Z   NONE  

I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte.

When the processing reaches the line that contains the NUL an exception is raised.

I'm wondering if there is something that can be done in sqlite-utils to say "skip lines with encoding errors" or some such. I think it isn't super straightforward though as the exception comes from inside the csv module that does all the parsing.

Concretely the file is the KernelVersions.csv from https://www.kaggle.com/datasets/kaggle/meta-kaggle

This is the command and output: $ sqlite-utils insert --csv kaggle.db kaggle KernelVersions.csv [------------------------------------] 0% [#####################---------------] 60% 00:04:24Traceback (most recent call last): File "/home/foobar/miniconda/envs/meta-kaggle/bin/sqlite-utils", line 10, in <module> sys.exit(cli()) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py", line 1223, in insert insert_upsert_implementation( File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py", line 1085, in insert_upsert_implementation db[table].insert_all( File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py", line 3198, in insert_all chunk = list(chunk) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py", line 3742, in fix_square_braces for record in records: File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py", line 1071, in <genexpr> docs = (decode_base64_values(doc) for doc in docs) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py", line 1068, in <genexpr> docs = (verify_is_dict(doc) for doc in docs) File "/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py", line 1003, in <genexpr> docs = (dict(zip(headers, row)) for row in reader) _csv.Error: line contains NUL

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1827436260 PR_kwDOD079W85WtVyk 39 Missing option in datasette instructions coldclimate 319473 open 0     0 2023-07-29T10:34:48Z 2023-07-29T10:34:48Z   FIRST_TIME_CONTRIBUTOR dogsheep/dogsheep-photos/pulls/39

Gotta tell it where to look

dogsheep-photos 256834907 pull    
{
    "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/39/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1824457306 I_kwDOBm6k_c5svwJa 2122 Parameters on canned queries: fixed or query-generated list? meowcat 1563881 open 0     0 2023-07-27T14:07:07Z 2023-07-27T14:07:07Z   NONE  

Hi,

currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.)

  • adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is)
  • making it possible to provide a query which returns selectable values for a parameter, e.g. calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: "SELECT VALUE FROM generate_series(1, 30, 1)" # this obviously requires the corresponding sqlite extension instrument: sql: "SELECT DISTINCT MACHINE FROM calendar_entries"
  • making it possible to provide a fixed list of parameters calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine]
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2122/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1823428714 I_kwDOBm6k_c5sr1Bq 2120 Add __all__ to datasette/__init__.py simonw 9599 open 0     0 2023-07-27T01:07:10Z 2023-07-27T01:07:10Z   OWNER  

Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/init.py#L1-L6

Adding __all__ = ["Permission", "Forbidden"...] would let me get rid of those # noqa comments.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2120/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1822918995 I_kwDOCGYnMM5sp4lT 580 Add way to export to a csv file using the Python library kevinlinxc 44324811 open 0     0 2023-07-26T18:09:26Z 2023-07-26T18:09:26Z   NONE  

According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter?

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1822813627 I_kwDOBm6k_c5spe27 2108 some (many?) SQL syntax errors are not throwing errors with a .csv endpoint fgregg 536941 open 0     0 2023-07-26T16:57:45Z 2023-07-26T16:58:07Z   CONTRIBUTOR  

here's a CTE query that should always fail with a syntax error:

sql with foo as (nonsense) select * from foo;

when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B

but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B

same with this bad sql

sql select a, from foo;

https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B

vs

https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B

but, datasette catches this bad sql at both endpoints:

sql slect a from foo;

https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2108/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1821108702 I_kwDOCGYnMM5si-ne 579 Special handling for SQLite column of type `JSON` asg017 15178711 open 0     0 2023-07-25T20:37:23Z 2023-07-25T20:37:23Z   CONTRIBUTOR  

sqlite-utils should detect and have specially handling for column with a JSON column. For example:

sql CREATE TABLE "dogs" ( id INTEGER PRIMARY KEY, name TEXT, friends JSON );

Automatic Nesting

According to "Nested JSON Values", sqlite-utils will only expand JSON if the --json-cols flag is passed. It looks like it'll try to json.load all text column to test if its JSON, which can get expensive on non-json columns.

Instead, sqlite-utils should be default (ie without the --json-cols flags) do the maybe_json() operation on columns with a declared JSON type. So the above table would expand the "friends" column as expected, withoutthe --json-cols flag:

bash sqlite-utils dogs.db "select * from dogs" | python -mjson.tool

[ { "id": 1, "name": "Cleo", "friends": [ { "name": "Pancakes" }, { "name": "Bailey" } ] } ]


I'm sure there's other ways sqlite-utils can specially handle JSON columns, so keeping this open while I think of more

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1816830546 I_kwDODEm0Qs5sSqJS 73 Twitter v1 API shutdown david-perez 6341745 open 0     0 2023-07-22T16:57:41Z 2023-07-22T16:57:41Z   NONE  

I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get:

[2023-07-19 21:00:04.937536] File "/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets["errors"])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}]

It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they announced they'd be deprecated on 29th April.

Unfortunately retrieving likes using the v2 API is not part of their free plan. In fact, with the free plan one can only post and delete tweets and retrieve information about oneself.

So I'm afraid this is the end of this very nice project. It was very useful, thank you!

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
   
1808116827 I_kwDOBm6k_c5rxaxb 2103 data attribute on Datasette tables exposing the primary key of the row simonw 9599 open 0     0 2023-07-17T16:18:25Z 2023-07-17T16:18:25Z   OWNER  

Maybe put it on the <tr> but probably better to go on the td.type-pk.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2103/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1794604602 PR_kwDOBm6k_c5U-akg 2096 Clarify docs for descriptions in metadata garthk 15906 open 0     0 2023-07-08T01:57:58Z 2023-07-08T01:58:13Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2096

G'day! I got confused while debugging, earlier today. That's on me, but it does strike me a little repetition in the metadata documentation might help those flicking around it rather than reading it from top to bottom. No worries if you think otherwise.


:books: Documentation preview :books:: https://datasette--2096.org.readthedocs.build/en/2096/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2096/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1794097871 I_kwDOBm6k_c5q78LP 2095 Introduce "dark mode" CSS jamietanna 3315059 open 0     0 2023-07-07T19:15:58Z 2023-07-07T19:15:58Z   NONE  

Using the CSS media query prefers-color-scheme we can provide a dark-mode version of Datasette

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2095/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1783304750 I_kwDOBm6k_c5qSxIu 2094 JS Plugin Hooks for the Code Editor asg017 15178711 open 0     0 2023-07-01T00:51:57Z 2023-07-01T00:51:57Z   CONTRIBUTOR  

When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor.

I'd eventually like to build a JS plugin for sqlite-docs, to add things like:

  • Inline documentation for tables/columns on hover
  • Inline docs for custom functions that are loaded in
  • More detailed autocomplete for tables/columns/functions

I did some hacking to see what this would look like, see here:

There can be a new hook that allows JS plugins to add new "extension" in the CodeMirror editorview here:

https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25

Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of "@codemirror/autocomplete", for example.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2094/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1764792125 I_kwDOBm6k_c5pMJc9 2086 Show information on startup in directory configuration mode simonw 9599 open 0     0 2023-06-20T07:13:33Z 2023-06-20T07:13:33Z   OWNER  

https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098

One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2086/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1762180409 I_kwDOBm6k_c5pCL05 2085 Interactive row selection in Datasette learning4life 24938923 open 0     0 2023-06-18T08:29:45Z 2023-06-18T08:31:23Z   NONE  

Simon did a excellent prototype of an interactive row selection in Datasette.

I hope this functionality can be turned into a Datasette plugin.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2085/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1761613778 I_kwDOBm6k_c5pABfS 2084 Support facets for columns that contain timestamps devxpy 19492893 open 0     0 2023-06-17T03:33:54Z 2023-06-17T03:33:54Z   NONE  

Django has this very nice filter for datetime fields -

It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now...

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2084/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1751214236 I_kwDOC8SPRc5oYWic 36 Getting sqlite_master may not be modified when creating dogsheep index khushmeeet 8711912 open 0     0 2023-06-11T03:21:53Z 2023-06-11T03:21:53Z   NONE  

When creating a dogsheep index from config.yml file on pocket.db (created using pocket-to-sqlite), I am getting this error

Traceback (most recent call last): File "/Users/khushmeeet/.pyenv/versions/3.11.2/bin/dogsheep-beta", line 8, in <module> sys.exit(cli()) ^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/cli.py", line 36, in index run_indexer( File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py", line 32, in run_indexer ensure_table_and_indexes(db, tokenize) File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py", line 91, in ensure_table_and_indexes table.add_foreign_key(*fk) File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py", line 2155, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File "/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py", line 1116, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified

Command I ran to get this error dogsheep-beta index pocket.db config.yml

Dogsheep version dogsheep-beta, version 0.10.2

Python version Python 3.11.2

dogsheep-beta 197431109 issue    
{
    "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1734786661 PR_kwDOBm6k_c5R0fcK 2082 Catch query interrupted on facet suggest row count redraw 10843208 open 0     0 2023-05-31T18:42:46Z 2023-05-31T18:45:26Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2082

Just like facet's suggest() is trapping QueryInterrupted for facet columns, we also need to trap get_row_count(), which can reach timeout if database tables are big enough.

I've included get_columns() inside the block as that's just another query, despite it's a really cheap one and might never raise the exception.


:books: Documentation preview :books:: https://datasette--2082.org.readthedocs.build/en/2082/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2082/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1727478903 I_kwDOBm6k_c5m9zx3 2081 Update Endpoints defined in metadata throws 403 Forbidden after a while cutmasta-kun 15085007 open 0     0 2023-05-26T11:52:30Z 2023-05-26T11:52:30Z   NONE  

Hello. I expose an endpoint to update tasks: { "title": "My Datasette Instance", "databases": { "tasks": { "queries": { "update_task": { "sql": "UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID", "write": true, "on_success_message": "Task updated", "on_success_redirect": "/tasks/tasks.json", "on_error_message": "Task update failed", "on_error_redirect": "/tasks.json", "params": ["queueID", "taskData", "status", "result", "systemMessage"] } } } } }

This works really well! But after a while, the Datasette Instanz answers with 403 Forbidden. I have to delete the database and recreate it in order to work again.

Any help here? (´。_。`)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2081/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1715468032 PR_kwDOBm6k_c5QzEAM 2076 Datsette gpt plugin StudioCordillera 130708713 open 0     0 2023-05-18T11:22:30Z 2023-05-18T11:22:45Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2076

:books: Documentation preview :books:: https://datasette--2076.org.readthedocs.build/en/2076/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2076/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1708981860 PR_kwDOBm6k_c5QdMea 2074 sort files by mtime abbbi 3919561 open 0     0 2023-05-14T15:25:15Z 2023-05-14T15:25:29Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2074

serving multiple database files and getting tired by the default sort, changes so the sort order puts the latest changed databases to be on top of the list so don't have to scroll down, lazy as i am ;)


:books: Documentation preview :books:: https://datasette--2074.org.readthedocs.build/en/2074/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2074/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1674322631 PR_kwDOBm6k_c5OpEz_ 2061 Add "Packaging a plugin using Poetry" section in docs rclement 1238873 open 0     0 2023-04-19T07:23:28Z 2023-04-19T07:27:18Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2061

This PR adds a new section about packaging a plugin using poetry within the "Writing plugins" page of the documentation.


:books: Documentation preview :books:: https://datasette--2061.org.readthedocs.build/en/2061/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2061/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1665510265 I_kwDOBm6k_c5jRat5 2060 Clean up a bunch of warnings from ruff simonw 9599 open 0     0 2023-04-13T01:23:02Z 2023-04-13T01:23:02Z   OWNER  

See: - #2056

ruff spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2060/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1650984552 PR_kwDOJHON9s5NbyYN 13 use universal command amlestin 14314871 open 0     0 2023-04-02T15:10:54Z 2023-04-02T15:37:34Z   FIRST_TIME_CONTRIBUTOR dogsheep/apple-notes-to-sqlite/pulls/13   apple-notes-to-sqlite 611552758 pull    
{
    "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/13/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1650981564 I_kwDOJHON9s5iZ_q8 12 Error running pytest amlestin 14314871 open 0     0 2023-04-02T15:02:36Z 2023-04-02T15:07:10Z   NONE  

______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in <module> from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'

Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using

python -m pytest

The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)

apple-notes-to-sqlite 611552758 issue    
{
    "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1649793525 I_kwDOBm6k_c5iVdn1 2051 `?_extra=row_urls` for table pages simonw 9599 open 0     0 2023-03-31T17:58:36Z 2023-03-31T17:58:36Z   OWNER  

Provides URLs to the JSON version of those rows. Maybe it persists the ?_shape= option too? Not sure about that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2051/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1646068413 I_kwDOBm6k_c5iHQK9 2048 Test failures encountered while packaging for GNU Guix Apteryks 8332263 open 0     0 2023-03-29T15:36:54Z 2023-03-29T15:36:54Z   NONE  

Hello,

While reviewing a packaged submitted to Guix to add datasette, the test suite produces the following errors: ``` =================================== FAILURES =================================== ____ testrow_strange_table_name ______ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffef099be0>

def test_row_strange_table_name(app_client):
    response = app_client.get(
        "/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"
    )
  assert response.status == 200

E assert 400 == 200 E + where 400 = <datasette.utils.testing.TestResponse object at 0x7fffef236a30>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=<sqlite3.Connection object at 0x7fffeedfe5d0>, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where "rowid"=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv __ test_database_page_for_database_with_dot_in_name __ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_with_dot = <datasette.utils.testing.TestClient object at 0x7fffef3416a0>

def test_database_page_for_database_with_dot_in_name(app_client_with_dot):
    response = app_client_with_dot.get("/fixtures~2Edot.json")
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffef089fa0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___ test_tilde_encoded_database_names[fo%o] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

db_name = 'fo%o'

@pytest.mark.asyncio
@pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d"))
async def test_tilde_encoded_database_names(db_name):
    ds = Datasette()
    ds.add_memory_database(db_name)
    response = await ds.client.get("/.json")
    assert db_name in response.json().keys()
    path = response.json()[db_name]["path"]
    # And the JSON for that database
    response2 = await ds.client.get(path + ".json")
  assert response2.status_code == 200

E assert 302 == 200 E + where 302 = <Response [302 Found]>.status_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ___ testtilde_encoded_database_names[f~/c.d] _____ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

db_name = 'f~/c.d'

@pytest.mark.asyncio
@pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d"))
async def test_tilde_encoded_database_names(db_name):
    ds = Datasette()
    ds.add_memory_database(db_name)
    response = await ds.client.get("/.json")
    assert db_name in response.json().keys()
    path = response.json()[db_name]["path"]
    # And the JSON for that database
    response2 = await ds.client.get(path + ".json")
  assert response2.status_code == 200

E assert 302 == 200 E + where 302 = <Response [302 Found]>.status_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __ test_database_with_space_in_name[/searchable.json] __ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef11d730> path = '/searchable.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffef11d940> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[.json] ______ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085a90> path = '.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffecd99ca0> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_database_with_space_in_name[/searchable_view] __ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffeeab4c70> path = '/searchable_view'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffec5b3580> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[/] ___ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085be0> path = '/'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffec5f1370> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ____ test_database_with_space_in_name[/searchable] _____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef099f10> path = '/searchable'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffecd8c790> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ testdatabase_with_space_in_name[/searchable_view.json] ____ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef341520> path = '/searchable_view.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffef085460> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_weird_database_names[database (1).sqlite] __ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite'

@pytest.mark.parametrize(
    "filename", ["test-database (1).sqlite", "database (1).sqlite"]
)
def test_weird_database_names(tmpdir, filename):
    # https://github.com/simonw/datasette/issues/1181
    runner = CliRunner()
    db_path = str(tmpdir / filename)
    sqlite3.connect(db_path).execute("vacuum")
    result1 = runner.invoke(cli, [db_path, "--get", "/"])
    assert result1.exit_code == 0, result1.output
    filename_no_stem = filename.rsplit(".", 1)[0]
    expected_link = '<a href="/{}">{}</a>'.format(
        tilde_encode(filename_no_stem), filename_no_stem
    )
    assert expected_link in result1.output
    # Now try hitting that database page
    result2 = runner.invoke(
        cli, [db_path, "--get", "/{}".format(tilde_encode(filename_no_stem))]
    )
  assert result2.exit_code == 0, result2.output

E AssertionError: E
E assert 1 == 0 E + where 1 = <Result SystemExit(1)>.exit_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError ___ test_weird_database_names[test-database (1).sqlite] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite'

@pytest.mark.parametrize(
    "filename", ["test-database (1).sqlite", "database (1).sqlite"]
)
def test_weird_database_names(tmpdir, filename):
    # https://github.com/simonw/datasette/issues/1181
    runner = CliRunner()
    db_path = str(tmpdir / filename)
    sqlite3.connect(db_path).execute("vacuum")
    result1 = runner.invoke(cli, [db_path, "--get", "/"])
    assert result1.exit_code == 0, result1.output
    filename_no_stem = filename.rsplit(".", 1)[0]
    expected_link = '<a href="/{}">{}</a>'.format(
        tilde_encode(filename_no_stem), filename_no_stem
    )
    assert expected_link in result1.output
    # Now try hitting that database page
    result2 = runner.invoke(
        cli, [db_path, "--get", "/{}".format(tilde_encode(filename_no_stem))]
    )
  assert result2.exit_code == 0, result2.output

E AssertionError: E
E assert 1 == 0 E + where 1 = <Result SystemExit(1)>.exit_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec2d37f0> path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['<td class="col-pk1 type-str">a/b</td>', '<td class="col-pk2 type-str">.c-d</td>', '<td class="col-content type-str">c</td>']]

@pytest.mark.parametrize(
    "path,expected",
    (
        (
            "/fixtures/compound_primary_key/a,b",
            [
                [
                    '<td class="col-pk1 type-str">a</td>',
                    '<td class="col-pk2 type-str">b</td>',
                    '<td class="col-content type-str">c</td>',
                ]
            ],
        ),
        (
            "/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd",
            [
                [
                    '<td class="col-pk1 type-str">a/b</td>',
                    '<td class="col-pk2 type-str">.c-d</td>',
                    '<td class="col-content type-str">c</td>',
                ]
            ],
        ),
    ),
)
def test_row_html_compound_primary_key(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd47c23a0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563']

@pytest.mark.parametrize(
    "path,expected_classes",
    [
        ("/", ["index"]),
        ("/fixtures", ["db", "db-fixtures"]),
        ("/fixtures?sql=select+1", ["query", "db-fixtures"]),
        (
            "/fixtures/simple_primary_key",
            ["table", "db-fixtures", "table-simple_primary_key"],
        ),
        (
            "/fixtures/neighborhood_search",
            ["query", "db-fixtures", "query-neighborhood_search"],
        ),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            ["table", "db-fixtures", "table-tablewithslashescsv-fa7563"],
        ),
        (
            "/fixtures/simple_primary_key/1",
            ["row", "db-fixtures", "table-simple_primary_key"],
        ),
    ],
)
def test_css_classes_on_body(app_client, path, expected_classes):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd4628dc0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html'

@pytest.mark.parametrize(
    "path,expected_considered",
    [
        ("/", "*index.html"),
        ("/fixtures", "database-fixtures.html, *database.html"),
        (
            "/fixtures/simple_primary_key",
            "table-fixtures-simple_primary_key.html, *table.html",
        ),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            "table-fixtures-tablewithslashescsv-fa7563.html, *table.html",
        ),
        (
            "/fixtures/simple_primary_key/1",
            "row-fixtures-simple_primary_key.html, *row.html",
        ),
    ],
)
def test_templates_considered(app_client, path, expected_considered):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd44f2d60>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffecd9fac0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json'

@pytest.mark.parametrize(
    "path,expected",
    (
        # Instance index page
        ("/", "http://localhost/.json"),
        # Table page
        ("/fixtures/facetable", "http://localhost/fixtures/facetable.json"),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            "http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json",
        ),
        # Row page
        (
            "/fixtures/no_primary_key/1",
            "http://localhost/fixtures/no_primary_key/1.json",
        ),
        # Database index page
        (
            "/fixtures",
            "http://localhost/fixtures.json",
        ),
        # Custom query page
        (
            "/fixtures?sql=select+*+from+facetable",
            "http://localhost/fixtures.json?sql=select+*+from+facetable",
        ),
        # Canned query page
        (
            "/fixtures/neighborhood_search?text=town",
            "http://localhost/fixtures/neighborhood_search.json?text=town",
        ),
        # /-/ pages
        (
            "/-/plugins",
            "http://localhost/-/plugins.json",
        ),
    ),
)
def test_alternate_url_json(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd4650b80>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec5952e0> path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B'

@pytest.mark.parametrize(
    "path,expected",
    [
        (
            "/fixtures/neighborhood_search",
            "/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&amp;text=",
        ),
        (
            "/fixtures/neighborhood_search?text=ber",
            "/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&amp;text=ber",
        ),
        ("/fixtures/pragma_cache_size", None),
        (
            # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬
            "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC",
            "/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B",
        ),
        ("/fixtures/magic_parameters", None),
    ],
)
def test_edit_sql_link_on_canned_queries(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffec4a6f70>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _____ test_table_with_slashes_in_name ______ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec0860a0>

def test_table_with_slashes_in_name(app_client):
    response = app_client.get(
        "/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"
    )
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffec086fa0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError ___ testcustom_query_with_unicode_characters _____ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec1cda90>

def test_custom_query_with_unicode_characters(app_client):
    # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json
    response = app_client.get(
        "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"
    )
  assert [{"id": 1, "name": "San Francisco"}] == response.json

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042:


/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/init.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())


self = <json.decoder.JSONDecoder object at 0x7ffff7479760>, s = '', idx = 0

def raw_decode(self, s, idx=0):
    """Decode a JSON document from ``s`` (a ``str`` beginning with
    a JSON document) and return a 2-tuple of the Python
    representation and the index in ``s`` where the document ended.

    This can be used to decode a JSON document from a string that may
    have extraneous data at the end.

    """
    try:
        obj, end = self.scan_once(s, idx)
    except StopIteration as err:
      raise JSONDecodeError("Expecting value", s, err.value) from None

E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd470bf10> path = '/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "path,expected_rows",
    [
        (
            "/fixtures/searchable.json?_search=dog",
            [
                [1, "barry cat", "terry dog", "panther"],
                [2, "terry dog", "sara weasel", "puma"],
            ],
        ),
        (
            # Special keyword shouldn't break FTS query
            "/fixtures/searchable.json?_search=AND",
            [],
        ),
        (
            # Without _searchmode=raw this should return no results
            "/fixtures/searchable.json?_search=te*+AND+do*",
            [],
        ),
        (
            # _searchmode=raw
            "/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw",
            [
                [1, "barry cat", "terry dog", "panther"],
                [2, "terry dog", "sara weasel", "puma"],
            ],
        ),
        (
            # _searchmode=raw combined with _search_COLUMN
            "/fixtures/searchable.json?_search_text2=te*&_searchmode=raw",
            [
                [1, "barry cat", "terry dog", "panther"],
            ],
        ),
        (
            "/fixtures/searchable.json?_search=weasel",
            [[2, "terry dog", "sara weasel", "puma"]],
        ),
        (
            "/fixtures/searchable.json?_search_text2=dog",
            [[1, "barry cat", "terry dog", "panther"]],
        ),
        (
            "/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther",
            [[1, "barry cat", "terry dog", "panther"]],
        ),
    ],
)
def test_searchable(app_client, path, expected_rows):
    response = app_client.get(path)
  assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _ test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1] ____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

table_metadata = {'searchmode': 'raw'}, querystring = '_search=te+AND+do' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "table_metadata,querystring,expected_rows",
    [
        (
            {},
            "_search=te*+AND+do*",
            [],
        ),
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*",
            _SEARCHMODE_RAW_RESULTS,
        ),
        (
            {},
            "_search=te*+AND+do*&_searchmode=raw",
            _SEARCHMODE_RAW_RESULTS,
        ),
        # Can be over-ridden with _searchmode=escaped
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*&_searchmode=escaped",
            [],
        ),
    ],
)
def test_searchmode(table_metadata, querystring, expected_rows):
    with make_app_client(
        metadata={"databases": {"fixtures": {"tables": {"searchable": table_metadata}}}}
    ) as client:
        response = client.get("/fixtures/searchable.json?" + querystring)
      assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te+AND+do&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

table_metadata = {}, querystring = '_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "table_metadata,querystring,expected_rows",
    [
        (
            {},
            "_search=te*+AND+do*",
            [],
        ),
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*",
            _SEARCHMODE_RAW_RESULTS,
        ),
        (
            {},
            "_search=te*+AND+do*&_searchmode=raw",
            _SEARCHMODE_RAW_RESULTS,
        ),
        # Can be over-ridden with _searchmode=escaped
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*&_searchmode=escaped",
            [],
        ),
    ],
)
def test_searchmode(table_metadata, querystring, expected_rows):
    with make_app_client(
        metadata={"databases": {"fixtures": {"tables": {"searchable": table_metadata}}}}
    ) as client:
        response = client.get("/fixtures/searchable.json?" + querystring)
      assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: "/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest" arguments: ("-vv" "-n" "24" "-m" "not serial") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds The tests run in a private namespace without internet connectivity, and the Python dependencies are at: python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9.

Thank you!

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2048/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1641013220 I_kwDOBm6k_c5hz9_k 2045 First column on a view page has no facet option in cog menu simonw 9599 open 0   Datasette 1.0 3268330 0 2023-03-26T18:02:47Z 2023-03-26T18:02:48Z   OWNER  

e.g. first column on this page - cog menu has no option to facet.

https://datasette.io/content/tools

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2045/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1639873822 PR_kwDOBm6k_c5M29tt 2044 Expand labels in row view as well (patch for 0.64.x branch) tmcl-it 82332573 open 0     0 2023-03-24T18:44:44Z 2023-03-24T18:44:57Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2044

This is a version of #2031 for the 0.64.x branch.


:books: Documentation preview :books:: https://datasette--2044.org.readthedocs.build/en/2044/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2044/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1636616315 I_kwDOBm6k_c5hjMh7 2042 Gather feedback on new ?_extra= design simonw 9599 open 0     0 2023-03-22T23:07:43Z 2023-03-22T23:08:19Z   OWNER  

Now that I've landed: - #1999

See also: - #262

I want to get some feedback from people on the design of the new ?_extra= feature, before freezing it into Datasette 1.0.

The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for "next" and "rows", where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json

If you want extra stuff you can ask for it with the new ?_extra= parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets

You can use ?_extra=extras to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2042/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1620515757 I_kwDOBm6k_c5glxut 2039 Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\` asg017 15178711 open 0     0 2023-03-12T21:18:52Z 2023-03-12T21:18:52Z   CONTRIBUTOR  

From the Datasette discord: A user tried running the following command on windows:

datasette --load-extension="C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll" This failed with "The specified module could not be found", because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at "C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll", it instead tried to load the extension at "C" with entrypoint `"\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll".

This is hard because most absolute windows paths have a colon in them, like C:\foo.txt or D:\bar.txt. I'd image the --static flag is also vulnerable to this type of bug.

The "solution" is to use a relative path instead, but that doesn't feel that great.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2039/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1617938730 I_kwDOJHON9s5gb8kq 9 Default to just storing plaintext, store HTML if `--html` is passed simonw 9599 open 0     0 2023-03-09T20:19:06Z 2023-03-09T20:19:06Z   MEMBER  

The full body version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the plaintext version.

I'm tempted to say you don't get HTML unless you pass a --html option.

apple-notes-to-sqlite 611552758 issue    
{
    "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1616440856 I_kwDOJHON9s5gWO4Y 5 Configure full text search simonw 9599 open 0     0 2023-03-09T05:20:46Z 2023-03-09T05:20:46Z   MEMBER  

FTS would be useful.

Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the plaintext property for that.

apple-notes-to-sqlite 611552758 issue    
{
    "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1605959201 I_kwDOBm6k_c5fuP4h 2032 datasette errors when foreign key integrity is enabled cldellow 193185 open 0     0 2023-03-02T01:27:51Z 2023-03-02T01:31:58Z   CONTRIBUTOR  

By default, SQLite does not enforce foreign key constraints. I typically enable these checks by running:

sql PRAGMA foreign_keys = ON;

inside of a prepare_connection hook.

If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with:

FOREIGN KEY constraint failed

This could be resolved by either: - deleting from the tables column last - changing the schema so that the foreign keys have ON DELETE CASCADE

Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in prepare_connection and trying not to enable fkey checks on the _internal database.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2032/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1592327343 I_kwDOBm6k_c5e6Pyv 2029 Sorry Simon, didn't know how else to contact you llchristopherson 5804626 open 0     0 2023-02-20T19:02:53Z 2023-02-20T19:02:53Z   NONE  

Hi Simon,

Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org.

Thanks, Laura

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2029/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1586980089 PR_kwDOBm6k_c5KF-by 2026 Avoid repeating primary key columns if included in _col args runderwood 8513 open 0     0 2023-02-16T04:16:25Z 2023-02-16T04:16:41Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2026

...while maintaining given order.

Fixes #1975 (if I'm understanding correctly).


:books: Documentation preview :books:: https://datasette--2026.org.readthedocs.build/en/2026/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2026/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1581218043 PR_kwDOBm6k_c5JyqPy 2025 Add database metadata to index.html template context palewire 9993 open 0     0 2023-02-12T11:16:58Z 2023-02-12T11:17:14Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2025

Fixes #2016


:books: Documentation preview :books:: https://datasette--2025.org.readthedocs.build/en/2025/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2025/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1577548579 I_kwDOBm6k_c5eB3sj 2021 Docker images for 1.0 alphas? meowcat 1563881 open 0     0 2023-02-09T09:35:52Z 2023-02-09T09:35:52Z   NONE  

Hi, would you consider putting 1.0alpha images on Dockerhub?

(Also, how usable are the alphas?)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2021/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1575880841 I_kwDOBm6k_c5d7giJ 2020 Documentation refers to "off" setting; doesn't seem to work, "false" does dmick 1350673 open 0     0 2023-02-08T10:38:10Z 2023-02-08T10:38:10Z   NONE  

https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using "off" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a "JSON boolean" and have the values "true" or "false". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2020/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1571711808 I_kwDOBm6k_c5drmtA 2018 `check_visibility` gives confusing (wrong?) results if permission is `None` cldellow 193185 open 0     0 2023-02-06T01:03:08Z 2023-02-06T01:03:46Z   CONTRIBUTOR  

I'm trying to gate access to an edit UI on the user having update-row on the underlying view or table.

I expected datasette.check_visibility to be a good way to do this:

```python visible, private = await datasette.check_visibility( request.actor, permissions=[ ("update-row", (database, table)), ], )

if not visible:
    return None

```

But visible is returning true, even when there is no explicit update-row permission. (In this case, request.actor is None.)

Based on the update-row permissions docs, I expected this to be default deny, and so no explicit permission would result in false.

I think the root cause is that check_visibility calls ensure_permissions and expects it to throw if the permission is not available.

But ensure_permissions does not throw when permission_allowed returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2018/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1565179870 I_kwDOBm6k_c5dSr_e 2013 Datasette uses non-standard quoting for identifiers cldellow 193185 open 0     0 2023-02-01T00:05:39Z 2023-02-01T00:06:30Z   CONTRIBUTOR  

Related to #2001, but where #2001 was about literals, this is about identifiers

From https://www.sqlite.org/lang_keywords.html:

"keyword" A keyword in double-quotes is an identifier. [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility.

Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/init.py#L345-L349, in some of the other DB access code, and in some of the test fixtures.

Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2013/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1564774831 I_kwDOBm6k_c5dRJGv 2012 Missing space in database summary simonw 9599 open 0     0 2023-01-31T18:01:13Z 2023-01-31T18:01:13Z   OWNER  

Spotted this on an instance index page:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2012/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1557599877 I_kwDODFE5qs5c1xaF 12 location history changes gerardrbentley 14809320 open 0     0 2023-01-26T03:57:25Z 2023-01-26T03:57:25Z   NONE  

not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25

filename changed from "Location History.json" to "Records.json"

"timestampMs" is not present, "timestamp" is roughly iso timestamp

```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, "%Y-%m-%dT%H:%M:%SZ").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, "%Y-%m-%dT%H:%M:%S.%fZ").timestamp()

def save_location_history(db, zf): location_history = json.load( zf.open("Takeout/Location History/Records.json") ) db["location_history"].upsert_all( ( { "id": id_for_location_history(row), "latitude": row["latitudeE7"] / 1e7, "longitude": row["longitudeE7"] / 1e7, "accuracy": row["accuracy"], "timestampMs": get_timestamp_ms(row["timestamp"]), "when": row["timestamp"], } for row in location_history["locations"] ), pk="id", )

def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=(",", ":"), sort_keys=True).encode("utf8") ).hexdigest()[:6] return "{}-{}".format( row['timestamp'], first_six, ) ```

example locations from mine

json { "latitudeE7": 427220206, "longitudeE7": -923423972, "accuracy": 10, "deviceTag": -1312429967, "deviceDesignation": "PRIMARY", "timestamp": "2019-01-08T23:31:50.867Z" }

json { "latitudeE7": 427011317, "longitudeE7": -923448300, "accuracy": 5, "deviceTag": -1312429967, "deviceDesignation": "PRIMARY", "timestamp": "2019-01-08T23:33:53Z" },

google-takeout-to-sqlite 206649770 issue    
{
    "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 2
}
   
1554032168 I_kwDOBm6k_c5coKYo 2002 Document how actors are displayed simonw 9599 open 0     0 2023-01-24T00:08:49Z 2023-01-24T00:08:49Z   OWNER  

https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/init.py#L1052-L1056

This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2002/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1550536442 I_kwDOCGYnMM5ca076 521 Custom JSON encoder janrito 31504 open 0     0 2023-01-20T09:19:40Z 2023-01-20T09:19:40Z   NONE  

It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1533673397 I_kwDOBm6k_c5baf-1 1991 fts5 tables are not auto-detected and hidden keturn 83819 open 0     0 2023-01-15T06:00:42Z 2023-01-20T04:54:24Z   NONE  

I set up a Datasette instance and was following the docs on full-text search.

When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either.

If I manually set fts_table for a view, then search does work as expected.

My table and view creation code looks like this: py connection.execute("""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """)   connection.execute("""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1991/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1538197093 I_kwDOBm6k_c5brwZl 1995 foreign_keys error 500 jonschoning 137183 open 0     0 2023-01-18T15:27:36Z 2023-01-18T16:44:01Z   NONE  

Error 500 expected string or bytes-like object

espial-new.sqlite3.zip

run datasette espial-new.sqlite3 & click on any table other than User

/home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """.format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk["other_column"]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk["other_table"]), │ │ 817 │ │ │ placeholders=", ".join(["?"] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = <Database: espial-new (mutable, size=53248)> │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = <datasette.app.Datasette object at 0x7f0f2e77e980> │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f"[{s}]" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File "/home/jon/.local/lib/python3.10/site-packages/datasette/app.py", line 1354, in route_path response = await view(request, send) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 134, in view return await self.dispatch_request(request) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 91, in dispatch_request return await handler(request) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py", line 603, in _data_traced await self.ds.expand_foreign_keys( File "/home/jon/.local/lib/python3.10/site-packages/datasette/app.py", line 814, in expand_foreign_keys other_column=escape_sqlite(fk["other_column"]), File "/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - "GET /espial-new/bookmark HTTP/1.1" 500 Internal Server Error INFO: 127.0.0.1:38574 - "GET /-/static/app.css?d59929 HTTP/1.1" 200 OK

Schema: ``` CREATE TABLE IF NOT EXISTS "user" ( "id" INTEGER PRIMARY KEY, "name" VARCHAR NOT NULL, "password_hash" VARCHAR NOT NULL, "api_token" VARCHAR NULL, "private_default" BOOLEAN NOT NULL, "archive_default" BOOLEAN NOT NULL, "privacy_lock" BOOLEAN NOT NULL, CONSTRAINT "unique_user_name" UNIQUE ("name") );

CREATE TABLE IF NOT EXISTS "bookmark" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), "href" VARCHAR NOT NULL, "description" VARCHAR NOT NULL, "extended" VARCHAR NOT NULL, "time" TIMESTAMP NOT NULL, "shared" BOOLEAN NOT NULL, "to_read" BOOLEAN NOT NULL, "selected" BOOLEAN NOT NULL, "archive_href" VARCHAR NULL, CONSTRAINT "unique_user_href" UNIQUE ("user_id", "href"), CONSTRAINT "unique_user_slug" UNIQUE ("user_id", "slug") );

CREATE TABLE IF NOT EXISTS "bookmark_tag" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "tag" VARCHAR NOT NULL, "bookmark_id" INTEGER NOT NULL REFERENCES "bookmark" ON DELETE RESTRICT ON UPDATE RESTRICT, "seq" INTEGER NOT NULL, CONSTRAINT "unique_user_tag_bookmark_id" UNIQUE ("user_id", "tag", "bookmark_id"), CONSTRAINT "unique_user_bookmark_id_tag_seq" UNIQUE ("user_id", "bookmark_id", "tag", "seq") );

CREATE TABLE IF NOT EXISTS "note" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), "length" INTEGER NOT NULL, "title" VARCHAR NOT NULL, "text" VARCHAR NOT NULL, "is_markdown" BOOLEAN NOT NULL, "shared" BOOLEAN NOT NULL DEFAULT false, "created" TIMESTAMP NOT NULL, "updated" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1995/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1532000914 I_kwDOBm6k_c5bUHqS 1990 Suggestion: Highlight error messages ('These facets timed out') pax 116795 open 0     0 2023-01-13T09:40:58Z 2023-01-13T09:40:58Z   NONE  

I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the These facets timed out notice.

It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1990/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1509783085 I_kwDOBm6k_c5Z_XYt 1969 sql-formatter javascript is not now working with CloudFlare rocketloader fgregg 536941 open 0     0 2022-12-23T21:14:06Z 2023-01-10T01:56:33Z   CONTRIBUTOR  

This is probably not a bug with datasette, but I thought you might want to know, @simonw.

I noticed today that my CloudFlare proxied datasette instance lost the "Format SQL" option. I'm pretty sure it was there last week.

In the CloudFlare settings, if I turn off Rocket Loader, I get the "Format SQL" option back.

Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading?

I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1969/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1524431805 I_kwDODEm0Qs5a3Pu9 72 Import thread, including self- and others' replies mcint 601708 open 0     0 2023-01-08T09:51:06Z 2023-01-08T09:51:06Z   NONE  

statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this.

twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234

twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity.

For reference, this is implemented in twarc, using a search, optionally recursively.

Other research suggests that this formerly, or currently, requires a search query, use of undocumented related_results api, or with requested inclusion of newer conversation_id with subsequent query.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1516815571 I_kwDOBm6k_c5aaMTT 1975 _col=id can cause id column to export twice in CSV export simonw 9599 open 0     0 2023-01-03T00:25:15Z 2023-01-03T00:25:21Z   OWNER  

https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1

csv id,id,title,body 1,1,WaSP Phase II,"<p>The <a href=""http://www.webstandards.org/"">Web Standards</a> project has launched Phase II.</p>" That should not have two id columns.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1975/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1515717718 PR_kwDOC8tyDs5Gc-VH 23 Include workout statistics badboy 2129 open 0     0 2023-01-01T17:29:57Z 2023-01-01T17:29:57Z   FIRST_TIME_CONTRIBUTOR dogsheep/healthkit-to-sqlite/pulls/23

Not sure when this changed (iOS 16 maybe?), but the WorkoutStatistics now has a whole bunch of information about workouts, e.g. for runs it contains the distance (as a <WorkoutStatistics type="HKQuantityTypeIdentifierDistanceWalkingRunning ...> element).

Adding it as another column at leat allows me to pull these out (using SQLite's JSON support). I'm running with this patch on my own data now.

healthkit-to-sqlite 197882382 pull    
{
    "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/23/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1513238455 PR_kwDODEm0Qs5GUoPm 71 Archive: Fix "ni devices" typo in importer sometimes-i-send-pull-requests 26161409 open 0     0 2022-12-28T23:33:31Z 2022-12-28T23:33:31Z   FIRST_TIME_CONTRIBUTOR dogsheep/twitter-to-sqlite/pulls/71   twitter-to-sqlite 206156866 pull    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/71/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1513238314 PR_kwDODEm0Qs5GUoN6 70 Archive: Import Twitter Circle data sometimes-i-send-pull-requests 26161409 open 0     0 2022-12-28T23:33:09Z 2022-12-28T23:33:09Z   FIRST_TIME_CONTRIBUTOR dogsheep/twitter-to-sqlite/pulls/70   twitter-to-sqlite 206156866 pull    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/70/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1513238152 PR_kwDODEm0Qs5GUoMM 69 Archive: Import new tweets table name sometimes-i-send-pull-requests 26161409 open 0     0 2022-12-28T23:32:44Z 2022-12-28T23:32:44Z   FIRST_TIME_CONTRIBUTOR dogsheep/twitter-to-sqlite/pulls/69

Given the code here, it seems like in the past this file was named "tweet.js". In recent exports, it's named "tweets.js". The archive importer needs to be modified to take this into account. Existing logic is reused for importing this table. (However, the resulting table name will be different, matching the different file name -- archive_tweets, rather than archive_tweet).

twitter-to-sqlite 206156866 pull    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/69/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1513237982 PR_kwDODEm0Qs5GUoKL 68 Archive: Import mute table sometimes-i-send-pull-requests 26161409 open 0     0 2022-12-28T23:32:06Z 2022-12-28T23:32:06Z   FIRST_TIME_CONTRIBUTOR dogsheep/twitter-to-sqlite/pulls/68   twitter-to-sqlite 206156866 pull    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/68/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1513237712 PR_kwDODEm0Qs5GUoG_ 67 Add support for app-only bearer tokens sometimes-i-send-pull-requests 26161409 open 0     0 2022-12-28T23:31:20Z 2022-12-28T23:31:20Z   FIRST_TIME_CONTRIBUTOR dogsheep/twitter-to-sqlite/pulls/67

Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here: https://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a "bearer_token" key instead of "api_key", "api_secret_key", etc.

twitter-to-sqlite 206156866 pull    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/67/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1505411725 I_kwDODFdgUs5ZusKN 78 self-hosted or corp github enterprise ebdavison 549431 open 0     0 2022-12-20T22:51:45Z 2022-12-20T22:51:45Z   NONE  

We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?

github-to-sqlite 207052882 issue    
{
    "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1504352503 I_kwDOBm6k_c5Zqpj3 1968 Allow to hide some queries in metadata.yml CharlesNepote 562352 open 0     0 2022-12-20T10:45:41Z 2022-12-20T10:45:41Z   NONE  

By default all queries are displayed.

But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long (eg.) * reading the query is not important for the users, they only want to see the result

Of course, the user still could have the option to see the query.

It could be an option in the metadata file: yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode

The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL

See also: #1824

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1968/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1468495358 I_kwDOBm6k_c5Xh3X- 1910 Check incoming column types on various write APIs simonw 9599 open 0   Datasette 1.0a-next 8755003 0 2022-11-29T18:09:10Z 2022-12-13T05:29:09Z   OWNER  

I do think this needs type checking - I just tried and you really can send a string to an integer column and have it work, which feels bad.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1331089156

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1910/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1447465004 I_kwDOBm6k_c5WRpAs 1889 Ability to create new tokens via the API simonw 9599 open 0   Datasette 1.0a-next 8755003 0 2022-11-14T06:21:36Z 2022-12-13T05:29:08Z   OWNER  

Refs: - #1850

Initially I decided that the API shouldn't be able to create new tokens at all - I don't like the idea of an API token holder creating themselves additional tokens.

Then I realized that two of the API features are specifically more useful if you can generate fresh tokens via the API:

  • Tokes that expire after a time limit are MUCH more useful if they can be automatically generated
  • Likewise, tokens that are restricted to a subset of permissions (see #1855) make more sense to be generated like this, especially in conjunction with expiry times
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1889/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1216436131 I_kwDOBm6k_c5IgVej 1721 Implement plugin hooks: `register_table_extras`, `register_row_extras`, `register_query_extras` simonw 9599 open 0   Datasette 1.0a-next 8755003 0 2022-04-26T20:21:49Z 2022-12-13T05:29:07Z   OWNER  

Designed in: - #1720

Part of: - #262 - #1709

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1721/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1485017981 I_kwDODEpn8M5Yg5N9 2 table identifications has no column named previous_observation_taxon heaversm 520541 open 0     0 2022-12-08T16:47:17Z 2022-12-08T16:47:17Z   NONE  

Installed successfully with pip and ran inaturalist-to-sqlite inaturalist.db simonw and got the error:

sqlite3.OperationalError: table identifications has no column named previous_observation_taxon

inaturalist-to-sqlite 206202864 issue    
{
    "url": "https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1479920517 I_kwDOBm6k_c5YNcuF 1934 Return number of ignored/replaced items from /-/insert simonw 9599 open 0   Datasette 1.0 3268330 0 2022-12-06T19:01:58Z 2022-12-06T19:02:03Z   OWNER  

Idea from here: - https://github.com/simonw/sqlite-utils/issues/516

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1934/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1469821027 I_kwDOBm6k_c5Xm7Bj 1921 Document methods to get canned queries eyeseast 25778 open 0     0 2022-11-30T15:26:33Z 2022-11-30T23:34:21Z   CONTRIBUTOR  

Two methods will get canned queries for a Datasette instance:

Datasette.get_canned_queries will return all canned queries for a database that an actor can see.

Datasette.get_canned_query will return a single canned query by name.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1921/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1469796454 I_kwDOBm6k_c5Xm1Bm 1920 Document Datasette.metadata() method eyeseast 25778 open 0     0 2022-11-30T15:10:36Z 2022-11-30T15:10:36Z   CONTRIBUTOR  

Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503

This will be the official way to access metadata from plugins.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1920/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1456013930 I_kwDOBm6k_c5WyQJq 1906 Extract publish Heroku support to a plugin simonw 9599 open 0   Datasette 1.0 3268330 0 2022-11-19T00:02:51Z 2022-11-19T00:03:10Z   OWNER  

This is a strong argument for extracting the Heroku support out to a plugin - it would allow this to be fixed with a plugin release without needing to push a full release of Datasette itself.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1905#issuecomment-1320678715

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1906/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1393330070 PR_kwDODD6af84__DNJ 14 Photo links redmanmale 6782721 open 0     0 2022-10-01T09:44:15Z 2022-11-18T17:10:49Z   FIRST_TIME_CONTRIBUTOR dogsheep/swarm-to-sqlite/pulls/14
  • add to checkin_details view new column for a calculated photo links
  • supported multiple links split by newline
  • create events table if there's no events in the history to avoid SQL errors

Fixes #9.

swarm-to-sqlite 205429375 pull    
{
    "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/14/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1454532488 I_kwDOBm6k_c5WsmeI 1902 Document {% block crumbs %} for plugin authors simonw 9599 open 0   Datasette 1.0 3268330 0 2022-11-18T06:16:30Z 2022-11-18T06:16:39Z   OWNER  

For datasette-copyable I want to show breadcrumbs that take database/instance permissions into account, so I'm removing {% block nav %} entirely and replacing it with this:

html+jinja {% block crumbs %} {{ crumbs.nav(request=request, database=database, table=table) }} {% endblock %}

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1901#issuecomment-1319588163

I should document this.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1902/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1453134846 I_kwDOCGYnMM5WnRP- 513 Add or document streamlined workflow for importing Datasette csv / json exports henry501 19328961 open 0     0 2022-11-17T10:54:47Z 2022-11-17T10:54:47Z   NONE  

I'm working on some small front-end enhancements to the laion-aesthetic-datasette project, and I wanted to partially populate a database directly using exports from the existing Datasette instance instead of downloading the parquet files and creating my own multi-GB database.

There have been a number of small issues that are certainly related to my relative lack of familiarity with the toolkit, but that are still surprising.

For example: a CSV export of the images table (http://laion-aesthetic.datasette.io/laion-aesthetic-6pls.csv?sql=select+rowid%2C+url%2C+text%2C+domain_id%2C+width%2C+height%2C+similarity%2C+punsafe%2C+pwatermark%2C+aesthetic%2C+hash%2C+index_level_0+from+images+order+by+random%28%29+limit+100) has nested single quotes, double quotes, and commas that aren't handled by rows_from_file. Similarly, the json output has to be manually transformed to add the column names and remove extraneous information before sqlite_utils can import it.

I was able to work through these issues, but as an enhancement it would be really helpful to create or document a clear workflow that avoids the friction of this data transformation.

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/513/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1452360613 I_kwDOBm6k_c5WkUOl 1895 Avoid using host name when building absolute URLs? hubgit 14294 open 0     0 2022-11-16T22:21:27Z 2022-11-16T22:21:27Z   NONE  

When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with https://[service].run.app rather than the (custom) domain of the Firebase app.

I guess this is because a) the custom domain of the Firebase app isn't being passed through in the host header of the request to the Cloud Run instance and b) the absolute_url function in Datasette is using information from the request to build the URL.

Would it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1895/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1446657889 I_kwDOBm6k_c5WOj9h 1885 Integrate inside GUI app (tkinter) dmalves 5115787 open 0     0 2022-11-13T00:10:43Z 2022-11-13T00:11:09Z   NONE  

Hi, I'd like to integrate datasette inside a tkinter app. The app should be able to start/stop datasette server. How could I integrate datasette inside my app, so it can start and stop datasette server?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1885/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1424980545 I_kwDOBm6k_c5U73pB 1861 request.headers.get("Content-Type") fails simonw 9599 open 0     0 2022-10-27T03:39:12Z 2022-10-27T03:39:12Z   OWNER  

Turns out this is case-sensitive, needs to be:

request.headers.get("content-type") != "application/json"

That's not great usability. It should be case insensitive.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1861/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1410548368 I_kwDODFdgUs5UE0KQ 77 Feature: Support GitHub discussions frosencrantz 631242 open 0     0 2022-10-16T16:53:38Z 2022-10-16T16:53:38Z   CONTRIBUTOR  

Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it.

I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database.

However, I'm not offering a pull request.

github-to-sqlite 207052882 issue    
{
    "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1406860394 I_kwDOBm6k_c5T2vxq 1841 Drop format_bytes for Jinja filesizeformat filter simonw 9599 open 0     0 2022-10-12T22:06:34Z 2022-10-12T22:06:34Z   OWNER  

Turns out this isn't necessary:

https://github.com/simonw/datasette/blob/5aa359b86907d11b3ee601510775a85a90224da8/datasette/utils/init.py#L849-L858

I can use this instead: https://jinja.palletsprojects.com/en/3.1.x/templates/#jinja-filters.filesizeformat

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1841/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1399933513 I_kwDOBm6k_c5TcUpJ 1833 Ability to submit long queries by POST simonw 9599 open 0     0 2022-10-06T16:03:26Z 2022-10-06T16:18:00Z   OWNER  

Datasette doesn't limit URL lengths but some common web proxies do - the one in front of Google Cloud Run for example limits to 8KB total for incoming request headers: https://cloud.google.com/load-balancing/docs/quotas#https-lb-header-limits

This means longer SQL queries can break!

Need an optional mechanism for submitting queries by POST instead.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1833/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1396977994 I_kwDOBm6k_c5TRDFK 1830 Add documentation for writing tests with signed actor cookies simonw 9599 open 0     0 2022-10-04T23:51:26Z 2022-10-04T23:51:26Z   OWNER  

I use this pattirn in a lot of plugin tests, e.g. https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/tests/test_edit_templates.py#L55-L58 python actor = ds.sign({"a": {"id": "root"}}, "actor") response1 = await ds.client.get( "/-/edit-templates/_footer.html", cookies={"ds_actor": actor} ) I should add this to the documentation on this page: https://docs.datasette.io/en/latest/testing_plugins.html

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1830/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1378636455 I_kwDOBm6k_c5SLFKn 1815 `datasette publish provider .` to publish whole directory, similar to configuration directory mode simonw 9599 open 0     0 2022-09-19T23:28:59Z 2022-09-19T23:29:11Z   OWNER  

I haven't done this with any of my other datasette publish tools, but I do think it's a good idea. Being able to publish the entire directory - with templates and plugins and metadata - does seem very useful to me.

Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/issues/23#issuecomment-1251673489

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1815/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1375792876 I_kwDOBm6k_c5SAO7s 1811 Drop-down menu with "REGEXP" choice CharlesNepote 562352 open 0     0 2022-09-16T11:06:18Z 2022-09-16T15:30:31Z   NONE  

Drop-down menu below could add "REGEXP" choice when REGEXP sqlite extension is installed and used

Not sure. Close the issue if you don't find it relevant.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1811/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 1744.937ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows