home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

1,936 rows where repo = 107914493 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, draft, state_reason, created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 1,558
  • pull 378

state 2

  • closed 1,441
  • open 495

repo 1

  • datasette · 1,936 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1633077183 I_kwDOBm6k_c5hVse_ 2041 Remove obsolete table POST code simonw 9599 open 0   Datasette 1.0a3 8755003 1 2023-03-21T01:01:40Z 2023-03-21T01:02:13Z   OWNER  

Spotted this in: - #1999

POST /db/table currently executes obsolete code for inserting a row - I replaced that with /db/table/-/insert in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2041/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1551694938 PR_kwDOBm6k_c5IQeKz 1999 ?_extra= support (draft) simonw 9599 open 0     42 2023-01-21T04:55:18Z 2023-03-21T00:07:53Z   OWNER simonw/datasette/pulls/1999

Refs: - #262


:books: Documentation preview :books:: https://datasette--1999.org.readthedocs.build/en/1999/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1999/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
1620515757 I_kwDOBm6k_c5glxut 2039 Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\` asg017 15178711 open 0     0 2023-03-12T21:18:52Z 2023-03-12T21:18:52Z   CONTRIBUTOR  

From the Datasette discord: A user tried running the following command on windows:

datasette --load-extension="C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll" This failed with "The specified module could not be found", because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at "C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll", it instead tried to load the extension at "C" with entrypoint `"\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll".

This is hard because most absolute windows paths have a colon in them, like C:\foo.txt or D:\bar.txt. I'd image the --static flag is also vulnerable to this type of bug.

The "solution" is to use a relative path instead, but that doesn't feel that great.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2039/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
317001500 MDU6SXNzdWUzMTcwMDE1MDA= 236 datasette publish lambda plugin simonw 9599 open 0     11 2018-04-23T22:10:30Z 2023-03-12T14:04:15Z   OWNER  

Refs #217 - create a publish plugin that can deploy to AWS Lambda.

https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it).

Lambdas do get a 512 MB /tmp directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the .db file there and then have the lambda download it on startup.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/236/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1618249044 I_kwDOBm6k_c5gdIVU 2038 Consider a `strict_templates` setting simonw 9599 open 0     2 2023-03-10T02:09:13Z 2023-03-10T02:11:06Z   OWNER  

A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error.

Prototype here: diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( "Allow display of SQL trace debug information with ?_trace=1", ), Setting("base_url", "/", "Datasette URLs should use this base path"), + Setting("strict_templates", False, "Raise errors for undefined template variables"), ) _HASH_URLS_REMOVED = "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting("strict_templates"): + env_extras["undefined"] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters["escape_css_string"] = escape_css_string self.jinja_env.filters["quote_plus"] = urllib.parse.quote_plus Explored this idea a bit in: - #1999

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2038/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1605481359 PR_kwDOBm6k_c5LDwrF 2031 Expand foreign key references in row view as well tmcl-it 82332573 open 0     4 2023-03-01T18:43:09Z 2023-03-09T22:35:30Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2031

Unlike the table view, the single row view does not resolve foreign key references into labels. This patch extracts the foreign key reference expansion code from TableView.data() into a standalone function that is then called by both TableView.data() and RowView.data().


:books: Documentation preview :books:: https://datasette--2031.org.readthedocs.build/en/2031/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2031/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1613974869 PR_kwDOBm6k_c5LgPS- 2034 remove an unused `app` var in cli.py wenhoujx 4370201 open 0     1 2023-03-07T18:19:05Z 2023-03-09T22:34:29Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2034

this var app isn't actually used? unless init it does some side-effect outside of the event loop, idon't think it's necessary.

Feel free to ignore this PR if the deleted line actually does something.


:books: Documentation preview :books:: https://datasette--2034.org.readthedocs.build/en/2034/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2034/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1615891776 I_kwDOBm6k_c5gUI1A 2037 Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError simonw 9599 closed 0     3 2023-03-08T20:30:06Z 2023-03-09T22:33:39Z 2023-03-09T22:33:39Z OWNER  

FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory

From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191

``` =================================== FAILURES =================================== ____ test_install_requirements _______

run_module = <MagicMock name='run_module' id='139768358191936'>

@mock.patch("datasette.cli.run_module")
def test_install_requirements(run_module):
    runner = CliRunner()
  with runner.isolated_filesystem():

/home/runner/work/datasette/datasette/tests/test_cli.py:184:


/opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in enter return next(self.gen)


self = <click.testing.CliRunner object at 0x7f1e5bfb9490>, temp_dir = None

@contextlib.contextmanager
def isolated_filesystem(
    self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None
) -> t.Iterator[str]:
    """A context manager that creates a temporary directory and
    changes the current working directory to it. This isolates tests
    that affect the contents of the CWD to prevent them from
    interfering with each other.

    :param temp_dir: Create the temporary directory under this
        directory. If given, the created directory is not removed
        when exiting.

    .. versionchanged:: 8.0
        Added the ``temp_dir`` parameter.
    """
  cwd = os.getcwd()

E FileNotFoundError: [Errno 2] No such file or directory

/opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the "Calculate test coverage" one.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2037/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1615862295 I_kwDOBm6k_c5gUBoX 2036 `publish cloudrun` reuses image tags, which can lead to very surprising deploy problems simonw 9599 closed 0     6 2023-03-08T20:11:44Z 2023-03-08T20:57:34Z 2023-03-08T20:57:34Z OWNER  

See this issue: - https://github.com/simonw/datasette.io/issues/141

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2036/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1615692818 I_kwDOBm6k_c5gTYQS 2035 Potential feature: special support for `?a=1&a=2` on the query page simonw 9599 open 0   Datasette 1.0 3268330 13 2023-03-08T18:05:03Z 2023-03-08T20:14:47Z   OWNER  

From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138

The key idea is to make it easier for people to implement where id in (...) that's populated from query string arguments.

What if you could add ?id=11&id=32&id=62 to the URL and have that made available as a list that can be used in the query?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2035/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1590183272 I_kwDOBm6k_c5eyEVo 2027 How to redirect from "/" to a specific db/table dmick 1350673 open 0     4 2023-02-18T03:14:01Z 2023-03-08T04:42:22Z   NONE  

Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2027/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1612296210 I_kwDOBm6k_c5gGbAS 2033 `datasette install -r requirements.txt` simonw 9599 closed 0     2 2023-03-06T22:17:17Z 2023-03-06T22:54:52Z 2023-03-06T22:27:34Z OWNER  

Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2033/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1590839187 PR_kwDOBm6k_c5KSs9T 2028 add Python 3.11 classifier dtrodrigues 614233 closed 0     2 2023-02-19T20:16:03Z 2023-03-06T21:01:20Z 2023-03-06T21:01:19Z CONTRIBUTOR simonw/datasette/pulls/2028

Python 3.11 is tested in CI and is used in the docker image, so add the Python 3.11 Trove classifier.


:books: Documentation preview :books:: https://datasette--2028.org.readthedocs.build/en/2028/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2028/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1121583414 I_kwDOBm6k_c5C2gE2 1619 JSON link on row page is 404 if base_url setting is used simonw 9599 open 0     4 2022-02-02T07:09:53Z 2023-03-05T20:30:14Z   OWNER  

On my local environment:

datasette fixtures.db -p 3344 --setting base_url /foo/bar/

Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3

But... that json link goes here, which is a 404:

http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1619/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1605959201 I_kwDOBm6k_c5fuP4h 2032 datasette errors when foreign key integrity is enabled cldellow 193185 open 0     0 2023-03-02T01:27:51Z 2023-03-02T01:31:58Z   CONTRIBUTOR  

By default, SQLite does not enforce foreign key constraints. I typically enable these checks by running:

sql PRAGMA foreign_keys = ON;

inside of a prepare_connection hook.

If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with:

FOREIGN KEY constraint failed

This could be resolved by either: - deleting from the tables column last - changing the schema so that the foreign keys have ON DELETE CASCADE

Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in prepare_connection and trying not to enable fkey checks on the _internal database.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2032/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1594383280 I_kwDOBm6k_c5fCFuw 2030 How to use Datasette with apache webserver on GCP? gk7279 19700859 closed 0     2 2023-02-22T03:08:49Z 2023-02-22T21:54:39Z 2023-02-22T21:54:39Z NONE  

Hi Simon and Datasette team-

I have installed apache2 webserver inside GCP VM using apt.

I can see my "Hello World" index.html if I use the external IP of this GCP in a browser.

However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage.

I cannot invest Docker on this VM.

Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated.

Thanks.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2030/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
828858421 MDU6SXNzdWU4Mjg4NTg0MjE= 1258 Allow canned query params to specify default values wdccdw 1385831 open 0     5 2021-03-11T07:19:02Z 2023-02-20T23:39:58Z   NONE  

If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out.

Is there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1258/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1592327343 I_kwDOBm6k_c5e6Pyv 2029 Sorry Simon, didn't know how else to contact you llchristopherson 5804626 open 0     0 2023-02-20T19:02:53Z 2023-02-20T19:02:53Z   NONE  

Hi Simon,

Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org.

Thanks, Laura

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2029/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1586980089 PR_kwDOBm6k_c5KF-by 2026 Avoid repeating primary key columns if included in _col args runderwood 8513 open 0     0 2023-02-16T04:16:25Z 2023-02-16T04:16:41Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2026

...while maintaining given order.

Fixes #1975 (if I'm understanding correctly).


:books: Documentation preview :books:: https://datasette--2026.org.readthedocs.build/en/2026/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2026/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1581218043 PR_kwDOBm6k_c5JyqPy 2025 Add database metadata to index.html template context palewire 9993 open 0     0 2023-02-12T11:16:58Z 2023-02-12T11:17:14Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/2025

Fixes #2016


:books: Documentation preview :books:: https://datasette--2025.org.readthedocs.build/en/2025/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2025/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1323346408 I_kwDOBm6k_c5O4Kno 1775 i18n support johnfelipe 428820 open 0     9 2022-07-31T02:51:04Z 2023-02-10T18:04:40Z   NONE  

I want contribute for translate UI to es, de, de and it if you share strings

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1775/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1579973223 I_kwDOBm6k_c5eLHpn 2024 Mention WAL mode in documentation simonw 9599 open 0     1 2023-02-10T16:11:10Z 2023-02-10T16:11:53Z   OWNER  

It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2024/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1579695809 I_kwDOBm6k_c5eKD7B 2023 Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 mlaparie 80409402 closed 0     2 2023-02-10T13:35:01Z 2023-02-10T15:40:00Z 2023-02-10T15:39:59Z NONE  

On a Debian machine, using datasette 0.64.1 installed with pip3, I am getting a datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json in journalctl -xe. The same settings work on 0.54.1 on another Debian server.

This is my settings.json:

json { "default_page_size": 200, "max_returned_rows": 8000, "num_sql_threads": 3, "sql_time_limit_ms": 1000, "default_facet_size": 30, "facet_time_limit_ms": 200, "facet_suggest_time_limit_ms": 50, "hash_urls": false, "allow_facet": true, "allow_download": true, "suggest_facets": true, "default_cache_ttl": 5, "default_cache_ttl_hashed": 31536000, "cache_size_kb": 0, "allow_csv_stream": true, "max_csv_mb": 100, "truncate_cells_html": 2048, "force_https_urls": false, "template_debug": false, "base_url": "/pclim/db/" }

This looks ok to me. Would you have any ideas?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2023/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1578609658 I_kwDOBm6k_c5eF6v6 2022 Error 500 - not clear the cause DavidPratten 1667631 closed 0     1 2023-02-09T20:57:17Z 2023-02-09T21:13:50Z 2023-02-09T21:13:50Z NONE  

On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error.

http://127.0.0.1:8001/literature/authors_papers?authorId=100550354

The cause of the error is not apparent.

Is this expected behaviour?

David

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2022/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1577548579 I_kwDOBm6k_c5eB3sj 2021 Docker images for 1.0 alphas? meowcat 1563881 open 0     0 2023-02-09T09:35:52Z 2023-02-09T09:35:52Z   NONE  

Hi, would you consider putting 1.0alpha images on Dockerhub?

(Also, how usable are the alphas?)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2021/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
323658641 MDU6SXNzdWUzMjM2NTg2NDE= 262 Add ?_extra= mechanism for requesting extra properties in JSON simonw 9599 open 0   Datasette 1.0 3268330 25 2018-05-16T14:55:42Z 2023-02-08T18:36:48Z   OWNER  

Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional template_data() function which is called if the view is being rendered as HTML.

This template_data() function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON.

Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704

With features like Facets in #255 I'm beginning to want to move more items into the template_data() - in the case of facets it's the suggested_facets array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used.

But... as an API user, I want to still optionally be able to access that information.

Solution: Add a ?_extra=suggested_facets&_extra=table_metadata argument which can be used to optionally request additional blocks to be added to the JSON API.

Then redefine as many of the current template_data() features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates.

This could allow the JSON representation to be slimmed down further (removing e.g. the table_definition and view_definition keys) while still making that information available to API users who need it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/262/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1575880841 I_kwDOBm6k_c5d7giJ 2020 Documentation refers to "off" setting; doesn't seem to work, "false" does dmick 1350673 open 0     0 2023-02-08T10:38:10Z 2023-02-08T10:38:10Z   NONE  

https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using "off" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a "JSON boolean" and have the values "true" or "false". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2020/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1573424830 I_kwDOBm6k_c5dyI6- 2019 Refactor out the keyset pagination code simonw 9599 open 0     14 2023-02-06T23:04:00Z 2023-02-08T01:40:46Z   OWNER  

While working on: - #1999

I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination:

https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493

Extracting that into a utility function would simplify that code a lot.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2019/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
473288428 MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz 564 First proof-of-concept of Datasette Library simonw 9599 open 0     1 2019-07-26T10:22:26Z 2023-02-07T15:14:11Z   OWNER simonw/datasette/pulls/564

Refs #417. Run it like this:

 datasette -d ~/Library

Uses a new plugin hook - available_databases()

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/564/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
1571711808 I_kwDOBm6k_c5drmtA 2018 `check_visibility` gives confusing (wrong?) results if permission is `None` cldellow 193185 open 0     0 2023-02-06T01:03:08Z 2023-02-06T01:03:46Z   CONTRIBUTOR  

I'm trying to gate access to an edit UI on the user having update-row on the underlying view or table.

I expected datasette.check_visibility to be a good way to do this:

```python visible, private = await datasette.check_visibility( request.actor, permissions=[ ("update-row", (database, table)), ], )

if not visible:
    return None

```

But visible is returning true, even when there is no explicit update-row permission. (In this case, request.actor is None.)

Based on the update-row permissions docs, I expected this to be default deny, and so no explicit permission would result in false.

I think the root cause is that check_visibility calls ensure_permissions and expects it to throw if the permission is not available.

But ensure_permissions does not throw when permission_allowed returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2018/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1571207083 I_kwDOBm6k_c5dprer 2016 Database metadata fields like description are not available in the index page template's context palewire 9993 open 0   Datasette 1.0 3268330 1 2023-02-05T02:25:53Z 2023-02-05T22:56:43Z   NONE  

When looping through databases in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2016/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1566081801 PR_kwDOBm6k_c5JAcGy 2014 Bump black from 22.12.0 to 23.1.0 dependabot[bot] 49699333 open 0     0 2023-02-01T13:06:16Z 2023-02-01T13:06:35Z   CONTRIBUTOR simonw/datasette/pulls/2014

Bumps black from 22.12.0 to 23.1.0.

Release notes

Sourced from black's releases.

23.1.0

Highlights

This is the first release of 2023, and following our stability policy, it comes with a number of improvements to our stable style, notably improvements to empty line handling and the removal of redundant parentheses in several contexts.

There are also many changes to the preview style; try out black --preview and give us feedback to help us set the stable style for next year.

In addition to style changes, Black now automatically infers the supported Python versions from your pyproject.toml file, removing the need to set Black's target versions separately.

Stable style

  • Introduce the 2023 stable style, which incorporates most aspects of last year's preview style (#3418). Specific changes:
    • Enforce empty lines before classes and functions with sticky leading comments (#3302) (22.12.0)
    • Reformat empty and whitespace-only files as either an empty file (if no newline is present) or as a single newline character (if a newline is present) (#3348) (22.12.0)
    • Correctly handle trailing commas that are inside a line's leading non-nested parens (#3370) (22.12.0)
    • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168) (since 22.8.0)
    • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209) (22.8.0)
    • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227) (22.8.0)
    • Docstring quotes are no longer moved if it would violate the line length limit (#3044, #3430) (22.6.0)
    • Parentheses around return annotations are now managed (#2990) (22.6.0)
    • Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
    • Remove unnecessary parentheses in with statements (#2926) (22.6.0)
    • Remove trailing newlines after code block open (#3035) (22.6.0)
    • Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
    • Remove unnecessary parentheses from except statements (#2939) (22.3.0)
    • Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
    • Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)
  • Fix a crash when a colon line is marked between # fmt: off and # fmt: on (#3439)

Preview style

  • Format hex codes in unicode escape sequences in string literals (#2916)
  • Add parentheses around if-else expressions (#2278)
  • Improve performance on large expressions that contain many strings (#3467)
  • Fix a crash in preview style with assert + parenthesized string (#3415)
  • Fix crashes in preview style with walrus operators used in function return annotations and except clauses (#3423)
  • Fix a crash in preview advanced string processing where mixed implicitly concatenated regular and f-strings start with an empty span (#3463)
  • Fix a crash in preview advanced string processing where a standalone comment is placed before a dict's value (#3469)
  • Fix an issue where extra empty lines are added when a decorator has # fmt: skip applied or there is a standalone comment between decorators (#3470)
  • Do not put the closing quotes in a docstring on a separate line, even if the line is too long (#3430)
  • Long values in dict literals are now wrapped in parentheses; correspondingly unnecessary parentheses around short values in dict literals are now removed; long string lambda values are now wrapped in parentheses (#3440)
  • Fix two crashes in preview style involving edge cases with docstrings (#3451)
  • Exclude string type annotations from improved string processing; fix crash when the return type annotation is stringified and spans across multiple lines (#3462)
  • Wrap multiple context managers in parentheses when targeting Python 3.9+ (#3489)
  • Fix several crashes in preview style with walrus operators used in with statements or tuples (#3473)
  • Fix an invalid quote escaping bug in f-string expressions where it produced invalid code. Implicitly concatenated f-strings with different quotes can now be merged or quote-normalized by changing the quotes used in expressions. (#3509)

... (truncated)

Changelog

Sourced from black's changelog.

23.1.0

Highlights

This is the first release of 2023, and following our stability policy, it comes with a number of improvements to our stable style, including improvements to empty line handling, removal of redundant parentheses in several contexts, and output that highlights implicitly concatenated strings better.

There are also many changes to the preview style; try out black --preview and give us feedback to help us set the stable style for next year.

In addition to style changes, Black now automatically infers the supported Python versions from your pyproject.toml file, removing the need to set Black's target versions separately.

Stable style

  • Introduce the 2023 stable style, which incorporates most aspects of last year's preview style (#3418). Specific changes:
    • Enforce empty lines before classes and functions with sticky leading comments (#3302) (22.12.0)
    • Reformat empty and whitespace-only files as either an empty file (if no newline is present) or as a single newline character (if a newline is present) (#3348) (22.12.0)
    • Implicitly concatenated strings used as function args are now wrapped inside parentheses (#3307) (22.12.0)
    • Correctly handle trailing commas that are inside a line's leading non-nested parens (#3370) (22.12.0)
    • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168) (since 22.8.0)
    • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209) (22.8.0)
    • Implicitly concatenated strings inside a list, set, or tuple are now wrapped inside parentheses (#3162) (22.8.0)
    • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227) (22.8.0)
    • Docstring quotes are no longer moved if it would violate the line length limit (#3044, #3430) (22.6.0)
    • Parentheses around return annotations are now managed (#2990) (22.6.0)
    • Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
    • Remove unnecessary parentheses in with statements (#2926) (22.6.0)
    • Remove trailing newlines after code block open (#3035) (22.6.0)
    • Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
    • Remove unnecessary parentheses from except statements (#2939) (22.3.0)
    • Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
    • Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)

... (truncated)

Commits
  • b0d1fba Prepare release 23.1.0 (#3536)
  • 69ca0a4 Infer target version based on project metadata (#3219)
  • c4bd2e3 Draft for Black 2023 stable style (#3418)
  • 226cbf0 Fix unsafe cast in linegen.py w/ await yield handling (#3533)
  • f4ebc68 Upgrade isort (#3534)
  • 6407ebb Remove Python version in the_basics.md (#3528)
  • 196b1f3 Fix black --help output for --python-cell-magics option to be reproducibl...
  • d950f15 Update document now that paren wrapping CMs on Python 3.9+ is implemented (#3...
  • a36878e Fix an invalid quote escaping bug in f-string expressions (#3509)
  • eabff67 Format hex code in unicode escape sequences in string literals (#2916)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--2014.org.readthedocs.build/en/2014/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2014/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1556065335 PR_kwDOBm6k_c5Ie5nA 2004 use single quotes for string literals, fixes #2001 cldellow 193185 open 0     1 2023-01-25T05:08:46Z 2023-02-01T06:37:18Z   CONTRIBUTOR simonw/datasette/pulls/2004

This modernizes some uses of double quotes for string literals to use only single quotes, fixes simonw/datasette#2001

While developing it, I manually enabled the stricter mode by using the code snippet at https://gist.github.com/cldellow/85bba507c314b127f85563869cd94820

I think that code snippet isn't generally safe/portable, so I haven't tried to automate it in the tests.


:books: Documentation preview :books:: https://datasette--2004.org.readthedocs.build/en/2004/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2004/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1565179870 I_kwDOBm6k_c5dSr_e 2013 Datasette uses non-standard quoting for identifiers cldellow 193185 open 0     0 2023-02-01T00:05:39Z 2023-02-01T00:06:30Z   CONTRIBUTOR  

Related to #2001, but where #2001 was about literals, this is about identifiers

From https://www.sqlite.org/lang_keywords.html:

"keyword" A keyword in double-quotes is an identifier. [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility.

Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/init.py#L345-L349, in some of the other DB access code, and in some of the test fixtures.

Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2013/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1564774831 I_kwDOBm6k_c5dRJGv 2012 Missing space in database summary simonw 9599 open 0     0 2023-01-31T18:01:13Z 2023-01-31T18:01:13Z   OWNER  

Spotted this on an instance index page:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2012/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1564769997 I_kwDOBm6k_c5dRH7N 2011 Applied facet did not result in an "x" icon to dismiss it simonw 9599 open 0     1 2023-01-31T17:57:44Z 2023-01-31T17:58:54Z   OWNER  

That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata

It's for Contract Type of Non-Purchasing Contract (Rents, etc.) - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2011/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1563264257 I_kwDOBm6k_c5dLYUB 2010 Row page should default to card view simonw 9599 open 0   Datasette 1.0 3268330 1 2023-01-30T21:49:37Z 2023-01-30T21:52:06Z   OWNER  

Datasette currently uses the same table layout on the row pages as it does on the table pages:

https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect

https://datasette.io/content/pypi_packages/datasette-column-inspect

If you shrink down to mobile width you get this instead, on both of those pages:

I think that view, which I think of as the "card view", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2010/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1186696202 I_kwDOBm6k_c5Gu4wK 1696 Show foreign key label when filtering simonw 9599 open 0     2 2022-03-30T16:18:54Z 2023-01-29T20:56:20Z   OWNER  

For example here:

3 corresponds to "Human Related: Other" - it would be neat to display this in this area of the page somehow.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1696/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1560982210 PR_kwDOBm6k_c5IvYKw 2008 array facet: don't materialize unnecessary columns cldellow 193185 open 0     8 2023-01-28T19:33:40Z 2023-01-29T18:17:40Z   CONTRIBUTOR simonw/datasette/pulls/2008

The presence of inner.* causes SQLite to materialize a row with all the columns. Those columns will be discarded later.

Instead, we can select only the column we'll use. This lets SQLite's optimizer realize that the other columns in the CTE definition aren't needed.

On a test table with 278K rows, 98K of which had an array, this speeds up the facet calculation from 4 sec to 1 sec.


:books: Documentation preview :books:: https://datasette--2008.org.readthedocs.build/en/2008/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2008/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1515815014 I_kwDOBm6k_c5aWYBm 1973 render_cell plugin hook's row object is not a sqlite.Row cldellow 193185 open 0     4 2023-01-01T20:27:46Z 2023-01-29T00:40:31Z   CONTRIBUTOR  

From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette:

row - sqlite.Row The SQLite row object that the value being rendered is part of

This appears to actually be a CustomRow, but I think that's unrelated to my issue.

I have a table:

sql CREATE TABLE IF NOT EXISTS "dss_job_stats"( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) );

On datasette 0.63.2, the render_cell hook receives a row value that looks like:

CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')])

I expected the job_id value to be 2, but it's actually {'value': 2, 'label': '2'}.

I can work around this, but was wondering if this was intended behaviour?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1973/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1560662739 I_kwDOBm6k_c5dBdLT 2007 `render_cell()` hook should take an optional `request` argument simonw 9599 closed 0     0 2023-01-28T03:13:00Z 2023-01-28T03:34:26Z 2023-01-28T03:34:26Z OWNER  

From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2007/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1558644003 I_kwDOBm6k_c5c5wUj 2006 Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release simonw 9599 open 0   Datasette 1.0 3268330 2 2023-01-26T19:17:40Z 2023-01-26T19:20:53Z   OWNER  

I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes.

I can hopefully help avoid that by shipping one last entry in the 0.x series that ensures datasette publish pins to <1.0 when it installs Datasette itself.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2006/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1557507274 I_kwDOBm6k_c5c1azK 2005 `extra_template_vars` should be OK to return `None` simonw 9599 open 0     1 2023-01-26T01:40:45Z 2023-01-26T01:41:50Z   OWNER  

Got this exception and had to make sure it always returned {}:

File ".../python3.11/site-packages/datasette/app.py", line 1049, in render_template assert isinstance(extra_vars, dict), "extra_vars is of type {}".format( AssertionError: extra_vars is of type <class 'NoneType'>

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2005/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1555701851 PR_kwDOBm6k_c5IdsD7 2003 Show referring tables and rows when the referring foreign key is compound fgregg 536941 open 0     3 2023-01-24T21:31:31Z 2023-01-25T18:44:42Z   CONTRIBUTOR simonw/datasette/pulls/2003

sqlite foreign keys can be compound, but that is not as well supported by datasette as single column foreign keys.

in particular,

  1. in a table view, there is not a link from the row to the referenced row if the foreign key is compound
  2. in a row view, there is no listing of tables and rows that refer to the focal row if those referencing foreign keys are compound.

Both of these issues are discussed in #1099.

This PR only fixes the second one, because it's not clear what the right UX is for the first issue.

Some things that might not be desirable about this approach.

  1. it changes the external API, by changing column => columns and other_column => other_columns (see inline comment for more discussion.
  2. There are various places where the plural foreign keys have to be checked for length and discarded or transformed to singular.
datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2003/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1553615704 I_kwDOBm6k_c5cmktY 2001 Datasette is not compatible with SQLite's strict quoting compilation option gwk 406380 open 0     4 2023-01-23T19:10:07Z 2023-01-25T04:59:58Z   NONE  

I have linked Python3.11 on macOS against recent SQLite that was compiled using -DSQLITE_DQS=0. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a "MySQL 3.x misfeature". See https://www.sqlite.org/quirks.html#dblquote for background.

Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment.

My experience was to pip install datasette, then run datasette serve -I my-data.db. When I visit http://127.0.0.1:8001 I get a 500 response.

The error: sqlite3.OperationalError: no such column: geometry_columns

The responsible SQL: 'select 1 from sqlite_master where tbl_name = "geometry_columns"'

I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: "select 1 from sqlite_master where tbl_name = 'geometry_columns'".

With this change, I get a little further, but have the same problem with the first table name in my database (in my case, "Meta"): OperationalError: no such column: Meta Traceback (most recent call last): File "/Users/gwk/external/datasette/datasette/app.py", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/base.py", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/base.py", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/index.py", line 70, in get "fts_table": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 211, in in_thread return fn(conn) ^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 363, in <lambda> return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/utils/__init__.py", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - "GET / HTTP/1.1" 500 Internal Server Error

I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon.

Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0);

As far as I can tell, sqlite3_db_config is not exposed in Python, but perhaps we could figure out how to invoke it using ctypes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2001/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
743371103 MDU6SXNzdWU3NDMzNzExMDM= 1099 Support linking to compound foreign keys simonw 9599 open 0     6 2020-11-15T23:23:17Z 2023-01-25T00:58:26Z   OWNER  

Reported as a bug in #1098 because they caused 500 errors - but it would be even better if Datasette could hyperlink to related rows via compound foreign keys.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1099/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1531991339 I_kwDOBm6k_c5bUFUr 1989 Suggestion: Hiding columns pax 116795 open 0     2 2023-01-13T09:33:32Z 2023-01-24T17:48:59Z   NONE  

As there's the possibility of hiding tables - I've run into the need of hiding specific columns - data that's either not relevant for public or can't be shown due to privacy reasons.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1989/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1554032168 I_kwDOBm6k_c5coKYo 2002 Document how actors are displayed simonw 9599 open 0     0 2023-01-24T00:08:49Z 2023-01-24T00:08:49Z   OWNER  

https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/init.py#L1052-L1056

This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2002/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1536851861 I_kwDOBm6k_c5bmn-V 1994 Stuck on loading screen jackhagley 10913053 open 0     1 2023-01-17T18:33:49Z 2023-01-23T08:21:08Z   NONE  

Can’t actually open it!

Downloaded today from the releases tab

Running macOS13.1

bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms STUCK

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1994/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1552368054 I_kwDOBm6k_c5ch0G2 2000 rewrite_sql hook cldellow 193185 open 0     1 2023-01-23T01:02:52Z 2023-01-23T06:08:01Z   CONTRIBUTOR  

I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like

python def rewrite_sql(datasette, database, request, fn, sql, params)

It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. fn would indicate which method was being used, in case that's relevant for the SQL inspection -- for example execute only permits a single statement.

The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed.

Plugins that could be written with this hook:

  • https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching
  • a plugin to inspect and reject unsafe Spatialite function calls (reported by Simon in Discord)
  • a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID
  • a plugin to maintain audit tables when users write to a table
  • a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue

Flaws with this idea:

execute_fn and execute_write_fn would not go through this hook, which limits the guarantees you can make about it for security purposes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2000/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
774332247 MDExOlB1bGxSZXF1ZXN0NTQ1MjY0NDM2 1159 Improve the display of facets information lovasoa 552629 open 0     8 2020-12-24T11:01:47Z 2023-01-22T20:58:11Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/1159

This PR changes the display of facets to hopefully make them more readable.

Before | After ---|--- |

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1159/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
749283032 MDU6SXNzdWU3NDkyODMwMzI= 1101 register_output_renderer() should support streaming data simonw 9599 open 0   Datasette 1.0 3268330 13 2020-11-24T02:17:09Z 2023-01-21T22:07:19Z   OWNER  

I'd like to implement this by first extending the register_output_renderer() hook to support streaming huge responses, then switching CSV to use the plugin hook in addition to TSV using it.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1096#issuecomment-732542285

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1101/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1551113681 I_kwDOBm6k_c5cdB3R 1998 `datasette --version` should also show the SQLite version simonw 9599 open 0     2 2023-01-20T16:11:30Z 2023-01-20T18:19:06Z   OWNER  

Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1998/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1533673397 I_kwDOBm6k_c5baf-1 1991 fts5 tables are not auto-detected and hidden keturn 83819 open 0     0 2023-01-15T06:00:42Z 2023-01-20T04:54:24Z   NONE  

I set up a Datasette instance and was following the docs on full-text search.

When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either.

If I manually set fts_table for a view, then search does work as expected.

My table and view creation code looks like this: py connection.execute("""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """)   connection.execute("""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1991/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1538342965 PR_kwDOBm6k_c5HpNYo 1996 Document custom json encoder eyeseast 25778 open 0     1 2023-01-18T16:54:14Z 2023-01-19T12:55:57Z   CONTRIBUTOR simonw/datasette/pulls/1996

Closes #1983

All documentation here. Edits welcome.


:books: Documentation preview :books:: https://datasette--1996.org.readthedocs.build/en/1996/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1996/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1034535001 I_kwDOBm6k_c49qcBZ 1497 Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" simonw 9599 closed 0     18 2021-10-24T22:57:07Z 2023-01-18T17:13:45Z 2021-10-24T23:36:55Z OWNER  

This means the Datasette 0.59.1 release has not been published to Docker Hub.

Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true

Preparing to unpack .../libc6_2.32-4_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype Checking for services that may need to be restarted... Checking init scripts... Unpacking libc6:amd64 (2.32-4) over (2.28-10) ... Setting up libc6:amd64 (2.32-4) ... /usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory dpkg: error processing package libc6:amd64 (--configure): installed libc6:amd64 package post-installation script subprocess returned error exit status 127 Errors were encountered while processing: libc6:amd64 E: Sub-process /usr/bin/dpkg returned an error code (1) The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository "deb http://httpredir.debian.org/debian sid main" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100 Same problem when I attempted to publish using the "Push specific Docker tag" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1497/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1538197093 I_kwDOBm6k_c5brwZl 1995 foreign_keys error 500 jonschoning 137183 open 0     0 2023-01-18T15:27:36Z 2023-01-18T16:44:01Z   NONE  

Error 500 expected string or bytes-like object

espial-new.sqlite3.zip

run datasette espial-new.sqlite3 & click on any table other than User

/home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """.format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk["other_column"]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk["other_table"]), │ │ 817 │ │ │ placeholders=", ".join(["?"] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = <Database: espial-new (mutable, size=53248)> │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = <datasette.app.Datasette object at 0x7f0f2e77e980> │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f"[{s}]" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File "/home/jon/.local/lib/python3.10/site-packages/datasette/app.py", line 1354, in route_path response = await view(request, send) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 134, in view return await self.dispatch_request(request) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 91, in dispatch_request return await handler(request) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File "/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py", line 603, in _data_traced await self.ds.expand_foreign_keys( File "/home/jon/.local/lib/python3.10/site-packages/datasette/app.py", line 814, in expand_foreign_keys other_column=escape_sqlite(fk["other_column"]), File "/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - "GET /espial-new/bookmark HTTP/1.1" 500 Internal Server Error INFO: 127.0.0.1:38574 - "GET /-/static/app.css?d59929 HTTP/1.1" 200 OK

Schema: ``` CREATE TABLE IF NOT EXISTS "user" ( "id" INTEGER PRIMARY KEY, "name" VARCHAR NOT NULL, "password_hash" VARCHAR NOT NULL, "api_token" VARCHAR NULL, "private_default" BOOLEAN NOT NULL, "archive_default" BOOLEAN NOT NULL, "privacy_lock" BOOLEAN NOT NULL, CONSTRAINT "unique_user_name" UNIQUE ("name") );

CREATE TABLE IF NOT EXISTS "bookmark" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), "href" VARCHAR NOT NULL, "description" VARCHAR NOT NULL, "extended" VARCHAR NOT NULL, "time" TIMESTAMP NOT NULL, "shared" BOOLEAN NOT NULL, "to_read" BOOLEAN NOT NULL, "selected" BOOLEAN NOT NULL, "archive_href" VARCHAR NULL, CONSTRAINT "unique_user_href" UNIQUE ("user_id", "href"), CONSTRAINT "unique_user_slug" UNIQUE ("user_id", "slug") );

CREATE TABLE IF NOT EXISTS "bookmark_tag" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "tag" VARCHAR NOT NULL, "bookmark_id" INTEGER NOT NULL REFERENCES "bookmark" ON DELETE RESTRICT ON UPDATE RESTRICT, "seq" INTEGER NOT NULL, CONSTRAINT "unique_user_tag_bookmark_id" UNIQUE ("user_id", "tag", "bookmark_id"), CONSTRAINT "unique_user_bookmark_id_tag_seq" UNIQUE ("user_id", "bookmark_id", "tag", "seq") );

CREATE TABLE IF NOT EXISTS "note" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), "length" INTEGER NOT NULL, "title" VARCHAR NOT NULL, "text" VARCHAR NOT NULL, "is_markdown" BOOLEAN NOT NULL, "shared" BOOLEAN NOT NULL DEFAULT false, "created" TIMESTAMP NOT NULL, "updated" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1995/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
627794879 MDU6SXNzdWU2Mjc3OTQ4Nzk= 782 Redesign default .json format simonw 9599 open 0   Datasette 1.0a3 8755003 54 2020-05-30T18:47:07Z 2023-01-17T02:05:45Z   OWNER  

The default JSON just isn't right. I find myself using ?_shape=array for almost everything I build against the API.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/782/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1534904478 PR_kwDOBm6k_c5HdwRg 1992 Bump blacken-docs from 1.12.1 to 1.13.0 dependabot[bot] 49699333 open 0     1 2023-01-16T13:05:05Z 2023-01-16T13:12:32Z   CONTRIBUTOR simonw/datasette/pulls/1992

Bumps blacken-docs from 1.12.1 to 1.13.0.

Changelog

Sourced from blacken-docs's changelog.

1.13.0 (2023-01-16)

  • Note Adam Johnson is new maintainer.

  • Require Black 22.1.0+.

  • Add --rst-literal-blocks option, to also format text in reStructuredText literal blocks, starting with ::. Sphinx highlights these with the project’s default language, which defaults to Python.

Commits
  • 1238e1d Version 1.13.0
  • 4e6dc07 Fix setup.cfg long_description_content_type
  • 579a71a Standardize setup.cfg (#212)
  • a6b2ba0 Changelog entry about change in maintenance
  • 3cf8b9a Standard pre-commit config (#211)
  • bcd3669 Standardize test file name (#210)
  • 6d1771d Remove setup.py (#209)
  • 4e5ab6e Improve README (#208)
  • c19c57f Add support for reStructuredText literal blocks (#196)
  • 6af8099 Move from tmpdir pytest fixture to tmp_path (#206)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1992.org.readthedocs.build/en/1992/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1992/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1532000914 I_kwDOBm6k_c5bUHqS 1990 Suggestion: Highlight error messages ('These facets timed out') pax 116795 open 0     0 2023-01-13T09:40:58Z 2023-01-13T09:40:58Z   NONE  

I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the These facets timed out notice.

It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1990/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1529707837 I_kwDOBm6k_c5bLX09 1988 Reconsider pattern where plugins could break existing template context simonw 9599 open 0   Datasette 1.0 3268330 4 2023-01-11T21:13:43Z 2023-01-11T21:25:05Z   OWNER  

I hadn't run into an issue with plugins like datasette-template-sql interfering with the existing context for other features before! Definitely not a good thing.

Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1988/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1529452371 I_kwDOBm6k_c5bKZdT 1987 installpython3.com is now a spam website simonw 9599 closed 0     4 2023-01-11T17:55:12Z 2023-01-11T18:29:26Z 2023-01-11T18:29:25Z OWNER  

Need to stop linking to it from the docs.

I'll link to https://www.python.org/about/gettingstarted/ instead.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1987/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1528448642 I_kwDOBm6k_c5bGkaC 1985 Don't let Datasette(path) without a list cause weird errors simonw 9599 closed 0     1 2023-01-11T05:17:44Z 2023-01-11T18:25:04Z 2023-01-11T18:25:04Z OWNER  

I got a confusing sqlite3.OperationalError: disk I/O error error in my tests, it turned out it was because this: python ds = Datasette(path) Should have been this: python ds = Datasette([path])

Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1985/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1528995601 PR_kwDOBm6k_c5HJ55o 1986 Bump sphinx from 6.1.2 to 6.1.3 dependabot[bot] 49699333 open 0     0 2023-01-11T13:02:36Z 2023-01-11T13:02:58Z   CONTRIBUTOR simonw/datasette/pulls/1986

Bumps sphinx from 6.1.2 to 6.1.3.

Release notes

Sourced from sphinx's releases.

v6.1.3

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.3 (released Jan 10, 2023)

Bugs fixed

  • #11116: Reverted to previous Sphinx 5 node copying method
  • #11117: Reverted changes to parallel image processing from Sphinx 6.1.0
  • #11119: Supress ValueError in the linkcheck builder
Commits
  • 776d01e Bump to 6.1.3 final
  • a2e922a CHANGES for Sphinx 6.1.3
  • 31162a9 Handle exceptions for get_node_source and get_node_line
  • dcb4429 Restore Sphinx 5 nodes.Element copying behaviour
  • 2a7c40d Undo parallel image changes
  • 7841d3d Ignore more checks in Ruff 0.0.214
  • ddbc5b5 Bump version
  • See full diff in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1986.org.readthedocs.build/en/1986/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1986/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1525560504 PR_kwDOBm6k_c5G-ZsQ 1982 Bump sphinx from 5.3.0 to 6.1.2 dependabot[bot] 49699333 closed 0     1 2023-01-09T13:06:11Z 2023-01-10T02:03:21Z 2023-01-10T02:03:19Z CONTRIBUTOR simonw/datasette/pulls/1982

Bumps sphinx from 5.3.0 to 6.1.2.

Release notes

Sourced from sphinx's releases.

v6.1.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.2 (released Jan 07, 2023)

Bugs fixed

  • #11101: LaTeX: div.topic_padding key of sphinxsetup documented at 5.1.0 was implemented with name topic_padding

  • #11099: LaTeX: shadowrule key of sphinxsetup causes PDF build to crash since Sphinx 5.1.0

  • #11096: LaTeX: shadowsize key of sphinxsetup causes PDF build to crash since Sphinx 5.1.0

  • #11095: LaTeX: shadow of :dudir:topic and contents_ boxes not in page margin since Sphinx 5.1.0

    .. _contents: https://docutils.sourceforge.io/docs/ref/rst/directives.html#table-of-contents

  • #11100: Fix copying images when running under parallel mode.

Release 6.1.1 (released Jan 05, 2023)

Bugs fixed

  • #11091: Fix util.nodes.apply_source_workaround for literal_block nodes with no source information in the node or the node's parents.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

  • Adopted the Ruff_ code linter.

    .. _Ruff: https://github.com/charliermarsh/ruff

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()

... (truncated)

Commits
  • 393b408 Bump to 6.1.2 final
  • d8a5dd8 Add note to CHANGES for PR 11100
  • a1cd19e Fix copying images under parallel execution (#11100)
  • 5008291 Ignore more checks in Ruff 0.0.213
  • 6259c2b Markup typo in docs
  • 7945aeb LaTeX: fix 5.1.0 bugs related to topic and contents boxes (#11102)
  • 77aaa86 Bump to 6.1.1 final
  • 476c115 Suppress ValueError in apply_source_workaround (#11092)
  • c80d656 Bump version
  • 4e1004a Bump to 6.1.0 final
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1982.org.readthedocs.build/en/1982/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1982/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1515185383 I_kwDOBm6k_c5aT-Tn 1971 Upgrade for Sphinx 6.0 (once Furo has support for it) simonw 9599 closed 0     3 2022-12-31T19:04:35Z 2023-01-10T02:02:34Z 2023-01-10T02:02:34Z OWNER  

A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/

``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog

Traceback (most recent call last): File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 299, in next_line self.line = self.input_lines[self.line_offset] File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 1136, in getitem return self.data[i] IndexError: list index out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 226, in run self.next_line() File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 302, in next_line raise EOFError EOFError

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py", line 281, in build_main app.build(args.force_all, args.filenames) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py", line 344, in build self.builder.build_update() File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/init.py", line 310, in build_update self.build(to_build, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/init.py", line 326, in build updated_docnames = set(self.read()) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/init.py", line 433, in read self._read_serial(docnames) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/init.py", line 454, in _read_serial self.read_doc(docname) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/init.py", line 510, in read_doc publisher.publish() File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py", line 224, in publish self.document = self.reader.read(self.source, self.parser, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py", line 103, in read self.parse() File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/init.py", line 76, in parse self.parser.parse(self.input, document) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 233, in run context, next_state, result = self.check_line( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 445, in check_line return method(match, context, next_state) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 325, in section self.new_subsection(title, lineno, messages) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 391, in new_subsection newabsoffset = self.nested_parse( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 233, in run context, next_state, result = self.check_line( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 445, in check_line return method(match, context, next_state) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 325, in section self.new_subsection(title, lineno, messages) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 391, in new_subsection newabsoffset = self.nested_parse( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 233, in run context, next_state, result = self.check_line( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 445, in check_line return method(match, context, next_state) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py", line 239, in run result = state.eof(context) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 2725, in eof self.blank(None, context, None) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 2716, in blank paragraph, literalnext = self.paragraph( File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting

Exception occurred: File "/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at https://github.com/sphinx-doc/sphinx/issues. Thanks! ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1971/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1526635374 PR_kwDOBm6k_c5HCCY2 1984 Upgrade Sphinx simonw 9599 closed 0     1 2023-01-10T02:00:40Z 2023-01-10T02:02:33Z 2023-01-10T02:02:33Z OWNER simonw/datasette/pulls/1984

Refs #1971


:books: Documentation preview :books:: https://datasette--1984.org.readthedocs.build/en/1984/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1984/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1509783085 I_kwDOBm6k_c5Z_XYt 1969 sql-formatter javascript is not now working with CloudFlare rocketloader fgregg 536941 open 0     0 2022-12-23T21:14:06Z 2023-01-10T01:56:33Z   CONTRIBUTOR  

This is probably not a bug with datasette, but I thought you might want to know, @simonw.

I noticed today that my CloudFlare proxied datasette instance lost the "Format SQL" option. I'm pretty sure it was there last week.

In the CloudFlare settings, if I turn off Rocket Loader, I get the "Format SQL" option back.

Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading?

I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1969/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1525815985 I_kwDOBm6k_c5a8hqx 1983 Make CustomJSONEncoder a documented public API simonw 9599 open 0     3 2023-01-09T15:27:05Z 2023-01-09T15:35:58Z   OWNER  

It's used by datasette-geojson here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1983/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1426080014 I_kwDOBm6k_c5VAEEO 1867 /db/table/-/rename API (also allows atomic replace) simonw 9599 open 0   Datasette 1.0a3 8755003 1 2022-10-27T18:13:23Z 2023-01-09T15:34:12Z   OWNER  

There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first.

...

If people care about that kind of thing they could always push all of their inserts to a table called _tablename and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism).

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1867/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
710650633 MDU6SXNzdWU3MTA2NTA2MzM= 979 Default table view JSON should include CREATE TABLE simonw 9599 closed 0     3 2020-09-28T23:54:58Z 2023-01-09T15:32:39Z 2023-01-09T15:32:22Z OWNER  

https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/979/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1082584499 I_kwDOBm6k_c5Ahu2z 1558 Redesign `facet_results` JSON structure prior to Datasette 1.0 simonw 9599 open 0   Datasette 1.0 3268330 3 2021-12-16T19:45:10Z 2023-01-09T15:31:17Z   OWNER  

Decision: as an initial fix I'm going to de-duplicate those keys by using tags__array etc - with a _2 on the end if that key is already used.

I'll open a separate issue to redesign this better for Datasette 1.0.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1558/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1522778923 I_kwDOBm6k_c5aw8Mr 1978 Document datasette.urls.row and row_blob eyeseast 25778 closed 0     2 2023-01-06T15:45:51Z 2023-01-09T14:30:00Z 2023-01-09T14:30:00Z CONTRIBUTOR  

These are in the codebase but not in documentation. I think everything else in this class is documented.

```python class Urls: ... def row(self, database, table, row_path, format=None): path = f"{self.table(database, table)}/{row_path}" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path)

def row_blob(self, database, table, row_path, column):
    return self.table(database, table) + "/{}.blob?_blob_column={}".format(
        row_path, urllib.parse.quote_plus(column)
    )

```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1978/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  not_planned
1522552817 PR_kwDOBm6k_c5G0XxH 1977 Bump sphinx from 5.3.0 to 6.1.1 dependabot[bot] 49699333 closed 0     2 2023-01-06T13:02:12Z 2023-01-09T13:06:17Z 2023-01-09T13:06:14Z CONTRIBUTOR simonw/datasette/pulls/1977

Bumps sphinx from 5.3.0 to 6.1.1.

Release notes

Sourced from sphinx's releases.

v6.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.1 (released Jan 05, 2023)

Bugs fixed

  • #11091: Fix util.nodes.apply_source_workaround for literal_block nodes with no source information in the node or the node's parents.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

  • Adopted the Ruff_ code linter.

    .. _Ruff: https://github.com/charliermarsh/ruff

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()
    • Moved sphinx.util.xmlname_checker() to sphinx.builders.epub3._XML_NAME_PATTERN

    Moved to sphinx.util.display:

    • sphinx.util.status_iterator
    • sphinx.util.display_chunk
    • sphinx.util.SkipProgressMessage
    • sphinx.util.progress_message

    Moved to sphinx.util.http_date:

    • sphinx.util.epoch_to_rfc1123
    • sphinx.util.rfc1123_to_epoch

    Moved to sphinx.util.exceptions:

    • sphinx.util.save_traceback

... (truncated)

Commits
  • 77aaa86 Bump to 6.1.1 final
  • 476c115 Suppress ValueError in apply_source_workaround (#11092)
  • c80d656 Bump version
  • 4e1004a Bump to 6.1.0 final
  • a2176d4 Fix deprecation warnings
  • 2c104e9 Merge branch '6.0.x'
  • a27d262 Bump to 6.0.1 final
  • 821569e Add note for Pygments
  • 222d366 imgmath: Fix relative file path (#10965)
  • c499f66 Add SIM113 lint (#11057)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1977.org.readthedocs.build/en/1977/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1977/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1524983536 I_kwDOBm6k_c5a5Wbw 1981 Canned query field labels truncated simonw 9599 open 0     1 2023-01-09T06:04:24Z 2023-01-09T06:05:44Z   OWNER  

Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1981/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1524867951 I_kwDOBm6k_c5a46Nv 1980 "Cannot sort table by id" when sortable_columns is used simonw 9599 open 0     2 2023-01-09T03:21:33Z 2023-01-09T03:23:53Z   OWNER  

I had an instance with this in metadata.yml:

yaml databases: timezones: tables: timezones: sortable_columns: - tzid When I clicked on the "Apply" button here:

It sent me to /timezones/timezones?_sort=id&id__exact=133 with the error message:

500: Cannot sort table by id

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1980/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1524076587 I_kwDOBm6k_c5a15Ar 1979 More useful error message if enable_load_extension is not available simonw 9599 closed 0     5 2023-01-07T19:13:19Z 2023-01-08T00:21:23Z 2023-01-08T00:21:23Z OWNER  

I get this from:

datasette --load-extension spatialite --get /-/versions.json

File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/datasette/app.py", line 614, in _prepare_connection conn.enable_load_extension(True) AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' It would be useful if Datasette caught this error and output something more friendly.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1979/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
957310278 MDU6SXNzdWU5NTczMTAyNzg= 1409 `default_allow_sql` setting (a re-imagining of the old `allow_sql` setting) simonw 9599 closed 0   Datasette 1.0 3268330 10 2021-07-31T19:48:56Z 2023-01-07T18:06:01Z 2023-01-05T00:51:31Z OWNER  

In 49d6d2f7b0f6cb02e25022e1c9403811f1fa0a7c as part of #813 I removed the allow_sql setting - on the basis that users could disable the ability to execute custom SQL queries using the new permission system instead.

I don't think this was the right decision. Disabling custom SQL is an important security capability, and explaining how to do it using permissions is significantly more complex than letting people know they can add --setting allow_sql off.

So I want to bring that setting back - maybe with a different, better name - and have it modify the default for that option if the permissions system doesn't have an opinion.

That way people can still use the setting but then use permissions to allow specific signed-in users access to execute SQL.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1409/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1520712722 PR_kwDOBm6k_c5GuDBN 1976 Bump sphinx from 5.3.0 to 6.1.0 dependabot[bot] 49699333 closed 0     2 2023-01-05T13:02:37Z 2023-01-06T13:02:17Z 2023-01-06T13:02:15Z CONTRIBUTOR simonw/datasette/pulls/1976

Bumps sphinx from 5.3.0 to 6.1.0.

Release notes

Sourced from sphinx's releases.

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

  • Adopted the Ruff_ code linter.

    .. _Ruff: https://github.com/charliermarsh/ruff

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()
    • Moved sphinx.util.xmlname_checker() to sphinx.builders.epub3._XML_NAME_PATTERN

    Moved to sphinx.util.display:

    • sphinx.util.status_iterator
    • sphinx.util.display_chunk
    • sphinx.util.SkipProgressMessage
    • sphinx.util.progress_message

    Moved to sphinx.util.http_date:

    • sphinx.util.epoch_to_rfc1123
    • sphinx.util.rfc1123_to_epoch

    Moved to sphinx.util.exceptions:

    • sphinx.util.save_traceback
    • sphinx.util.format_exception_cut_frames

Features added

  • Cache doctrees in the build environment during the writing phase.
  • Make all writing phase tasks support parallel execution.
  • #11072: Use PEP 604 (X | Y) display conventions for typing.Optional and typing.Optional types within the Python domain and autodoc.

... (truncated)

Commits
  • 4e1004a Bump to 6.1.0 final
  • a2176d4 Fix deprecation warnings
  • 2c104e9 Merge branch '6.0.x'
  • a27d262 Bump to 6.0.1 final
  • 821569e Add note for Pygments
  • 222d366 imgmath: Fix relative file path (#10965)
  • c499f66 Add SIM113 lint (#11057)
  • 0fbd8af Add missing default arguments in sphinx-apidoc.rst (#11084)
  • f89f943 Remove flake8 plugins in favour of Ruff (#11085)
  • 0479115 Suppress lint failures from Ruff 0.0.211 (#11086)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1976.org.readthedocs.build/en/1976/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1976/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1516376583 PR_kwDOBm6k_c5GfPJL 1974 Bump sphinx from 5.3.0 to 6.0.0 dependabot[bot] 49699333 closed 0     2 2023-01-02T13:04:26Z 2023-01-05T13:02:42Z 2023-01-05T13:02:40Z CONTRIBUTOR simonw/datasette/pulls/1974

Bumps sphinx from 5.3.0 to 6.0.0.

Release notes

Sourced from sphinx's releases.

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.0.0 (released Dec 29, 2022)

Dependencies

  • #10468: Drop Python 3.6 support
  • #10470: Drop Python 3.7, Docutils 0.14, Docutils 0.15, Docutils 0.16, and Docutils 0.17 support. Patch by Adam Turner

Incompatible changes

  • #7405: Removed the jQuery and underscore.js JavaScript frameworks.

    These frameworks are no longer be automatically injected into themes from Sphinx 6.0. If you develop a theme or extension that uses the jQuery, $, or $u global objects, you need to update your JavaScript to modern standards, or use the mitigation below.

    The first option is to use the sphinxcontrib.jquery_ extension, which has been developed by the Sphinx team and contributors. To use this, add sphinxcontrib.jquery to the extensions list in conf.py, or call app.setup_extension("sphinxcontrib.jquery") if you develop a Sphinx theme or extension.

    The second option is to manually ensure that the frameworks are present. To re-add jQuery and underscore.js, you will need to copy jquery.js and underscore.js from the Sphinx repository_ to your static directory, and add the following to your layout.html:

    .. code-block:: html+jinja

    {%- block scripts %} {{ super() }} {%- endblock %}

    .. _sphinxcontrib.jquery: https://github.com/sphinx-contrib/jquery/

    Patch by Adam Turner.

  • #10471, #10565: Removed deprecated APIs scheduled for removal in Sphinx 6.0. See :ref:dev-deprecated-apis for details. Patch by Adam Turner.

  • #10901: C Domain: Remove support for parsing pre-v3 style type directives and roles. Also remove associated configuration variables c_allow_pre_v3 and c_warn_on_allowed_pre_v3. Patch by Adam Turner.

Features added

... (truncated)

Commits
  • 5b56a23 Bump to 6.0.0 final
  • f1d1e9c Update coverage workflow for Tox 4
  • 66a738c Update coverage workflow for new configuration location
  • 041e5f8 Add test coverage for 'today_fmt' reference substitution (#10980)
  • da25145 Remove unnecessary conditional import in sphinx.ext.napoleon (#11043)
  • 45a0ea9 Migrate coveragepy config into pyproject.toml (#11025)
  • 3ec54f1 Create a pydata_sphinx_theme section in usage examples (#11046)
  • 32bce8f Copy edit the tutorial (#11049)
  • 9844162 Fix example using add_config_value (#10937)
  • bf4a626 RTD builder: add graphviz depedendency (#11040)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

:books: Documentation preview :books:: https://datasette--1974.org.readthedocs.build/en/1974/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1974/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1516815571 I_kwDOBm6k_c5aaMTT 1975 _col=id can cause id column to export twice in CSV export simonw 9599 open 0     0 2023-01-03T00:25:15Z 2023-01-03T00:25:21Z   OWNER  

https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1

csv id,id,title,body 1,1,WaSP Phase II,"<p>The <a href=""http://www.webstandards.org/"">Web Standards</a> project has launched Phase II.</p>" That should not have two id columns.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1975/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1515182998 I_kwDOBm6k_c5aT9uW 1970 Path "None" in _internal database table simonw 9599 closed 0     2 2022-12-31T18:51:05Z 2022-12-31T19:22:58Z 2022-12-31T18:52:49Z OWNER  

See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1970/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1503010009 PR_kwDOBm6k_c5FyT3c 1967 Add favicon to documentation choldgraf 1839645 closed 0     2 2022-12-19T14:01:04Z 2022-12-31T19:15:51Z 2022-12-31T19:00:31Z CONTRIBUTOR simonw/datasette/pulls/1967

I've been browsing the datasette documentation and found it hard to quickly locate tabs with many of them open, because it does not ship a favicon. So this PR:

  • Grabs the favicon .png from datasette itself[^1]
  • Adds it to the _static/ folder
  • Sets html_favicon to load it in the docs

[^1]: I also learned that Chrome can fetch favicons as an internal service! See chrome://favicon/https://datasette.io/tools/github-to-sqlite.

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1967/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1501900064 I_kwDOBm6k_c5ZhS0g 1966 Broken link to live demo in Getting started docs lbellomo 7551922 closed 0     1 2022-12-18T13:17:00Z 2022-12-31T19:15:19Z 2022-12-31T19:15:10Z NONE  

The link in Play with a live demo in Getting started to https://fivethirtyeight.datasettes.com/fivethirtyeight is broken and the datasette is no longer working (maybe due to the end of the free tier).

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1966/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1515186569 I_kwDOBm6k_c5aT-mJ 1972 Fix Sphinx warning about extlink extension simonw 9599 closed 0     0 2022-12-31T19:12:04Z 2022-12-31T19:13:26Z 2022-12-31T19:13:26Z OWNER  

[sphinx-autobuild] > sphinx-build -b html /Users/simon/Dropbox/Development/datasette/docs /Users/simon/Dropbox/Development/datasette/docs/_build Running Sphinx v5.3.0 loading pickled environment... done WARNING: extlinks: Sphinx-6.0 will require a caption string to contain exactly one '%s' and all other '%' need to be escaped as '%%'.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1972/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1115435536 I_kwDOBm6k_c5CfDIQ 1614 Try again with SQLite codemirror support simonw 9599 open 0     1 2022-01-26T20:05:20Z 2022-12-23T21:27:10Z   OWNER  

I tried and failed to implement autocomplete a while ago. Relevant code:

https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335

Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1614/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1355148385 I_kwDOBm6k_c5Qxexh 1796 Research an upgrade to CodeMirror 6 simonw 9599 open 0     4 2022-08-30T04:27:46Z 2022-12-23T21:27:03Z   OWNER  

There are still a bunch of bugs in CodeMirror 5 that affect various mobile browsers - see Datasette Discord report here: https://discord.com/channels/823971286308356157/823971286941302908/1013878624992108645

https://user-images.githubusercontent.com/9599/187349269-7b7c0c8c-3894-4810-82f0-de7c1eb940b3.mp4

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1796/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1501843596 PR_kwDOBm6k_c5FuaJm 1965 Detect server start/stop more reliably. janl 11321 closed 0     2 2022-12-18T10:03:42Z 2022-12-20T19:08:26Z 2022-12-18T16:01:51Z CONTRIBUTOR simonw/datasette/pulls/1965

This is useful, especially in testing, since your test hosts might not reliabliy start the server within two seconds, so we do a definite check before progressing.

By the same token, after kill $server_pid wait for the pid to be gone from the process list.

Since now the script can end prematurely, I also added a cleanup function to make sure the temporary certs are removed in any case.

n.b. this could also be done with the use of trap 'fn' ERR but that felt like a bit too much magic for this short a script.


:books: Documentation preview :books:: https://datasette--1965.org.readthedocs.build/en/1965/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1965/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1504352503 I_kwDOBm6k_c5Zqpj3 1968 Allow to hide some queries in metadata.yml CharlesNepote 562352 open 0     0 2022-12-20T10:45:41Z 2022-12-20T10:45:41Z   NONE  

By default all queries are displayed.

But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long (eg.) * reading the query is not important for the users, they only want to see the result

Of course, the user still could have the option to see the query.

It could be an option in the metadata file: yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode

The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL

See also: #1824

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1968/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1496652622 I_kwDOBm6k_c5ZNRtO 1955 invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things Rik-de-Kort 32839123 closed 0     36 2022-12-14T13:39:56Z 2022-12-19T04:34:16Z 2022-12-18T02:45:18Z NONE  

In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app.

My most recent deployment, however, fails loudly by shouting that Datasette.invoke_startup() was not called. It does not seem to be possible to call invoke_startup when running using a uvicorn command directly like this (I've reproduced this locally using uvicorn). Two candidates that I have tried: * Uvicorn has a --factory option, but the app factory has to be synchronous, so no await invoke_startup there * asyncio.get_event_loop().run_until_complete is also not an option because uvicorn already has the event loop running.

One additional option is: * Use Gunicorn's server hooks to call invoke_startup. These are also synchronous, but I might be able to get ahead of the event loop starting here.

In my current deployment setup, it does not appear to be possible to use datasette serve directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described here) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation.

Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears!

Almost forgot, minimal reproducer: ```python from datasette import Datasette

ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ```

Save as app.py in the same folder as global-power-plants.db, and then try running uvicorn app:app.

Opening the resulting Datasette instance in the browser will show the error message.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1955/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1447050738 I_kwDOBm6k_c5WQD3y 1886 Call for birthday presents: if you're using Datasette, let us know how you're using it here simonw 9599 open 0     13 2022-11-13T19:25:51Z 2022-12-18T17:34:20Z   OWNER  

Datasette is 5 years old today. To celebrate, I'm asking the community for birthday presents:

https://simonwillison.net/2022/Nov/13/datasette-birthday/

To celebrate this open source project’s birthday, I’ve decided to try something new: I’m going to ask for birthday presents.

An aspect of Datastte’s marketing that I’ve so far neglected is social proof. I think it’s time to change that: I know people are using the software to do cool things, but this often happens behind closed doors.

For Datastte’s birthday, I’m looking for endorsements and case studies and just general demonstrations that show how people are using it do so cool stuff.

So: if you’ve used Datasette to solve a problem, and you’re willing to publicize it, please give us the gift of your endorsement!

[...]

Add a comment to this issue thread describing what you’re doing. Just a few sentences is fine—though a screenshot or even a link to a live instance would be even better

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1886/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 2,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1501778647 I_kwDOBm6k_c5Zg1LX 1964 Cog menu is not keyboard accessible (also no ARIA) simonw 9599 open 0     1 2022-12-18T06:36:28Z 2022-12-18T06:37:28Z   OWNER  

This menu here: https://latest.datasette.io/fixtures/attraction_characteristic

You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1964/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1306984363 I_kwDOBm6k_c5N5v-r 1771 minor a11y: <select> has no visual indicator when tabbed to mustafa0x 1473102 closed 0     5 2022-07-17T04:30:14Z 2022-12-18T06:34:20Z 2022-12-18T06:28:12Z NONE     datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1771/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1501713288 I_kwDOBm6k_c5ZglOI 1963 0.63.3 bugfix release simonw 9599 closed 0     2 2022-12-18T02:48:15Z 2022-12-18T03:26:55Z 2022-12-18T03:26:55Z OWNER  

I'm going to ship a release which back-ports these two fixes:

  • https://github.com/simonw/datasette/issues/1958
  • https://github.com/simonw/datasette/issues/1955
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1963/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
806849424 MDU6SXNzdWU4MDY4NDk0MjQ= 1221 Support SSL/TLS directly simonw 9599 closed 0     4 2021-02-12T00:18:29Z 2022-12-18T02:39:04Z 2021-02-12T00:52:18Z OWNER  

This should be pretty easy because Uvicorn supports them already. Need a good mechanism for testing it - https://pypi.org/project/trustme/ looks ideal.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1221/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1408757705 I_kwDOBm6k_c5T9-_J 1843 Intermittent "Too many open files" error running tests simonw 9599 open 0     16 2022-10-14T04:45:01Z 2022-12-17T22:02:41Z   OWNER  

Partial stack trace from one of them: ``` /Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.10/site-packages/jinja2/loaders.py:200: in get_source f = open_if_exists(filename)


filename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html', mode = 'rb'

def open_if_exists(filename: str, mode: str = "rb") -> t.Optional[t.IO]:
    """Returns a file descriptor for the filename if that file exists,
    otherwise ``None``.
    """
    if not os.path.isfile(filename):
        return None
  return open(filename, mode)

E OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/error.html' ```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1843/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  reopened
1499081664 I_kwDOBm6k_c5ZWivA 1959 Refactor test suite to use mostly `async def` tests simonw 9599 closed 0     9 2022-12-15T21:02:54Z 2022-12-17T21:49:37Z 2022-12-17T21:49:36Z OWNER  

I got blocked working on this issue due to weird and hard-to-debug test suite problems:

  • 1955

The test suite has needed a major upgrade for several years now. It has a LOT of def test_... synchronous functions that could be upgraded to async def for better performance and less test complexity - I've used the new async def pattern in plugins and new tests for a couple of years now.

Hopefully I can get more of the tests to use in-memory named databases too, ideally so I can fix this consistent problem:

  • 1843

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1959/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1499150951 PR_kwDOBm6k_c5FlZmG 1960 Port as many tests as possible to async def tests against ds_client simonw 9599 closed 0     29 2022-12-15T21:45:53Z 2022-12-17T21:47:56Z 2022-12-17T21:47:55Z OWNER simonw/datasette/pulls/1960

Refs: - #1959


:books: Documentation preview :books:: https://datasette--1960.org.readthedocs.build/en/1960/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1960/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1500636982 I_kwDOBm6k_c5Zcec2 1962 Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient` simonw 9599 open 0     1 2022-12-16T17:56:51Z 2022-12-16T21:55:29Z   OWNER  

In this issue I replaced a whole bunch of places that used the non-async app_client fixture with an async ds_client fixture instead: - #1959

But I didn't get everything, and a lot of tests are still using the old TestClient mechanism as a result.

The main work here is replacing all of the app_client_... fixtures which use variants on the default client - and changing the tests that call make_app_client() to do something else instead.

This requires some careful thought. I need to come up with a really nice pattern for creating variants on the ds_client default fixture - and do so in a way that minimizes the number of open files, refs:

  • 1843

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1962/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1497909798 I_kwDOBm6k_c5ZSEom 1958 datasette --root running in Docker doesn't reliably show the magic URL davidhaley 11729897 closed 0     11 2022-12-13T16:29:13Z 2022-12-16T00:59:12Z 2022-12-16T00:55:19Z NONE  

I followed these steps:

docker run datasetteproject/datasette pip install datasette-upload-csvs

docker commit $(docker ps -lq) datasette-with-plugins

docker run -p 8001:8001 -v $(pwd):/mnt datasette-with-plugins datasette --root -p 8001 -h 0.0.0.0

Visited: http://127.0.0.1:8001/-/plugins

Visited: http://localhost:8001/-/upload-csvs

I may have missed a step?

Thank you.


Ubuntu 22.04.1 LTS

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1958/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 247.3ms · About: github-to-sqlite