id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1994861266,PR_kwDOBm6k_c5fhgOS,2209,Fix query for suggested facets with column named value,198537,rgieseke,open,0,,,,,3,2023-11-15T14:13:30Z,2023-11-15T15:31:12Z,,CONTRIBUTOR,simonw/datasette/pulls/2209,"See discussion in https://github.com/simonw/datasette/issues/2208 ---- :books: Documentation preview :books:: https://datasette--2209.org.readthedocs.build/en/2209/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139 In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates. You can't even try calling `post_body()` and implement your own custom parsing because of: - #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1884335789,PR_kwDOCGYnMM5Zs0KB,591,Test against Python 3.12 preview,9599,simonw,closed,0,,,,,3,2023-09-06T16:10:00Z,2023-11-04T00:58:03Z,2023-11-04T00:58:02Z,OWNER,simonw/sqlite-utils/pulls/591,"https://dev.to/hugovk/help-test-python-312-beta-1508/ ---- :books: Documentation preview :books:: https://sqlite-utils--591.org.readthedocs.build/en/591/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/591/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",0, 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,dazzag24,closed,0,,,,,3,2019-02-14T16:30:22Z,2023-10-25T13:23:04Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886812002,PR_kwDOBm6k_c5Z1N2L,2181,actors_from_ids plugin hook and datasette.actors_from_ids() method,9599,simonw,closed,0,,,,,3,2023-09-08T01:51:07Z,2023-09-08T04:24:00Z,2023-09-08T04:23:59Z,OWNER,simonw/datasette/pulls/2181,"Refs: - #2180 This plugin hook is feature complete - including documentation and tests. I'm not going to land it in Datasette `main` until we've used it at least once though, which should happen promptly in development for [Datasette Cloud](https://www.datasette.cloud/). ---- :books: Documentation preview :books:: https://datasette--2181.org.readthedocs.build/en/2181/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1884330740,PR_kwDOBm6k_c5ZszDF,2174,Use $DATASETTE_INTERNAL in absence of --internal,15178711,asg017,open,0,,,,,3,2023-09-06T16:07:15Z,2023-09-08T00:46:13Z,,CONTRIBUTOR,simonw/datasette/pulls/2174,"#refs 2157, specifically [this comment](https://github.com/simonw/datasette/issues/2157#issuecomment-1700291967) Passing in `--internal my_internal.db` over and over again can get repetitive. This PR adds a new configurable env variable `DATASETTE_INTERNAL_DB_PATH`. If it's defined, then it takes place as the path to the internal database. Users can still overwrite this behavior by passing in their own `--internal internal.db` flag. In draft mode for now, needs tests and documentation. Side note: Maybe we can have a sections in the docs that lists all the ""configuration environment variables"" that Datasette respects? I did a quick grep and found: - `DATASETTE_LOAD_PLUGINS` - `DATASETTE_SECRETS` ---- :books: Documentation preview :books:: https://datasette--2174.org.readthedocs.build/en/2174/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2174/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1883055640,PR_kwDOBm6k_c5ZociX,2173,click-default-group>=1.2.3,9599,simonw,closed,0,,,,,3,2023-09-06T02:33:28Z,2023-09-06T02:50:10Z,2023-09-06T02:50:10Z,OWNER,simonw/datasette/pulls/2173,"Now available as a wheel: - https://github.com/click-contrib/click-default-group/issues/21 ---- :books: Documentation preview :books:: https://datasette--2173.org.readthedocs.build/en/2173/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,simonw,open,0,,,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,asg017,open,0,,,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 787098146,MDU6SXNzdWU3ODcwOTgxNDY=,1190,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,1024355,tomershvueli,closed,0,,,,,3,2021-01-15T18:18:42Z,2023-08-30T22:16:39Z,2023-08-30T22:16:38Z,NONE,,"If I have a self-hosted instance of Datasette up and running, I'd like to be able to the use the CLI to publish databases to that instance, not only Google or Heroku. Ideally there'd be a `url` parameter or something similar to which one could point the publish command to their instance. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865174661,PR_kwDOBm6k_c5YsfZ7,2152,Bump the python-packages group with 3 updates,49699333,dependabot[bot],closed,0,,,,,3,2023-08-24T13:34:44Z,2023-08-28T13:49:39Z,2023-08-28T13:49:37Z,CONTRIBUTOR,simonw/datasette/pulls/2152,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.3
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.3

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.3 (released Aug 23, 2023)

Dependencies

  • #11576: Require sphinxcontrib-serializinghtml 1.1.9.

Bugs fixed

  • Fix regression in autodoc.Documenter.parse_name().
  • Fix regression in JSON serialisation.
  • #11543: autodoc: Support positional-only parameters in classmethod methods when autodoc_preserve_defaults is True.
  • Restore support string methods on path objects. This is deprecated and will be removed in Sphinx 8. Use :py:funcos.fspath to convert :py:class:pathlib.Path objects to strings, or :py:class:pathlib.Path's methods to work with path objects.

Release 7.2.2 (released Aug 17, 2023)

Bugs fixed

  • Fix the signature of the StateMachine.insert_input() patch, for when calling with keyword arguments.
  • Fixed membership testing (in) for the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
  • Fixed a type error in SingleFileHTMLBuilder._get_local_toctree, includehidden may be passed as a string or a boolean.
  • Fix :noindex: for PyModule and JSModule``.

Release 7.2.1 (released Aug 17, 2023)

Bugs fixed

  • Restored the the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon. This will be removed in Sphinx 9.
  • Restored calls to Builder.add_{css,js}_file(), which several extensions relied upon.
  • Restored the private API TocTree.get_toctree_ancestors(), which several extensions relied upon.

Release 7.2.0 (released Aug 17, 2023)

... (truncated)

Commits
  • 2f6ea14 Bump to 7.2.3 final
  • 511e407 Implement bool() for string paths
  • 494de73 Implement hash() for string paths
  • 2986aa1 Override special methods for string paths
  • 07b87e9 Update CHANGES for 7.2.3
  • 6b17dd1 Support string methods on path objects (#11619)
  • a73fb59 Support positional-only parameters in classmethods (#11635)
  • 02cb02c Fix invocation of python -m sphinx build
  • 6183b6a Require sphinxcontrib-serializinghtml 1.1.9 or later
  • 1e16f21 Fix regression in autodoc.Documenter.parse_name (#11613)
  • Additional commits viewable in compare view

Updates `furo` from 2023.7.26 to 2023.8.19
Changelog

Sourced from furo's changelog.

Changelog

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2152.org.readthedocs.build/en/2152/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324720095,MDU6SXNzdWUzMjQ3MjAwOTU=,275,"""config"" section in metadata.json (root, database and table level)",9599,simonw,closed,0,,,,,3,2018-05-20T16:02:28Z,2023-08-23T01:28:37Z,2023-08-23T01:28:37Z,OWNER,,"Split off from #274 Metadata should an optional `""config""` section at root, table or database level. The TableView and RowView and DatabaseView and BaseView classes could all have a `.config(""key"")` method which knows how to resolve the hierarchy of configs. This will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,simonw,open,0,,,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1719759468,PR_kwDOBm6k_c5RBXH_,2077,Bump furo from 2023.3.27 to 2023.5.20,49699333,dependabot[bot],closed,0,,,,,3,2023-05-22T13:58:16Z,2023-07-27T13:40:55Z,2023-07-27T13:40:53Z,CONTRIBUTOR,simonw/datasette/pulls/2077,"Bumps [furo](https://github.com/pradyunsg/furo) from 2023.3.27 to 2023.5.20.
Changelog

Sourced from furo's changelog.

Changelog

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.
  • Replace the meta generator tag with a comment.
  • Tweak labels with icons to prevent users selecting icons as text on touch.

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.

... (truncated)

Commits
  • d2c9ca8 Prepare release: 2023.05.20
  • 662d21b Update changelog
  • 591780b Bump compatible Sphinx version
  • c2e7837 Bump NodeJS and package versions
  • dd85574 Use the reference HtmlFormatter class defined on PygmentsBridge. (#657)
  • 6bff419 Fix broken link (#654)
  • e7f732e Improve the screen-reader label for sidebar collapse
  • 48c0bf2 Drop the check for the theme name
  • 1b17d81 [pre-commit.ci] pre-commit autoupdate (#646)
  • 4904fd5 Remove Python 3.8 constraint from Black pre-commit config (#647)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2023.3.27&new-version=2023.5.20)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) You can trigger a rebase of this PR by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2077.org.readthedocs.build/en/2077/ > **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2077/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,aki-k,open,0,,,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1726603778,PR_kwDOBm6k_c5RYvTU,2080,New View base class,9599,simonw,closed,0,,,,,3,2023-05-25T23:22:55Z,2023-05-26T00:18:45Z,2023-05-26T00:18:44Z,OWNER,simonw/datasette/pulls/2080,"Refs: - #2078 TODO: - [x] Teach router layer how to handle this - [x] Use it for something ---- :books: Documentation preview :books:: https://datasette--2080.org.readthedocs.build/en/2080/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2080/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718586377,PR_kwDOCGYnMM5Q9cAv,549,TUI powered by Trogon,9599,simonw,closed,0,,,,,3,2023-05-21T17:55:42Z,2023-05-21T18:42:00Z,2023-05-21T18:41:56Z,OWNER,simonw/sqlite-utils/pulls/549,"Refs: - #545 ---- :books: Documentation preview :books:: https://sqlite-utils--549.org.readthedocs.build/en/549/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1665200812,PR_kwDOCGYnMM5OKveS,537,Support self-referencing FKs in `Table.create`,544011,numist,closed,0,,,,,3,2023-04-12T20:26:59Z,2023-05-08T22:45:33Z,2023-05-08T21:10:01Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/537," ---- :books: Documentation preview :books:: https://sqlite-utils--537.org.readthedocs.build/en/537/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1555701851,PR_kwDOBm6k_c5IdsD7,2003,Show referring tables and rows when the referring foreign key is compound,536941,fgregg,open,0,,,,,3,2023-01-24T21:31:31Z,2023-01-25T18:44:42Z,,CONTRIBUTOR,simonw/datasette/pulls/2003,"sqlite foreign keys can be compound, but that is not as well supported by datasette as single column foreign keys. in particular, 1. in a table view, there is not a link from the row to the referenced row if the foreign key is compound 2. in a row view, there is no listing of tables and rows that refer to the focal row if those referencing foreign keys are compound. Both of these issues are discussed in #1099. This PR only fixes the second one, because it's not clear what the right UX is for the first issue. ![Screenshot 2023-01-24 at 19-47-40 nlrb bargaining_unit](https://user-images.githubusercontent.com/536941/214454749-d53deead-4151-4329-a5d4-8a7a454de7d3.png) Some things that might not be desirable about this approach. 1. it changes the external API, by changing `column` => `columns` and `other_column` => `other_columns` (see inline comment for more discussion. 2. There are various places where the plural foreign keys have to be checked for length and discarded or transformed to singular. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2003/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 710650633,MDU6SXNzdWU3MTA2NTA2MzM=,979,Default table view JSON should include CREATE TABLE,9599,simonw,closed,0,,,,,3,2020-09-28T23:54:58Z,2023-01-09T15:32:39Z,2023-01-09T15:32:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used. > > I'll open a separate issue to redesign this better for Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1485488236,PR_kwDOBm6k_c5E1iJG,1938,"""permissions"" blocks in metadata.json/yaml",9599,simonw,closed,0,,,8711695, Datasette 1.0a2,3,2022-12-08T22:07:36Z,2022-12-13T05:23:18Z,2022-12-13T05:23:18Z,OWNER,simonw/datasette/pulls/1938,"Refs #1636 - [x] Documentation - [ ] Implementation - [ ] Validate metadata to check there are no nonsensical permissions (like `debug-menu` set at the table level) - [ ] Tests ---- :books: Documentation preview :books:: https://datasette--1938.org.readthedocs.build/en/1938/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1938/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470320227,I_kwDOBm6k_c5Xo05j,1923,latest.datasette.io Cloud Run deploys failing,9599,simonw,closed,0,,,,,3,2022-11-30T22:49:34Z,2022-11-30T23:04:56Z,2022-11-30T23:04:56Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/3587402085/jobs/6038106719v ``` Warning: ""service_account_key"" has been deprecated. Please switch to using google-github-actions/auth which supports both Workload Identity Federation and Service Account Key JSON authentication. For more details, see https://github.com/google-github-actions/setup-gcloud#authorization Error: google-github-actions/setup-gcloud failed with: failed to execute command `gcloud --quiet auth activate-service-account *** --key-file -`: /opt/hostedtoolcache/gcloud/275.0.0/x64/lib/googlecloudsdk/core/console/console_io.py:544: SyntaxWarning: ""is"" with a literal. Did you mean ""==""? if answer is None or (answer is '' and default is not None): ERROR: gcloud failed to load: module 'collections' has no attribute 'MutableMapping' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,ar-jan,closed,0,,,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,simonw,closed,0,,,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423369494,I_kwDOBm6k_c5U1uUW,1859,datasette create-token CLI command,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T03:12:59Z,2022-11-15T19:59:00Z,2022-10-26T04:31:39Z,OWNER,,The CLI equivalent of the `/-/create-token` page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336122,I_kwDOBm6k_c5U1mK6,1856,allow_signed_tokens setting for disabling API signed token mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T02:20:55Z,2022-11-15T19:57:05Z,2022-10-26T02:58:35Z,OWNER,,"Had some design thoughts here: https://github.com/simonw/datasette/issues/1852#issuecomment-1291272280 I liked this option the most: --setting allow_create_tokens off",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1856/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,simonw,closed,0,,,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1428560020,I_kwDOBm6k_c5VJhiU,1872,"SITE-BUSTING ERROR: ""render_template() called before await ds.invoke_startup()""",192568,mroswell,closed,0,,,,,3,2022-10-30T02:28:39Z,2022-10-30T06:26:01Z,2022-10-30T06:26:01Z,CONTRIBUTOR,,"1. My https://list.saferdisinfectants.org/disinfectants/listN page (linked from https://SaferDisinfectants.org ) has been running beautifully for a year and a half, including a GitHub Actions workflow that's been routinely updating the database. 2. I received a recent report that the list page is down. I don't know when it went down, but the content is replaced with: ""render_template() called before await ds.invoke_startup()"" 3. The local datasette repo runs without incident. 4. The site is hosted on vercel, linked to my github repo. Perhaps some vercel changes were made, but not by anyone on our side. Here is a screenshot of the current project settings: Here a screenshot of the latest deployment status: This is my repository: https://github.com/mroswell/list-N (I notice: datasette==0.59 in my requirements.txt file) Because it's been long while since I actively worked on this or any other datasette project, I forget a lot of what I knew at one point. Perhaps some configuration file could be missing? Or perhaps I just need to know the right incantation to add to that vercel settings page. Help is welcome as the nonprofit org is soon hosting its annual conference, and we'd love to have the page working again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,simonw,closed,0,,,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1400121355,PR_kwDOBm6k_c5AVujU,1835,use inspect data for hash and file size,536941,fgregg,closed,0,,,,,3,2022-10-06T18:25:24Z,2022-10-27T20:51:30Z,2022-10-06T20:06:07Z,CONTRIBUTOR,simonw/datasette/pulls/1835,"`inspect_data` should already include the hash and the db file size, so this PR takes advantage of using those instead of always recalculating. should help a lot on startup with large DBs. closes #1834 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1835/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,simonw,open,0,,,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,mhalle,closed,0,,,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422973111,I_kwDOBm6k_c5U0Ni3,1854,Flaky test: test_serve_localhost_http,9599,simonw,closed,0,,,,,3,2022-10-25T19:37:35Z,2022-10-25T19:53:02Z,2022-10-25T19:53:02Z,OWNER,,Failing on Python 3.10 at the moment: https://github.com/simonw/datasette/actions/runs/3323629947/jobs/5494340302,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1420090659,I_kwDOBm6k_c5UpN0j,1848,Private database page should show padlock on every table,9599,simonw,closed,0,,,,,3,2022-10-24T02:28:38Z,2022-10-24T02:50:29Z,2022-10-24T02:42:34Z,OWNER,,"Following: - #1829 https://latest.datasette.io/_internal looks like this: But those queries and tables are private too, and should also show the padlock icon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1408561039,PR_kwDOBm6k_c5Axrpb,1842,check_visibility can now take multiple permissions into account,9599,simonw,closed,0,,,,,3,2022-10-14T00:06:04Z,2022-10-24T02:11:36Z,2022-10-24T02:11:36Z,OWNER,simonw/datasette/pulls/1842,"Refs #1829 - [x] Fix table page - [x] Fix database page - [x] Fix query page - [x] Fix row page - [x] Tests - [x] Updated documentation for `check_visibility` method, to cover the new `permissions=` keyword argument Also this fix is currently only applied on the table page - needs to be applied on database, row and query pages too. ---- :books: Documentation preview :books:: https://datasette--1842.org.readthedocs.build/en/1842/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1842/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1410305897,I_kwDOBm6k_c5UD49p,1845,Reconsider the Datasette first-run experience,9599,simonw,open,0,,,,,3,2022-10-15T22:21:31Z,2022-10-16T08:54:53Z,,OWNER,,"Had a really interesting conversation today about how hard it is to get from ""I installed Datasette"" to ""I've done something useful with it"": https://news.ycombinator.com/item?id=33216789#33218590 Spending some time focusing on that first-run experience feels very worthwhile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1400494162,PR_kwDOBm6k_c5AW_kl,1838,Open Datasette link in new tab,4399499,ocdtrekkie,closed,0,,,,,3,2022-10-07T01:12:20Z,2022-10-07T16:28:41Z,2022-10-07T02:01:07Z,NONE,simonw/datasette/pulls/1838,"This is technically a Sandstorm-specific fix (as external links do not work inside the grain frame), however, I think it is an improvement to the upstream project, so I wanted to propose it here rather than patching it in our package. There's much opinions on the Internet about whether external links should open in a new tab by default or not, but I'd argue very few people who might click a ""powered by"" link intend to complete their interaction with the source page (a Datasette). And furthermore, users may be working within various queries or loading visualizations (navigating away when trying to plot a million GPS coordinates pretty much just resets your progress!), so linking away within the tab might be a frustrating or destructive act to one's work, even inadvertently. original report: https://github.com/ocdtrekkie/datasette-sandstorm/issues/1 ---- :books: Documentation preview :books:: https://datasette--1838.org.readthedocs.build/en/1838/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1386917344,PR_kwDOBm6k_c4_prjN,1823,Keyword-only arguments for a bunch of internal methods,9599,simonw,open,0,,,,,3,2022-09-27T00:44:59Z,2022-10-05T04:37:54Z,,OWNER,simonw/datasette/pulls/1823,"Refs #1822 ---- :books: Documentation preview :books:: https://datasette--1823.org.readthedocs.build/en/1823/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1823/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,simonw,open,0,,,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,pjamargh,closed,0,,,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386854246,I_kwDOBm6k_c5Sqbdm,1822,Switch to keyword-only arguments for a bunch of internal methods,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-09-26T23:20:38Z,2022-09-27T00:44:04Z,,OWNER,,"This is a good idea, and one that needs to happen before Datasette 1.0: > While you are adding features, would you be future-proofing your APIs if you switched over some arguments over to keyword-only arguments or would that be too disruptive? > > Thinking out loud: > > ``` > async def render_template( > self, templates, *, context=None, plugin_context=None, request=None, view_name=None > ): > ``` _Originally posted by @jefftriplett in https://github.com/simonw/datasette/issues/1817#issuecomment-1256781274_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1822/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,danp,closed,0,,,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1375930971,PR_kwDOBm6k_c4_GVBS,1812,Bump furo from 2022.6.21 to 2022.9.15,49699333,dependabot[bot],closed,0,,,,,3,2022-09-16T13:10:45Z,2022-09-16T19:50:53Z,2022-09-16T19:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/1812,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.6.21 to 2022.9.15.
Changelog

Sourced from furo's changelog.

Changelog

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.
  • Correctly select only Furo's own svg in related pages nav.
  • Make numpy-style documentation headers consistent.
  • Retitle the reference section.
  • Update npm dependencies.

2022.06.21 -- Opulent Opal

  • Fix docutils <= 0.17.x compatibility.
  • Bump to the latest Node.js LTS.

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

  • ✨ Advertise Sphinx 5 compatibility.
  • ✨ Change to basic-ng as the base theme (from {pypi}sphinx-basic-ng).
  • Document site-wide announcement banners.
  • Drop the pin on pygments.
  • Improve edit button, using basic-ng's edit-this-page component.
  • Tweak headings to better match what users expect.
  • Tweak how Sphinx's default HTML is rendered, using docutils post-transforms (this replaces parsing+modifying it with BeautifulSoup).
  • When built with docutils 0.18, footnotes are rendered differently and stylised differently in Furo.

2022.04.07 -- Magical Mauve

  • ✨ Make sphinx-copybutton look better.
  • Add margin to indentations in line blocks.
  • Add styling for non-arabic list styles
  • Add support for html_baseurl.

... (truncated)

Commits
  • 08e6b38 Prepare release: 2022.09.15
  • 9de7613 Update changelog
  • a064929 Tweak changelog content style
  • 46f4adc Revert "Add initial theme.conf content for eventual ablog support"
  • 45b839b Set a minimum constraint on pygments
  • a4af988 [pre-commit.ci] pre-commit autoupdate (#518)
  • a72186f [pre-commit.ci] pre-commit autoupdate (#504)
  • 9f41ee6 Add initial theme.conf content for eventual ablog support
  • 75e0361 Make numpy-style documentation headers consistent
  • 9d280e6 [pre-commit.ci] pre-commit autoupdate (#487)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.6.21&new-version=2022.9.15)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1812.org.readthedocs.build/en/1812/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1812/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1362363685,I_kwDOBm6k_c5RNAUl,1800,Remove upper bound dependencies as a default policy,9599,simonw,closed,0,,,,,3,2022-09-05T18:23:45Z,2022-09-05T18:39:52Z,2022-09-05T18:35:41Z,OWNER,,"https://iscinumpy.dev/post/bound-version-constraints/ has convinced me not to use upper bound dependencies unless I'm certain they are needed. Relevant PR: - https://github.com/simonw/datasette/pull/1799 Also: https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L45-L46 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L48-L49 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L51-L55 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L57-L59 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L75-L78 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L81-L82 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1800/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1180778860,PR_kwDOBm6k_c41BFWj,1685,"Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0",49699333,dependabot[bot],closed,0,,,,,3,2022-03-25T13:12:13Z,2022-09-05T18:36:49Z,2022-09-05T18:36:48Z,CONTRIBUTOR,simonw/datasette/pulls/1685,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes

Sourced from jinja2's releases.

3.1.0

This is a feature release, which includes new features and removes previously deprecated features. The 3.1.x branch is now the supported bugfix branch, the 3.0.x branch has become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades. We also encourage upgrading to MarkupSafe 2.1.1, the latest version at this time.

Changelog

Sourced from jinja2's changelog.

Version 3.1.0

Released 2022-03-24

  • Drop support for Python 3.6. :pr:1534

  • Remove previously deprecated code. :pr:1544

    • WithExtension and AutoEscapeExtension are built-in now.
    • contextfilter and contextfunction are replaced by pass_context. evalcontextfilter and evalcontextfunction are replaced by pass_eval_context. environmentfilter and environmentfunction are replaced by pass_environment.
    • Markup and escape should be imported from MarkupSafe.
    • Compiled templates from very old Jinja versions may need to be recompiled.
    • Legacy resolve mode for Context subclasses is no longer supported. Override resolve_or_missing instead of resolve.
    • unicode_urlencode is renamed to url_quote.
  • Add support for native types in macros. :issue:1510

  • The {% trans %} tag can use pgettext and npgettext by passing a context string as the first token in the tag, like {% trans "title" %}. :issue:1430

  • Update valid identifier characters from Python 3.6 to 3.7. :pr:1571

  • Filters and tests decorated with @async_variant are pickleable. :pr:1612

  • Add items filter. :issue:1561

  • Subscriptions ([0], etc.) can be used after filters, tests, and calls when the environment is in async mode. :issue:1573

  • The groupby filter is case-insensitive by default, matching other comparison filters. Added the case_sensitive parameter to control this. :issue:1463

  • Windows drive-relative path segments in template names will not result in FileSystemLoader and PackageLoader loading from drive-relative paths. :pr:1621

Version 3.0.3

Released 2021-11-09

  • Fix traceback rewriting internals for Python 3.10 and 3.11. :issue:1535
  • Fix how the native environment treats leading and trailing spaces when parsing values on Python 3.10. :pr:1537

... (truncated)

Commits
  • 84c0e2c Merge pull request #1625 from pallets/release-3.1.0
  • 7b0c47f release version 3.1.0
  • ede0f98 Merge pull request #1621 from pallets/template-safe-path
  • 040088a use posixpath.join when loading template names
  • a292075 Merge pull request #1620 from janfilips/patch-1
  • 6e4df02 Fix formatting in tricks.rst
  • 3a050b1 Merge pull request #1617 from pallets/docs-prose
  • 4b63cd8 rewrite include statement section
  • a98d482 clean up faq, move technical discussions
  • 9de99f8 clean up engine comparisons
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1685/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,hubgit,open,0,,,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1309542173,PR_kwDOCGYnMM47pwAb,455,"in extract code, check equality with IS instead of = for nulls",536941,fgregg,closed,0,,,,,3,2022-07-19T13:40:25Z,2022-08-27T14:45:03Z,2022-08-27T14:45:03Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/455,"sqlite ""IS"" is equivalent to SQL ""IS NOT DISTINCT FROM"" closes #423",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1348394901,PR_kwDOBm6k_c49qmC2,1792,Test `--load-extension` in GitHub Actions,9599,simonw,closed,0,,,,,3,2022-08-23T18:43:29Z,2022-08-24T00:11:46Z,2022-08-24T00:11:45Z,OWNER,simonw/datasette/pulls/1792,"Refs: - #1789 ---- :books: Documentation preview :books:: https://datasette--1792.org.readthedocs.build/en/1792/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1792/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 838245338,MDU6SXNzdWU4MzgyNDUzMzg=,1272,Unit tests for the Dockerfile,9599,simonw,open,0,,,,,3,2021-03-23T01:36:29Z,2022-07-29T10:22:59Z,,OWNER,,"Working on the Dockerfile in #1249 made me wish for automated tests - to confirm that it boots up correctly, can run SpatiaLite and doesn't have weird bugs like the `/db` page hanging. These could run in CI too, but maybe only if the `Dockerfile` is updated.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1272/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1184850337,PR_kwDOBm6k_c41OrSL,1693,Bump black from 22.1.0 to 22.3.0,49699333,dependabot[bot],closed,0,,,,,3,2022-03-29T13:11:09Z,2022-06-28T13:11:38Z,2022-06-28T13:11:36Z,CONTRIBUTOR,simonw/datasette/pulls/1693,"Bumps [black](https://github.com/psf/black) from 22.1.0 to 22.3.0.
Release notes

Sourced from black's releases.

22.3.0

Preview style

  • Code cell separators #%% are now standardised to # %% (#2919)
  • Remove unnecessary parentheses from except statements (#2939)
  • Remove unnecessary parentheses from tuple unpacking in for loops (#2945)
  • Avoid magic-trailing-comma in single-element subscripts (#2942)

Configuration

  • Do not format __pypackages__ directories by default (#2836)
  • Add support for specifying stable version with --required-version (#2832).
  • Avoid crashing when the user has no homedir (#2814)
  • Avoid crashing when md5 is not available (#2905)
  • Fix handling of directory junctions on Windows (#2904)

Documentation

  • Update pylint config documentation (#2931)

Integrations

  • Move test to disable plugin in Vim/Neovim, which speeds up loading (#2896)

Output

  • In verbose, mode, log when Black is using user-level config (#2861)

Packaging

  • Fix Black to work with Click 8.1.0 (#2966)
  • On Python 3.11 and newer, use the standard library's tomllib instead of tomli (#2903)
  • black-primer, the deprecated internal devtool, has been removed and copied to a separate repository (#2924)

Parser

  • Black can now parse starred expressions in the target of for and async for statements, e.g for item in *items_1, *items_2: pass (#2879).
Changelog

Sourced from black's changelog.

22.3.0

Preview style

  • Code cell separators #%% are now standardised to # %% (#2919)
  • Remove unnecessary parentheses from except statements (#2939)
  • Remove unnecessary parentheses from tuple unpacking in for loops (#2945)
  • Avoid magic-trailing-comma in single-element subscripts (#2942)

Configuration

  • Do not format __pypackages__ directories by default (#2836)
  • Add support for specifying stable version with --required-version (#2832).
  • Avoid crashing when the user has no homedir (#2814)
  • Avoid crashing when md5 is not available (#2905)
  • Fix handling of directory junctions on Windows (#2904)

Documentation

  • Update pylint config documentation (#2931)

Integrations

  • Move test to disable plugin in Vim/Neovim, which speeds up loading (#2896)

Output

  • In verbose, mode, log when Black is using user-level config (#2861)

Packaging

  • Fix Black to work with Click 8.1.0 (#2966)
  • On Python 3.11 and newer, use the standard library's tomllib instead of tomli (#2903)
  • black-primer, the deprecated internal devtool, has been removed and copied to a separate repository (#2924)

Parser

  • Black can now parse starred expressions in the target of for and async for statements, e.g for item in *items_1, *items_2: pass (#2879).
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.1.0&new-version=22.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1693/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1227571375,I_kwDOCGYnMM5JK0Cv,431,Allow making m2m relation of a table to itself,738408,rafguns,open,0,,,,,3,2022-05-06T08:30:43Z,2022-06-23T14:12:51Z,,NONE,,"I am building a database, in which one of the tables has a many-to-many relationship to itself. As far as I can see, this is not (yet) possible using `.m2m()` in sqlite-utils. This may be a bit of a niche use case, so feel free to close this issue if you feel it would introduce too much complexity compared to the benefits. Example: suppose I have a table of people, and I want to store the information that John and Mary have two children, Michael and Suzy. It would be neat if I could do something like this: ```python from sqlite_utils import Database db = Database(memory=True) db[""people""].insert({""name"": ""John""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) db[""people""].insert({""name"": ""Mary""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) ``` But if I do that, the many-to-many table `parent_child` has only one column: ``` CREATE TABLE [parent_child] ( [people_id] TEXT REFERENCES [people]([name]), PRIMARY KEY ([people_id], [people_id]) ) ``` This could be solved by adding one or two keyword_arguments to `.m2m()`, e.g. `.m2m(..., left_name=None, right_name=None)` or `.m2m(..., names=(None, None))`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,simonw,closed,0,,,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,simonw,closed,0,,,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1275523220,PR_kwDOBm6k_c454SlE,1759,Extract facet portions of table.html out into included templates,19872,nsmgr8,closed,0,,,,,3,2022-06-17T22:04:04Z,2022-06-20T18:05:45Z,2022-06-20T18:05:45Z,CONTRIBUTOR,simonw/datasette/pulls/1759,To allow users customise the facet content as they would prefer such as sorting of facet results. ordering of suggested facets etc.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,ryascott,closed,0,,,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so: ```yaml databases: fixtures: queries: neighborhood_search: sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood title: Search neighborhoods ``` This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it. So, I'd like to see a new `""sql_file""` key that works like so: `metadata.yaml`: ```yaml databases: fixtures: queries: neighborhood_search: sql_file: neighborhood_search.sql title: Search neighborhoods ``` `neighborhood_search.sql`: ```sql select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood ``` Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup. A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml: - Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code - Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit - Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier) - Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810507413,MDExOlB1bGxSZXF1ZXN0NTc1MTg3NDU3,1229,ensure immutable databses when starting in configuration directory mode with,295329,camallen,closed,0,,,,,3,2021-02-17T20:18:26Z,2022-04-22T13:16:36Z,2021-03-29T00:17:32Z,CONTRIBUTOR,simonw/datasette/pulls/1229,"fixes #1224 This PR ensures all databases found in a configuration directory that match the files in `inspect-data.json` will be set to `immutable` as outlined in https://docs.datasette.io/en/latest/settings.html#configuration-directory-mode specifically on building the `datasette` instance it checks: - if `immutables` is an empty tuple - as passed by the cli code - if `immutables` is the default function value `None` - when it's not explicitly set And correctly builds the immutable database list from the `inspect-data[file]` keys. Note for this to work the `inspect-data.json` file must contain `file` paths which are relative to the configuration directory otherwise the file paths won't match and the dbs won't be set to immutable. I couldn't find an easy way to test this due to the way `make_app_client` works, happy to take directions on adding a test for this. I've updated the relevant docs as well, i.e. use the `inspect` cli cmd from the config directory path to create the relevant file ``` cd $config_dir datasette inspect *.db --inspect-file=inspect-data.json ``` https://docs.datasette.io/en/latest/performance.html#using-datasette-inspect",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531502365,MDU6SXNzdWU1MzE1MDIzNjU=,646,Make database level information from metadata.json available in the index.html template,18017473,lagolucas,open,0,,,3268330,Datasette 1.0,3,2019-12-02T19:55:10Z,2022-03-15T20:50:34Z,,NONE,,"Did a search on the issues here and didn't find anything related to what I want. I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page. I tried some small tweaks on the python and html files, but failed to get that result. Is there a way? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683805434,MDU6SXNzdWU2ODM4MDU0MzQ=,135,Code for finding SpatiaLite in the usual locations,9599,simonw,closed,0,,,,,3,2020-08-21T20:15:34Z,2022-02-05T00:04:26Z,2020-08-21T20:30:13Z,OWNER,,"I built this for `shapefile-to-sqlite` but it would be useful in `sqlite-utils` too: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L16-L19 ```python SPATIALITE_PATHS = ( ""/usr/lib/x86_64-linux-gnu/mod_spatialite.so"", ""/usr/local/lib/mod_spatialite.dylib"", ) ``` https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L105-L109 ```python def find_spatialite(): for path in SPATIALITE_PATHS: if os.path.exists(path): return path return None ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683812642,MDU6SXNzdWU2ODM4MTI2NDI=,136,--load-extension=spatialite shortcut option,9599,simonw,closed,0,,,,,3,2020-08-21T20:31:25Z,2022-02-05T00:04:26Z,2020-10-16T19:14:32Z,OWNER,,In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134),140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1120990806,PR_kwDOBm6k_c4x6zZ5,1617,"Ensure template_path always uses ""/"" to match jinja",3526913,cb160,closed,0,,,,,3,2022-02-01T17:20:30Z,2022-02-03T01:58:35Z,2022-02-03T01:58:35Z,CONTRIBUTOR,simonw/datasette/pulls/1617,"This PR shoudl fix #1545 The existing code substituted / for \, assuming this was the right behaviour for windows. But on Windows, Jinja still uses / for the template list - See https://github.com/pallets/jinja/blob/896a62135bcc151f2997e028c5125bec2cb2431f/src/jinja2/loaders.py#L225",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1122451096,PR_kwDOBm6k_c4x_mXy,1626,Try test suite against macOS and Windows,9599,simonw,open,0,,,,,3,2022-02-02T22:26:51Z,2022-02-03T01:22:44Z,,OWNER,simonw/datasette/pulls/1626,Refs #1625,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1626/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097477582,PR_kwDOCGYnMM4wtl17,377,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-10T05:34:24Z,2022-01-11T02:10:57Z,2022-01-11T02:10:54Z,OWNER,simonw/sqlite-utils/pulls/377,"Refs #375 Still needs: - [x] Refactor `@insert_upsert_options` so that it doesn't duplicate `@import_options` - [x] Tests - [x] Documentation - [x] Try it against a really big file",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079129258,PR_kwDOBm6k_c4vynly,1554,TableView refactor,9599,simonw,closed,0,,,,,3,2021-12-13T23:16:04Z,2021-12-20T23:52:11Z,2021-12-20T23:52:04Z,OWNER,simonw/datasette/pulls/1554,"I'm starting a PR with almost nothing in it so I can use the GitHub code commenting feature to add a bunch of comments to the code I intend to refactor. Related issues: - #617 - #715 - #870 - #1518",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 793002853,MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1,1204,WIP: Plugin includes,9599,simonw,open,0,,,,,3,2021-01-25T03:59:06Z,2021-12-17T07:10:49Z,,OWNER,simonw/datasette/pulls/1204,"Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 636511683,MDU6SXNzdWU2MzY1MTE2ODM=,830,Redesign register_facet_classes plugin hook,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2020-06-10T20:03:27Z,2021-12-16T19:58:22Z,,OWNER,,"Nothing uses this plugin hook yet, so the design is not yet proven. I'm going to build a real plugin against it and use that process to inform any design changes that may need to be made. I'll add a warning about this to the documentation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit. it would be great if a response is truncated that an header signalling that was set in the header. i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,simonw,open,0,,,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,simonw,open,0,,,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1039037439,PR_kwDOCGYnMM4t0uaI,333,Add functionality to read Parquet files.,2118708,Florents-Tselai,closed,0,,,,,3,2021-10-28T23:43:19Z,2021-11-25T19:47:35Z,2021-11-25T19:47:35Z,NONE,simonw/sqlite-utils/pulls/333,"I needed this for a project of mine, and I thought it'd be useful to have it in sqlite-utils (It's also mentioned in #248 ). The current implementation works (data is read & data types are inferred correctly. I've added a single straightforward test case, but @simonw please let me know if there are any non-obvious flags/combinations I should test too.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/333/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 991575770,MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3,1467,Add Authorization header when CORS flag is set,3058200,jameslittle230,closed,0,,,,,3,2021-09-08T22:14:41Z,2021-10-17T02:29:07Z,2021-10-14T18:54:18Z,NONE,simonw/datasette/pulls/1467,"This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 964400482,MDU6SXNzdWU5NjQ0MDA0ODI=,310,`sqlite-utils insert --flatten` option to flatten nested JSON,9599,simonw,closed,0,,,,,3,2021-08-09T21:23:08Z,2021-10-16T13:54:56Z,2021-08-09T21:44:06Z,OWNER,,"I had to do this with a `jq` recipe today: https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs ``` cat log.json | jq -c '[leaf_paths as $path | { ""key"": $path | join(""_""), ""value"": getpath($path) }] | from_entries' \ | sqlite-utils insert /tmp/logs.db logs - --nl --alter --batch-size 1 ``` That was to turn something like this: ```json { ""httpRequest"": { ""latency"": ""0.112114537s"", ""requestMethod"": ""GET"", ""requestSize"": ""534"", ""status"": 200, }, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels"": { ""service"": ""datasette-io"" } } ``` Into this instead: ```json { ""httpRequest_latency"": ""0.112114537s"", ""httpRequest_requestMethod"": ""GET"", ""httpRequest_requestSize"": ""534"", ""httpRequest_status"": 200, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels_service"": ""datasette-io"" } ``` I have to do this often enough that I think it should be an option, `--flatten` - so I can do this instead: ``` cat log.json | sqlite-utils insert /tmp/logs.db logs - --flatten ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026379132,PR_kwDOBm6k_c4tM0JV,1489,"Update pyyaml requirement from ~=5.3 to >=5.3,<7.0",49699333,dependabot[bot],closed,0,,,,,3,2021-10-14T13:09:33Z,2021-10-14T18:10:43Z,2021-10-14T18:10:42Z,CONTRIBUTOR,simonw/datasette/pulls/1489,"Updates the requirements on [pyyaml](https://github.com/yaml/pyyaml) to permit the latest version.
Changelog

Sourced from pyyaml's changelog.

6.0 (2021-10-13)

5.4.1 (2021-01-20)

  • yaml/pyyaml#480 -- Fix stub compat with older pyyaml versions that may unwittingly load it

5.4 (2021-01-19)

5.3.1 (2020-03-18)

  • yaml/pyyaml#386 -- Prevents arbitrary code execution during python/object/new constructor

5.3 (2020-01-06)

5.2 (2019-12-02)

... (truncated)

Commits
  • 8cdff2c 6.0 release
  • a4fb55e Update Python 3.10 versions for Windows build
  • e45b964 Add Python 3.10 to tox.ini
  • 4808fba 6.0b1 release
  • d5aba40 Omnibus CI/artifact build update
  • a6d384c Various setup fixes
  • 8f3f979 No longer using appveyor
  • c274365 The yaml.load{,_all} functions require Loader= now
  • 2f87ac4 Add a basic test file for yaml.load and yaml.dump
  • 7bd92df Makefile tweaks
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1489/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 929748885,MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5,293,Test against Python 3.10-dev,9599,simonw,closed,0,,,,,3,2021-06-25T01:40:39Z,2021-10-13T21:49:33Z,2021-10-13T21:49:33Z,OWNER,simonw/sqlite-utils/pulls/293,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 989986586,MDU6SXNzdWU5ODk5ODY1ODY=,1461,Try blacken-docs,9599,simonw,closed,0,,,,,3,2021-09-07T13:28:50Z,2021-09-07T16:13:59Z,2021-09-07T16:13:59Z,OWNER,,https://github.com/asottile/blacken-docs,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988325628,MDExOlB1bGxSZXF1ZXN0NzI3MjY1MDI1,1455,Add scientists to target groups,198537,rgieseke,closed,0,,,,,3,2021-09-04T16:28:58Z,2021-09-04T16:32:21Z,2021-09-04T16:31:38Z,CONTRIBUTOR,simonw/datasette/pulls/1455,"Not sure if you want them mentioned explicitly (it's already a long list), but following up on https://twitter.com/simonw/status/1434176989565382656",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 978357984,MDU6SXNzdWU5NzgzNTc5ODQ=,1446,Modify base.html template to support optional sticky footer,9599,simonw,closed,0,,,,,3,2021-08-24T18:11:12Z,2021-08-31T01:54:59Z,2021-08-24T20:32:47Z,OWNER,,"The neatest way to have the footer stick to the bottom of the browser window that I've found is to use the flexbox pattern from https://css-tricks.com/couple-takes-sticky-footer/ ```html
content
``` ```css html, body { height: 100%; } body { display: flex; flex-direction: column; } .content { flex: 1 0 auto; } .footer { flex-shrink: 0; } ``` I tried this in a custom plugin but it ended up having to duplicate the entire `base.html` template just to get a wrapper around the not-footer content. I think Datasette's own `base.html` template should have this wrapper element instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913135723,MDU6SXNzdWU5MTMxMzU3MjM=,266,"Add some types, enforce with mypy",9599,simonw,closed,0,,,,,3,2021-06-07T06:05:56Z,2021-08-18T22:25:38Z,2021-08-18T22:25:38Z,OWNER,,"A good starting point would be adding type information to the members of these named tuples and the introspection methods that return them: https://github.com/simonw/sqlite-utils/blob/9dff7a38831d471b1dff16d40d89eb5c3b4e84d6/sqlite_utils/db.py#L51-L75",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959710008,MDU6SXNzdWU5NTk3MTAwMDg=,1419,`publish cloudrun` should deploy a more recent SQLite version,536941,fgregg,open,0,,,,,3,2021-08-04T00:45:55Z,2021-08-05T03:23:24Z,,CONTRIBUTOR,,"I recently changed from deploying a datasette using `datasette publish heroku` to `datasette publish cloudrun`. [A query that ran on the heroku site](https://odpr.bunkum.us/odpr-6c2f4fc?sql=with+pivot_members+as+%28%0D%0A++select%0D%0A++++f_num%2C%0D%0A++++max%28union_name%29+as+union_name%2C%0D%0A++++max%28aff_abbr%29+as+abbreviation%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2007%0D%0A++++%29+as+%222007%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2008%0D%0A++++%29+as+%222008%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2009%0D%0A++++%29+as+%222009%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2010%0D%0A++++%29+as+%222010%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2011%0D%0A++++%29+as+%222011%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2012%0D%0A++++%29+as+%222012%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2013%0D%0A++++%29+as+%222013%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2014%0D%0A++++%29+as+%222014%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2015%0D%0A++++%29+as+%222015%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2016%0D%0A++++%29+as+%222016%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2017%0D%0A++++%29+as+%222017%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2018%0D%0A++++%29+as+%222018%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2019%0D%0A++++%29+as+%222019%22%0D%0A++from%0D%0A++++lm_data%0D%0A++++left+join+ar_membership+using+%28rpt_id%29%0D%0A++where%0D%0A++++members+%21%3D+%27%27%0D%0A++++and+trim%28desig_name%29+%3D+%27NHQ%27%0D%0A++++and+union_name+not+in+%28%0D%0A++++++%27AFL-CIO%27%2C%0D%0A++++++%27CHANGE+TO+WIN%27%2C%0D%0A++++++%27FOOD+ALLIED+SVC+TRADES+DEPT+AFL-CIO%27%0D%0A++++%29%0D%0A++++and+f_num+not+in+%28387%2C+296%2C+123%2C+347%2C+531897%2C+30410%2C+49%29%0D%0A++++and+lower%28ar_membership.category%29+IN+%28%0D%0A++++++%27active+members%27%2C%0D%0A++++++%27active+professional%27%2C%0D%0A++++++%27regular+members%27%2C%0D%0A++++++%27full+time+member%27%2C%0D%0A++++++%27full+time+members%27%2C%0D%0A++++++%27active+education+support+professional%27%2C%0D%0A++++++%27active%27%2C%0D%0A++++++%27active+member%27%2C%0D%0A++++++%27members%27%2C%0D%0A++++++%27see+item+69%27%2C%0D%0A++++++%27regular%27%2C%0D%0A++++++%27dues+paying+members%27%2C%0D%0A++++++%27building+trades+journeyman%27%2C%0D%0A++++++%27full+per+capita+tax+payers%27%2C%0D%0A++++++%27active+postal%27%2C%0D%0A++++++%27active+membership%27%2C%0D%0A++++++%27full-time+member%27%2C%0D%0A++++++%27regular+active+member%27%2C%0D%0A++++++%27journeyman%27%2C%0D%0A++++++%27member%27%2C%0D%0A++++++%27journeymen%27%2C%0D%0A++++++%27one+half+per+capita+tax+payers%27%2C%0D%0A++++++%27schedule+1%27%2C%0D%0A++++++%27active+memebers%27%2C%0D%0A++++++%27active+members+-+us%27%2C%0D%0A++++++%27dues+paying+membership%27%0D%0A++++%29%0D%0A++GROUP+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++*%0D%0Afrom%0D%0A++pivot_members%0D%0Aorder+by%0D%0A++%222019%22+desc%3B), now throws a syntax error on the [cloudrun site](https://labordata.bunkum.us/odpr-6c2f4fc?sql=with+pivot_members+as+%28%0D%0A++select%0D%0A++++f_num%2C%0D%0A++++max%28union_name%29+as+union_name%2C%0D%0A++++max%28aff_abbr%29+as+abbreviation%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2007%0D%0A++++%29+as+%222007%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2008%0D%0A++++%29+as+%222008%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2009%0D%0A++++%29+as+%222009%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2010%0D%0A++++%29+as+%222010%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2011%0D%0A++++%29+as+%222011%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2012%0D%0A++++%29+as+%222012%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2013%0D%0A++++%29+as+%222013%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2014%0D%0A++++%29+as+%222014%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2015%0D%0A++++%29+as+%222015%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2016%0D%0A++++%29+as+%222016%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2017%0D%0A++++%29+as+%222017%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2018%0D%0A++++%29+as+%222018%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2019%0D%0A++++%29+as+%222019%22%0D%0A++from%0D%0A++++lm_data%0D%0A++++left+join+ar_membership+using+%28rpt_id%29%0D%0A++where%0D%0A++++members+%21%3D+%27%27%0D%0A++++and+trim%28desig_name%29+%3D+%27NHQ%27%0D%0A++++and+union_name+not+in+%28%0D%0A++++++%27AFL-CIO%27%2C%0D%0A++++++%27CHANGE+TO+WIN%27%2C%0D%0A++++++%27FOOD+ALLIED+SVC+TRADES+DEPT+AFL-CIO%27%0D%0A++++%29%0D%0A++++and+f_num+not+in+%28387%2C+296%2C+123%2C+347%2C+531897%2C+30410%2C+49%29%0D%0A++++and+lower%28ar_membership.category%29+IN+%28%0D%0A++++++%27active+members%27%2C%0D%0A++++++%27active+professional%27%2C%0D%0A++++++%27regular+members%27%2C%0D%0A++++++%27full+time+member%27%2C%0D%0A++++++%27full+time+members%27%2C%0D%0A++++++%27active+education+support+professional%27%2C%0D%0A++++++%27active%27%2C%0D%0A++++++%27active+member%27%2C%0D%0A++++++%27members%27%2C%0D%0A++++++%27see+item+69%27%2C%0D%0A++++++%27regular%27%2C%0D%0A++++++%27dues+paying+members%27%2C%0D%0A++++++%27building+trades+journeyman%27%2C%0D%0A++++++%27full+per+capita+tax+payers%27%2C%0D%0A++++++%27active+postal%27%2C%0D%0A++++++%27active+membership%27%2C%0D%0A++++++%27full-time+member%27%2C%0D%0A++++++%27regular+active+member%27%2C%0D%0A++++++%27journeyman%27%2C%0D%0A++++++%27member%27%2C%0D%0A++++++%27journeymen%27%2C%0D%0A++++++%27one+half+per+capita+tax+payers%27%2C%0D%0A++++++%27schedule+1%27%2C%0D%0A++++++%27active+memebers%27%2C%0D%0A++++++%27active+members+-+us%27%2C%0D%0A++++++%27dues+paying+membership%27%0D%0A++++%29%0D%0A++GROUP+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++*%0D%0Afrom%0D%0A++pivot_members%0D%0Aorder+by%0D%0A++%222019%22+desc%3B). I suspect this is because they are running different versions of sqlite3. - Heroku: sqlite3 3.31.1 ([-/versions](https://odpr.bunkum.us/-/versions)) - Cloudrun: sqlite3 3.27.2 ([-/versions](https://labordata.bunkum.us/-/versions)) If so, it would be great to 1. harmonize the sqlite3 versions across platforms 2. update the docker files so as to update the sqlite3 version for cloudrun",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1419/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957731178,MDU6SXNzdWU5NTc3MzExNzg=,304,"`table.convert(..., where=)` and `sqlite-utils convert ... --where=`",9599,simonw,closed,0,,,,,3,2021-08-02T04:27:23Z,2021-08-02T19:00:00Z,2021-08-02T18:58:10Z,OWNER,,"For applying the conversion to a subset of rows selected using the where clause. Should also take optional arguments, as seen in `db[""dogs""].delete_where(""age < ?"", [3])`. Follows #302 and #251. This was originally https://github.com/simonw/sqlite-transform/issues/9",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 955316250,MDU6SXNzdWU5NTUzMTYyNTA=,1405,utils.parse_metadata() should be a documented internal function,9599,simonw,closed,0,,,,,3,2021-07-28T23:51:39Z,2021-07-29T23:33:30Z,2021-07-29T23:30:24Z,OWNER,,Because it's used by this plugin: https://github.com/simonw/datasette-remote-metadata,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 952189173,MDU6SXNzdWU5NTIxODkxNzM=,3,Use HN algolia endpoint to retrieve trees,9599,simonw,open,0,,,,,3,2021-07-25T03:35:27Z,2021-07-25T18:41:17Z,,MEMBER,,"The `trees` command currently has to make a request for every single comment. Algolia have an endpoint that bundles the entire thread together into a single request. `https://hn.algolia.com/api/v1/items/ID` Here's an example that loads quickly, with about 50 comments: https://hn.algolia.com/api/v1/items/27941108 It doesn't appear to use pagination at all - if a thread is big then the response is big. I ran this search to find some stories with more than 1000 comments: https://hn.algolia.com/api/v1/search?tags=story&numericFilters=num_comments%3E=1000 Here's one: https://news.ycombinator.com/item?id=25015967 with 4759 comments. Hitting the API takes 41s and returns 3.7 MB of JSON! ``` wget 'https://hn.algolia.com/api/v1/items/25015967' 0.03s user 0.04s system 0% cpu 41.368 total /tmp % ls -lah 25015967 -rw-r--r-- 1 simon wheel 3.7M Jul 24 20:31 25015967 ```",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 803333769,MDU6SXNzdWU4MDMzMzM3Njk=,32,KeyError: 'Contents' on running upload,11855322,robmarkcole,open,0,,,,,3,2021-02-08T08:36:37Z,2021-07-22T06:40:25Z,,NONE,,"Following the readme, on big sur, and having entered my auth creds via `dogsheep-photos s3-auth`: ``` (venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos upload photos.db ~/Pictures/Photos\ /Users/robin/Pictures/Library.photoslibrary --dry-run Fetching existing keys from S3... Traceback (most recent call last): File ""/Users/robin/datasette/venv/bin/dogsheep-photos"", line 8, in sys.exit(cli()) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py"", line 96, in upload key.split(""."")[0] for key in get_all_keys(client, creds[""photos_s3_bucket""]) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/utils.py"", line 46, in get_all_keys for row in page[""Contents""]: KeyError: 'Contents' ``` Possibly since the bucket is in `EU (London) eu-west-2` and this into is not requested?",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612673948,MDU6SXNzdWU2MTI2NzM5NDg=,759,fts search on a column doesn't work anymore due to escape_fts,133845,Krazybug,closed,0,,,,,3,2020-05-05T15:03:44Z,2021-07-16T02:11:54Z,2020-05-06T17:50:57Z,NONE,,"Hi and first, thank you for this awesome work you make with this projet. On a db indexed in full text search, I can't query on indexed column anymore. This request ""cauvin language:ita"": is running smoothly on a old version of datasette but not on the current version. Compare the current version query `select uuid, title, authors, year, series, language, formats, publisher, tags, identifiers from summary where rowid in (select rowid from summary_fts where summary_fts match escape_fts(:search)) order by uuid limit 101` To an older version: `select title, authors, series, uuid, language, identifiers, tags, publisher, formats, year, links from summary where rowid in (select rowid from summary_fts where summary_fts match :search) order by uuid limit 101` _language_ is a searchable column but now the search string is known as ""cauvin language:ita"" literally as a search term. columns are not parsed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539590148,MDU6SXNzdWU1Mzk1OTAxNDg=,651,fts5 syntax error when using punctuation,2181410,clausjuhl,closed,0,,,,,3,2019-12-18T10:25:35Z,2021-07-14T19:26:06Z,2019-12-30T06:42:55Z,NONE,,"Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' ""enable-fts""-command. The same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.) What am I doing wrong? Many thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 646448486,MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0,868,initial windows ci setup,702729,joshmgrant,open,0,,,,,3,2020-06-26T18:49:13Z,2021-07-10T23:41:43Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/868,Picking up the work done on #557 with a new PR. Seeing if I can get this working.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 920884085,MDU6SXNzdWU5MjA4ODQwODU=,1377,Mechanism for plugins to exclude certain paths from CSRF checks,9599,simonw,closed,0,,,,,3,2021-06-15T00:48:20Z,2021-06-23T22:51:33Z,2021-06-23T22:51:33Z,OWNER,,I need this for a plugin I'm building that offers a POST API.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925677191,MDU6SXNzdWU5MjU2NzcxOTE=,289,Mypy fixes for rows_from_file(),857609,adamchainz,closed,0,,,,,3,2021-06-20T20:34:59Z,2021-06-22T18:44:36Z,2021-06-22T18:13:26Z,NONE,,"Following https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864328927 You had two mypy errors. The first: > sqlite_utils/utils.py:157: error: Argument 1 to ""BufferedReader"" has incompatible type ""BinaryIO""; expected ""RawIOBase"" Looking at the `BufferedReader` docs, it seems to expect a `RawIOBase`, and this [has been copied into typeshed](https://github.com/python/typeshed/blob/9ec2f8712480c57353cea097a65d75a2c4ec1846/stdlib/io.pyi#L100). There may be scope to change how `BufferedReader` is documented and typed upstream, but for now it wouldn't be too bad to use a `typing.cast()`: ``` # Detect the format, then call this recursively buffered = io.BufferedReader( cast(io.RawIOBase, fp), # Undocumented BufferedReader support for BinaryIO buffer_size=4096, ) ``` The second error seems to be flagging a legitimate bug in your code: > sqlite_utils/utils.py:163: error: Argument 1 to ""decode"" of ""bytes"" has incompatible type ""Optional[str]""; expected ""str"" From your type hints, `encoding` may be `None`. In the CSV format block, you use `encoding or ""utf-8-sig""` to set a default, maybe that's desirable in this case too? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/289/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923697888,MDU6SXNzdWU5MjM2OTc4ODg=,278,"Support db as first parameter before subcommand, or as environment variable",601708,mcint,closed,0,,,,,3,2021-06-17T09:26:29Z,2021-06-20T22:39:57Z,2021-06-18T15:43:19Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732685643,MDU6SXNzdWU3MzI2ODU2NDM=,1063,.csv should link to .blob downloads,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-29T21:45:58Z,2021-06-17T18:12:30Z,2020-10-29T22:47:45Z,OWNER,,"- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass) - ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~ > Moving the CSV work to a separate ticket. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1063/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919250621,MDU6SXNzdWU5MTkyNTA2MjE=,269,bool type not supported,4068,frafra,closed,0,,,,,3,2021-06-11T22:00:36Z,2021-06-15T01:34:10Z,2021-06-15T01:34:10Z,NONE,,"Hi! Thank you for sharing this very nice tool :) It would be nice to have support for more types, like `bool`: it is not possible to convert to boolean at the moment. My suggestion would be to handle it as `bool(int(value))`, like csvkit does.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919822817,MDU6SXNzdWU5MTk4MjI4MTc=,1376,Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns),1726460,jcgregorio,open,0,,,,,3,2021-06-13T15:25:51Z,2021-06-13T15:39:37Z,,NONE,,"Trying to run datasette via the Docker container doesn't seem to work: ``` $ docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 544, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/usr/local/lib/python3.9/asyncio/base_events.py"", line 642, in run_until_complete return future.result() File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 584, in check_databases await database.execute_fn(check_connection) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 155, in execute_fn return await asyncio.get_event_loop().run_in_executor( File ""/usr/local/lib/python3.9/concurrent/futures/thread.py"", line 52, in run result = self.fn(*self.args, **self.kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 153, in in_thread return fn(conn) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/__init__.py"", line 892, in check_connection for r in conn.execute( sqlite3.DatabaseError: malformed database schema (generated_columns) - near ""AS"": syntax error ``` I have confirmed that the downloaded `fixtures.db` database is fine: ``` [skia-public] jcgregorio@jcgregorio840 ~/Downloads $ sqlite3 fixtures.db SQLite version 3.34.1 2021-01-20 14:10:07 Enter "".help"" for usage hints. sqlite> pragma integrity_check; ok sqlite> ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 912418094,MDU6SXNzdWU5MTI0MTgwOTQ=,1358,Release Datasette 0.57,9599,simonw,closed,0,,,,,3,2021-06-05T19:56:13Z,2021-06-05T22:20:07Z,2021-06-05T22:20:07Z,OWNER,,"Need release notes. Changes are here: https://github.com/simonw/datasette/compare/0.56...368aa5f1b16ca35f82d90ff747023b9a2bfa27c1 Partial release notes already exist for the two alphas, https://github.com/simonw/datasette/releases/tag/0.57a0 and https://github.com/simonw/datasette/releases/tag/0.57a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907642546,MDU6SXNzdWU5MDc2NDI1NDY=,264,"Supporting additional output formats, like GeoJSON",25778,eyeseast,closed,0,,,,,3,2021-05-31T18:03:32Z,2021-06-03T05:12:21Z,2021-06-03T05:12:21Z,CONTRIBUTOR,,"I have a project going where it would be useful to do some spatial processing in SQLite (instead of large files) and then output GeoJSON. So my workflow would be something like this: 1. Read Shapefiles, GeoJSON, CSVs into a SQLite database 2. Join, filter, prune as needed 3. Export GeoJSON for just the stuff I need at that moment, while still having a database of things that will be useful later I'm wondering if this is worth adding to SQLite-utils itself (GeoJSON, at least), or if it's better to make a counterpart to the ecosystem of `*-to-sqlite` tools, say a suite of `sqlite-to-*` things. Or would it be crazy to have a plugin system?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904537568,MDExOlB1bGxSZXF1ZXN0NjU1Njg0NDc3,1346,Re-display user's query with an error message if an error occurs,9599,simonw,closed,0,,,,,3,2021-05-28T02:04:20Z,2021-06-02T03:46:21Z,2021-06-02T03:46:21Z,OWNER,simonw/datasette/pulls/1346,Refs #619,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 323671577,MDU6SXNzdWUzMjM2NzE1Nzc=,263,Facets should not execute for ?shape=array|object,9599,simonw,closed,0,,,,,3,2018-05-16T15:26:13Z,2021-06-02T02:54:34Z,2021-06-02T02:54:34Z,OWNER,,Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 838148087,MDU6SXNzdWU4MzgxNDgwODc=,250,Handle byte order marks (BOMs) in CSV files,9599,simonw,closed,0,,,,,3,2021-03-22T22:13:18Z,2021-05-29T05:34:21Z,2021-05-29T05:34:21Z,OWNER,,I often find `sqlite-utils insert ... --csv` creates a first column with a weird character at the start of it - which it turns out is the UTF-8 BOM. Fix that.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903902495,MDU6SXNzdWU5MDM5MDI0OTU=,1342,Improve `path_with_replaced_args()` and friends and document them,9599,simonw,open,0,,,,,3,2021-05-27T15:18:28Z,2021-05-27T15:23:02Z,,OWNER,,"> In order to cleanly implement this I need to expose the `path_with_replaced_args` utility function to Datasette's template engine. This is the first time this will become an exposed (and hence should-by-documented) API and I don't like its shape much. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1337#issuecomment-849721280_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,frosencrantz,closed,0,,,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 642296989,MDU6SXNzdWU2NDIyOTY5ODk=,856,Consider pagination of canned queries,9599,simonw,open,0,,,,,3,2020-06-20T03:15:59Z,2021-05-21T14:22:41Z,,OWNER,,The new `canned_queries()` plugin hook from #852 combined with plugins like https://github.com/simonw/datasette-saved-queries could mean that some installations end up with hundreds or even thousands of canned queries. I should consider pagination or some other way of ensuring that this doesn't cause performance problems for Datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/856/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 866668415,MDU6SXNzdWU4NjY2Njg0MTU=,1308,"Columns named ""link"" display in bold",9599,simonw,closed,0,,,,,3,2021-04-24T05:58:11Z,2021-04-24T06:07:49Z,2021-04-24T06:07:49Z,OWNER,,Reported in office hours today.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,chrismp,closed,0,,,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 849543502,MDU6SXNzdWU4NDk1NDM1MDI=,1289,Speed up tests with pytest-xdist,9599,simonw,closed,0,,,,,3,2021-04-03T00:47:39Z,2021-04-03T03:42:28Z,2021-04-03T03:42:28Z,OWNER,,"I think I can get this working for almost every test, then use the pattern in https://github.com/pytest-dev/pytest-xdist/issues/385#issuecomment-444545641 to opt specific tests out of being run in parallel.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1289/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 741862364,MDU6SXNzdWU3NDE4NjIzNjQ=,1090,Custom widgets for canned query forms,9599,simonw,open,0,,,,,3,2020-11-12T19:21:07Z,2021-03-27T16:25:25Z,,OWNER,,"This is an idea that was cut from the first version of writable canned queries: > I really want the option to use a `