id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1066474200,I_kwDOCGYnMM4_kRrY,344,Support STRICT tables,9599,simonw,closed,0,,,,,14,2021-11-29T20:32:23Z,2023-12-08T05:22:39Z,2023-12-08T05:22:39Z,OWNER,,"New in SQLite 3.37.0, released a few days ago: https://www.sqlite.org/stricttables.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2001006157,PR_kwDOCGYnMM5f2OZC,604,Add more STRICT table support,16437338,tkhattra,closed,0,,,,,4,2023-11-19T19:38:53Z,2023-12-08T05:17:20Z,2023-12-08T05:05:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/604,"- https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982014776 Make `table.transform()` preserve STRICT mode. ---- :books: Documentation preview :books:: https://sqlite-utils--604.org.readthedocs.build/en/604/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 2007893839,I_kwDOCGYnMM53rgdP,605,Insert fails with `Error: Python int too large to convert to SQLite INTEGER`; can we use `NUMERIC` here?,12229877,Zac-HD,closed,0,,,,,1,2023-11-23T10:19:46Z,2023-12-08T05:07:54Z,2023-12-08T05:07:54Z,NONE,,"I'm currently working on a new feature for Hypothesis, where we can dump a tidy jsonlines table of all the test cases we tried - including arguments, outcomes, timings, coverage, etc. Exploring this seems like a perfect cases for `sqlite-utils` and `datasette`, but I pretty quickly ran into an integer overflow problem and don't want to recommend that experience to my users. I originally went to report this as a bug... and then found https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895581038 almost exactly matched my repro 😅 https://github.com/simonw/sqlite-utils/issues/110#issuecomment-626391063 suggests that using `NUMERIC` would avoid this overflow error, although ""If the TEXT value is a well-formed integer literal that is too large to fit in a 64-bit signed integer, it is converted to REAL."" suggests that this would come at the cost of rounding to the nearest float value. Maybe I should just convert large integers to float before writing out my json? After a bit more hacking, ""manually cast large integers to float"" seems like a decent solution for my particular case, but having written it up I thought I might as well post this issue anyway - I hope it's useful feedback, and won't mind at all if you close as wontfix if it's not.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2029161033,I_kwDOCGYnMM548opJ,606,str and int as aliases for text and integer,9599,simonw,closed,0,,,,,2,2023-12-06T18:35:49Z,2023-12-06T19:44:04Z,2023-12-06T18:49:32Z,OWNER,,"I keep making this mistake: ```bash sqlite-utils add-column content.db assets _since int ``` ``` Usage: sqlite-utils add-column [OPTIONS] PATH TABLE COL_NAME [[integer|float|b lob|text|INTEGER|FLOAT|BLOB|TEXT]] Try 'sqlite-utils add-column -h' for help. Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]': 'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1959278971,PR_kwDOBm6k_c5dpF-F,2202,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,2,2023-10-24T13:40:21Z,2023-11-08T13:19:03Z,2023-11-08T13:19:01Z,CONTRIBUTOR,simonw/datasette/pulls/2202,"Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).
Release notes

Sourced from black's releases.

23.10.1

Highlights

Preview style

Packaging

Integrations

Documentation

23.10.0

Stable style

Preview style

Configuration

Parser

... (truncated)

Changelog

Sourced from black's changelog.

23.10.1

Highlights

Preview style

Packaging

Integrations

Documentation

23.10.0

Stable style

Preview style

Configuration

Parser

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=23.9.1&new-version=23.10.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2202.org.readthedocs.build/en/2202/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1976986318,I_kwDOCGYnMM511mrO,599,Cannot find spatialite on arm64 linux,37802088,MikeCoats,closed,0,,,,,1,2023-11-03T22:05:51Z,2023-11-04T01:06:31Z,2023-11-04T00:33:28Z,CONTRIBUTOR,,"Initially, I found an issue in `datasette` where it wouldn’t find `spatialite` when running on my Radxa Rock 5B - an RK3588 powered SBC, running the arm64 build of Debian Bullseye. I confirmed the same behaviour on my Raspberry Pi 4 - a BCM2711 powered SBC, running the arm64 build of Debian Bookworm. ``` $ datasette --load-extension=spatialite example.db Error: Could not find SpatiaLite extension ``` I did some digging and realised the issue originates in this project. Even with the `libsqlite3-mod-spatialite` package installed, `pytest` skips all of the GIS tests in the project. ``` $ apt list --installed | grep spatial […] libsqlite3-mod-spatialite/stable,now 5.0.1-3 arm64 [installed] $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` I tracked the issue down to the [`find_sqlite()` function in the `utils.py`](https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/utils.py#L60) file. The [`SPATIALITE_PATHS`](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/utils.py#L34-L39) array doesn’t have an entry for the location of this module on arm64 linux. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1884335789,PR_kwDOCGYnMM5Zs0KB,591,Test against Python 3.12 preview,9599,simonw,closed,0,,,,,3,2023-09-06T16:10:00Z,2023-11-04T00:58:03Z,2023-11-04T00:58:02Z,OWNER,simonw/sqlite-utils/pulls/591,"https://dev.to/hugovk/help-test-python-312-beta-1508/ ---- :books: Documentation preview :books:: https://sqlite-utils--591.org.readthedocs.build/en/591/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/591/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",0, 1553425465,I_kwDOCGYnMM5cl2Q5,522,Add COLUMN_TYPE_MAPPING for timedelta,81377,maport,closed,0,,,,,0,2023-01-23T16:49:54Z,2023-11-04T00:49:51Z,2023-11-04T00:49:51Z,NONE,,"Currently trying to create a column with Python type `datetime.timedelta` results in an error: ``` >>> from sqlite_utils import Database >>> db = Database(""test.db"") >>> test_tbl = db['test'] >>> test_tbl.insert({'col1': datetime.timedelta()}) Traceback (most recent call last): File """", line 1, in File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 2979, in insert return self.insert_all( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 3082, in insert_all self.create( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 1574, in create self.db.create_table( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 961, in create_table sql = self.create_table_sql( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 852, in create_table_sql column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` The reason this would be useful is that `MySQLdb` uses `timedelta` for MySQL `TIME` columns: ``` >>> import MySQLdb >>> conn = MySQLdb.connect(host='database', user='user', passwd='pw') >>> csr = conn.cursor() >>> csr.execute(""SELECT CAST('11:20' AS TIME)"") >>> tuple(csr) ((datetime.timedelta(seconds=40800),),) ``` So currently any attempt to convert a MySQL DB with a `TIME` column using `db-to-sqlite` will result in the above error. I was rather surprised that `MySQLdb` uses `timedelta` for `TIME` columns but I see that [this column type](https://dev.mysql.com/doc/refman/8.0/en/time.html) is intended for time intervals as well as the time of day so it makes sense. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/522/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1919296686,PR_kwDOCGYnMM5bifPC,596,"Fixes mapping for time fields related to mysql, closes #522",4420927,nezhar,closed,0,,,,,1,2023-09-29T13:41:48Z,2023-11-04T00:49:50Z,2023-11-04T00:49:50Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/596,"Adds `COLUMN_TYPE_MAPPING` for `TIME` fields that are mapped as `datetime.timedelta` for MySQL and json represantation for `datetime.timedelta` in order to fix #522 ---- :books: Documentation preview :books:: https://sqlite-utils--596.org.readthedocs.build/en/596/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1926729132,PR_kwDOCGYnMM5b7Z_y,598,Fixed issue #433 - CLI eats cursor,62745,spookylukey,closed,0,,,,,2,2023-10-04T18:06:58Z,2023-11-04T00:46:55Z,2023-11-04T00:40:30Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/598,"The issue is that underlying iterator is not fully consumed within the body of the `with file_progress()` block. Instead, that block creates generator expressions like `docs = (dict(zip(headers, row)) for row in reader)` These iterables are consumed later, outside the `with file_progress()` block, which consumes the underlying iterator, and in turn updates the progress bar. This means that the `ProgressBar.__exit__` method gets called before the last time the `ProgressBar.update` method gets called. The result is that the code to make the cursor invisible (inside the `update()` method) is called after the cleanup code to make it visible (in the `__exit__` method). The fix is to move consumption of the `docs` iterators within the progress bar block. ( (An additional fix, to make ProgressBar more robust against this kind of misuse, would to make it refusing to update after its `__exit__` method had been called, just like files cannot be `read()` after they are closed. That requires a in the click library). Note that Github diff obscures the simplicity of this diff, it's just indenting a block of code. ---- :books: Documentation preview :books:: https://sqlite-utils--598.org.readthedocs.build/en/598/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/598/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",0, 1239034903,I_kwDOCGYnMM5J2iwX,433,CLI eats my cursor,7908073,chapmanjacobd,closed,0,,,,,10,2022-05-17T18:52:52Z,2023-11-04T00:46:30Z,2023-11-04T00:46:30Z,CONTRIBUTOR,,"I'm not sure why this happens but `sqlite-utils` makes my terminal cursor disappear after running commands like `sqlite-utils insert`. I've only noticed this behavior in `sqlite-utils`, not in any other CLI tools I can still type commands after it runs but the text cursor is invisible",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/433/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1977004379,PR_kwDOCGYnMM5elFZf,600,Add spatialite arm64 linux path,37802088,MikeCoats,closed,0,,,,,5,2023-11-03T22:23:26Z,2023-11-04T00:34:33Z,2023-11-04T00:31:49Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/600,"According to both [Debian](https://packages.debian.org/bookworm/arm64/libsqlite3-mod-spatialite/filelist) and [Ubuntu](https://packages.ubuntu.com/mantic/arm64/libsqlite3-mod-spatialite/filelist), the correct “target triple” for arm64 is `aarch64-linux-gnu`, so we should be looking in `/usr/lib/aarch64-linux-gnu` for `mod_spatialite.so`. I can confirm that on both of my Debian arm64 SBCs, `libsqlite3-mod-spatialite` installs to that path. ``` $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` This is a set of before and after snippets of pytest’s output for this PR. ### Before ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` ### After ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ............ [ 75%] tests/test_hypothesis.py .... [ 75%] ``` Issue: #599 ---- :books: Documentation preview :books:: https://sqlite-utils--600.org.readthedocs.build/en/600/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 684961449,MDU6SXNzdWU2ODQ5NjE0NDk=,949,Try out CodeMirror SQL hints,9599,simonw,closed,0,,,,,5,2020-08-24T20:58:21Z,2023-11-03T05:28:58Z,2020-11-01T03:29:48Z,OWNER,,"> It would also be interesting to try out the SQL hint mode, which can autocomplete against tables and columns. This demo shows how to configure that: https://codemirror.net/mode/sql/ > > Some missing documentation: https://stackoverflow.com/questions/20023381/codemirror-how-add-tables-to-sql-hint _Originally posted by @simonw in https://github.com/simonw/datasette/issues/948#issuecomment-679355426_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/949/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,dazzag24,closed,0,,,,,3,2019-02-14T16:30:22Z,2023-10-25T13:23:04Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1949756141,PR_kwDOBm6k_c5dJF8z,2200,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,1,2023-10-18T13:25:55Z,2023-10-24T13:40:29Z,2023-10-24T13:40:26Z,CONTRIBUTOR,simonw/datasette/pulls/2200,"Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).
Release notes

Sourced from black's releases.

23.10.0

Stable style

  • Fix comments getting removed from inside parenthesized strings (#3909)

Preview style

  • Fix long lines with power operators getting split before the line length (#3942)
  • Long type hints are now wrapped in parentheses and properly indented when split across multiple lines (#3899)
  • Magic trailing commas are now respected in return types. (#3916)
  • Require one empty line after module-level docstrings. (#3932)
  • Treat raw triple-quoted strings as docstrings (#3947)

Configuration

  • Fix cache versioning logic when BLACK_CACHE_DIR is set (#3937)

Parser

  • Fix bug where attributes named type were not acccepted inside match statements (#3950)
  • Add support for PEP 695 type aliases containing lambdas and other unusual expressions (#3949)

Output

  • Black no longer attempts to provide special errors for attempting to format Python 2 code (#3933)
  • Black will more consistently print stacktraces on internal errors in verbose mode (#3938)

Integrations

  • The action output displayed in the job summary is now wrapped in Markdown (#3914)
Changelog

Sourced from black's changelog.

23.10.0

Stable style

  • Fix comments getting removed from inside parenthesized strings (#3909)

Preview style

  • Fix long lines with power operators getting split before the line length (#3942)
  • Long type hints are now wrapped in parentheses and properly indented when split across multiple lines (#3899)
  • Magic trailing commas are now respected in return types. (#3916)
  • Require one empty line after module-level docstrings. (#3932)
  • Treat raw triple-quoted strings as docstrings (#3947)

Configuration

  • Fix cache versioning logic when BLACK_CACHE_DIR is set (#3937)

Parser

  • Fix bug where attributes named type were not acccepted inside match statements (#3950)
  • Add support for PEP 695 type aliases containing lambdas and other unusual expressions (#3949)

Output

  • Black no longer attempts to provide special errors for attempting to format Python 2 code (#3933)
  • Black will more consistently print stacktraces on internal errors in verbose mode (#3938)

Integrations

  • The action output displayed in the job summary is now wrapped in Markdown (#3914)
Commits
  • 9edba85 Prepare release 23.10.0 (#3951)
  • bb58807 Fix parser bug where "type" was misinterpreted as a keyword inside a match (#...
  • 722735d Fix grammar for type alias support (#3949)
  • abe57e3 Treat raw strings like other docstrings (#3947)
  • 1648ac5 Fix long lines with power operator(s) getting splitted before line length (#3...
  • 6f84f65 Migrate mypy config to pyproject.toml (#3936)
  • 3bb9214 CI Test: Deprecating 'Healthcheck.all()' from Hypothesis in fuzz.py (#3945)
  • 935f303 Fix test that was not being run (#3939)
  • b7717c3 Standardise newlines after module-level docstrings (#3932)
  • 7aa37ea Report all stacktraces in verbose mode (#3938)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=23.9.1&new-version=23.10.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2200.org.readthedocs.build/en/2200/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1651082214,PR_kwDOBm6k_c5NcFCf,2052,"feat: Javascript Plugin API (Custom panels, column menu items with JS actions)",9020979,hydrosquall,closed,0,9599,simonw,,,20,2023-04-02T20:23:44Z,2023-10-14T17:49:03Z,2023-10-13T00:00:27Z,CONTRIBUTOR,simonw/datasette/pulls/2052,"## Motivation - Allow plugins that add data visualizations [`datasette-vega`](https://github.com/simonw/datasette-vega), [`datasette-leaflet`](https://github.com/simonw/datasette-leaflet), and [`datasette-nteract-data-explorer`](https://github.com/hydrosquall/datasette-nteract-data-explorer) to co-exist safely - Standardize APIs / hooks to ease development for new JS plugin developers (better compat with datasette-lite) through standardized DOM selectors, methods for extending the existing Table UI. This has come up as a feature request several times (see research notes for examples) - Discussion w/ @simonw about a general-purpose Datasette JS API ## Changes Summary: Provide 2 new surface areas for Datasette JS plugin developers. See alpha [documentation](https://github.com/simonw/datasette/pull/2052#issuecomment-1510423051) 1. Custom column header items: 2. Basic ""panels"" controlled by buttons: ### User Facing Changes - Allow creating menu items under table header that triggers JS (instead of opening hrefs per the existing [menu_link](https://docs.datasette.io/en/stable/plugin_hooks.html#menu-links-datasette-actor-request) hook). Items can respond to any column metadata provided by the column header (e.g. label). The proof of concept plugins log data to the console, or copy the column name to clipboard. - Allow plugins to register UI elements in a panel controller. The parent component handles switching the visibility of active plugins. - Because native button elements are used, the panel is keyboard-accessible - use tab / shift-tab to cycle through tab options, and `enter` to select. - There's room to improve the styling, but the focus of this PR is on the API rather than the UX. ### (plugin) Developer Facing Changes - Dispatch a `datasette_init` [CustomEvent](https://developer.mozilla.org/en-US/docs/Web/API/CustomEvent/CustomEvent) when the `datasetteManager` is finished loading. - Provide `manager.registerPlugin` API for adding new functionality that coordinates with Datasette lifecycle events. - Provide a `manager.selectors` map of DOM elements that users may want to hook into. - Updated `table.js` to use refer to these to motivating keeping things in sync - Allow plugins to register themselves with 2 hooks: - `makeColumnActions`: Add items to menu in table column headers. Users can provide a `label`, and either `href` or `onClick` with full access to the metadata for the clicked column (name, type, misc. booleans) - `makeAboveTablePanelConfigs`: Add items to the panel. Each panel has a unique ID (namespaced within that plugin), a render function, and a string label. See [this file](https://github.com/simonw/datasette/blob/2d92b9328022d86505261bcdac419b6ed9cb2236/datasette/static/table-example-plugins.js) for example plugin usage. ### Core Developer Facing Changes - Modified `table.js` to make use of the `datasetteManager` API. - Added example plugins to the `demos/plugins` folder, and stored the test js in the `statics/` folder ## Testing For Datasette plugin developers, please see the [alpha-level documentation](https://github.com/simonw/datasette/pull/2052#issuecomment-1510423051) . To run the examples: ```bash datasette serve fixtures.db --plugins-dir=demos/plugins/ ``` Open local server: `http://127.0.0.1:8001/fixtures/facetable` Open to all feedback on this PR, from API design to variable naming, to what additional hooks might be useful for the future. My focus was more on the general shape of the API for developers, rather than on the UX of the test plugins. ## Design notes - The manager tab panel could be a separate plugin if the implementation is too custom. - The `makeColumnHeaderItems` benefits from hooking into the logic of `table.js` - I wanted to offer this to the Datasette core, since the `datasette-manager` would be more powerful if it were connected to lifecycle and JS events that are part of the existing table.js. - Non-goals: - Dependency management (for now) - there's no ""build"" step, we don't know when new plugins will be added. While there are some valid use cases (for example, allow multiple plugins to wait for a global leaflet object to be loaded), I don't see enough use-cases to justify doing this yet. - Enabling single-page-app features - for now, most datasette actions lead to a new page being loaded. SPA development offers some benefits (no page jumping after clicking on a link), but also complexity that doesn't need to be in the core Datasette project. ## Research Notes - Relocated to a [comment](https://github.com/simonw/datasette/pull/2052#issuecomment-1510423215), as this isn't required to review when evaluating the plugin. Including it just for those who are curious. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2052/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,Olshansk,closed,0,,,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1901483874,PR_kwDOBm6k_c5amULw,2190,"Raise an exception if a ""plugins"" block exists in metadata.json",15178711,asg017,closed,0,,,,,5,2023-09-18T18:08:56Z,2023-10-12T16:20:51Z,2023-10-12T16:20:51Z,CONTRIBUTOR,simonw/datasette/pulls/2190,"refs #2183 #2093 From [this comment](https://github.com/simonw/datasette/pull/2183#issuecomment-1714699724) in #2183: If a `""plugins""` block appears in `metadata.json`, it means that a user hasn't migrated over their plugin configuration from `metadata.json` to `datasette.yaml`, which is a breaking change in Datasette 1.0. This PR will ensure that an error is raised whenever that happens. ---- :books: Documentation preview :books:: https://datasette--2190.org.readthedocs.build/en/2190/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1901768721,PR_kwDOBm6k_c5anSg5,2191,"Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml`",15178711,asg017,closed,0,,,,,4,2023-09-18T21:21:16Z,2023-10-12T16:16:38Z,2023-10-12T16:16:38Z,CONTRIBUTOR,simonw/datasette/pulls/2191,"The PR moves the following fields from `metadata.yaml` to `datasette.yaml`: ``` permissions allow allow_sql queries extra_css_urls extra_js_urls ``` This is a significant breaking change that users will need to upgrade their `metadata.yaml` files for. But the format/locations are similar to the previous version, so it shouldn't be too difficult to upgrade. One note: I'm still working on the Configuration docs, specifically the ""reference"" section. Though it's pretty small, the rest of read to review",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1907281675,I_kwDOCGYnMM5xrs8L,595,Cascading DELETE not working with Table.delete(pk),123451970,cycle-data,closed,0,,,,,1,2023-09-21T15:46:41Z,2023-09-25T09:38:57Z,2023-09-25T09:38:13Z,NONE,,"Hi ! I noticed that when I am trying to use the delete method of the Table object, the record get properly deleted from the table, but the cascading delete triggers on foreign keys do not activate. `self.db[""contact""].delete(contact_id)` I tried querying the database directly via DB Browser and the triggers work without any issue. Looked up the source code and behind the scene this method is just querying the database normally so I'm not exactly sure where this behavior comes from. Thank you in advance for your time ! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1901416155,I_kwDOBm6k_c5xVU7b,2189,Server hang on parallel execution of queries to named in-memory databases,9599,simonw,closed,0,,,,,31,2023-09-18T17:23:18Z,2023-09-21T22:26:21Z,2023-09-21T22:26:21Z,OWNER,,"I've started to encounter a bug where queries to tables inside named in-memory databases sometimes trigger server hangs. I'm still trying to figure out what's going on here - on one occasion I managed to Ctrl+C the server and saw an exception that mentioned a thread lock, but usually hitting Ctrl+C does nothing and I have to `kill -9` the PID instead. This is all running on my M2 Mac. I've seen the bug in the Datasette 1.0 alphas and in Datasette 0.64.3 - but reverting to 0.61 appeared to fix it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1662951875,I_kwDOBm6k_c5jHqHD,2057,DeprecationWarning: pkg_resources is deprecated as an API,9599,simonw,closed,0,,,,,25,2023-04-11T17:41:20Z,2023-09-21T22:09:10Z,2023-09-21T22:09:10Z,OWNER,,"Got this running tests against Python 3.11. ``` ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/datasette/app.py:14: in import pkg_resources ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/pkg_resources/__init__.py:121: in warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) E DeprecationWarning: pkg_resources is deprecated as an API ``` I ran with `pytest -Werror --pdb -x` to get the debugger for that warning, but it turned out searching the code worked better. It's used in these two places: https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/plugins.py#L43-L50 https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/app.py#L1037",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2057/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907695234,I_kwDOBm6k_c5xtR6C,2194,"Deploy failing with ""plugins/alternative_route.py: Not a directory""",9599,simonw,closed,0,,,,,8,2023-09-21T20:17:49Z,2023-09-21T22:08:19Z,2023-09-21T22:08:19Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/6266449018/job/17017460074 This is a bit of a mystery, I don't think I've changed anything recently that could have broken this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907655261,I_kwDOBm6k_c5xtIJd,2193,"""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run",9599,simonw,closed,0,,,,,6,2023-09-21T19:49:34Z,2023-09-21T21:56:43Z,2023-09-21T21:56:43Z,OWNER,,"> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the ""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1896578249,PR_kwDOBm6k_c5aWACP,2185,Bump the python-packages group with 3 updates,49699333,dependabot[bot],closed,0,,,,,1,2023-09-14T13:27:40Z,2023-09-20T22:11:25Z,2023-09-20T22:11:24Z,CONTRIBUTOR,simonw/datasette/pulls/2185,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [black](https://github.com/psf/black). Updates `sphinx` from 7.2.5 to 7.2.6
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.6

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.6 (released Sep 13, 2023)

Bugs fixed

  • #11679: Add the :envvar:!SPHINX_AUTODOC_RELOAD_MODULES environment variable, which if set reloads modules when using autodoc with TYPE_CHECKING = True. Patch by Matt Wozniski and Adam Turner.
  • #11679: Use :py:func:importlib.reload to reload modules in autodoc. Patch by Matt Wozniski and Adam Turner.
Commits

Updates `furo` from 2023.8.19 to 2023.9.10
Changelog

Sourced from furo's changelog.

Changelog

2023.09.10 -- Zesty Zaffre

  • Make asset hash injection idempotent, fixing Sphinx 6 compatibility.
  • Fix the check for HTML builders, fixing non-HTML Read the Docs builds.

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

... (truncated)

Commits
  • 2718ca4 Prepare release: 2023.09.10
  • c22c99d Update changelog
  • c37e849 Quote a not-runtime-generic type annotation
  • 9cfdf44 Rework infrastructure for linting
  • 5abeb9f Fix the check for HTML builders
  • ee2ab54 Tweak how tests are run with nox
  • cdae236 Test against Sphinx minor versions in CI
  • 9e40071 Make asset hash injection idempotent
  • aab86f4 Revert "Exclude incompatible Sphinx releases (#711)"
  • 4dd6eec Exclude incompatible Sphinx releases (#711)
  • Additional commits viewable in compare view

Updates `black` from 23.7.0 to 23.9.1
Release notes

Sourced from black's releases.

23.9.1

Due to various issues, the previous release (23.9.0) did not include compiled mypyc wheels, which make Black significantly faster. These issues have now been fixed, and this release should come with compiled wheels once again.

There will be no wheels for Python 3.12 due to a bug in mypyc. We will provide 3.12 wheels in a future release as soon as the mypyc bug is fixed.

Packaging

  • Upgrade to mypy 1.5.1 (#3864)

Performance

  • Store raw tuples instead of NamedTuples in Black's cache, improving performance and decreasing the size of the cache (#3877)

23.9.0

Preview style

  • More concise formatting for dummy implementations (#3796)
  • In stub files, add a blank line between a statement with a body (e.g an if sys.version_info > (3, x):) and a function definition on the same level (#3862)
  • Fix a bug whereby spaces were removed from walrus operators within subscript(#3823)

Configuration

  • Black now applies exclusion and ignore logic before resolving symlinks (#3846)

Performance

  • Avoid importing IPython if notebook cells do not contain magics (#3782)
  • Improve caching by comparing file hashes as fallback for mtime and size (#3821)

Blackd

  • Fix an issue in blackd with single character input (#3558)

Integrations

  • Black now has an official pre-commit mirror. Swapping https://github.com/psf/black to https://github.com/psf/black-pre-commit-mirror in your .pre-commit-config.yaml will make Black about 2x faster (#3828)
  • The .black.env folder specified by ENV_PATH will now be removed on the completion of the GitHub Action (#3759)
Changelog

Sourced from black's changelog.

23.9.1

Due to various issues, the previous release (23.9.0) did not include compiled mypyc wheels, which make Black significantly faster. These issues have now been fixed, and this release should come with compiled wheels once again.

There will be no wheels for Python 3.12 due to a bug in mypyc. We will provide 3.12 wheels in a future release as soon as the mypyc bug is fixed.

Packaging

  • Upgrade to mypy 1.5.1 (#3864)

Performance

  • Store raw tuples instead of NamedTuples in Black's cache, improving performance and decreasing the size of the cache (#3877)

23.9.0

Preview style

  • More concise formatting for dummy implementations (#3796)
  • In stub files, add a blank line between a statement with a body (e.g an if sys.version_info > (3, x):) and a function definition on the same level (#3862)
  • Fix a bug whereby spaces were removed from walrus operators within subscript(#3823)

Configuration

  • Black now applies exclusion and ignore logic before resolving symlinks (#3846)

Performance

  • Avoid importing IPython if notebook cells do not contain magics (#3782)
  • Improve caching by comparing file hashes as fallback for mtime and size (#3821)

Blackd

  • Fix an issue in blackd with single character input (#3558)

Integrations

  • Black now has an official pre-commit mirror. Swapping https://github.com/psf/black to https://github.com/psf/black-pre-commit-mirror in your .pre-commit-config.yaml will make Black about 2x faster (#3828)
  • The .black.env folder specified by ENV_PATH will now be removed on the completion of the GitHub Action (#3759)
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2185.org.readthedocs.build/en/2185/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1903932086,PR_kwDOBm6k_c5aumyn,2192,Stop using parallel SQL queries for tables,9599,simonw,closed,0,,,,,1,2023-09-20T01:28:43Z,2023-09-20T22:10:56Z,2023-09-20T22:10:55Z,OWNER,simonw/datasette/pulls/2192,"Refs: - #2189 ---- :books: Documentation preview :books:: https://datasette--2192.org.readthedocs.build/en/2192/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1890593563,PR_kwDOBm6k_c5aBx3g,2182,Bump the python-packages group with 2 updates,49699333,dependabot[bot],closed,0,,,,,1,2023-09-11T14:01:25Z,2023-09-14T13:27:30Z,2023-09-14T13:27:28Z,CONTRIBUTOR,simonw/datasette/pulls/2182,"Bumps the python-packages group with 2 updates: [furo](https://github.com/pradyunsg/furo) and [black](https://github.com/psf/black). Updates `furo` from 2023.8.19 to 2023.9.10
Changelog

Sourced from furo's changelog.

Changelog

2023.09.10 -- Zesty Zaffre

  • Make asset hash injection idempotent, fixing Sphinx 6 compatibility.
  • Fix the check for HTML builders, fixing non-HTML Read the Docs builds.

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

... (truncated)

Commits
  • 2718ca4 Prepare release: 2023.09.10
  • c22c99d Update changelog
  • c37e849 Quote a not-runtime-generic type annotation
  • 9cfdf44 Rework infrastructure for linting
  • 5abeb9f Fix the check for HTML builders
  • ee2ab54 Tweak how tests are run with nox
  • cdae236 Test against Sphinx minor versions in CI
  • 9e40071 Make asset hash injection idempotent
  • aab86f4 Revert "Exclude incompatible Sphinx releases (#711)"
  • 4dd6eec Exclude incompatible Sphinx releases (#711)
  • Additional commits viewable in compare view

Updates `black` from 23.7.0 to 23.9.1
Release notes

Sourced from black's releases.

23.9.1

Due to various issues, the previous release (23.9.0) did not include compiled mypyc wheels, which make Black significantly faster. These issues have now been fixed, and this release should come with compiled wheels once again.

There will be no wheels for Python 3.12 due to a bug in mypyc. We will provide 3.12 wheels in a future release as soon as the mypyc bug is fixed.

Packaging

  • Upgrade to mypy 1.5.1 (#3864)

Performance

  • Store raw tuples instead of NamedTuples in Black's cache, improving performance and decreasing the size of the cache (#3877)

23.9.0

Preview style

  • More concise formatting for dummy implementations (#3796)
  • In stub files, add a blank line between a statement with a body (e.g an if sys.version_info > (3, x):) and a function definition on the same level (#3862)
  • Fix a bug whereby spaces were removed from walrus operators within subscript(#3823)

Configuration

  • Black now applies exclusion and ignore logic before resolving symlinks (#3846)

Performance

  • Avoid importing IPython if notebook cells do not contain magics (#3782)
  • Improve caching by comparing file hashes as fallback for mtime and size (#3821)

Blackd

  • Fix an issue in blackd with single character input (#3558)

Integrations

  • Black now has an official pre-commit mirror. Swapping https://github.com/psf/black to https://github.com/psf/black-pre-commit-mirror in your .pre-commit-config.yaml will make Black about 2x faster (#3828)
  • The .black.env folder specified by ENV_PATH will now be removed on the completion of the GitHub Action (#3759)
Changelog

Sourced from black's changelog.

23.9.1

Due to various issues, the previous release (23.9.0) did not include compiled mypyc wheels, which make Black significantly faster. These issues have now been fixed, and this release should come with compiled wheels once again.

There will be no wheels for Python 3.12 due to a bug in mypyc. We will provide 3.12 wheels in a future release as soon as the mypyc bug is fixed.

Packaging

  • Upgrade to mypy 1.5.1 (#3864)

Performance

  • Store raw tuples instead of NamedTuples in Black's cache, improving performance and decreasing the size of the cache (#3877)

23.9.0

Preview style

  • More concise formatting for dummy implementations (#3796)
  • In stub files, add a blank line between a statement with a body (e.g an if sys.version_info > (3, x):) and a function definition on the same level (#3862)
  • Fix a bug whereby spaces were removed from walrus operators within subscript(#3823)

Configuration

  • Black now applies exclusion and ignore logic before resolving symlinks (#3846)

Performance

  • Avoid importing IPython if notebook cells do not contain magics (#3782)
  • Improve caching by comparing file hashes as fallback for mtime and size (#3821)

Blackd

  • Fix an issue in blackd with single character input (#3558)

Integrations

  • Black now has an official pre-commit mirror. Swapping https://github.com/psf/black to https://github.com/psf/black-pre-commit-mirror in your .pre-commit-config.yaml will make Black about 2x faster (#3828)
  • The .black.env folder specified by ENV_PATH will now be removed on the completion of the GitHub Action (#3759)
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2182.org.readthedocs.build/en/2182/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1891212159,PR_kwDOBm6k_c5aD33C,2183,`datasette.yaml` plugin support,15178711,asg017,closed,0,,,,,4,2023-09-11T20:26:04Z,2023-09-13T21:06:25Z,2023-09-13T21:06:25Z,CONTRIBUTOR,simonw/datasette/pulls/2183,"Part of #2093 In #2149 , we ported over `""settings.json""` into the new `datasette.yaml` config file, with a top-level `""settings""` key. This PR ports over plugin configuration into top-level `""plugins""` key, as well as nested database/table plugin config. From now on, no plugin-related configuration is allowed in `metadata.yaml`, and must be in `datasette.yaml` in this new format. This is a pretty significant breaking change. Thankfully, you should be able to copy-paste your legacy plugin key/values into the new `datasette.yaml` format. An example of what `datasette.yaml` would look like with this new plugin config: ```yaml plugins: datasette-my-plugin: config_key: value databases: fixtures: plugins: datasette-my-plugin: config_key: fixtures-db-value tables: students: plugins: datasette-my-plugin: config_key: fixtures-students-table-value ``` As an additional benefit, this now works with the new `-s` flag: ```bash datasette --memory -s 'plugins.datasette-my-plugin.config_key' new_value ``` Marked as a ""Draft"" right now until I add better documentation. We also should have a plan for the next alpha release to document and publicize this change, especially for plugin authors (since their docs will have to change to say `datasette.yaml` instead of `metadata.yaml` ---- :books: Documentation preview :books:: https://datasette--2183.org.readthedocs.build/en/2183/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1886771493,I_kwDOCGYnMM5wddkl,592,`table.transform()` should preserve `rowid` values,9599,simonw,closed,0,,,,,6,2023-09-08T00:42:38Z,2023-09-10T17:46:41Z,2023-09-09T00:45:32Z,OWNER,,"I just spotted a bug when using https://datasette.io/plugins/datasette-configure-fts and https://datasette.io/plugins/datasette-edit-schema at the same time. Steps to reproduce: - Configure FTS for a table, then run a test search - Edit the schema for that table and change the order of columns - Run the test search again I got the wrong search results, which I think is because the `_fts` table pointed to the first table by `rowid` but those `rowid` values were entirely rewritten as a consequence of running `table.transform()` on the table. Reconfiguring FTS on the table fixed the problem. I think `table.transform()` should be able to preserve `rowid` values.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886783150,PR_kwDOCGYnMM5Z1H1d,593,".transform() now preserves rowid values, refs #592",9599,simonw,closed,0,,,,,1,2023-09-08T01:02:28Z,2023-09-10T17:44:59Z,2023-09-09T00:45:30Z,OWNER,simonw/sqlite-utils/pulls/593,"Refs: - #592 - [x] Tests against weird shaped tables I need to test that this works against: - `rowid` tables - Tables that have a column called `rowid` even though they are not rowid tables ---- :books: Documentation preview :books:: https://sqlite-utils--593.org.readthedocs.build/en/593/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1886791100,I_kwDOBm6k_c5wdiW8,2180,Plugin hook: `actors_from_ids()`,9599,simonw,closed,0,,,,,6,2023-09-08T01:16:41Z,2023-09-10T17:44:14Z,2023-09-08T04:28:03Z,OWNER,,"In building Datasette Cloud we realized that a bunch of the features we are building need a way of resolving an actor ID to the actual actor, in order to display something more interesting than just an integer ID. Social plugins in particular need this - comments by X, CSV uploaded by X, that kind of thing. I think the solution is a new plugin hook: `actors_from_ids(datasette, ids)` which can return a list of actor dictionaries. The default implementation can return `[{""id"": ""...""}]` for the IDs passed to it. Pluggy has a `firstresult=True` option which is relevant here, since this is the first plugin hook we will have implemented where only one plugin should provide an answer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1874255116,I_kwDOBm6k_c5vtt0M,2164,Ability to only load a specific list of plugins,9599,simonw,closed,0,,,,,1,2023-08-30T19:33:41Z,2023-09-08T04:35:46Z,2023-08-30T22:12:27Z,OWNER,,"I'm going to try and get this working through an environment variable, so that you can start Datasette and it will only load a subset of plugins including those that use the `register_commands()` hook. Initial research on this: - https://github.com/pytest-dev/pluggy/issues/422",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886812002,PR_kwDOBm6k_c5Z1N2L,2181,actors_from_ids plugin hook and datasette.actors_from_ids() method,9599,simonw,closed,0,,,,,3,2023-09-08T01:51:07Z,2023-09-08T04:24:00Z,2023-09-08T04:23:59Z,OWNER,simonw/datasette/pulls/2181,"Refs: - #2180 This plugin hook is feature complete - including documentation and tests. I'm not going to land it in Datasette `main` until we've used it at least once though, which should happen promptly in development for [Datasette Cloud](https://www.datasette.cloud/). ---- :books: Documentation preview :books:: https://datasette--2181.org.readthedocs.build/en/2181/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1886350562,I_kwDOBm6k_c5wb2zi,2178,Don't show foreign key links to tables the user cannot access,9599,simonw,closed,0,,,,,5,2023-09-07T17:56:41Z,2023-09-07T23:28:27Z,2023-09-07T23:28:27Z,OWNER,,"Spotted this problem while working on this plugin: - https://github.com/simonw/datasette-public It's possible to make a table public to any users - but then you may end up with situations like this: That table is public, but the foreign key links go to tables that are NOT public. We're also leaking the names of the values in those private tables here, which we shouldn't do. So this is a tiny bit of an information leak. Since this only affects people who have configured a table to be public that has foreign keys to a table that is private I don't think this is worth issuing a vulnerability report about - I very much doubt anyone is running Datasette configured in a way that could result in problems because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886649402,I_kwDOBm6k_c5wc_w6,2179,Flaky test: test_hidden_sqlite_stat1_table,9599,simonw,closed,0,,,,,0,2023-09-07T22:48:43Z,2023-09-07T22:51:19Z,2023-09-07T22:51:19Z,OWNER,,"This test here: https://github.com/simonw/datasette/blob/fbcb103c0cb6668018ace539a01a6a1f156e8d6a/tests/test_api.py#L1011-L1020 It failed for me like this: `E AssertionError: assert [('normal', False), ('sqlite_stat1', True), ('sqlite_stat4', True)] in ([('normal', False), ('sqlite_stat1', True)],)` Looks like some builds of SQLite include a `sqlite_stat4` table.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1875519316,PR_kwDOBm6k_c5ZPO5y,2166,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,1,2023-08-31T13:19:57Z,2023-09-06T16:34:32Z,2023-09-06T16:34:31Z,CONTRIBUTOR,simonw/datasette/pulls/2166,"Bumps the python-packages group with 1 update: [sphinx](https://github.com/sphinx-doc/sphinx).
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.5

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.5 (released Aug 30, 2023)

Bugs fixed

  • #11645: Fix a regression preventing autodoc from importing modules within packages that make use of if typing.TYPE_CHECKING: to guard circular imports needed by type checkers. Patch by Matt Wozniski.
  • #11634: Fixed inheritance diagram relative link resolution for sibling files in a subdirectory. Patch by Albert Shih.
  • #11659: Allow ?config=... in :confval:mathjax_path.
  • #11654: autodoc: Fail with a more descriptive error message when an object claims to be an instance of type, but is not a class. Patch by James Braza.
  • 11620: Cease emitting :event:source-read events for files read via the :dudir:include directive.
  • 11620: Add a new :event:include-read for observing and transforming the content of included files via the :dudir:include directive.
  • #11627: Restore support for copyright lines of the form YYYY when SOURCE_DATE_EPOCH is set.
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=7.2.4&new-version=7.2.5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2166.org.readthedocs.build/en/2166/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2166/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1884333600,PR_kwDOBm6k_c5Zszqk,2175,Test against Python 3.12 preview,9599,simonw,closed,0,,,,,0,2023-09-06T16:09:05Z,2023-09-06T16:16:28Z,2023-09-06T16:16:27Z,OWNER,simonw/datasette/pulls/2175,"https://dev.to/hugovk/help-test-python-312-beta-1508/ ---- :books: Documentation preview :books:: https://datasette--2175.org.readthedocs.build/en/2175/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1883055640,PR_kwDOBm6k_c5ZociX,2173,click-default-group>=1.2.3,9599,simonw,closed,0,,,,,3,2023-09-06T02:33:28Z,2023-09-06T02:50:10Z,2023-09-06T02:50:10Z,OWNER,simonw/datasette/pulls/2173,"Now available as a wheel: - https://github.com/click-contrib/click-default-group/issues/21 ---- :books: Documentation preview :books:: https://datasette--2173.org.readthedocs.build/en/2173/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 336464733,MDU6SXNzdWUzMzY0NjQ3MzM=,328,"Installation instructions, including how to use the docker image",9599,simonw,closed,0,,,,,4,2018-06-28T03:59:33Z,2023-09-05T14:10:39Z,2018-06-28T04:02:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1292370469,I_kwDOBm6k_c5NCAIl,1765,Document plugins providing new plugin hook-,9599,simonw,closed,0,,,,,1,2022-07-03T17:05:14Z,2023-08-31T23:08:24Z,2023-08-31T23:06:31Z,OWNER,,I've used this pattern twice now: https://til.simonwillison.net/datasette/register-new-plugin-hooks - in `datasette-graphql` and `datasette-low-disk-space-hook`. I should describe the pattern on https://docs.datasette.io/en/stable/writing_plugins.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1765/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1876407598,I_kwDOBm6k_c5v17Uu,2169,execute-sql on a database should imply view-database/view-permission,9599,simonw,closed,0,,,,,0,2023-08-31T22:45:56Z,2023-08-31T22:46:28Z,2023-08-31T22:46:28Z,OWNER,,"I noticed that a token with `execute-sql` permission alone did not work, because it was not allowed to view the instance of the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 742041667,MDU6SXNzdWU3NDIwNDE2Njc=,1092,Make cascading permission checks available to plugins,9599,simonw,closed,0,,,,,1,2020-11-13T01:02:55Z,2023-08-30T22:17:42Z,2023-08-30T22:17:41Z,OWNER,,"The `BaseView` class has a method for cascading permission checks, but it's not easily accessible to plugins. https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/views/base.py#L75-L99 This leaves plugins like `datasette-graphql` having to implement their own versions of this logic, which is bad: https://github.com/simonw/datasette-graphql/issues/65 > First check `view-database` - if that says `False` then disallow access, if it says `True` then allow access. If it says `None` check `view-instance`. This should become a supported API that plugins are encouraged to use.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 787098146,MDU6SXNzdWU3ODcwOTgxNDY=,1190,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,1024355,tomershvueli,closed,0,,,,,3,2021-01-15T18:18:42Z,2023-08-30T22:16:39Z,2023-08-30T22:16:38Z,NONE,,"If I have a self-hosted instance of Datasette up and running, I'd like to be able to the use the CLI to publish databases to that instance, not only Google or Heroku. Ideally there'd be a `url` parameter or something similar to which one could point the publish command to their instance. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1874327336,PR_kwDOBm6k_c5ZLMSe,2165,DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins,9599,simonw,closed,0,,,,,6,2023-08-30T20:33:30Z,2023-08-30T22:12:25Z,2023-08-30T22:12:25Z,OWNER,simonw/datasette/pulls/2165,"- #2164 TODO: - [x] Automated tests - [ ] Documentation - [x] Make sure `DATASETTE_LOAD_PLUGINS=''` works for loading zero plugins",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1872043170,I_kwDOBm6k_c5vlRyi,2163,Rename core_X to catalog_X in the internals,9599,simonw,closed,0,,,,,1,2023-08-29T16:45:00Z,2023-08-29T17:01:31Z,2023-08-29T17:01:31Z,OWNER,,"Discussed with Alex this morning. We think the American spelling is fine here (it's shorter than `catalogue`) and that it's a slightly less lazy name than `core_`. Follows: - https://github.com/simonw/datasette/issues/2157",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1805076818,I_kwDOBm6k_c5rl0lS,2102,API tokens with view-table but not view-database/view-instance cannot access the table,9599,simonw,closed,0,9599,simonw,,,20,2023-07-14T15:34:27Z,2023-08-29T16:32:36Z,2023-08-29T16:32:35Z,OWNER,,"> Spotted a problem while working on this: if you grant a token access to view table for a specific table but don't also grant view database and view instance permissions, that token is useless. > > This was a deliberate design decision in Datasette - it's documented on https://docs.datasette.io/en/1.0a2/authentication.html#access-permissions-in-metadata > >> If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries. > > I'm now second-guessing if this was a good decision. _Originally posted by @simonw in https://github.com/simonw/datasette-auth-tokens/issues/7#issuecomment-1636031702_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865281760,PR_kwDOBm6k_c5Ys3C5,2154,Cascade for restricted token view-table/view-database/view-instance operations,9599,simonw,closed,0,,,,,8,2023-08-24T14:24:23Z,2023-08-29T16:32:35Z,2023-08-29T16:32:34Z,OWNER,simonw/datasette/pulls/2154,"Refs: - #2102 Also includes a prototype implementation of `--actor option` which I'm using for testing this, from: - #2153 ---- :books: Documentation preview :books:: https://datasette--2154.org.readthedocs.build/en/2154/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1870672704,PR_kwDOBm6k_c5Y-7Em,2162,"Add new `--internal internal.db` option, deprecate legacy `_internal` database",15178711,asg017,closed,0,,,,,4,2023-08-29T00:05:07Z,2023-08-29T03:24:23Z,2023-08-29T03:24:23Z,CONTRIBUTOR,simonw/datasette/pulls/2162,"refs #2157 This PR adds a new `--internal` option to datasette serve. If provided, it is the path to a persistent internal database that Datasette core and Datasette plugins can use to store data, as discussed in the proposal issue. This PR also removes and deprecates the previous in-memory `_internal` database. Those tables now appear in the `internal` database, with `core_` prefixes (ex `tables` in `_internal` is now `core_tables` in `internal`). ## A note on the new `core_` tables However, one important notes about those new `core_` tables: If a `--internal` DB is passed in, that means those `core_` tables will persist across multiple Datasette instances. This wasn't the case before, since `_internal` was always an in-memory database created from scratch. I tried to put those `core_` tables as `TEMP` tables - after all, there's always one 1 `internal` DB connection at a time, so I figured it would work. But, since we use the `Database()` wrapper for the internal DB, it has two separate connections: a default read-only connection and a write connection that is created when a write operation occurs. Which meant the `TEMP` tables would be created by the write connection, but not available in the read-only connection. So I had a brillant idea: Attach an in-memory named database with `cache=shared`, and create those tables there! ```sql ATTACH DATABASE 'file:datasette_internal_core?mode=memory&cache=shared' AS core; ``` We'd run this on both the read-only connection and the write-only connection. That way, those tables would stay in memory, they'd communicate with the `cache=shared` feature, and we'd be good to go. However, I couldn't find an easy way to run a `ATTACH DATABASE` command on the read-only query. Using `Database()` as a wrapper for the internal DB is pretty limiting - it's meant for Datasette ""data"" databases, where we want multiple readers and possibly 1 write connection at a time. But the internal database doesn't really require that kind of support - I think we could get away with a single read/write connection, but it seemed like too big of a rabbithole to go through now. ---- :books: Documentation preview :books:: https://datasette--2162.org.readthedocs.build/en/2162/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1869807874,PR_kwDOBm6k_c5Y8AN0,2160,"Bump sphinx, furo, blacken-docs dependencies",49699333,dependabot[bot],closed,0,,,,,5,2023-08-28T13:49:31Z,2023-08-29T00:38:33Z,2023-08-29T00:38:32Z,CONTRIBUTOR,simonw/datasette/pulls/2160,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.4
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.4

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.3

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.4 (released Aug 28, 2023)

Bugs fixed

  • #11618: Fix a regression in the MoveModuleTargets transform, introduced in #10478 (#9662).
  • #11649: linkcheck: Resolve hanging tests for timezones west of London and incorrect conversion from UTC to offsets from the UNIX epoch. Patch by Dmitry Shachnev and Adam Turner.

Release 7.2.3 (released Aug 23, 2023)

Dependencies

  • #11576: Require sphinxcontrib-serializinghtml 1.1.9.

Bugs fixed

  • Fix regression in autodoc.Documenter.parse_name().
  • Fix regression in JSON serialisation.
  • #11543: autodoc: Support positional-only parameters in classmethod methods when autodoc_preserve_defaults is True.
  • Restore support string methods on path objects. This is deprecated and will be removed in Sphinx 8. Use :py:func:os.fspath to convert :py:class:~pathlib.Path objects to strings, or :py:class:~pathlib.Path's methods to work with path objects.

Release 7.2.2 (released Aug 17, 2023)

Bugs fixed

  • Fix the signature of the StateMachine.insert_input() patch, for when calling with keyword arguments.
  • Fixed membership testing (in) for the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
  • Fixed a type error in SingleFileHTMLBuilder._get_local_toctree, includehidden may be passed as a string or a boolean.
  • Fix :noindex: for PyModule and JSModule.

Release 7.2.1 (released Aug 17, 2023)

... (truncated)

Commits

Updates `furo` from 2023.7.26 to 2023.8.19
Changelog

Sourced from furo's changelog.

Changelog

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2160.org.readthedocs.build/en/2160/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2160/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1859415334,PR_kwDOBm6k_c5YY5Ea,2148,"Bump sphinx, furo, blacken-docs dependencies",49699333,dependabot[bot],closed,0,,,,,9,2023-08-21T13:48:11Z,2023-08-29T00:15:31Z,2023-08-29T00:15:27Z,CONTRIBUTOR,simonw/datasette/pulls/2148,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.2
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.2 (released Aug 17, 2023)

Bugs fixed

  • Fix the signature of the StateMachine.insert_input() patch, for when calling with keyword arguments.
  • Fixed membership testing (in) for the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
  • Fixed a type error in SingleFileHTMLBuilder._get_local_toctree, includehidden may be passed as a string or a boolean.
  • Fix :noindex: for PyModule and JSModule``.

Release 7.2.1 (released Aug 17, 2023)

Bugs fixed

  • Restored the the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon. This will be removed in Sphinx 9.
  • Restored calls to Builder.add_{css,js}_file(), which several extensions relied upon.
  • Restored the private API TocTree.get_toctree_ancestors(), which several extensions relied upon.

Release 7.2.0 (released Aug 17, 2023)

Dependencies

  • #11511: Drop Python 3.8 support.
  • #11576: Require Pygments 2.14 or later.

Deprecated

  • #11512: Deprecate sphinx.util.md5 and sphinx.util.sha1. Use hashlib instead.
  • #11526: Deprecate sphinx.testing.path. Use os.path or pathlib instead.
  • #11528: Deprecate sphinx.util.split_index_msg and sphinx.util.split_into. Use sphinx.util.index_entries.split_index_msg instead.
  • Deprecate sphinx.builders.html.Stylesheet and sphinx.builders.html.Javascript. Use sphinx.application.Sphinx.add_css_file()

... (truncated)

Commits
  • ed84d63 Bump to 7.2.2 final
  • ea4a73e [bot]: Update message catalogues (#11612)
  • e47846a Fix :noindex: for PyModule and JSModule``
  • b2fc47f Add CHANGES entry for renaming the StateMachine.insert_input() parameter
  • 0835c3e Fix regression in SingleFileHTMLBuilder._get_local_toctree
  • 49dc0dd Fix asset class string interface membership testing
  • 8512855 Fix signature of docutils include_source monkeypatch (#11610)
  • e1d9068 Bump version
  • 441a9e4 Bump to 7.2.1 final
  • ec31853 Restore TocTree.get_toctree_ancestors()
  • Additional commits viewable in compare view

Updates `furo` from 2023.7.26 to 2023.8.19
Changelog

Sourced from furo's changelog.

Changelog

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2148.org.readthedocs.build/en/2148/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1865232341,I_kwDOBm6k_c5vLS_V,2153,Datasette --get --actor option,9599,simonw,closed,0,,,,,5,2023-08-24T14:00:03Z,2023-08-28T20:19:15Z,2023-08-28T20:15:53Z,OWNER,,"I experimented with a prototype of this here: - https://github.com/simonw/datasette/issues/2102#issuecomment-1691037971_ Which lets me run requests as if they belonged to a specific actor like this: ```bash datasette fixtures.db --get '/fixtures/facetable.json' --actor '{ ""_r"": { ""r"": { ""fixtures"": { ""facetable"": [ ""vt"" ] } } }, ""a"": ""user"" }' ``` Really useful for testing actors an `_r` options. Is this worth adding as a feature?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1870345352,PR_kwDOBm6k_c5Y90K9,2161,"-s/--setting x y gets merged into datasette.yml, refs #2143, #2156",9599,simonw,closed,0,,,,,1,2023-08-28T19:30:42Z,2023-08-28T20:06:15Z,2023-08-28T20:06:14Z,OWNER,simonw/datasette/pulls/2161,"This change updates the `-s/--setting` option to `datasette serve` to allow it to be used to set arbitrarily complex nested settings in a way that is compatible with the new `-c datasette.yml` work happening in: - #2143 It will enable things like this: ``` datasette data.db --setting plugins.datasette-ripgrep.path ""/home/simon/code"" ``` For the moment though it just affects [settings](https://docs.datasette.io/en/1.0a4/settings.html) - so you can do this: ``` datasette data.db --setting settings.sql_time_limit_ms 3500 ``` I've also implemented a backwards compatibility mechanism, so if you use it this way (the old way): ``` datasette data.db --setting sql_time_limit_ms 3500 ``` It will notice that the setting you passed is one of Datasette's core settings, and will treat that as if you said `settings.sql_time_limit_ms` instead. ---- :books: Documentation preview :books:: https://datasette--2161.org.readthedocs.build/en/2161/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1865174661,PR_kwDOBm6k_c5YsfZ7,2152,Bump the python-packages group with 3 updates,49699333,dependabot[bot],closed,0,,,,,3,2023-08-24T13:34:44Z,2023-08-28T13:49:39Z,2023-08-28T13:49:37Z,CONTRIBUTOR,simonw/datasette/pulls/2152,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.3
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.3

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.3 (released Aug 23, 2023)

Dependencies

  • #11576: Require sphinxcontrib-serializinghtml 1.1.9.

Bugs fixed

  • Fix regression in autodoc.Documenter.parse_name().
  • Fix regression in JSON serialisation.
  • #11543: autodoc: Support positional-only parameters in classmethod methods when autodoc_preserve_defaults is True.
  • Restore support string methods on path objects. This is deprecated and will be removed in Sphinx 8. Use :py:funcos.fspath to convert :py:class:pathlib.Path objects to strings, or :py:class:pathlib.Path's methods to work with path objects.

Release 7.2.2 (released Aug 17, 2023)

Bugs fixed

  • Fix the signature of the StateMachine.insert_input() patch, for when calling with keyword arguments.
  • Fixed membership testing (in) for the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
  • Fixed a type error in SingleFileHTMLBuilder._get_local_toctree, includehidden may be passed as a string or a boolean.
  • Fix :noindex: for PyModule and JSModule``.

Release 7.2.1 (released Aug 17, 2023)

Bugs fixed

  • Restored the the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon. This will be removed in Sphinx 9.
  • Restored calls to Builder.add_{css,js}_file(), which several extensions relied upon.
  • Restored the private API TocTree.get_toctree_ancestors(), which several extensions relied upon.

Release 7.2.0 (released Aug 17, 2023)

... (truncated)

Commits
  • 2f6ea14 Bump to 7.2.3 final
  • 511e407 Implement bool() for string paths
  • 494de73 Implement hash() for string paths
  • 2986aa1 Override special methods for string paths
  • 07b87e9 Update CHANGES for 7.2.3
  • 6b17dd1 Support string methods on path objects (#11619)
  • a73fb59 Support positional-only parameters in classmethods (#11635)
  • 02cb02c Fix invocation of python -m sphinx build
  • 6183b6a Require sphinxcontrib-serializinghtml 1.1.9 or later
  • 1e16f21 Fix regression in autodoc.Documenter.parse_name (#11613)
  • Additional commits viewable in compare view

Updates `furo` from 2023.7.26 to 2023.8.19
Changelog

Sourced from furo's changelog.

Changelog

2023.08.19 -- Xenolithic Xanadu

  • Fix missing search context with Sphinx 7.2, for dirhtml builds.
  • Drop support for Python 3.7.
  • Present configuration errors in a better format -- thanks @​AA-Turner!
  • Bump require_sphinx() to Sphinx 6.0, in line with dependency changes in Unassuming Ultramarine.

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2152.org.readthedocs.build/en/2152/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 685806511,MDU6SXNzdWU2ODU4MDY1MTE=,950,Private/secret databases: database files that are only visible to plugins,9599,simonw,closed,0,,,,,6,2020-08-25T20:46:17Z,2023-08-24T22:26:09Z,2023-08-24T22:26:08Z,OWNER,,"In thinking about the best way to implement https://github.com/simonw/datasette-auth-passwords/issues/6 (SQL-backed user accounts for `datasette-auth-passwords`) I realized that there are a few different use-cases where a plugin might want to store data that isn't visible to regular Datasette users: - Storing password hashes - Storing API tokens - Storing secrets that are used for data import integrations (secrets for talking to the Twitter API for example) Idea: allow one or more private database files to be attached to Datasette, something like this: datasette github.db linkedin.db -s secrets.db -m metadata.yml The `secrets.db` file would not be visible using any of the Datasette's usual interface or API routes - but plugins would be able to run queries against it. So `datasette-auth-passwords` might then be configured like this: ```yaml plugins: datasette-auth-passwords: database: secrets sql: ""select password_hash from passwords where username = :username"" ``` The plugin could even refuse to operate against a database that hadn't been loaded as a secret database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1863810783,I_kwDOBm6k_c5vF37f,2150,form label { width: 15% } is a bad default,9599,simonw,closed,0,,,,,4,2023-08-23T18:22:27Z,2023-08-23T18:37:18Z,2023-08-23T18:35:48Z,OWNER,,"See: - https://github.com/simonw/datasette-configure-fts/issues/14 - https://github.com/simonw/datasette-auth-tokens/issues/12",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795051447,I_kwDOBm6k_c5q_k-3,2097,Drop Python 3.7,9599,simonw,closed,0,,,,,0,2023-07-08T18:39:44Z,2023-08-23T18:18:00Z,2023-08-23T18:18:00Z,OWNER,,"> I'm going to drop Python 3.7. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892_ It's not supported any more: https://devguide.python.org/versions/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2097/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459689615,MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1,524,"Sort commits using isort, refs #516",9599,simonw,closed,0,,,,,1,2019-06-24T05:04:48Z,2023-08-23T01:31:08Z,2023-08-23T01:31:08Z,OWNER,simonw/datasette/pulls/524,Also added a lint unit test to ensure they stay sorted. #516,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 449886319,MDU6SXNzdWU0NDk4ODYzMTk=,493,Rename metadata.json to config.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-05-29T15:48:03Z,2023-08-23T01:29:21Z,2023-08-23T01:29:20Z,OWNER,,"It is increasingly being useful configuration options, when it started out as purely metadata. Could cause confusion with the `--config` mechanism though - maybe that should be called ""settings"" instead?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324720095,MDU6SXNzdWUzMjQ3MjAwOTU=,275,"""config"" section in metadata.json (root, database and table level)",9599,simonw,closed,0,,,,,3,2018-05-20T16:02:28Z,2023-08-23T01:28:37Z,2023-08-23T01:28:37Z,OWNER,,"Split off from #274 Metadata should an optional `""config""` section at root, table or database level. The TableView and RowView and DatabaseView and BaseView classes could all have a `.config(""key"")` method which knows how to resolve the hierarchy of configs. This will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1861812208,PR_kwDOBm6k_c5YhH-W,2149,"Start a new `datasette.yaml` configuration file, with settings support",15178711,asg017,closed,0,,,,,2,2023-08-22T16:24:16Z,2023-08-23T01:26:11Z,2023-08-23T01:26:11Z,CONTRIBUTOR,simonw/datasette/pulls/2149,"refs #2093 #2143 This is the first step to implementing the new `datasette.yaml`/`datasette.json` configuration file. - The old `--config` argument is now back, and is the path to a `datasette.yaml` file. Acts like the `--metadata` flag. - The old `settings.json` behavior has been removed. - The `""settings""` key inside `datasette.yaml` defines the same `--settings` flags - Values passed in `--settings` will over-write values in `datasette.yaml` Docs for the Config file is pretty light, not much to add until we add more config to the file. ---- :books: Documentation preview :books:: https://datasette--2149.org.readthedocs.build/en/2149/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1856760386,PR_kwDOBm6k_c5YQGcc,2144,Bump the python-packages group with 3 updates,49699333,dependabot[bot],closed,0,,,,,2,2023-08-18T13:49:37Z,2023-08-21T13:48:18Z,2023-08-21T13:48:16Z,CONTRIBUTOR,simonw/datasette/pulls/2144,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.2
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.2 (released Aug 17, 2023)

Bugs fixed

  • Fix the signature of the StateMachine.insert_input() patch, for when calling with keyword arguments.
  • Fixed membership testing (in) for the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
  • Fixed a type error in SingleFileHTMLBuilder._get_local_toctree, includehidden may be passed as a string or a boolean.
  • Fix :noindex: for PyModule and JSModule``.

Release 7.2.1 (released Aug 17, 2023)

Bugs fixed

  • Restored the the :py:class:str interface of the asset classes (_CascadingStyleSheet and _JavaScript), which several extensions relied upon. This will be removed in Sphinx 9.
  • Restored calls to Builder.add_{css,js}_file(), which several extensions relied upon.
  • Restored the private API TocTree.get_toctree_ancestors(), which several extensions relied upon.

Release 7.2.0 (released Aug 17, 2023)

Dependencies

  • #11511: Drop Python 3.8 support.
  • #11576: Require Pygments 2.14 or later.

Deprecated

  • #11512: Deprecate sphinx.util.md5 and sphinx.util.sha1. Use hashlib instead.
  • #11526: Deprecate sphinx.testing.path. Use os.path or pathlib instead.
  • #11528: Deprecate sphinx.util.split_index_msg and sphinx.util.split_into. Use sphinx.util.index_entries.split_index_msg instead.
  • Deprecate sphinx.builders.html.Stylesheet and sphinx.builders.html.Javascript. Use sphinx.application.Sphinx.add_css_file()

... (truncated)

Commits
  • ed84d63 Bump to 7.2.2 final
  • ea4a73e [bot]: Update message catalogues (#11612)
  • e47846a Fix :noindex: for PyModule and JSModule``
  • b2fc47f Add CHANGES entry for renaming the StateMachine.insert_input() parameter
  • 0835c3e Fix regression in SingleFileHTMLBuilder._get_local_toctree
  • 49dc0dd Fix asset class string interface membership testing
  • 8512855 Fix signature of docutils include_source monkeypatch (#11610)
  • e1d9068 Bump version
  • 441a9e4 Bump to 7.2.1 final
  • ec31853 Restore TocTree.get_toctree_ancestors()
  • Additional commits viewable in compare view

Updates `furo` from 2023.7.26 to 2023.8.17
Changelog

Sourced from furo's changelog.

Changelog

2023.08.17 -- Wonderous White

  • Fix compatiblity with Sphinx 7.2.0 and 7.2.1.

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2144.org.readthedocs.build/en/2144/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1817289521,I_kwDOCGYnMM5sUaMx,577,Get `add_foreign_keys()` to work without modifying `sqlite_master`,9599,simonw,closed,0,,,,,9,2023-07-23T20:40:18Z,2023-08-18T17:43:11Z,2023-08-18T00:48:10Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/13ebcc575d2547c45e8d31288b71a3242c16b886/sqlite_utils/db.py#L1165-L1174 This is the only place in the code that attempts to modify `sqlite_master` directly, which fails on some Python installations. Could this use the `.transform()` trick instead? Or automatically switch to that trick if it hits an error?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1854970601,PR_kwDOBm6k_c5YKAZ4,2142,Bump the python-packages group with 2 updates,49699333,dependabot[bot],closed,0,,,,,2,2023-08-17T13:07:53Z,2023-08-18T13:49:29Z,2023-08-18T13:49:26Z,CONTRIBUTOR,simonw/datasette/pulls/2142,"Bumps the python-packages group with 2 updates: [sphinx](https://github.com/sphinx-doc/sphinx) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 7.1.2 to 7.2.0
Release notes

Sourced from sphinx's releases.

Sphinx 7.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.2.0 (released Aug 17, 2023)

Dependencies

  • #11511: Drop Python 3.8 support.
  • #11576: Require Pygments 2.14 or later.

Deprecated

  • #11512: Deprecate sphinx.util.md5 and sphinx.util.sha1. Use hashlib instead.
  • #11526: Deprecate sphinx.testing.path. Use os.path or pathlib instead.
  • #11528: Deprecate sphinx.util.split_index_msg and sphinx.util.split_into. Use sphinx.util.index_entries.split_index_msg instead.
  • Deprecate sphinx.builders.html.Stylesheet and sphinx.builders.html.Javascript. Use sphinx.application.Sphinx.add_css_file() and sphinx.application.Sphinx.add_js_file() instead.
  • #11582: Deprecate sphinx.builders.html.StandaloneHTMLBuilder.css_files and sphinx.builders.html.StandaloneHTMLBuilder.script_files. Use sphinx.application.Sphinx.add_css_file() and sphinx.application.Sphinx.add_js_file() instead.
  • #11459: Deprecate sphinx.ext.autodoc.preserve_defaults.get_function_def(). Patch by Bénédikt Tran.

Features added

  • #11526: Support os.PathLike types and pathlib.Path objects in many more places.
  • #5474: coverage: Print summary statistics tables. Patch by Jorge Leitao.
  • #6319: viewcode: Add :confval:viewcode_line_numbers to control whether line numbers are added to rendered source code. Patch by Ben Krikler.
  • #9662: Add the :no-typesetting: option to suppress textual output and only create a linkable anchor. Patch by Latosha Maltba.
  • #11221: C++: Support domain objects in the table of contents. Patch by Rouslan Korneychuk.
  • #10938: doctest: Add :confval:doctest_show_successes option. Patch by Trey Hunner.
  • #11533: Add :no-index:, :no-index-entry:, and :no-contents-entry:.
  • #11572: Improve debug logging of reasons why files are detected as out of date. Patch by Eric Larson.

... (truncated)

Commits

Updates `blacken-docs` from 1.15.0 to 1.16.0
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2142.org.readthedocs.build/en/2142/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1855894222,I_kwDOCGYnMM5unrLO,585,CLI equivalents to `transform(add_foreign_keys=)`,9599,simonw,closed,0,,,,,7,2023-08-18T01:07:15Z,2023-08-18T01:51:16Z,2023-08-18T01:51:15Z,OWNER,,"The new options added in: - #577 Deserve consideration in the CLI as well. https://github.com/simonw/sqlite-utils/blob/d2bcdc00c6ecc01a6e8135e775ffdb87572b802b/sqlite_utils/db.py#L1706-L1708",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855836914,I_kwDOCGYnMM5undLy,583,Get rid of test.utils.collapse_whitespace,9599,simonw,closed,0,,,,,1,2023-08-17T23:31:09Z,2023-08-18T00:59:19Z,2023-08-18T00:59:19Z,OWNER,,"I have a neater pattern for this now - instead of: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L472-L475 I now prefer: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L1163-L1171",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855838223,PR_kwDOCGYnMM5YM-I3,584,.transform() instead of modifying sqlite_master for add_foreign_keys,9599,simonw,closed,0,,,,,13,2023-08-17T23:32:45Z,2023-08-18T00:48:13Z,2023-08-18T00:48:08Z,OWNER,simonw/sqlite-utils/pulls/584,"Refs: - #577 ---- :books: Documentation preview :books:: https://sqlite-utils--584.org.readthedocs.build/en/584/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1853289039,PR_kwDOBm6k_c5YEUBK,2141,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,2,2023-08-16T13:47:35Z,2023-08-17T13:07:48Z,2023-08-17T13:07:45Z,CONTRIBUTOR,simonw/datasette/pulls/2141,"Bumps the python-packages group with 1 update: [blacken-docs](https://github.com/asottile/blacken-docs).
Changelog

Sourced from blacken-docs's changelog.

1.16.0 (2023-08-16)

  • Allow Markdown fence options.

    Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.

  • Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.

  • Preserve leading whitespace lines in reStructuredText code blocks.

    Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.

  • Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.

    Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.

  • Support passing the --preview option through to Black, to select the future style.

  • Remove language_version from .pre-commit-hooks.yaml. This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.

    Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.15.0&new-version=1.16.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
---- :books: Documentation preview :books:: https://datasette--2141.org.readthedocs.build/en/2141/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1847201263,I_kwDOBm6k_c5uGg3v,2140,Remove all remaining documentation instances of '$ ',9599,simonw,closed,0,,,,,1,2023-08-11T17:42:13Z,2023-08-11T17:52:25Z,2023-08-11T17:45:00Z,OWNER,,"For example this: https://github.com/simonw/datasette/blob/4535568f2ce907af646304d0ebce2500ebd55677/docs/authentication.rst?plain=1#L33-L35 The problem with that `$ ` prefix is that it prevents users from copying and pasting the raw command. https://docs.datasette.io/en/stable/authentication.html#using-the-root-actor",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1846076261,I_kwDOBm6k_c5uCONl,2139,border-color: ##ff0000 bug - two hashes,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-11T01:22:58Z,2023-08-11T05:16:24Z,2023-08-11T05:16:24Z,OWNER,,"Spotted this on https://latest.datasette.io/extra_database ```html
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843391585,I_kwDOBm6k_c5t3-xh,2134,Add writable canned query demo to latest.datasette.io,9599,simonw,closed,0,,,,,5,2023-08-09T14:31:30Z,2023-08-10T01:22:46Z,2023-08-10T01:05:56Z,OWNER,,"This would be useful while working on: - #2114",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1844213115,I_kwDOBm6k_c5t7HV7,2138,on_success_message_sql option for writable canned queries,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-10T00:20:14Z,2023-08-10T00:39:40Z,2023-08-10T00:34:26Z,OWNER,,"> Or... how about if the `on_success_message` option could define a SQL query to be executed to generate that message? Maybe `on_success_message_sql`. - https://github.com/simonw/datasette/issues/2134",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841501975,I_kwDOBm6k_c5twxcX,2133,[feature request]`datasette install plugins.json` options,54462,HaveF,closed,0,,,,,9,2023-08-08T15:06:50Z,2023-08-10T00:31:24Z,2023-08-09T22:04:46Z,NONE,,"Hi, simon ❤️ `datasette plugins --all > plugins.json` could generate all plugins info. On another machine, it would be great to install all plugins just by `datasette install plugins.json`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 627794879,MDU6SXNzdWU2Mjc3OTQ4Nzk=,782,Redesign default .json format,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,55,2020-05-30T18:47:07Z,2023-08-10T00:07:17Z,2023-08-10T00:07:17Z,OWNER,,The default JSON just isn't right. I find myself using `?_shape=array` for almost everything I build against the API.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843710170,I_kwDOBm6k_c5t5Mja,2136,Query view shouldn't return `columns`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-08-09T17:23:57Z,2023-08-09T19:03:04Z,2023-08-09T19:03:04Z,OWNER,,"I just noticed that https://latest.datasette.io/fixtures/roadside_attraction_characteristics.json?_labels=on&_size=1 returns: ```json { ""ok"": true, ""next"": ""1"", ""rows"": [ { ""rowid"": 1, ""attraction_id"": { ""value"": 1, ""label"": ""The Mystery Spot"" }, ""characteristic_id"": { ""value"": 2, ""label"": ""Paranormal"" } } ], ""truncated"": false } ``` But https://latest.datasette.io/fixtures.json?sql=select+rowid%2C+attraction_id%2C+characteristic_id+from+roadside_attraction_characteristics+order+by+rowid+limit+1 returns: ```json { ""rows"": [ { ""rowid"": 1, ""attraction_id"": 1, ""characteristic_id"": 2 } ], ""columns"": [ ""rowid"", ""attraction_id"", ""characteristic_id"" ], ""ok"": true, ""truncated"": false } ``` The `columns` key in the query response is inconsistent with the table response.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1560662739,I_kwDOBm6k_c5dBdLT,2007,`render_cell()` hook should take an optional `request` argument,9599,simonw,closed,0,,,,,1,2023-01-28T03:13:00Z,2023-08-09T17:15:03Z,2023-01-28T03:34:26Z,OWNER,,From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2007/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841343173,I_kwDOBm6k_c5twKrF,2132,Get form fields on query page working again ,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-08-08T13:39:05Z,2023-08-08T13:45:10Z,2023-08-08T13:45:09Z,OWNER,,"Caused by: - #2112 https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+where+%22pk1%22+%3D+%3Ap0+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=b The `:p0` form field is missing. Submitting the form results in this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822982933,I_kwDOBm6k_c5sqIMV,2117,Figure out what to do about `DatabaseView.name`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:58:06Z,2023-08-08T02:02:07Z,2023-08-08T02:02:07Z,OWNER,,"In the old code: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35 This `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54 Figure out how that should work once I've refactored those classes to view functions instead. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940964,I_kwDOBm6k_c5sp98k,2115,Ensure all tests pass against new query view JSON,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,0,2023-07-26T18:25:20Z,2023-08-08T02:01:39Z,2023-08-08T02:01:38Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822938661,I_kwDOBm6k_c5sp9Yl,2112,Build HTML version of /content?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:23:34Z,2023-08-08T02:01:09Z,2023-08-08T02:01:01Z,OWNER,,"This will help make the hook as robust as possible. - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822937426,I_kwDOBm6k_c5sp9FS,2111,Implement new /content.json?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-07-26T18:22:39Z,2023-08-08T02:00:37Z,2023-08-08T02:00:22Z,OWNER,,"This will be the base that the remaining work builds on top of. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840329615,I_kwDOBm6k_c5tsTOP,2130,Render plugin mechanism needs `error` and `truncated` fields,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,2,2023-08-07T23:19:19Z,2023-08-08T01:51:54Z,2023-08-08T01:47:42Z,OWNER,,"While working on: - https://github.com/simonw/datasette/pull/2118 It became clear that the `render` callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette Needs to grow the ability to be told if an error occurred (an `error` string) and if the results were truncated (a `truncated` boolean).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823352380,PR_kwDOBm6k_c5Wfgd9,2118,New JSON design for query views,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,11,2023-07-26T23:29:21Z,2023-08-08T01:47:40Z,2023-08-08T01:47:39Z,OWNER,simonw/datasette/pulls/2118,"WIP. Refs: - #2109 ---- :books: Documentation preview :books:: https://datasette--2118.org.readthedocs.build/en/2118/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1833193570,PR_kwDOBm6k_c5XArm3,2125,Bump sphinx from 6.1.3 to 7.1.2,49699333,dependabot[bot],closed,0,,,,,2,2023-08-02T13:28:39Z,2023-08-07T16:20:30Z,2023-08-07T16:20:27Z,CONTRIBUTOR,simonw/datasette/pulls/2125,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.2.
Release notes

Sourced from sphinx's releases.

Sphinx 7.1.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.1.2 (released Aug 02, 2023)

Bugs fixed

  • #11542: linkcheck: Properly respect :confval:linkcheck_anchors and do not spuriously report failures to validate anchors. Patch by James Addison.

Release 7.1.1 (released Jul 27, 2023)

Bugs fixed

  • #11514: Fix SOURCE_DATE_EPOCH in multi-line copyright footer. Patch by Bénédikt Tran.

Release 7.1.0 (released Jul 24, 2023)

Incompatible changes

Deprecated

  • #11412: Emit warnings on using a deprecated Python-specific index entry type (namely, module, keyword, operator, object, exception, statement, and builtin) in the :rst:dir:index directive, and set the removal version to Sphinx 9. Patch by Adam Turner.

Features added

  • #11415: Add a checksum to JavaScript and CSS asset URIs included within generated HTML, using the CRC32 algorithm.
  • :meth:~sphinx.application.Sphinx.require_sphinx now allows the version requirement to be specified as (major, minor).
  • #11011: Allow configuring a line-length limit for object signatures, via :confval:maximum_signature_line_length and the domain-specific variants. If the length of the signature (in characters) is greater than the configured limit, each parameter in the signature will be split to its own logical line. This behaviour may also be controlled by options on object description directives, for example :rst:dir:py:function:single-line-parameter-list.

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2125.org.readthedocs.build/en/2125/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1824399610,PR_kwDOBm6k_c5WjCS8,2121,Bump furo from 2023.3.27 to 2023.7.26,49699333,dependabot[bot],closed,0,,,,,2,2023-07-27T13:40:48Z,2023-08-07T16:20:23Z,2023-08-07T16:20:20Z,CONTRIBUTOR,simonw/datasette/pulls/2121,"Bumps [furo](https://github.com/pradyunsg/furo) from 2023.3.27 to 2023.7.26.
Changelog

Sourced from furo's changelog.

Changelog

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.

... (truncated)

Commits
  • 35f5307 Prepare release: 2023.07.26
  • 0a8bedc Update changelog
  • a92dd0c Make _add_asset_hashes a no-op with Sphinx 7.1
  • f8db95b Improve literals with inline code are handled
  • 1680dbe Document the use of figclass with figure directive
  • beebd7e Increase the specificity of the admonition title selector
  • 834e951 Setup uploads to Percy
  • 27bf2c0 [pre-commit.ci] pre-commit autoupdate (#672)
  • c8b51d0 Fix how content overflow is handled
  • 80afa27 [pre-commit.ci] pre-commit autoupdate (#652)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2023.3.27&new-version=2023.7.26)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2121.org.readthedocs.build/en/2121/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1796830110,PR_kwDOBm6k_c5VFw3j,2098,Bump blacken-docs from 1.14.0 to 1.15.0,49699333,dependabot[bot],closed,0,,,,,2,2023-07-10T13:49:12Z,2023-08-07T16:20:22Z,2023-08-07T16:20:20Z,CONTRIBUTOR,simonw/datasette/pulls/2098,"Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.14.0 to 1.15.0.
Changelog

Sourced from blacken-docs's changelog.

1.15.0 (2023-07-09)

  • Drop Python 3.7 support.
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.14.0&new-version=1.15.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2098.org.readthedocs.build/en/2098/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2098/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1839766197,PR_kwDOBm6k_c5XWhWF,2128,"Bump blacken-docs, furo, blacken-docs",49699333,dependabot[bot],closed,0,,,,,1,2023-08-07T15:50:40Z,2023-08-07T16:19:25Z,2023-08-07T16:19:24Z,CONTRIBUTOR,simonw/datasette/pulls/2128,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs). Updates `sphinx` from 6.1.3 to 7.1.2
Release notes

Sourced from sphinx's releases.

Sphinx 7.1.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.1.2 (released Aug 02, 2023)

Bugs fixed

  • #11542: linkcheck: Properly respect :confval:linkcheck_anchors and do not spuriously report failures to validate anchors. Patch by James Addison.

Release 7.1.1 (released Jul 27, 2023)

Bugs fixed

  • #11514: Fix SOURCE_DATE_EPOCH in multi-line copyright footer. Patch by Bénédikt Tran.

Release 7.1.0 (released Jul 24, 2023)

Incompatible changes

Deprecated

  • #11412: Emit warnings on using a deprecated Python-specific index entry type (namely, module, keyword, operator, object, exception, statement, and builtin) in the :rst:dir:index directive, and set the removal version to Sphinx 9. Patch by Adam Turner.

Features added

  • #11415: Add a checksum to JavaScript and CSS asset URIs included within generated HTML, using the CRC32 algorithm.
  • :meth:~sphinx.application.Sphinx.require_sphinx now allows the version requirement to be specified as (major, minor).
  • #11011: Allow configuring a line-length limit for object signatures, via :confval:maximum_signature_line_length and the domain-specific variants. If the length of the signature (in characters) is greater than the configured limit, each parameter in the signature will be split to its own logical line. This behaviour may also be controlled by options on object description directives, for example :rst:dir:py:function:single-line-parameter-list.

... (truncated)

Commits

Updates `furo` from 2023.3.27 to 2023.7.26
Changelog

Sourced from furo's changelog.

Changelog

2023.07.26 -- Vigilant Volt

  • Fix compatiblity with Sphinx 7.1.
  • Improve how content overflow is handled.
  • Improve how literal blocks containing inline code are handled.

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.

... (truncated)

Commits
  • 35f5307 Prepare release: 2023.07.26
  • 0a8bedc Update changelog
  • a92dd0c Make _add_asset_hashes a no-op with Sphinx 7.1
  • f8db95b Improve literals with inline code are handled
  • 1680dbe Document the use of figclass with figure directive
  • beebd7e Increase the specificity of the admonition title selector
  • 834e951 Setup uploads to Percy
  • 27bf2c0 [pre-commit.ci] pre-commit autoupdate (#672)
  • c8b51d0 Fix how content overflow is handled
  • 80afa27 [pre-commit.ci] pre-commit autoupdate (#652)
  • Additional commits viewable in compare view

Updates `blacken-docs` from 1.14.0 to 1.15.0
Changelog

Sourced from blacken-docs's changelog.

1.15.0 (2023-07-09)

  • Drop Python 3.7 support.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
---- :books: Documentation preview :books:: https://datasette--2128.org.readthedocs.build/en/2128/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1826424151,PR_kwDOBm6k_c5Wp6Hs,2124,Bump sphinx from 6.1.3 to 7.1.1,49699333,dependabot[bot],closed,0,,,,,2,2023-07-28T13:23:11Z,2023-08-02T13:28:47Z,2023-08-02T13:28:44Z,CONTRIBUTOR,simonw/datasette/pulls/2124,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.1.
Release notes

Sourced from sphinx's releases.

Sphinx 7.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Sphinx 7.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.1.1 (released Jul 27, 2023)

Bugs fixed

  • #11514: Fix SOURCE_DATE_EPOCH in multi-line copyright footer. Patch by Bénédikt Tran.

Release 7.1.0 (released Jul 24, 2023)

Incompatible changes

Deprecated

  • #11412: Emit warnings on using a deprecated Python-specific index entry type (namely, module, keyword, operator, object, exception, statement, and builtin) in the :rst:dir:index directive, and set the removal version to Sphinx 9. Patch by Adam Turner.

Features added

  • #11415: Add a checksum to JavaScript and CSS asset URIs included within generated HTML, using the CRC32 algorithm.
  • :meth:~sphinx.application.Sphinx.require_sphinx now allows the version requirement to be specified as (major, minor).
  • #11011: Allow configuring a line-length limit for object signatures, via :confval:maximum_signature_line_length and the domain-specific variants. If the length of the signature (in characters) is greater than the configured limit, each parameter in the signature will be split to its own logical line. This behaviour may also be controlled by options on object description directives, for example :rst:dir:py:function:single-line-parameter-list. Patch by Thomas Louf, Adam Turner, and Jean-François B.
  • #10983: Support for multiline copyright statements in the footer block. Patch by Stefanie Molin
  • sphinx.util.display.status_iterator now clears the current line with ANSI control codes, rather than overprinting with space characters.
  • #11431: linkcheck: Treat SSL failures as broken links. Patch by Bénédikt Tran
  • #11157: Keep the translated attribute on translated nodes.
  • #11451: Improve the traceback displayed when using :option:sphinx-build -T in parallel builds. Patch by Bénédikt Tran

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2124.org.readthedocs.build/en/2124/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1827427757,PR_kwDOD079W85WtUKG,38,photos-to-sql not found?,319473,coldclimate,closed,0,,,,,2,2023-07-29T09:59:42Z,2023-07-29T10:01:27Z,2023-07-29T10:01:23Z,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/38,"I wonder if `photos-to-sql` is an old name for `dogsheep-photos`, because I can't find it anywhere. I can't actually get this command to work (`sqlite3.OperationalError: no such table: attached.ZGENERICASSET` thrown) but I don't think that's related",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1820346348,PR_kwDOBm6k_c5WVYor,2107,Bump sphinx from 6.1.3 to 7.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-07-25T13:28:30Z,2023-07-28T13:23:19Z,2023-07-28T13:23:17Z,CONTRIBUTOR,simonw/datasette/pulls/2107,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.0.
Release notes

Sourced from sphinx's releases.

Sphinx 7.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.1.0 (released Jul 24, 2023)

Incompatible changes

Deprecated

  • #11412: Emit warnings on using a deprecated Python-specific index entry type (namely, module, keyword, operator, object, exception, statement, and builtin) in the :rst:dir:index directive, and set the removal version to Sphinx 9. Patch by Adam Turner.

Features added

  • #11415: Add a checksum to JavaScript and CSS asset URIs included within generated HTML, using the CRC32 algorithm.
  • :meth:~sphinx.application.Sphinx.require_sphinx now allows the version requirement to be specified as (major, minor).
  • #11011: Allow configuring a line-length limit for object signatures, via :confval:maximum_signature_line_length and the domain-specific variants. If the length of the signature (in characters) is greater than the configured limit, each parameter in the signature will be split to its own logical line. This behaviour may also be controlled by options on object description directives, for example :rst:dir:py:function:single-line-parameter-list. Patch by Thomas Louf, Adam Turner, and Jean-François B.
  • #10983: Support for multiline copyright statements in the footer block. Patch by Stefanie Molin
  • sphinx.util.display.status_iterator now clears the current line with ANSI control codes, rather than overprinting with space characters.
  • #11431: linkcheck: Treat SSL failures as broken links. Patch by Bénédikt Tran
  • #11157: Keep the translated attribute on translated nodes.
  • #11451: Improve the traceback displayed when using :option:sphinx-build -T in parallel builds. Patch by Bénédikt Tran
  • #11324: linkcheck: Use session-basd HTTP requests.
  • #11438: Add support for the :rst:dir:py:class and :rst:dir:py:function directives for PEP 695 (generic classes and functions declarations) and PEP 696 (default type parameters). Multi-line support (#11011) is enabled for type parameters list and can be locally controlled on object description directives, e.g., :rst:dir:py:function:single-line-type-parameter-list. Patch by Bénédikt Tran.
  • #11484: linkcheck: Allow HTML anchors to be ignored on a per-URL basis via :confval:linkcheck_anchors_ignore_for_url while

... (truncated)

Commits
  • e560f63 Bump to 7.1.0 final
  • 066e0fa Add translation progress information (#11509)
  • 0882914 Target PyPI in create-release.yml
  • 21fbee5 Fix OIDC token payload
  • 1a403e4 Add informational log messaging
  • 258b0ea Revert "Switch to using github.request"
  • f9c89e5 Switch to using github.request
  • 52c7f66 Use the correct token minting URL for TestPyPI
  • 6079f28 Install twine in PyPI publish workflow
  • 3d43b9e Fix github-script syntax in create-release.yml
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2107.org.readthedocs.build/en/2107/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1719759468,PR_kwDOBm6k_c5RBXH_,2077,Bump furo from 2023.3.27 to 2023.5.20,49699333,dependabot[bot],closed,0,,,,,3,2023-05-22T13:58:16Z,2023-07-27T13:40:55Z,2023-07-27T13:40:53Z,CONTRIBUTOR,simonw/datasette/pulls/2077,"Bumps [furo](https://github.com/pradyunsg/furo) from 2023.3.27 to 2023.5.20.
Changelog

Sourced from furo's changelog.

Changelog

2023.05.20 -- Unassuming Ultramarine

  • ✨ Add support for Sphinx 7.
  • Drop support for Sphinx 5.
  • Improve the screen-reader label for sidebar collapse.
  • Make it easier to create derived themes from Furo.
  • Bump all JS dependencies (NodeJS and npm packages).

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.
  • Replace the meta generator tag with a comment.
  • Tweak labels with icons to prevent users selecting icons as text on touch.

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.

... (truncated)

Commits
  • d2c9ca8 Prepare release: 2023.05.20
  • 662d21b Update changelog
  • 591780b Bump compatible Sphinx version
  • c2e7837 Bump NodeJS and package versions
  • dd85574 Use the reference HtmlFormatter class defined on PygmentsBridge. (#657)
  • 6bff419 Fix broken link (#654)
  • e7f732e Improve the screen-reader label for sidebar collapse
  • 48c0bf2 Drop the check for the theme name
  • 1b17d81 [pre-commit.ci] pre-commit autoupdate (#646)
  • 4904fd5 Remove Python 3.8 constraint from Black pre-commit config (#647)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2023.3.27&new-version=2023.5.20)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) You can trigger a rebase of this PR by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2077.org.readthedocs.build/en/2077/ > **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2077/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1822934563,I_kwDOBm6k_c5sp8Yj,2109,Plan for getting the new JSON format query views working,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:20:18Z,2023-07-27T00:24:47Z,2023-07-26T18:25:34Z,OWNER,,"I've been stuck on this for too long. I'm breaking it down into a full milestone: https://github.com/simonw/datasette/milestone/29",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823160748,I_kwDOCGYnMM5sqzms,581,`sqlite-utils convert --pdb` option,9599,simonw,closed,0,,,,,1,2023-07-26T21:02:50Z,2023-07-26T21:07:45Z,2023-07-26T21:06:10Z,OWNER,,While using `sqlite-utils convert` I realized it would be handy if you could pass `--pdb` to have it open the debugger at the first instance of a failed conversion.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822936521,I_kwDOBm6k_c5sp83J,2110,Merge database index page and query view,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:21:57Z,2023-07-26T19:53:25Z,2023-07-26T19:53:25Z,OWNER,,"Refs: - #2109 The idea here is that hitting `/content` without a `?sql=` will show an empty result set AND default to including a bunch of extras about the list of tables in the database. Then I won't have to think about `/content` and `/content?sql=` as separate pages any more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1656432059,PR_kwDOBm6k_c5NuBNG,2053,WIP new JSON for queries,9599,simonw,closed,0,,,,,12,2023-04-05T23:26:15Z,2023-07-26T18:28:59Z,2023-07-26T18:26:45Z,OWNER,simonw/datasette/pulls/2053,"Refs: - #2049 TODO: - [x] Read queries JSON - Implement error display with `""ok"": false` and an errors key - Read queries HTML - Read queries other formats (plugins) - Canned read queries (dispatched to from table) - Write queries (a canned query thing) - Implement different shapes, refactoring to share code with table - Implement a sensible subset of extras, also refactoring to share code with table - Get all tests passing ---- :books: Documentation preview :books:: https://datasette--2053.org.readthedocs.build/en/2053/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2053/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1710164693,PR_kwDOBm6k_c5QhIL2,2075,Bump sphinx from 6.1.3 to 7.0.1,49699333,dependabot[bot],closed,0,,,,,2,2023-05-15T13:59:31Z,2023-07-25T13:28:39Z,2023-07-25T13:28:36Z,CONTRIBUTOR,simonw/datasette/pulls/2075,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.0.1.
Release notes

Sourced from sphinx's releases.

v7.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.0.1 (released May 12, 2023)

Dependencies

  • #11411: Support Docutils 0.20_. Patch by Adam Turner.

.. _Docutils 0.20: https://docutils.sourceforge.io/RELEASE-NOTES.html#release-0-20-2023-05-04

Bugs fixed

  • #11418: Clean up remaining references to sphinx.setup_command following the removal of support for setuptools. Patch by Willem Mulder.

Release 7.0.0 (released Apr 29, 2023)

Incompatible changes

  • #11359: Remove long-deprecated aliases for MecabSplitter and DefaultSplitter in sphinx.search.ja.
  • #11360: Remove deprecated make_old_id functions in domain object description classes.
  • #11363: Remove the Setuptools integration (build_sphinx hook in setup.py).
  • #11364: Remove deprecated sphinx.ext.napoleon.iterators module.
  • #11365: Remove support for the jsdump format in sphinx.search.
  • #11366: Make locale a required argument to sphinx.util.i18n.format_date().
  • #11370: Remove deprecated sphinx.util.stemmer module.
  • #11371: Remove deprecated sphinx.pycode.ast.parse() function.
  • #11372: Remove deprecated sphinx.io.read_doc() function.
  • #11373: Removed deprecated sphinx.util.get_matching_files() function.
  • #11378: Remove deprecated sphinx.util.docutils.is_html5_writer_available() function.
  • #11379: Make the env argument to Builder subclasses required.
  • #11380: autosummary: Always emit grouped import exceptions.
  • #11381: Remove deprecated style key for HTML templates.
  • #11382: Remove deprecated sphinx.writers.latex.LaTeXTranslator.docclasses attribute.
  • #11383: Remove deprecated sphinx.builders.html.html5_ready and sphinx.builders.html.HTMLTranslator attributes.
  • #11385: Remove support for HTML 4 output.

Release 6.2.1 (released Apr 25, 2023)

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) You can trigger a rebase of this PR by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2075.org.readthedocs.build/en/2075/ > **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2075/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816997390,I_kwDOCGYnMM5sTS4O,576,Backfill the release notes prior to 0.4,9599,simonw,closed,0,,,,,2,2023-07-23T05:41:42Z,2023-07-23T05:49:51Z,2023-07-23T05:48:21Z,OWNER,,"Currently the changelog starts at 0.4: https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115 I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816919568,I_kwDOCGYnMM5sS_4Q,575,Python API ability to opt-out of connection plugins,9599,simonw,closed,0,,,,,2,2023-07-22T23:01:13Z,2023-07-22T23:17:22Z,2023-07-22T23:08:22Z,OWNER,,"Plugins affecting the CLI by default makes sense to me. I'm less confident about them _always_ affecting users of the Python API. I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this: ```python from sqlite_utils import Database db = Database(memory=True, execute_plugins=False) # Anything using db from here on will not execute plugins ``` cc @asg017 Refs: - #567 - #574 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1801394744,I_kwDOCGYnMM5rXxo4,567,Plugin system,15178711,asg017,closed,0,,,,,9,2023-07-12T17:02:14Z,2023-07-22T22:59:37Z,2023-07-22T22:59:36Z,CONTRIBUTOR,,"I'd like there to be a plugin system for sqlite-utils, similar to the datasette/llm plugins. I'd like to make plugins that would do things like: - Register SQLite extensions for more SQL functions + virtual tables - Register new subcommands - Different input file formats for `sqlite-utils memory` - Different output file formats (in addition to `--csv` `--tsv` `--nl` etc. A few real-world use-cases of plugins I'd like to see in sqlite-utils: - Register many of my sqlite extensions in sqlite-utils (`sqlite-http`, `sqlite-lines`, `sqlite-regex`, etc.) - New subcommands to work with `sqlite-vss` vector tables - Input/ouput Parquet/Avro/Arrow IPC files with `sqlite-arrow`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816917522,PR_kwDOCGYnMM5WJ6Jm,573,feat: Implement a prepare_connection plugin hook,15178711,asg017,closed,0,,,,,4,2023-07-22T22:48:44Z,2023-07-22T22:59:09Z,2023-07-22T22:59:09Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/573,"Just like the [Datasette prepare_connection hook](https://docs.datasette.io/en/stable/plugin_hooks.html#prepare-connection-conn-database-datasette), this PR adds a similar hook for the `sqlite-utils` plugin system. The sole argument is `conn`, since I don't believe a `database` or `datasette` argument would be relevant here. I want to do this so I can release `sqlite-utils` plugins for my [SQLite extensions](https://github.com/asg017/sqlite-ecosystem), similar to the Datasette plugins I've release for them. An example plugin: https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c ```bash $ sqlite-utils install https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c/archive/5f5ad549a40860787629c69ca120a08c32519e99.zip $ sqlite-utils memory 'select hello(""alex"") as response' [{""response"": ""Hello, alex!""}] ``` Refs: - #574 ---- :books: Documentation preview :books:: https://sqlite-utils--573.org.readthedocs.build/en/573/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1816876211,I_kwDOCGYnMM5sS1Sz,571,`.transform(keep_table=...)` option,9599,simonw,closed,0,,,,,1,2023-07-22T19:49:29Z,2023-07-22T22:32:18Z,2023-07-22T22:32:18Z,OWNER,,">> Also need a design for an option for the `.transform()` method to indicate that the new table should be created with a new name without dropping the old one. > > I think `keep_table=""name_of_table""` is good for this. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/565#issuecomment-1646657324_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816877910,I_kwDOCGYnMM5sS1tW,572,Don't test Python 3.7 against textual,9599,simonw,closed,0,,,,,2,2023-07-22T19:57:03Z,2023-07-22T22:16:50Z,2023-07-22T22:16:50Z,OWNER,,"Spotted this in the GitHub Actions logs: ![IMG_5046](https://github.com/simonw/sqlite-utils/assets/9599/81fb1093-cd8a-4019-a612-2e49b500c933) ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786243905,I_kwDOCGYnMM5qd-tB,564,Document that running `db.transform()` tidies up the schema indentation,9599,simonw,closed,0,,,,,0,2023-07-03T13:59:28Z,2023-07-22T22:15:34Z,2023-07-22T22:15:34Z,OWNER,,"> ... and it turns out running `.transform()` with no arguments still fixes the format of the schema! ```pycon >>> db[""log""].add_column(""foo"", str) >>> db[""log""].add_column(""bar"", str)
>>> db[""log""].add_column(""baz"", str)
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT , [foo] TEXT, [bar] TEXT, [baz] TEXT) >>> db[""log""].transform()
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT, [foo] TEXT, [bar] TEXT, [baz] TEXT ) ``` _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618347727_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/564/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1205687423,I_kwDOCGYnMM5H3VR_,426,CLI docs should link to Python docs and vice versa,9599,simonw,closed,0,9599,simonw,,,1,2022-04-15T16:05:15Z,2023-07-22T22:13:22Z,2023-07-22T22:13:22Z,OWNER,,"For every command/API method there should be a link to the equivalent in the other form factor. Maybe also link to the API and CLI reference pages too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/426/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786258502,I_kwDOCGYnMM5qeCRG,565,Table renaming: db.rename_table() and sqlite-utils rename-table,9599,simonw,closed,0,,,,,6,2023-07-03T14:07:42Z,2023-07-22T22:12:40Z,2023-07-22T22:12:40Z,OWNER,,"> I find myself wanting two new features in `sqlite-utils`: > - The ability to have the new transformed table set to a specific name, while keeping the old table around > - The ability to rename a table (`sqlite-utils` doesn't have a table rename function at all right now) _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618375042_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816851056,I_kwDOCGYnMM5sSvJw,568,"table.create(..., replace=True)",9599,simonw,closed,0,,,,,7,2023-07-22T18:12:22Z,2023-07-22T19:25:35Z,2023-07-22T19:15:44Z,OWNER,,"Found myself using this pattern to quickly prototype a schema: ```python import sqlite_utils db = sqlite_utils.Database(memory=True) print(db[""answers_chunks""].create({ ""id"": int, ""content"": str, ""embedding_type_id"": int, ""embedding"": bytes, ""embedding_content_md5"": str, ""source"": str, }, pk=""id"", transform=True).schema) ``` Using `replace=True` to drop and then recreate the table would be neat here, and would be consistent with other places that use `replace=True`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857105,I_kwDOCGYnMM5sSwoR,570,`sqlite-utils install -e` option,9599,simonw,closed,0,,,,,0,2023-07-22T18:32:23Z,2023-07-22T18:55:59Z,2023-07-22T18:32:56Z,OWNER,,"As seen in LLM. Needed while working on: - #567",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1798901709,PR_kwDOBm6k_c5VM2MK,2099,Bump black from 23.3.0 to 23.7.0,49699333,dependabot[bot],closed,0,,,,,0,2023-07-11T13:05:53Z,2023-07-21T21:19:25Z,2023-07-21T21:19:24Z,CONTRIBUTOR,simonw/datasette/pulls/2099,"Bumps [black](https://github.com/psf/black) from 23.3.0 to 23.7.0.
Release notes

Sourced from black's releases.

23.7.0

Highlights

  • Runtime support for Python 3.7 has been removed. Formatting 3.7 code will still be supported until further notice (#3765)

Stable style

  • Fix a bug where an illegal trailing comma was added to return type annotations using PEP 604 unions (#3735)
  • Fix several bugs and crashes where comments in stub files were removed or mishandled under some circumstances (#3745)
  • Fix a crash with multi-line magic comments like type: ignore within parentheses (#3740)
  • Fix error in AST validation when Black removes trailing whitespace in a type comment (#3773)

Preview style

  • Implicitly concatenated strings used as function args are no longer wrapped inside parentheses (#3640)
  • Remove blank lines between a class definition and its docstring (#3692)

Configuration

  • The --workers argument to Black can now be specified via the BLACK_NUM_WORKERS environment variable (#3743)
  • .pytest_cache, .ruff_cache and .vscode are now excluded by default (#3691)
  • Fix Black not honouring pyproject.toml settings when running --stdin-filename and the pyproject.toml found isn't in the current working directory (#3719)
  • Black will now error if exclude and extend-exclude have invalid data types in pyproject.toml, instead of silently doing the wrong thing (#3764)

Packaging

  • Upgrade mypyc from 0.991 to 1.3 (#3697)
  • Remove patching of Click that mitigated errors on Python 3.6 with LANG=C (#3768)

Parser

  • Add support for the new PEP 695 syntax in Python 3.12 (#3703)

Performance

  • Speed up Black significantly when the cache is full (#3751)
  • Avoid importing IPython in a case where we wouldn't need it (#3748)

Output

... (truncated)

Changelog

Sourced from black's changelog.

23.7.0

Highlights

  • Runtime support for Python 3.7 has been removed. Formatting 3.7 code will still be supported until further notice (#3765)

Stable style

  • Fix a bug where an illegal trailing comma was added to return type annotations using PEP 604 unions (#3735)
  • Fix several bugs and crashes where comments in stub files were removed or mishandled under some circumstances (#3745)
  • Fix a crash with multi-line magic comments like type: ignore within parentheses (#3740)
  • Fix error in AST validation when Black removes trailing whitespace in a type comment (#3773)

Preview style

  • Implicitly concatenated strings used as function args are no longer wrapped inside parentheses (#3640)
  • Remove blank lines between a class definition and its docstring (#3692)

Configuration

  • The --workers argument to Black can now be specified via the BLACK_NUM_WORKERS environment variable (#3743)
  • .pytest_cache, .ruff_cache and .vscode are now excluded by default (#3691)
  • Fix Black not honouring pyproject.toml settings when running --stdin-filename and the pyproject.toml found isn't in the current working directory (#3719)
  • Black will now error if exclude and extend-exclude have invalid data types in pyproject.toml, instead of silently doing the wrong thing (#3764)

Packaging

  • Upgrade mypyc from 0.991 to 1.3 (#3697)
  • Remove patching of Click that mitigated errors on Python 3.6 with LANG=C (#3768)

Parser

  • Add support for the new PEP 695 syntax in Python 3.12 (#3703)

Performance

  • Speed up Black significantly when the cache is full (#3751)
  • Avoid importing IPython in a case where we wouldn't need it (#3748)

Output

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=23.3.0&new-version=23.7.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2099.org.readthedocs.build/en/2099/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2099/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771202454,MDU6SXNzdWU3NzEyMDI0NTQ=,1153,"Use YAML examples in documentation by default, not JSON",9599,simonw,closed,0,,,,,22,2020-12-18T22:20:15Z,2023-07-08T20:09:48Z,2023-07-08T20:08:13Z,OWNER,,"YAML configuration is much better for multi-line strings, and I'm increasingly adding configuration options to Datasette that benefit from that - fragments of HTML in `description_html` or SQL queries used to configure things like https://github.com/simonw/datasette-atom for example. Rather than confusing things by showing both in the documentation, I should switch all of the default examples to use YAML instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1785360409,I_kwDOCGYnMM5qanAZ,563,`--empty-null` option when importing CSV,9599,simonw,closed,0,,,,,1,2023-07-03T05:23:36Z,2023-07-03T05:44:43Z,2023-07-03T05:42:30Z,OWNER,,"CSV files with empty cells in (which come through as the empty string) are common and a bit gross. Having an option that means ""and if it's an empty string store `null` instead) would be cool. I brainstormed name options here https://chat.openai.com/share/c947b738-ee7d-419c-af90-bc84e90987da",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1355148385,I_kwDOBm6k_c5Qxexh,1796,Research an upgrade to CodeMirror 6,9599,simonw,closed,0,,,,,4,2022-08-30T04:27:46Z,2023-07-03T04:58:21Z,2023-07-03T04:58:21Z,OWNER,,"There are still a bunch of bugs in CodeMirror 5 that affect various mobile browsers - see Datasette Discord report here: https://discord.com/channels/823971286308356157/823971286941302908/1013878624992108645 https://user-images.githubusercontent.com/9599/187349269-7b7c0c8c-3894-4810-82f0-de7c1eb940b3.mp4 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781047747,I_kwDOBm6k_c5qKKHD,2092,test_homepage intermittent failure,9599,simonw,closed,0,,,,,2,2023-06-29T15:20:37Z,2023-06-29T15:26:28Z,2023-06-29T15:24:13Z,OWNER,,"e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852 ``` =================================== FAILURES =================================== ________________________________ test_homepage _________________________________ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python ds_client = @pytest.mark.asyncio async def test_homepage(ds_client): response = await ds_client.get(""/.json"") assert response.status_code == 200 assert ""application/json; charset=utf-8"" == response.headers[""content-type""] data = response.json() assert data.keys() == {""fixtures"": 0}.keys() d = data[""fixtures""] assert d[""name""] == ""fixtures"" assert d[""tables_count""] == 24 assert len(d[""tables_and_views_truncated""]) == 5 assert d[""tables_and_views_more""] is True # 4 hidden FTS tables + no_primary_key (hidden in metadata) assert d[""hidden_tables_count""] == 6 # 201 in no_primary_key, plus 6 in other hidden tables: > assert d[""hidden_table_rows_sum""] == 207, data E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}} E assert 0 == 207 ``` My guess is that this is a timing error, where very occasionally the ""count rows but stop counting if it exceeds a time limit"" thing fails.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1780973290,I_kwDOBm6k_c5qJ37q,2089,codespell test failure,9599,simonw,closed,0,,,,,5,2023-06-29T14:40:10Z,2023-06-29T14:48:11Z,2023-06-29T14:48:10Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/5413443676/jobs/9838999356 ``` codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt shell: /usr/bin/bash -e {0} env: pythonLocation: /opt/hostedtoolcache/Python/3.9.17/x64 LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.17/x64/lib docs/metadata.rst:192: displaing ==> displaying ``` This failure is legit, it found a spelling mistake: https://github.com/simonw/datasette/blob/ede62036180993dbd9d4e5d280fc21c183cda1c3/docs/metadata.rst#L192",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2089/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1756975532,PR_kwDOBm6k_c5S_5Jl,2083,Bump blacken-docs from 1.13.0 to 1.14.0,49699333,dependabot[bot],closed,0,,,,,0,2023-06-14T13:57:52Z,2023-06-29T14:31:55Z,2023-06-29T14:31:54Z,CONTRIBUTOR,simonw/datasette/pulls/2083,"Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.13.0 to 1.14.0.
Changelog

Sourced from blacken-docs's changelog.

1.14.0 (2023-06-13)

  • Support Python 3.12.
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.13.0&new-version=1.14.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2083.org.readthedocs.build/en/2083/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2083/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1777548699,I_kwDOCGYnMM5p8z2b,561,`--stop-after` option for `insert` and `upsert` commands,9599,simonw,closed,0,,,,,1,2023-06-27T18:44:15Z,2023-06-27T18:50:09Z,2023-06-27T18:50:08Z,OWNER,,I found myself wanting to insert rows from a 849MB CSV file without processing the whole thing: https://huggingface.co/datasets/jerpint-org/HackAPrompt-Playground-Submissions/tree/main,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1773458985,PR_kwDOCGYnMM5T2mMb,560,Use sqlean if available in environment,9599,simonw,closed,0,,,,,10,2023-06-25T19:48:48Z,2023-06-26T08:21:00Z,2023-06-25T23:25:51Z,OWNER,simonw/sqlite-utils/pulls/560,"Refs: - #559 ---- :books: Documentation preview :books:: https://sqlite-utils--560.org.readthedocs.build/en/560/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 810618495,MDU6SXNzdWU4MTA2MTg0OTU=,235,Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified,6913891,kristomi,closed,0,,,,,18,2021-02-17T23:33:23Z,2023-06-26T01:47:01Z,2023-06-25T23:25:53Z,NONE,,"Thanks for what seems like a truly great suite of libraries. I wanted to try out Datasette, but never got more than half way through your YouTube video with the SF tree dataset. Whenever I try to extract a column, I get a `sqlite3.OperationalError: table sqlite_master may not be modified` error from Python. This snippet reproduces the error on my system, Python 3.9.1 and sqlite-utils 3.5 on an M1 Macbook Pro running in rosetta mode: ``` curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id sqlite-utils extract meteorites.db meteorites recclass ``` I have tried googling the problem, but all I've found is that this *might* be a problem with the sqlite3 database running in defensive mode, but I definitely can't know for sure. Does the problem seem familiar to you? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/235/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1773450152,I_kwDOCGYnMM5ptLOo,559,sqlean support,9599,simonw,closed,0,,,,,0,2023-06-25T19:27:26Z,2023-06-25T23:25:53Z,2023-06-25T23:25:53Z,OWNER,,"If sqlean is available, use that. Refs: - https://github.com/nalgeon/sqlean.py/issues/1#issuecomment-1605707788 This will provide a good workaround for: - #235 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1655860104,I_kwDOCGYnMM5ismuI,535,rows: --transpose or psql extended view-like functionality,7908073,chapmanjacobd,closed,0,,,,,2,2023-04-05T15:37:33Z,2023-06-15T08:39:49Z,2023-06-14T22:05:28Z,CONTRIBUTOR,,"It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`) In other words instead of this: ``` sqlite-utils rows --limit 5 --fmt github track_metadata.db songs ``` | track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 | The output would look something like this: ``` $ for col in (sqlite-columns track_metadata.db songs) sqlite-utils --fmt github track_metadata.db ""select $col from songs order by rowid desc limit 5"" end ``` | track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 | ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1581090327,I_kwDOCGYnMM5ePYYX,529,Microsoft line endings,7908073,chapmanjacobd,closed,0,,,,,1,2023-02-12T02:20:48Z,2023-06-14T23:12:12Z,2023-06-14T23:11:47Z,CONTRIBUTOR,,"sqlite-utils prints `\r\n` but [it should probably](https://devblogs.microsoft.com/commandline/extended-eol-in-notepad/) print `\n` (unless the platform is detected as Windows?) It has tripped me up a few times when piping the output of sqlite-utils to other programs: ``` $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1740150327,I_kwDOCGYnMM5nuJY3,557,Aliased ROWID option for tables created from alter=True commands,7908073,chapmanjacobd,closed,0,,,,,2,2023-06-04T05:29:28Z,2023-06-14T06:09:21Z,2023-06-05T19:26:26Z,CONTRIBUTOR,,"> If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values. ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK. It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726236847,I_kwDOBm6k_c5m5Eiv,2078,Resolve the difference between `wrap_view()` and `BaseView`,9599,simonw,closed,0,,,,,16,2023-05-25T17:44:32Z,2023-05-26T00:18:46Z,2023-05-26T00:18:46Z,OWNER,,"There are two patterns for implementing views in Datasette at the moment. I want to combine those. Part of: - #2053",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2078/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726603778,PR_kwDOBm6k_c5RYvTU,2080,New View base class,9599,simonw,closed,0,,,,,3,2023-05-25T23:22:55Z,2023-05-26T00:18:45Z,2023-05-26T00:18:44Z,OWNER,simonw/datasette/pulls/2080,"Refs: - #2078 TODO: - [x] Teach router layer how to handle this - [x] Use it for something ---- :books: Documentation preview :books:: https://datasette--2080.org.readthedocs.build/en/2080/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2080/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1726531350,I_kwDOBm6k_c5m6McW,2079,Datasette should serve Access-Control-Max-Age,9599,simonw,closed,0,,,,,8,2023-05-25T21:50:50Z,2023-05-25T22:56:28Z,2023-05-25T22:08:35Z,OWNER,,"Currently the CORS headers served are: https://github.com/simonw/datasette/blob/9584879534ff0556e04e4c420262972884cac87b/datasette/utils/__init__.py#L1139-L1143 Serving `Access-Control-Max-Age: 600` would allow browsers to cache that for 10 minutes, avoiding additional CORS pre-flight OPTIONS requests during that time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718612569,I_kwDOCGYnMM5mb_JZ,552,Document how to setup shell auto-completion,9599,simonw,closed,0,,,,,1,2023-05-21T19:20:41Z,2023-05-21T21:05:16Z,2023-05-21T21:03:40Z,OWNER,,"https://click.palletsprojects.com/en/8.1.x/shell-completion/ This works for `zsh`: eval ""$(_SQLITE_UTILS_COMPLETE=zsh_source sqlite-utils)"" This will probably work for `bash`: eval ""$(_SQLITE_UTILS_COMPLETE=bash_source sqlite-utils)"" Need to add this to the installation docs here: https://sqlite-utils.datasette.io/en/stable/installation.html - along with the pattern for adding that to `.zshrc` or whatever.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718607907,I_kwDOCGYnMM5mb-Aj,551,Make as many examples in the CLI docs as possible copy-and-pastable,9599,simonw,closed,0,,,,,6,2023-05-21T19:04:10Z,2023-05-21T21:04:04Z,2023-05-21T20:57:24Z,OWNER,,"e.g. in this section: https://sqlite-utils.datasette.io/en/stable/cli.html#running-queries-directly-against-csv-or-json The little copy button will also copy the `$ ` which breaks the examples when copied.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718635018,PR_kwDOCGYnMM5Q9lY4,553,Reformatted CLI examples in docs,9599,simonw,closed,0,,,,,2,2023-05-21T20:44:34Z,2023-05-21T20:57:27Z,2023-05-21T20:57:23Z,OWNER,simonw/sqlite-utils/pulls/553,"Refs: - #551 ---- :books: Documentation preview :books:: https://sqlite-utils--553.org.readthedocs.build/en/553/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1718517882,I_kwDOCGYnMM5mboB6,545,Try out Trogon for a tui interface,9599,simonw,closed,0,,,,,6,2023-05-21T14:08:25Z,2023-05-21T19:33:13Z,2023-05-21T18:41:58Z,OWNER,,https://github.com/Textualize/trogon,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/545/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718586377,PR_kwDOCGYnMM5Q9cAv,549,TUI powered by Trogon,9599,simonw,closed,0,,,,,3,2023-05-21T17:55:42Z,2023-05-21T18:42:00Z,2023-05-21T18:41:56Z,OWNER,simonw/sqlite-utils/pulls/549,"Refs: - #545 ---- :books: Documentation preview :books:: https://sqlite-utils--549.org.readthedocs.build/en/549/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1718576761,I_kwDOCGYnMM5mb2Z5,548,analyze-tables should validate provide --column names,9599,simonw,closed,0,,,,,1,2023-05-21T17:20:24Z,2023-05-21T17:35:52Z,2023-05-21T17:35:52Z,OWNER,,"Noticed this while testing: - #547 If you pass a non-existent column to `-c/--column` you don't get an error message.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718572201,I_kwDOCGYnMM5mb1Sp,547,No need to show common values if everything is null,9599,simonw,closed,0,,,,,1,2023-05-21T17:05:07Z,2023-05-21T17:19:21Z,2023-05-21T17:19:21Z,OWNER,,"Noticed this: ``` % sqlite-utils analyze-tables content.db repos -c delete_branch_on_merge --common-limit 20 --no-least repos.delete_branch_on_merge: (1/1) Total rows: 158 Null rows: 158 Blank rows: 0 Distinct values: 0 Most common: 158: None ``` The `158: None` there is duplicate information considering we already know there are 158/158 null rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718515590,I_kwDOCGYnMM5mbneG,544,New options for analyze-tables --common-limit --no-most and --no-least,9599,simonw,closed,0,,,,,2,2023-05-21T14:03:19Z,2023-05-21T17:03:06Z,2023-05-21T16:19:31Z,OWNER,,"The ""least common"" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values. sqlite-utils analyze-tables content.db repos --common-limit 20 --no-least",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718550688,PR_kwDOCGYnMM5Q9VH0,546,"Analyze tables options: --common-limit, --no-most, --no-least",9599,simonw,closed,0,,,,,2,2023-05-21T15:54:39Z,2023-05-21T16:19:30Z,2023-05-21T16:19:30Z,OWNER,simonw/sqlite-utils/pulls/546,"Refs #544 - [x] Documentation for CLI options - [x] Documentation for new Python API parameters: `most_common: bool` and `least_common: bool` - [x] Tests for CLI - [x] Tests for Python API",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1690842199,PR_kwDOBm6k_c5PgNaA,2068,Bump sphinx from 6.1.3 to 7.0.0,49699333,dependabot[bot],closed,0,,,,,1,2023-05-01T13:58:46Z,2023-05-15T13:59:38Z,2023-05-15T13:59:36Z,CONTRIBUTOR,simonw/datasette/pulls/2068,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.0.0.
Release notes

Sourced from sphinx's releases.

v7.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v7.0.0rc1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 7.0.0 (released Apr 29, 2023)

Incompatible changes

  • #11359: Remove long-deprecated aliases for MecabSplitter and DefaultSplitter in sphinx.search.ja.
  • #11360: Remove deprecated make_old_id functions in domain object description classes.
  • #11363: Remove the Setuptools integration (build_sphinx hook in setup.py).
  • #11364: Remove deprecated sphinx.ext.napoleon.iterators module.
  • #11365: Remove support for the jsdump format in sphinx.search.
  • #11366: Make locale a required argument to sphinx.util.i18n.format_date().
  • #11370: Remove deprecated sphinx.util.stemmer module.
  • #11371: Remove deprecated sphinx.pycode.ast.parse() function.
  • #11372: Remove deprecated sphinx.io.read_doc() function.
  • #11373: Removed deprecated sphinx.util.get_matching_files() function.
  • #11378: Remove deprecated sphinx.util.docutils.is_html5_writer_available() function.
  • #11379: Make the env argument to Builder subclasses required.
  • #11380: autosummary: Always emit grouped import exceptions.
  • #11381: Remove deprecated style key for HTML templates.
  • #11382: Remove deprecated sphinx.writers.latex.LaTeXTranslator.docclasses attribute.
  • #11383: Remove deprecated sphinx.builders.html.html5_ready and sphinx.builders.html.HTMLTranslator attributes.
  • #11385: Remove support for HTML 4 output.

Release 6.2.1 (released Apr 25, 2023)

Bugs fixed

  • #11355: Revert the default type of :confval:nitpick_ignore and :confval:nitpick_ignore_regex to list.

Release 6.2.0 (released Apr 23, 2023)

Dependencies

  • Require Docutils 0.18.1 or greater.

Incompatible changes

... (truncated)

Commits
  • d568b2f Bump to 7.0.0 final
  • ff79edf Remove jsdump references post removal
  • 1a5133a Bump to 7.0.0rc1 final
  • 5795fc7 Update sphinx.deprecation for Sphinx 7.0 (#11386)
  • 6202087 Add a note to CHANGES for PR 11385
  • ad47373 Remove HTML 4 support (#11385)
  • 3e3251d Remove HTMLTranslator and html5_ready from sphinx.builders.html (...
  • 77fd819 Remove deprecated LaTeXTranslator.docclasses attribute (#11382)
  • 4be56f3 Remove deprecated style key for HTML templates (#11381)
  • 49027a9 Autosummary: Always emit grouped ImportError exceptions (#11380)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2068.org.readthedocs.build/en/2068/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2068/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1578790070,I_kwDOCGYnMM5eGmy2,527,`Table.convert()` skips falsey values,167893,mcarpenter,closed,0,,,,,5,2023-02-10T00:00:52Z,2023-05-09T21:15:05Z,2023-05-08T21:03:24Z,CONTRIBUTOR,,"# Summary By design, `Table.convert()` does [not attempt](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2663) conversion of falsey values (`None`, `""""`, `0`, ...). This is surprising (directly contradicts the docstring) and `convert()` may quietly skip cells where the user assumed a conversion would take place. # Example Increment a column of integers by one ``` python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 0}, {col:1}]) print(table.get(1)) # 0 print(table.get(2)) # 1 print() table.convert(col, lambda x: x+1) print(table.get(1)) # got 0, expected 1 ⚠⚠⚠ print(table.get(2)) # got 2, expected 2 ``` Another example might be, say, transforming cells containing empty string to `NULL`. # Discussion This was, I think, a pragmatic choice so that consumers can skip writing guard clauses for these falsey values (particularly from the CLI). But this surprising undocumented behavior can lead to incorrect data. I don't think this is a good trade-off between convenience and correctness. In the absence of this convenience users will either have to write guard clauses into their conversion expressions (or adapt the called function to do the same), so: ``` python fn(value) if value else value ``` instead of: ``` python fn(value) ``` This is more typing and sometimes I will forget, and there will be errors. (But they will be noisy errors, which is a good thing). Such a change will certainly inconvenience some existing consumers; there will be some breakage. But I think this is worth it to avoid quietly not converting some values by default, which can lead to quietly bad data. I have a PR that I will attach, please take a look and see what you think.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1702354223,I_kwDOBm6k_c5ld90v,2070,Mechanism for deploying a preview of a branch using Vercel,9599,simonw,closed,0,,,,,2,2023-05-09T16:21:45Z,2023-05-09T16:25:00Z,2023-05-09T16:24:31Z,OWNER,,"I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml It deployed the `json-extras-query` branch here: https://datasette-preview-json-extras-query.vercel.app/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1665200812,PR_kwDOCGYnMM5OKveS,537,Support self-referencing FKs in `Table.create`,544011,numist,closed,0,,,,,3,2023-04-12T20:26:59Z,2023-05-08T22:45:33Z,2023-05-08T21:10:01Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/537," ---- :books: Documentation preview :books:: https://sqlite-utils--537.org.readthedocs.build/en/537/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1701018909,I_kwDOCGYnMM5lY30d,543,Tests broken on Windows due to new convert() lambda names,9599,simonw,closed,0,,,,,0,2023-05-08T22:11:29Z,2023-05-08T22:19:04Z,2023-05-08T22:19:04Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/4920084038/jobs/8788501314 ```python sql = 'update [example] set [dt] = lambda_-9223371942137158589([dt]);' ``` From: - #526",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279144769,I_kwDOCGYnMM5MPjNB,448,Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto',236907,mungewell,closed,0,,,,,5,2022-06-21T21:48:27Z,2023-05-08T22:01:00Z,2023-05-08T22:01:00Z,NONE,,"Attempting to run the example given here (without extra bracket ;-): https://sqlite-utils.datasette.io/en/stable/python-api.html#reading-rows-from-a-file ``` from sqlite_utils.utils import rows_from_file import io rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) print(list(rows), format) # Outputs [{'id': '1', 'name': 'Cleo'}] Format.CSV ``` Gives error ``` >""c:\Program Files\Python37\python.exe"" test2.py Traceback (most recent call last): File ""test2.py"", line 4, in rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) File ""C:\Users\swood\Downloads\sqlite-utils-main-20220621\sqlite-utils-main\sqlite_utils\utils.py"", line 300, in rows_from_file first_bytes = buffered.peek(2048).strip() AttributeError: '_io.StringIO' object has no attribute 'readinto' ``` I am running Python on Windows. ``` >""c:\Program Files\Python37\python.exe"" Python 3.7.4 (tags/v3.7.4:e09359112e, Jul 8 2019, 20:34:20) [MSC v.1916 64 bit (AMD64)] on win32 Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/448/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1575131737,I_kwDOCGYnMM5d4ppZ,525,Repeated calls to `Table.convert()` fail,167893,mcarpenter,closed,0,,,,,4,2023-02-07T22:40:47Z,2023-05-08T21:59:41Z,2023-05-08T21:54:02Z,CONTRIBUTOR,,"## Summary When using the API, repeated calls to `Table.convert()` do not work correctly since all conversions quietly use the callable (function, lambda) from the first call to `convert()` only. Subsequent invocations with different callables use the callable from the first invocation only. ## Example ```python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 1}]) print(table.get(1)) table.convert(col, lambda x: x*2) print(table.get(1)) def zeroize(x): return 0 #zeroize = lambda x: 0 #zeroize.__name__ = 'zeroize' table.convert(col, zeroize) print(table.get(1)) ``` Output: ``` {'x': 1} {'x': 2} {'x': 4} ``` Expected: ``` {'x': 1} {'x': 2} {'x': 0} ``` ## Explanation This is some relevant [documentation](https://github.com/simonw/sqlite-utils/blob/1491b66dd7439dd87cd5cd4c4684f46eb3c5751b/docs/python-api.rst#registering-custom-sql-functions:~:text=By%20default%20registering%20a%20function%20with%20the%20same%20name%20and%20number%20of%20arguments%20will%20have%20no%20effect). * `Table.convert()` takes a `Callable` to perform data conversion on a column * The `Callable` is passed to `Database.register_function()` * `Database.register_function()` uses the callable's `__name__` attribute for registration * (Aside: all lambdas have a `__name__` of ``: I thought this was the problem, and it was close, but not quite) * However `convert()` first wraps the callable by local function [`convert_value()`](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2661) * Consequently `register_function()` sees name `convert_value` for all invocations from `convert()` * `register_function()` silently ignores registrations using the same name, retaining only the first such registration There's a mismatch between the comments and the code: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L404 but actually the existing function is returned/used instead (as the ""registering custom sql functions"" doc I linked above says too). Seems like this can be rectified to match the comment? ## Suggested fix I think there are four things: 1. The call to `register_function()` from `convert()`should have an explicit `name=` parameter (to continue using `convert_value()` and the progress bar). 2. For functions, this name can be the real function name. (I understand the sqlite api needs a name, and it's nice if those are recognizable names where possible). For lambdas would `'lambda-{uuid}'` or similar be acceptable? 3. `register_function()` really should throw an error on repeated attempts to register a duplicate (function, arity)-pair. 4. A test? I haven't looked at the test framework here but seems this should be testable. ## See also - #458 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1576990618,PR_kwDOCGYnMM5JkkED,526,Fix repeated calls to `Table.convert()`,167893,mcarpenter,closed,0,,,,,0,2023-02-09T00:14:49Z,2023-05-08T21:56:05Z,2023-05-08T21:53:58Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/526,"Fixes #525. All tests pass. There's perhaps a better way to name lambdas? There could be a collision if a caller passes a function with name like `lambda_123456`. SQLite [documentation](https://www.sqlite.org/appfunc.html) is a little, ah, lite on function name specs. If there is a character that can be used in place of underscore in a SQLite function name that is not permitted in a Python function identifier then that could be a good way to prevent accidental collisions. (I tried dash, colon, dot, no joy). Otherwise, there is little chance of this happening and if it should happen the risk is mitigated by now throwing an exception in the case of a (name, arity) collision without `replace=True`. ---- :books: Documentation preview :books:: https://sqlite-utils--526.org.readthedocs.build/en/526/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1465194249,I_kwDOCGYnMM5XVRcJ,514,upsert of new row with check constraints fails,193185,cldellow,closed,0,,,,,5,2022-11-26T16:12:23Z,2023-05-08T21:50:52Z,2023-05-08T21:50:51Z,NONE,,"(I originally opened this in https://github.com/simonw/datasette-insert/issues/20, but I see that that library depends on sqlite-utils) In the case of a new row, upsert first adds the row, specifying only its pkeys: https://github.com/simonw/sqlite-utils/blob/965ca0d5f5bffe06cc02cd7741344d1ddddf9d56/sqlite_utils/db.py#L2783-L2787 This means that a table with NON NULL (or other constraint) columns that aren't part of the pkey can't have new rows upserted.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1465194930,PR_kwDOCGYnMM5DvZxa,515,"upsert new rows with constraints, fixes #514",193185,cldellow,closed,0,,,,,1,2022-11-26T16:15:21Z,2023-05-08T21:27:11Z,2023-05-08T21:27:10Z,NONE,simonw/sqlite-utils/pulls/515,"This fixes #514 by making the initial insert for upserts include all columns, so that new rows can be added to tables with non-pkey columns that have constraints. (aside: I'm not a python programmer. `pip`? `pipenv`? `venv`? These are mystical incantations to me. The process to set up this repo for local development and testing was _so easy_. Thank you for the excellent contributing documentation!) ---- :books: Documentation preview :books:: https://sqlite-utils--515.org.readthedocs.build/en/515/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1044267332,I_kwDOCGYnMM4-PkFE,336,"sqlite-util tranform --column-order mangles columns of type ""timestamp""",536941,fgregg,closed,0,,,,,1,2021-11-04T01:15:38Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,CONTRIBUTOR,,"Reproducible code below: ```bash > echo 'create table bar (baz text, created_at timestamp default CURRENT_TIMESTAMP)' | sqlite3 foo.db > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE bar (baz text, created_at timestamp default CURRENT_TIMESTAMP); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT 'CURRENT_TIMESTAMP' ); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT '''CURRENT_TIMESTAMP''' ); ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1432377191,I_kwDOCGYnMM5VYFdn,509,`sqlite-utils transform` breaks DEFAULT string values and STRFTIME(),2199875,kennysong,closed,0,,,,,0,2022-11-02T02:32:23Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,NONE,,"Very nice library! Our team found sqlite-utils through @simonw's [comment on the ""Simple declarative schema migration for SQLite"" article](https://news.ycombinator.com/item?id=31249823), and we were excited to use it, but unfortunately `sqlite-utils transform` seems to break our DB. Running `sqlite-utils transform` to modify a column mangles their DEFAULT values: - Default string values are wrapped in extra single quotes - Function expressions such as [`STRFTIME()`](https://www.sqlite.org/lang_datefunc.html) are turned into strings! ------ Here are steps to reproduce: **Original database** ``` $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) EOF $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) ``` **Modified database after sqlite-utils** ``` $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 $ sqlite-utils transform test.db mytable --rename col1 renamedcol1 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE ""mytable"" ( [renamedcol1] TEXT DEFAULT '''foo''', [col2] TEXT DEFAULT 'STRFTIME(''%Y-%m-%d %H:%M:%f'', ''NOW'')' ) $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 'foo'|STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW') ``` (Related: #336)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1505568103,PR_kwDOCGYnMM5F609a,519,Fixes breaking DEFAULT values,13819005,rhoboro,closed,0,,,,,1,2022-12-21T01:27:52Z,2023-05-08T21:13:37Z,2023-05-08T21:13:37Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/519,"Fixes #509, Fixes #336 Thanks for the great library! I fixed a bug that `sqlite-utils transform` breaks DEFAULT values. All tests already present passed with no changes, and I added some tests for this PR. In #509 case, fixed here. ```shell $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) EOF $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-12-21 01:15:39.669 $ sqlite-utils transform test.db mytable --rename col1 renamedcol1 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE ""mytable"" ( [renamedcol1] TEXT DEFAULT 'foo', [col2] TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) # ← Non-String Value ) $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-12-21 01:15:39.669 foo|2022-12-21 01:15:56.432 ``` And #336 case also fixed. Special values are described [here](https://www.sqlite.org/lang_createtable.html). > 3.2. The DEFAULT clause > ... A default value may also be one of the special case-independent keywords CURRENT_TIME, CURRENT_DATE or CURRENT_TIMESTAMP. ```shell $ echo 'create table bar (baz text, created_at timestamp default CURRENT_TIMESTAMP)' | sqlite3 foo.db $ sqlite3 foo.db SQLite version 3.39.5 2022-10-14 20:58:05 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE bar (baz text, created_at timestamp default CURRENT_TIMESTAMP); sqlite> .exit $ sqlite-utils transform foo.db bar --column-order baz $ sqlite3 foo.db SQLite version 3.39.5 2022-10-14 20:58:05 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT CURRENT_TIMESTAMP ); sqlite> .exit $ sqlite-utils transform foo.db bar --column-order baz $ sqlite3 foo.db SQLite version 3.39.5 2022-10-14 20:58:05 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT CURRENT_TIMESTAMP # ← Non-String Value ); ``` ---- :books: Documentation preview :books:: https://sqlite-utils--519.org.readthedocs.build/en/519/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/519/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1578793661,PR_kwDOCGYnMM5Jqn1u,528,Enable `Table.convert()` on falsey values,167893,mcarpenter,closed,0,,,,,1,2023-02-10T00:04:09Z,2023-05-08T21:08:23Z,2023-05-08T21:08:23Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/528,"Fixes #527 ---- :books: Documentation preview :books:: https://sqlite-utils--528.org.readthedocs.build/en/528/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,voltagex,closed,0,,,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json` ``` sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json [###################################-] 99% 00:00:00 Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ``` Please show more information as to *why* this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1695428235,I_kwDOCGYnMM5lDi6L,538,`table.upsert_all` fails to write rows when `not_null` is present,1231935,xavdid,closed,0,,,,,9,2023-05-04T07:30:38Z,2023-05-08T20:06:35Z,2023-05-08T19:27:02Z,NONE,,"I found an odd bug today, where calls to `table.upsert_all` don't write rows if you include the `not_null` kwarg. ## Repro Example ```py from sqlite_utils import Database db = Database(""upsert-test.db"") db[""comments""].upsert_all( [{""id"": 1, ""name"": ""david""}], pk=""id"", not_null=[""name""], ) assert list(db[""comments""].rows) # err! ``` The schema is correctly created: ```sql CREATE TABLE [comments] ( [id] INTEGER PRIMARY KEY, [name] TEXT NOT NULL ) ``` But no rows are created. Removing either the `not_null` kwargs works as expected, as does an `insert_all` call. ## Version Info - Python: `3.11.0` - sqlite-utils: `3.30` - sqlite: `3.39.5 2022-10-14`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1622640374,I_kwDOCGYnMM5gt4b2,534, ResourceWarning: unclosed file,1244826,djhenderson,closed,0,,,,,1,2023-03-14T03:02:18Z,2023-05-08T19:56:29Z,2023-05-08T19:56:29Z,NONE,,"Issuing either ``` py -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` or ``` set pythonwarnings=default sqlite-utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` exhibits a ResourceWarning indicating that the CSV file being loaded is not closed. sqlite-utils --version sqlite-utils, version 3.30 py --version Python 3.11.2 Windows Version 10.0.19045 Build 19045 SQLite version 3.41.0 2023-02-21 18:09:37 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699184583,I_kwDOCGYnMM5lR3_H,540,sphinx.builders.linkcheck build error,9599,simonw,closed,0,,,,,4,2023-05-07T18:37:09Z,2023-05-08T04:56:13Z,2023-05-07T18:42:36Z,OWNER,,"https://readthedocs.org/projects/sqlite-utils/builds/20512693/ ``` Running Sphinx v6.2.1 Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 442, in load_extension mod = import_module(extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File """", line 1014, in _gcd_import File """", line 991, in _find_and_load File """", line 975, in _find_and_load_unlocked File """", line 671, in _load_unlocked File """", line 783, in exec_module File """", line 219, in _call_with_frames_removed File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/builders/linkcheck.py"", line 20, in from requests import Response File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/requests/__init__.py"", line 43, in import urllib3 File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/urllib3/__init__.py"", line 38, in raise ImportError( ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168 The above exception was the direct cause of the following exception: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/cmd/build.py"", line 280, in build_main app = Sphinx(args.sourcedir, args.confdir, args.outputdir, File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 225, in __init__ self.setup_extension(extension) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 404, in setup_extension self.registry.load_extension(self, extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 445, in load_extension raise ExtensionError(__('Could not import extension %s') % extname, sphinx.errors.ExtensionError: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) Extension error: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699174055,I_kwDOCGYnMM5lR1an,539,"`--raw-lines` option, like `--raw` for multiple lines",9599,simonw,closed,0,,,,,4,2023-05-07T18:07:46Z,2023-05-07T18:43:24Z,2023-05-07T18:26:18Z,OWNER,,I wanted to output newline-separated output of the first column of every row in the results - like `--row` but for more than one line.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1683229834,PR_kwDOBm6k_c5PG0wF,2064,Bump sphinx from 6.1.3 to 6.2.1,49699333,dependabot[bot],closed,0,,,,,1,2023-04-25T13:57:49Z,2023-05-01T13:58:53Z,2023-05-01T13:58:52Z,CONTRIBUTOR,simonw/datasette/pulls/2064,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 6.2.1.
Release notes

Sourced from sphinx's releases.

v6.2.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.2.1 (released Apr 25, 2023)

Bugs fixed

  • #11355: Revert the default type of :confval:nitpick_ignore and :confval:nitpick_ignore_regex to list.

Release 6.2.0 (released Apr 23, 2023)

Dependencies

  • Require Docutils 0.18.1 or greater.

Incompatible changes

  • LaTeX: removal of some internal TeX \dimen registers (not previously publicly documented) as per 5.1.0 code comments in sphinx.sty: \sphinxverbatimsep, \sphinxverbatimborder, \sphinxshadowsep, \sphinxshadowsize, and \sphinxshadowrule. (refs: #11105)
  • Remove .egg support from pycode ModuleAnalyser; Python eggs are a now-obsolete binary distribution format
  • #11089: Remove deprecated code in sphinx.builders.linkcheck. Patch by Daniel Eades
  • Remove internal-only sphinx.locale.setlocale

Deprecated

  • #11247: Deprecate the legacy intersphinx_mapping format
  • sphinx.util.osutil.cd is deprecated in favour of contextlib.chdir.

Features added

  • #11277: :rst:dir:autoproperty allows the return type to be specified as a type comment (e.g., # type: () -> int). Patch by Bénédikt Tran
  • #10811: Autosummary: extend __all__ to imported members for template rendering when option autosummary_ignore_module_all is set to False. Patch by Clement Pinard
  • #11147: Add a content_offset parameter to nested_parse_with_titles(), allowing for correct line numbers during nested parsing. Patch by Jeremy Maitin-Shepard
  • Update to Unicode CLDR 42
  • Add a --jobs synonym for -j. Patch by Hugo van Kemenade
  • LaTeX: a command \sphinxbox for styling text elements with a (possibly

... (truncated)

Commits
  • ec993dd Bump to 6.2.1 final
  • d2aa91f Revert the default type of nitpick_ignore[_regex] to list
  • 60d8fa1 Bump version
  • 70102ac Bump to 6.2.0 final
  • 4e27a5f Remove unneeded JavaScript from sphinx13 theme
  • bffb547 Note correct deprecation version for sphinx.util.osutil.cd
  • 59de8d5 Revert "Support and prefer .jinja to _t for static templates (#11165)...
  • aee3c0a Partially revert "Disable localisation when SOURCE_DATE_EPOCH is set (#10949)...
  • 186d596 Use overwrite_file context manager in test_ext_autodoc_configs (#11320)
  • 77483f2 Add missing test decorator for test_util_inspect (#11321)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=6.2.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2064.org.readthedocs.build/en/2064/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2064/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1686033652,I_kwDOBm6k_c5kftT0,2065,Datasette cannot be installed with Rye,9599,simonw,closed,0,,,,,4,2023-04-27T03:35:42Z,2023-04-27T05:09:36Z,2023-04-27T05:09:36Z,OWNER,,"https://github.com/mitsuhiko/rye I tried this: rye install datasette But now: ``` % ~/.rye/shims/datasette Traceback (most recent call last): File ""/Users/simon/.rye/shims/datasette"", line 5, in from datasette.cli import cli File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/cli.py"", line 17, in from .app import ( File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/app.py"", line 14, in import pkg_resources ModuleNotFoundError: No module named 'pkg_resources' ``` I think that's because `setuptools` is not included in Rye.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1686042269,I_kwDOBm6k_c5kfvad,2066,Failing test: httpx.InvalidURL: URL too long,9599,simonw,closed,0,,,,,10,2023-04-27T03:48:47Z,2023-04-27T04:27:50Z,2023-04-27T04:27:50Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/4815723640/jobs/8574667731 ``` def urlparse(url: str = """", **kwargs: typing.Optional[str]) -> ParseResult: # Initial basic checks on allowable URLs. # --------------------------------------- # Hard limit the maximum allowable URL length. if len(url) > MAX_URL_LENGTH: > raise InvalidURL(""URL too long"") E httpx.InvalidURL: URL too long /opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/httpx/_urlparse.py:155: InvalidURL =========================== short test summary info ============================ FAILED tests/test_csv.py::test_max_csv_mb - httpx.InvalidURL: URL too long ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1681339696,PR_kwDOBm6k_c5PAcGt,2063,Bump sphinx from 6.1.3 to 6.2.0,49699333,dependabot[bot],closed,0,,,,,1,2023-04-24T13:58:21Z,2023-04-25T13:57:55Z,2023-04-25T13:57:53Z,CONTRIBUTOR,simonw/datasette/pulls/2063,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 6.2.0.
Release notes

Sourced from sphinx's releases.

v6.2.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.2.0 (released Apr 23, 2023)

Dependencies

  • Require Docutils 0.18.1 or greater.

Incompatible changes

  • LaTeX: removal of some internal TeX \dimen registers (not previously publicly documented) as per 5.1.0 code comments in sphinx.sty: \sphinxverbatimsep, \sphinxverbatimborder, \sphinxshadowsep, \sphinxshadowsize, and \sphinxshadowrule. (refs: #11105)
  • Remove .egg support from pycode ModuleAnalyser; Python eggs are a now-obsolete binary distribution format
  • #11089: Remove deprecated code in sphinx.builders.linkcheck. Patch by Daniel Eades
  • Remove internal-only sphinx.locale.setlocale

Deprecated

  • #11247: Deprecate the legacy intersphinx_mapping format
  • sphinx.util.osutil.cd is deprecated in favour of contextlib.chdir.

Features added

  • #11277: :rst:dir:autoproperty allows the return type to be specified as a type comment (e.g., # type: () -> int). Patch by Bénédikt Tran
  • #10811: Autosummary: extend __all__ to imported members for template rendering when option autosummary_ignore_module_all is set to False. Patch by Clement Pinard
  • #11147: Add a content_offset parameter to nested_parse_with_titles(), allowing for correct line numbers during nested parsing. Patch by Jeremy Maitin-Shepard
  • Update to Unicode CLDR 42
  • Add a --jobs synonym for -j. Patch by Hugo van Kemenade
  • LaTeX: a command \sphinxbox for styling text elements with a (possibly rounded) box, optional background color and shadow, has been added. See :ref:sphinxbox. (refs: #11224)
  • LaTeX: add \sphinxstylenotetitle, ..., \sphinxstylewarningtitle, ..., for an extra layer of mark-up freeing up \sphinxstrong for other uses. See :ref:latex-macros. (refs: #11267)
  • LaTeX: :dudir:note, :dudir:hint, :dudir:important and :dudir:tip can now each be styled as the other admonitions, i.e. possibly with a background color, individual border widths and paddings, possibly rounded corners, and optional shadow. See :ref:additionalcss. (refs: #11234)

... (truncated)

Commits
  • e7d4c36 Bump to 6.2.0 final
  • 4e27a5f Remove unneeded JavaScript from sphinx13 theme
  • bffb547 Note correct deprecation version for sphinx.util.osutil.cd
  • 59de8d5 Revert "Support and prefer .jinja to _t for static templates (#11165)...
  • aee3c0a Partially revert "Disable localisation when SOURCE_DATE_EPOCH is set (#10949)...
  • 186d596 Use overwrite_file context manager in test_ext_autodoc_configs (#11320)
  • 77483f2 Add missing test decorator for test_util_inspect (#11321)
  • d8f15c7 Increase timeout threshold for linkcheck tests (#11326)
  • b430e05 Create a 'search field' component for themes (#11045)
  • e2f66ce Update CHANGES for PR #11333
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=6.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2063.org.readthedocs.build/en/2063/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2063/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1203842656,I_kwDOCGYnMM5HwS5g,425,`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher,9599,simonw,closed,0,,,,,5,2022-04-13T22:16:53Z,2023-04-15T20:14:58Z,2022-04-13T22:48:57Z,OWNER,,"Got this error while investigating: - #421 Even though I was using the `LD_PRELOAD` trick from https://til.simonwillison.net/sqlite/ld-preload to use a newer version of SQLite. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098531354_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1659525418,PR_kwDOCGYnMM5N35VZ,536,Add paths for homebrew on Apple silicon,25778,eyeseast,closed,0,,,,,1,2023-04-08T13:34:21Z,2023-04-13T01:44:43Z,2023-04-13T01:44:43Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/536,"Does what it says and nothing else. This is the same set of paths as Datasette uses. ---- :books: Documentation preview :books:: https://sqlite-utils--536.org.readthedocs.build/en/536/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/536/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620164673,PR_kwDOCGYnMM5L08O8,531,Add paths for homebrew on Apple silicon,25778,eyeseast,closed,0,,,,,4,2023-03-11T22:27:52Z,2023-04-09T01:49:44Z,2023-04-09T01:49:43Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/531,"This also passes in the extension path when specified in GIS methods. Wherever we know an extension path, we use `db.init_spatialite(find_spatialite() or load_extension)`. ---- :books: Documentation preview :books:: https://sqlite-utils--531.org.readthedocs.build/en/531/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/531/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1645098678,PR_kwDOBm6k_c5NIQri,2047,Bump black from 22.12.0 to 23.3.0,49699333,dependabot[bot],closed,0,,,,,0,2023-03-29T06:09:06Z,2023-03-29T06:12:21Z,2023-03-29T06:12:05Z,CONTRIBUTOR,simonw/datasette/pulls/2047,"Bumps [black](https://github.com/psf/black) from 22.12.0 to 23.3.0.
Release notes

Sourced from black's releases.

23.3.0

Highlights

This release fixes a longstanding confusing behavior in Black's GitHub action, where the version of the action did not determine the version of Black being run (issue #3382). In addition, there is a small bug fix around imports and a number of improvements to the preview style.

Please try out the preview style with black --preview and tell us your feedback. All changes in the preview style are expected to become part of Black's stable style in January 2024.

Stable style

  • Import lines with # fmt: skip and # fmt: off no longer have an extra blank line added when they are right after another import line (#3610)

Preview style

  • Add trailing commas to collection literals even if there's a comment after the last entry (#3393)
  • async def, async for, and async with statements are now formatted consistently compared to their non-async version. (#3609)
  • with statements that contain two context managers will be consistently wrapped in parentheses (#3589)
  • Let string splitters respect East Asian Width (#3445)
  • Now long string literals can be split after East Asian commas and periods ( U+3001 IDEOGRAPHIC COMMA, U+3002 IDEOGRAPHIC FULL STOP, & U+FF0C FULLWIDTH COMMA) besides before spaces (#3445)
  • For stubs, enforce one blank line after a nested class with a body other than just ... (#3564)
  • Improve handling of multiline strings by changing line split behavior (#1879)

Parser

  • Added support for formatting files with invalid type comments (#3594)

Integrations

  • Update GitHub Action to use the version of Black equivalent to action's version if version input is not specified (#3543)
  • Fix missing Python binary path in autoload script for vim (#3508)

Documentation

  • Document that only the most recent release is supported for security issues; vulnerabilities should be reported through Tidelift (#3612)

... (truncated)

Changelog

Sourced from black's changelog.

23.3.0

Highlights

This release fixes a longstanding confusing behavior in Black's GitHub action, where the version of the action did not determine the version of Black being run (issue #3382). In addition, there is a small bug fix around imports and a number of improvements to the preview style.

Please try out the preview style with black --preview and tell us your feedback. All changes in the preview style are expected to become part of Black's stable style in January 2024.

Stable style

  • Import lines with # fmt: skip and # fmt: off no longer have an extra blank line added when they are right after another import line (#3610)

Preview style

  • Add trailing commas to collection literals even if there's a comment after the last entry (#3393)
  • async def, async for, and async with statements are now formatted consistently compared to their non-async version. (#3609)
  • with statements that contain two context managers will be consistently wrapped in parentheses (#3589)
  • Let string splitters respect East Asian Width (#3445)
  • Now long string literals can be split after East Asian commas and periods ( U+3001 IDEOGRAPHIC COMMA, U+3002 IDEOGRAPHIC FULL STOP, & U+FF0C FULLWIDTH COMMA) besides before spaces (#3445)
  • For stubs, enforce one blank line after a nested class with a body other than just ... (#3564)
  • Improve handling of multiline strings by changing line split behavior (#1879)

Parser

  • Added support for formatting files with invalid type comments (#3594)

Integrations

  • Update GitHub Action to use the version of Black equivalent to action's version if version input is not specified (#3543)
  • Fix missing Python binary path in autoload script for vim (#3508)

Documentation

  • Document that only the most recent release is supported for security issues; vulnerabilities should be reported through Tidelift (#3612)

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.12.0&new-version=23.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2047.org.readthedocs.build/en/2047/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2047/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1534904478,PR_kwDOBm6k_c5HdwRg,1992,Bump blacken-docs from 1.12.1 to 1.13.0,49699333,dependabot[bot],closed,0,,,,,1,2023-01-16T13:05:05Z,2023-03-29T06:11:35Z,2023-03-29T06:11:34Z,CONTRIBUTOR,simonw/datasette/pulls/1992,"Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.12.1 to 1.13.0.
Changelog

Sourced from blacken-docs's changelog.

1.13.0 (2023-01-16)

  • Note Adam Johnson is new maintainer.

  • Require Black 22.1.0+.

  • Add --rst-literal-blocks option, to also format text in reStructuredText literal blocks, starting with ::. Sphinx highlights these with the project’s default language, which defaults to Python.

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.12.1&new-version=1.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1992.org.readthedocs.build/en/1992/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1992/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1528995601,PR_kwDOBm6k_c5HJ55o,1986,Bump sphinx from 6.1.2 to 6.1.3,49699333,dependabot[bot],closed,0,,,,,0,2023-01-11T13:02:36Z,2023-03-29T06:09:50Z,2023-03-29T06:09:49Z,CONTRIBUTOR,simonw/datasette/pulls/1986,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.2 to 6.1.3.
Release notes

Sourced from sphinx's releases.

v6.1.3

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.3 (released Jan 10, 2023)

Bugs fixed

  • #11116: Reverted to previous Sphinx 5 node copying method
  • #11117: Reverted changes to parallel image processing from Sphinx 6.1.0
  • #11119: Supress ValueError in the linkcheck builder
Commits
  • 776d01e Bump to 6.1.3 final
  • a2e922a CHANGES for Sphinx 6.1.3
  • 31162a9 Handle exceptions for get_node_source and get_node_line
  • dcb4429 Restore Sphinx 5 nodes.Element copying behaviour
  • 2a7c40d Undo parallel image changes
  • 7841d3d Ignore more checks in Ruff 0.0.214
  • ddbc5b5 Bump version
  • See full diff in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.2&new-version=6.1.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1986.org.readthedocs.build/en/1986/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1986/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1566081801,PR_kwDOBm6k_c5JAcGy,2014,Bump black from 22.12.0 to 23.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-02-01T13:06:16Z,2023-03-29T06:09:14Z,2023-03-29T06:09:12Z,CONTRIBUTOR,simonw/datasette/pulls/2014,"Bumps [black](https://github.com/psf/black) from 22.12.0 to 23.1.0.
Release notes

Sourced from black's releases.

23.1.0

Highlights

This is the first release of 2023, and following our stability policy, it comes with a number of improvements to our stable style, notably improvements to empty line handling and the removal of redundant parentheses in several contexts.

There are also many changes to the preview style; try out black --preview and give us feedback to help us set the stable style for next year.

In addition to style changes, Black now automatically infers the supported Python versions from your pyproject.toml file, removing the need to set Black's target versions separately.

Stable style

  • Introduce the 2023 stable style, which incorporates most aspects of last year's preview style (#3418). Specific changes:
    • Enforce empty lines before classes and functions with sticky leading comments (#3302) (22.12.0)
    • Reformat empty and whitespace-only files as either an empty file (if no newline is present) or as a single newline character (if a newline is present) (#3348) (22.12.0)
    • Correctly handle trailing commas that are inside a line's leading non-nested parens (#3370) (22.12.0)
    • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168) (since 22.8.0)
    • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209) (22.8.0)
    • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227) (22.8.0)
    • Docstring quotes are no longer moved if it would violate the line length limit (#3044, #3430) (22.6.0)
    • Parentheses around return annotations are now managed (#2990) (22.6.0)
    • Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
    • Remove unnecessary parentheses in with statements (#2926) (22.6.0)
    • Remove trailing newlines after code block open (#3035) (22.6.0)
    • Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
    • Remove unnecessary parentheses from except statements (#2939) (22.3.0)
    • Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
    • Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)
  • Fix a crash when a colon line is marked between # fmt: off and # fmt: on (#3439)

Preview style

  • Format hex codes in unicode escape sequences in string literals (#2916)
  • Add parentheses around if-else expressions (#2278)
  • Improve performance on large expressions that contain many strings (#3467)
  • Fix a crash in preview style with assert + parenthesized string (#3415)
  • Fix crashes in preview style with walrus operators used in function return annotations and except clauses (#3423)
  • Fix a crash in preview advanced string processing where mixed implicitly concatenated regular and f-strings start with an empty span (#3463)
  • Fix a crash in preview advanced string processing where a standalone comment is placed before a dict's value (#3469)
  • Fix an issue where extra empty lines are added when a decorator has # fmt: skip applied or there is a standalone comment between decorators (#3470)
  • Do not put the closing quotes in a docstring on a separate line, even if the line is too long (#3430)
  • Long values in dict literals are now wrapped in parentheses; correspondingly unnecessary parentheses around short values in dict literals are now removed; long string lambda values are now wrapped in parentheses (#3440)
  • Fix two crashes in preview style involving edge cases with docstrings (#3451)
  • Exclude string type annotations from improved string processing; fix crash when the return type annotation is stringified and spans across multiple lines (#3462)
  • Wrap multiple context managers in parentheses when targeting Python 3.9+ (#3489)
  • Fix several crashes in preview style with walrus operators used in with statements or tuples (#3473)
  • Fix an invalid quote escaping bug in f-string expressions where it produced invalid code. Implicitly concatenated f-strings with different quotes can now be merged or quote-normalized by changing the quotes used in expressions. (#3509)

... (truncated)

Changelog

Sourced from black's changelog.

23.1.0

Highlights

This is the first release of 2023, and following our stability policy, it comes with a number of improvements to our stable style, including improvements to empty line handling, removal of redundant parentheses in several contexts, and output that highlights implicitly concatenated strings better.

There are also many changes to the preview style; try out black --preview and give us feedback to help us set the stable style for next year.

In addition to style changes, Black now automatically infers the supported Python versions from your pyproject.toml file, removing the need to set Black's target versions separately.

Stable style

  • Introduce the 2023 stable style, which incorporates most aspects of last year's preview style (#3418). Specific changes:
    • Enforce empty lines before classes and functions with sticky leading comments (#3302) (22.12.0)
    • Reformat empty and whitespace-only files as either an empty file (if no newline is present) or as a single newline character (if a newline is present) (#3348) (22.12.0)
    • Implicitly concatenated strings used as function args are now wrapped inside parentheses (#3307) (22.12.0)
    • Correctly handle trailing commas that are inside a line's leading non-nested parens (#3370) (22.12.0)
    • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168) (since 22.8.0)
    • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209) (22.8.0)
    • Implicitly concatenated strings inside a list, set, or tuple are now wrapped inside parentheses (#3162) (22.8.0)
    • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227) (22.8.0)
    • Docstring quotes are no longer moved if it would violate the line length limit (#3044, #3430) (22.6.0)
    • Parentheses around return annotations are now managed (#2990) (22.6.0)
    • Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
    • Remove unnecessary parentheses in with statements (#2926) (22.6.0)
    • Remove trailing newlines after code block open (#3035) (22.6.0)
    • Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
    • Remove unnecessary parentheses from except statements (#2939) (22.3.0)
    • Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
    • Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)

... (truncated)

Commits
  • b0d1fba Prepare release 23.1.0 (#3536)
  • 69ca0a4 Infer target version based on project metadata (#3219)
  • c4bd2e3 Draft for Black 2023 stable style (#3418)
  • 226cbf0 Fix unsafe cast in linegen.py w/ await yield handling (#3533)
  • f4ebc68 Upgrade isort (#3534)
  • 6407ebb Remove Python version in the_basics.md (#3528)
  • 196b1f3 Fix black --help output for --python-cell-magics option to be reproducibl...
  • d950f15 Update document now that paren wrapping CMs on Python 3.9+ is implemented (#3...
  • a36878e Fix an invalid quote escaping bug in f-string expressions (#3509)
  • eabff67 Format hex code in unicode escape sequences in string literals (#2916)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.12.0&new-version=23.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2014.org.readthedocs.build/en/2014/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2014/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1644018605,PR_kwDOBm6k_c5NEqBO,2046,Bump furo from 2022.12.7 to 2023.3.27,49699333,dependabot[bot],closed,0,,,,,0,2023-03-28T13:58:14Z,2023-03-29T06:08:02Z,2023-03-29T06:08:01Z,CONTRIBUTOR,simonw/datasette/pulls/2046,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.12.7 to 2023.3.27.
Changelog

Sourced from furo's changelog.

Changelog

2023.03.27 -- Tasty Tangerine

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Add missing class to Font Awesome examples

2023.03.23 -- Sassy Saffron

  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.
  • Replace the meta generator tag with a comment.
  • Tweak labels with icons to prevent users selecting icons as text on touch.

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.
  • Add support for aligning text in MyST-parser generated tables.

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.12.7&new-version=2023.3.27)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2046.org.readthedocs.build/en/2046/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2046/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1639446870,PR_kwDOBm6k_c5M1izI,2043,Bump furo from 2022.12.7 to 2023.3.23,49699333,dependabot[bot],closed,0,,,,,2,2023-03-24T13:58:08Z,2023-03-28T13:58:24Z,2023-03-28T13:58:21Z,CONTRIBUTOR,simonw/datasette/pulls/2043,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.12.7 to 2023.3.23.
Changelog

Sourced from furo's changelog.

Changelog

2023.03.23 -- Sassy Saffron

  • Regenerate with newer version of sphinx-theme-builder, to fix RECORD hashes.
  • Update Python version classifiers.
  • Increase the icon size in mobile header.
  • Increase admonition title bg opacity.
  • Change the default API background to transparent.
  • Transition the API background change.
  • Remove the "indent" of API entries which have a background.
  • Break long inline code literals.

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.
  • Replace the meta generator tag with a comment.
  • Tweak labels with icons to prevent users selecting icons as text on touch.

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.
  • Add support for aligning text in MyST-parser generated tables.

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.
  • Correctly select only Furo's own svg in related pages nav.
  • Make numpy-style documentation headers consistent.
  • Retitle the reference section.
  • Update npm dependencies.

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.12.7&new-version=2023.3.23)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--2043.org.readthedocs.build/en/2043/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2043/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1114543475,I_kwDOCGYnMM5CbpVz,388,Link to stable docs from older versions,9599,simonw,closed,0,,,,,7,2022-01-26T01:55:46Z,2023-03-26T23:43:12Z,2022-01-26T02:00:22Z,OWNER,,"https://sqlite-utils.datasette.io/en/2.14.1/ isn't showing a link to the stable release right now. I should also apply the same fix I used for Datasette in: - https://github.com/simonw/datasette/issues/1608 TIL: https://til.simonwillison.net/readthedocs/link-from-latest-to-stable",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that. On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1551694938,PR_kwDOBm6k_c5IQeKz,1999,?_extra= support (draft),9599,simonw,closed,0,,,,,49,2023-01-21T04:55:18Z,2023-03-22T22:49:41Z,2023-03-22T22:49:40Z,OWNER,simonw/datasette/pulls/1999,"Refs: - #262 ---- :books: Documentation preview :books:: https://datasette--1999.org.readthedocs.build/en/1999/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1999/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1633077183,I_kwDOBm6k_c5hVse_,2041,Remove obsolete table POST code,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-03-21T01:01:40Z,2023-03-21T01:17:44Z,2023-03-21T01:17:43Z,OWNER,,"Spotted this in: - #1999 `POST /db/table` currently executes obsolete code for inserting a row - I replaced that with `/db/table/-/insert` in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620516340,I_kwDOCGYnMM5glx30,533,ReadTheDocs error: not all arguments converted during string formatting,9599,simonw,closed,0,,,,,2,2023-03-12T21:21:05Z,2023-03-12T21:25:33Z,2023-03-12T21:25:33Z,OWNER,,"This came up as a failure running tests for: - #531 Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/ ``` File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615862295,I_kwDOBm6k_c5gUBoX,2036,"`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems",9599,simonw,closed,0,,,,,6,2023-03-08T20:11:44Z,2023-03-08T20:57:34Z,2023-03-08T20:57:34Z,OWNER,,"See this issue: - https://github.com/simonw/datasette.io/issues/141",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1612296210,I_kwDOBm6k_c5gGbAS,2033,`datasette install -r requirements.txt`,9599,simonw,closed,0,,,,,2,2023-03-06T22:17:17Z,2023-03-06T22:54:52Z,2023-03-06T22:27:34Z,OWNER,,"Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1590839187,PR_kwDOBm6k_c5KSs9T,2028,add Python 3.11 classifier,614233,dtrodrigues,closed,0,,,,,2,2023-02-19T20:16:03Z,2023-03-06T21:01:20Z,2023-03-06T21:01:19Z,CONTRIBUTOR,simonw/datasette/pulls/2028,"Python 3.11 is tested in CI and is used in the docker image, so add the Python 3.11 Trove classifier. ---- :books: Documentation preview :books:: https://datasette--2028.org.readthedocs.build/en/2028/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2028/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,gk7279,closed,0,,,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1572766460,I_kwDOCGYnMM5dvoL8,524,Transformation type `--type DATETIME`,21095447,4l1fe,closed,0,,,,,15,2023-02-06T15:18:42Z,2023-02-15T12:10:54Z,2023-02-15T12:10:54Z,NONE,,"Hey. Currently i do transformation with the type `--type TEXT`, but i noticed using the sqlalchemy based library [dataset](https://github.com/pudo/dataset) that is reading and writing differ depending on the column types `TEXT`, `DATETIME`. Is it possible to alter a column type to `DATETIME` somehow using Sqlite-Utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,DavidPratten,closed,0,,,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1034535001,I_kwDOBm6k_c49qcBZ,1497,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""",9599,simonw,closed,0,,,,,18,2021-10-24T22:57:07Z,2023-01-18T17:13:45Z,2021-10-24T23:36:55Z,OWNER,,"This means the Datasette 0.59.1 release has not been published to Docker Hub. Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true ``` Preparing to unpack .../libc6_2.32-4_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype Checking for services that may need to be restarted... Checking init scripts... Unpacking libc6:amd64 (2.32-4) over (2.28-10) ... Setting up libc6:amd64 (2.32-4) ... /usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory dpkg: error processing package libc6:amd64 (--configure): installed libc6:amd64 package post-installation script subprocess returned error exit status 127 Errors were encountered while processing: libc6:amd64 E: Sub-process /usr/bin/dpkg returned an error code (1) The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100 ``` Same problem when I attempted to publish using the ""Push specific Docker tag"" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1528448642,I_kwDOBm6k_c5bGkaC,1985,Don't let Datasette(path) without a list cause weird errors,9599,simonw,closed,0,,,,,1,2023-01-11T05:17:44Z,2023-01-11T18:25:04Z,2023-01-11T18:25:04Z,OWNER,,"I got a confusing `sqlite3.OperationalError: disk I/O error` error in my tests, it turned out it was because this: ```python ds = Datasette(path) ``` Should have been this: ```python ds = Datasette([path]) ``` _Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525560504,PR_kwDOBm6k_c5G-ZsQ,1982,Bump sphinx from 5.3.0 to 6.1.2,49699333,dependabot[bot],closed,0,,,,,1,2023-01-09T13:06:11Z,2023-01-10T02:03:21Z,2023-01-10T02:03:19Z,CONTRIBUTOR,simonw/datasette/pulls/1982,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.1.2.
Release notes

Sourced from sphinx's releases.

v6.1.2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.2 (released Jan 07, 2023)

Bugs fixed

  • #11101: LaTeX: div.topic_padding key of sphinxsetup documented at 5.1.0 was implemented with name topic_padding

  • #11099: LaTeX: shadowrule key of sphinxsetup causes PDF build to crash since Sphinx 5.1.0

  • #11096: LaTeX: shadowsize key of sphinxsetup causes PDF build to crash since Sphinx 5.1.0

  • #11095: LaTeX: shadow of :dudir:topic and contents_ boxes not in page margin since Sphinx 5.1.0

    .. _contents: https://docutils.sourceforge.io/docs/ref/rst/directives.html#table-of-contents

  • #11100: Fix copying images when running under parallel mode.

Release 6.1.1 (released Jan 05, 2023)

Bugs fixed

  • #11091: Fix util.nodes.apply_source_workaround for literal_block nodes with no source information in the node or the node's parents.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1982.org.readthedocs.build/en/1982/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1982/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1526635374,PR_kwDOBm6k_c5HCCY2,1984,Upgrade Sphinx,9599,simonw,closed,0,,,,,1,2023-01-10T02:00:40Z,2023-01-10T02:02:33Z,2023-01-10T02:02:33Z,OWNER,simonw/datasette/pulls/1984,"Refs #1971 ---- :books: Documentation preview :books:: https://datasette--1984.org.readthedocs.build/en/1984/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1984/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 710650633,MDU6SXNzdWU3MTA2NTA2MzM=,979,Default table view JSON should include CREATE TABLE,9599,simonw,closed,0,,,,,3,2020-09-28T23:54:58Z,2023-01-09T15:32:39Z,2023-01-09T15:32:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1522778923,I_kwDOBm6k_c5aw8Mr,1978,Document datasette.urls.row and row_blob,25778,eyeseast,closed,0,,,,,2,2023-01-06T15:45:51Z,2023-01-09T14:30:00Z,2023-01-09T14:30:00Z,CONTRIBUTOR,,"These are in the codebase but not in documentation. I think everything else in this class is documented. ```python class Urls: ... def row(self, database, table, row_path, format=None): path = f""{self.table(database, table)}/{row_path}"" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) def row_blob(self, database, table, row_path, column): return self.table(database, table) + ""/{}.blob?_blob_column={}"".format( row_path, urllib.parse.quote_plus(column) ) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned 1522552817,PR_kwDOBm6k_c5G0XxH,1977,Bump sphinx from 5.3.0 to 6.1.1,49699333,dependabot[bot],closed,0,,,,,2,2023-01-06T13:02:12Z,2023-01-09T13:06:17Z,2023-01-09T13:06:14Z,CONTRIBUTOR,simonw/datasette/pulls/1977,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.1.1.
Release notes

Sourced from sphinx's releases.

v6.1.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.1 (released Jan 05, 2023)

Bugs fixed

  • #11091: Fix util.nodes.apply_source_workaround for literal_block nodes with no source information in the node or the node's parents.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()
    • Moved sphinx.util.xmlname_checker() to sphinx.builders.epub3._XML_NAME_PATTERN

    Moved to sphinx.util.display:

    • sphinx.util.status_iterator
    • sphinx.util.display_chunk
    • sphinx.util.SkipProgressMessage
    • sphinx.util.progress_message

    Moved to sphinx.util.http_date:

    • sphinx.util.epoch_to_rfc1123
    • sphinx.util.rfc1123_to_epoch

    Moved to sphinx.util.exceptions:

    • sphinx.util.save_traceback

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1977.org.readthedocs.build/en/1977/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1977/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1524076587,I_kwDOBm6k_c5a15Ar,1979,More useful error message if enable_load_extension is not available,9599,simonw,closed,0,,,,,5,2023-01-07T19:13:19Z,2023-01-08T00:21:23Z,2023-01-08T00:21:23Z,OWNER,,"I get this from: datasette --load-extension spatialite --get /-/versions.json ``` File ""/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/datasette/app.py"", line 614, in _prepare_connection conn.enable_load_extension(True) AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' ``` It would be useful if Datasette caught this error and output something more friendly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957310278,MDU6SXNzdWU5NTczMTAyNzg=,1409,`default_allow_sql` setting (a re-imagining of the old `allow_sql` setting),9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2021-07-31T19:48:56Z,2023-01-07T18:06:01Z,2023-01-05T00:51:31Z,OWNER,,"In 49d6d2f7b0f6cb02e25022e1c9403811f1fa0a7c as part of #813 I removed the `allow_sql` setting - on the basis that users could disable the ability to execute custom SQL queries using the new permission system instead. I don't think this was the right decision. Disabling custom SQL is an important security capability, and explaining how to do it using permissions is significantly more complex than letting people know they can add `--setting allow_sql off`. So I want to bring that setting back - maybe with a different, better name - and have it modify the default for that option if the permissions system doesn't have an opinion. That way people can still use the setting but then use permissions to allow specific signed-in users access to execute SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1520712722,PR_kwDOBm6k_c5GuDBN,1976,Bump sphinx from 5.3.0 to 6.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-01-05T13:02:37Z,2023-01-06T13:02:17Z,2023-01-06T13:02:15Z,CONTRIBUTOR,simonw/datasette/pulls/1976,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.1.0.
Release notes

Sourced from sphinx's releases.

v6.1.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.1.0 (released Jan 05, 2023)

Dependencies

Incompatible changes

  • #10979: gettext: Removed support for pluralisation in get_translation. This was unused and complicated other changes to sphinx.locale.

Deprecated

  • sphinx.util functions:

    • Renamed sphinx.util.typing.stringify() to sphinx.util.typing.stringify_annotation()
    • Moved sphinx.util.xmlname_checker() to sphinx.builders.epub3._XML_NAME_PATTERN

    Moved to sphinx.util.display:

    • sphinx.util.status_iterator
    • sphinx.util.display_chunk
    • sphinx.util.SkipProgressMessage
    • sphinx.util.progress_message

    Moved to sphinx.util.http_date:

    • sphinx.util.epoch_to_rfc1123
    • sphinx.util.rfc1123_to_epoch

    Moved to sphinx.util.exceptions:

    • sphinx.util.save_traceback
    • sphinx.util.format_exception_cut_frames

Features added

  • Cache doctrees in the build environment during the writing phase.
  • Make all writing phase tasks support parallel execution.
  • #11072: Use PEP 604 (X | Y) display conventions for typing.Optional and typing.Optional types within the Python domain and autodoc.

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1976.org.readthedocs.build/en/1976/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1976/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1516376583,PR_kwDOBm6k_c5GfPJL,1974,Bump sphinx from 5.3.0 to 6.0.0,49699333,dependabot[bot],closed,0,,,,,2,2023-01-02T13:04:26Z,2023-01-05T13:02:42Z,2023-01-05T13:02:40Z,CONTRIBUTOR,simonw/datasette/pulls/1974,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.0.0.
Release notes

Sourced from sphinx's releases.

v6.0.0

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b2

Changelog: https://www.sphinx-doc.org/en/master/changes.html

v6.0.0b1

Changelog: https://www.sphinx-doc.org/en/master/changes.html

Changelog

Sourced from sphinx's changelog.

Release 6.0.0 (released Dec 29, 2022)

Dependencies

  • #10468: Drop Python 3.6 support
  • #10470: Drop Python 3.7, Docutils 0.14, Docutils 0.15, Docutils 0.16, and Docutils 0.17 support. Patch by Adam Turner

Incompatible changes

  • #7405: Removed the jQuery and underscore.js JavaScript frameworks.

    These frameworks are no longer be automatically injected into themes from Sphinx 6.0. If you develop a theme or extension that uses the jQuery, $, or $u global objects, you need to update your JavaScript to modern standards, or use the mitigation below.

    The first option is to use the sphinxcontrib.jquery_ extension, which has been developed by the Sphinx team and contributors. To use this, add sphinxcontrib.jquery to the extensions list in conf.py, or call app.setup_extension("sphinxcontrib.jquery") if you develop a Sphinx theme or extension.

    The second option is to manually ensure that the frameworks are present. To re-add jQuery and underscore.js, you will need to copy jquery.js and underscore.js from the Sphinx repository_ to your static directory, and add the following to your layout.html:

    .. code-block:: html+jinja

    {%- block scripts %} {{ super() }} {%- endblock %}

    .. _sphinxcontrib.jquery: https://github.com/sphinx-contrib/jquery/

    Patch by Adam Turner.

  • #10471, #10565: Removed deprecated APIs scheduled for removal in Sphinx 6.0. See :ref:dev-deprecated-apis for details. Patch by Adam Turner.

  • #10901: C Domain: Remove support for parsing pre-v3 style type directives and roles. Also remove associated configuration variables c_allow_pre_v3 and c_warn_on_allowed_pre_v3. Patch by Adam Turner.

Features added

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1974.org.readthedocs.build/en/1974/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1974/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1515182998,I_kwDOBm6k_c5aT9uW,1970,"Path ""None"" in _internal database table",9599,simonw,closed,0,,,,,2,2022-12-31T18:51:05Z,2022-12-31T19:22:58Z,2022-12-31T18:52:49Z,OWNER,,"See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1503010009,PR_kwDOBm6k_c5FyT3c,1967,Add favicon to documentation,1839645,choldgraf,closed,0,,,,,2,2022-12-19T14:01:04Z,2022-12-31T19:15:51Z,2022-12-31T19:00:31Z,CONTRIBUTOR,simonw/datasette/pulls/1967,"I've been browsing the datasette documentation and found it hard to quickly locate tabs with many of them open, because it does not ship a favicon. So this PR: - Grabs the favicon `.png` from datasette itself[^1] - Adds it to the `_static/` folder - Sets `html_favicon` to load it in the docs [^1]: I also learned that Chrome can fetch favicons as an internal service! See `chrome://favicon/https://datasette.io/tools/github-to-sqlite`. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1967/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,lbellomo,closed,0,,,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515186569,I_kwDOBm6k_c5aT-mJ,1972,Fix Sphinx warning about extlink extension,9599,simonw,closed,0,,,,,0,2022-12-31T19:12:04Z,2022-12-31T19:13:26Z,2022-12-31T19:13:26Z,OWNER,,"``` [sphinx-autobuild] > sphinx-build -b html /Users/simon/Dropbox/Development/datasette/docs /Users/simon/Dropbox/Development/datasette/docs/_build Running Sphinx v5.3.0 loading pickled environment... done WARNING: extlinks: Sphinx-6.0 will require a caption string to contain exactly one '%s' and all other '%' need to be escaped as '%%'. ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1972/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1000275035,PR_kwDOCGYnMM4r7n-9,327,Extract expand: Support JSON Arrays,101753,phaer,closed,0,,,,,0,2021-09-19T10:34:30Z,2022-12-29T09:05:36Z,2022-12-29T09:05:36Z,NONE,simonw/sqlite-utils/pulls/327,"Hi, I needed to extract data in JSON Arrays to normalize data imports. I've quickly hacked the following together based on #241 which refers to #239 where you, @simonw, wrote: > Could this handle lists of objects too? That would be pretty amazing - if the column has a [{...}, {...}] list in it could turn that into a many-to-many. They way this works in my work is that many-to-many relationships are created for anything that maps to an dictionary in a list, and many-to-one relations for everything else (assumed to be scalar values). Not sure what the best approach here would be? Are many-to-one relationships are at all useful here? What do you think about this approach? I could try to add it to the cli interface and documentation if wanted. Thanks for this awesome piece of software in any case! :sun_with_face: ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/327/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1501843596,PR_kwDOBm6k_c5FuaJm,1965,Detect server start/stop more reliably.,11321,janl,closed,0,,,,,2,2022-12-18T10:03:42Z,2022-12-20T19:08:26Z,2022-12-18T16:01:51Z,CONTRIBUTOR,simonw/datasette/pulls/1965,"This is useful, especially in testing, since your test hosts might not reliabliy start the server within two seconds, so we do a definite check before progressing. By the same token, after `kill $server_pid` wait for the pid to be gone from the process list. Since now the script can end prematurely, I also added a cleanup function to make sure the temporary certs are removed in any case. n.b. this could also be done with the use of `trap 'fn' ERR` but that felt like a bit too much magic for this short a script. ---- :books: Documentation preview :books:: https://datasette--1965.org.readthedocs.build/en/1965/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1965/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1496652622,I_kwDOBm6k_c5ZNRtO,1955,"invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things",32839123,Rik-de-Kort,closed,0,,,,,36,2022-12-14T13:39:56Z,2022-12-19T04:34:16Z,2022-12-18T02:45:18Z,NONE,,"In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: `gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app`. My most recent deployment, however, fails loudly by shouting that `Datasette.invoke_startup()` was not called. It does not seem to be possible to call `invoke_startup` when running using a uvicorn command directly like this (I've reproduced this locally using `uvicorn`). Two candidates that I have tried: * Uvicorn has a `--factory` option, but the app factory has to be synchronous, so no `await invoke_startup` there * `asyncio.get_event_loop().run_until_complete` is also not an option because `uvicorn` already has the event loop running. One additional option is: * Use Gunicorn's [server hooks](https://docs.gunicorn.org/en/stable/settings.html#server-hooks) to call `invoke_startup`. These are also synchronous, but I might be able to get ahead of the event loop starting here. In my current deployment setup, it does not appear to be possible to use `datasette serve` directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described [here](https://github.com/simonw/azure-functions-datasette)) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation. Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears! Almost forgot, minimal reproducer: ```python from datasette import Datasette ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ``` Save as app.py in the same folder as global-power-plants.db, and then try running `uvicorn app:app`. Opening the resulting Datasette instance in the browser will show the error message.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y: `",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1939/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1486011362,PR_kwDOBm6k_c5E3XqB,1940,register_permissions() plugin hook,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,6,2022-12-09T05:09:28Z,2022-12-13T02:05:55Z,2022-12-13T02:05:54Z,OWNER,simonw/datasette/pulls/1940,"Refs #1939 From this comment: https://github.com/simonw/datasette/issues/1939#issuecomment-1343872168 - [x] Unit test for the registration plugin hook itself - [x] Use them in `check_permission_actions_are_documented` test in `conftest.py` - [x] Add description field to `Permissions` (and update tests and docs) - [x] Documentation for `datasette.permissions` dictionary - [x] If no `default=` provided in call to `permission_allowed()` then use default from `datasette.permissions` list - [x] Remove `default=` from a bunch of places - [x] Throw an error if two permissions are registered with the same name or abbreviation (but other attributes differ) - [x] Update authentication and permissions documentation to explain that permissions are now registered and have a registered default ---- :books: Documentation preview :books:: https://datasette--1940.org.readthedocs.build/en/1940/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1940/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1487764628,I_kwDOCGYnMM5YrXyU,518,flake8 ValueError: Error code '#' supplied to 'extend-ignore' option...,9599,simonw,closed,0,,,,,0,2022-12-10T01:30:24Z,2022-12-10T01:36:46Z,2022-12-10T01:36:46Z,OWNER,,"> `Error code '#' supplied to 'extend-ignore' option does not match '^[A-Z]{1,3}[0-9]{0,3}$'` https://github.com/simonw/sqlite-utils/actions/runs/3662011265/jobs/6190770361 I think from this: https://github.com/simonw/sqlite-utils/blob/e660635cea6c32f4022818380b1e1ee88e7c93a6/setup.cfg#L1-L3 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487757143,I_kwDOCGYnMM5YrV9X,517,Drop support for Python 3.6,9599,simonw,closed,0,,,,,1,2022-12-10T01:23:31Z,2022-12-10T01:36:36Z,2022-12-10T01:36:36Z,OWNER,,"CI has started failing for Python 3.6: https://github.com/simonw/sqlite-utils/actions/runs/3576322798 It's fixable by swiching away from `ubuntu-latest` according to: - https://github.com/actions/setup-python/issues/355#issuecomment-1335042510 But https://endoflife.date/python says that 3.6 end of life was almost 6 years ago, and end of security support nearly 1 year ago. So I'm OK dropping support entirely - Python 3.6 users will still be able to install version 3.30, just not any releases that come next.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434094365,I_kwDOBm6k_c5Veosd,1881,Tool for simulating permission checks against actors,9599,simonw,closed,0,,,,,9,2022-11-03T04:43:20Z,2022-12-09T01:38:21Z,2022-11-04T00:13:05Z,OWNER,,"In working on this issue: - #1855 I realized that if I'm going to make actors more complicated (the proposed `_r` key for additional restricted permissions) I really need an interactive tool for simulating these checks, similar to the https://latest.datasette.io/-/allow-debug tool.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1881/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1473664029,PR_kwDOBm6k_c5ELz0u,1930,Typo in JSON API `Updating a row` documentation,3556,davidbgk,closed,0,,,,,2,2022-12-03T02:22:31Z,2022-12-08T21:12:35Z,2022-12-08T21:12:35Z,CONTRIBUTOR,simonw/datasette/pulls/1930," ---- :books: Documentation preview :books:: https://datasette--1930.org.readthedocs.build/en/1930/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1930/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1481875485,PR_kwDOBm6k_c5EouZs,1935,Bump furo from 2022.9.29 to 2022.12.7,49699333,dependabot[bot],closed,0,,,,,1,2022-12-07T13:02:57Z,2022-12-08T21:12:08Z,2022-12-08T21:12:07Z,CONTRIBUTOR,simonw/datasette/pulls/1935,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.29 to 2022.12.7.
Changelog

Sourced from furo's changelog.

Changelog

2022.12.07 -- Reverent Raspberry

  • ✨ Add support for Sphinx 6.
  • ✨ Improve footnote presentation with docutils 0.18+.
  • Drop support for Sphinx 4.
  • Improve documentation about what the edit button does.
  • Improve handling of empty-flexboxes for better print experience on Chrome.
  • Improve styling for inline signatures.
  • Replace the meta generator tag with a comment.
  • Tweak labels with icons to prevent users selecting icons as text on touch.

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.
  • Add support for aligning text in MyST-parser generated tables.

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.
  • Correctly select only Furo's own svg in related pages nav.
  • Make numpy-style documentation headers consistent.
  • Retitle the reference section.
  • Update npm dependencies.

2022.06.21 -- Opulent Opal

  • Fix docutils <= 0.17.x compatibility.
  • Bump to the latest Node.js LTS.

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.9.29&new-version=2022.12.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1935.org.readthedocs.build/en/1935/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1935/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1432013704,I_kwDOBm6k_c5VWsuI,1878,/db/table/-/upsert API,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,8,2022-11-01T20:01:18Z,2022-12-08T01:12:18Z,2022-12-08T01:12:17Z,OWNER,,Equivalent to `sqlite-utils upsert`: https://sqlite-utils.datasette.io/en/stable/python-api.html#upserting-data,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1878/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1473814539,PR_kwDOBm6k_c5EMVug,1931,/db/table/-/upsert,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,8,2022-12-03T07:01:44Z,2022-12-08T01:12:17Z,2022-12-08T01:12:16Z,OWNER,simonw/datasette/pulls/1931,"Refs #1878 Still todo: - [x] Support `""return"": true` properly for upserts (with tests) - [x] Require both `insert-row` and `update-row` permissions - [x] Tests are going to need to cover both rowid-only and compound primary key tables, including all of the error states - [x] Documentation ---- :books: Documentation preview :books:: https://datasette--1931.org.readthedocs.build/en/1931/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1931/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1473659191,I_kwDOBm6k_c5X1kE3,1929,Incorrect link from the API explorer to the JSON API documentation,3556,davidbgk,closed,0,,,,,4,2022-12-03T02:08:58Z,2022-12-06T19:36:23Z,2022-12-06T19:34:20Z,CONTRIBUTOR,,"I installed `datasette==1.0a1`. When I go to http://127.0.0.1:8001/-/api I have a link: `Use this tool to try out the [Datasette API](https://docs.datasette.io/en/1.0a1/json_api.html).` but that documentation page does not exist. I'm not sure where it has to be fixed, should it link to the stable page https://docs.datasette.io/en/stable/json_api.html , the latest one https://docs.datasette.io/en/latest/json_api.html#the-json-write-api or would it be more appropriated to deploy documentation for the `1.0a1` version?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1929/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1473481262,I_kwDOBm6k_c5X04ou,1928,Hacker News Datasette write demo,9599,simonw,closed,0,,,,,7,2022-12-02T21:17:41Z,2022-12-02T23:47:11Z,2022-12-02T21:43:19Z,OWNER,,"Idea is to have my existing scraper at https://github.com/simonw/scrape-hacker-news-by-domain also write to my private Datasette Cloud account, then create an atom feed from it. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1471969984,I_kwDOBm6k_c5XvHrA,1926,Release notes for 1.0a1 (and release it),9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-12-01T21:18:12Z,2022-12-01T22:06:13Z,2022-12-01T22:06:12Z,OWNER,,"Mainly CORS support and a few small bug fixes. Changes: https://github.com/simonw/datasette/compare/1.0a0...99da46f7258225fc6fd8e94ddc20859ccccc4109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1926/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1215174094,I_kwDOBm6k_c5IbhXO,1720,Design plugin hook for extras,9599,simonw,closed,0,,,,,14,2022-04-26T00:08:10Z,2022-12-01T21:15:19Z,2022-04-26T20:20:27Z,OWNER,,"Refs: - #262 - #1709 I realized that this is a really natural plugin hook - and if I design it as a hook I can implement Datasette's core extras as default plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212823665,I_kwDOBm6k_c5ISjhx,1715,Refactor TableView to use asyncinject,9599,simonw,closed,0,,,,,13,2022-04-22T21:43:39Z,2022-12-01T21:15:18Z,2022-04-28T22:26:56Z,OWNER,,"I've been working on a dependency injection mechanism in a separate library: - https://github.com/simonw/asyncinject I think it's ready to try out with Datasette to see if it's a pattern that will work here. I'm going to attempt to refactor `TableView` to use it. There are two overall goals here: - Use `asyncinject` to add parallel execution of some aspects of the table page - most notably I want to be able to execute the `count(*)` query, the `select ...` query, the various faceting queries and the facet suggestion queries in parallel - and measure if doing so is good for performance. - Use it to execute different output formats (possibly with some changes to the existing `register_output_renderer()` plugin hook). I want CSV and JSON to use the same mechanism that plugins use. Stretch goal is to get this working with streaming data too, see: - #1101",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469973742,I_kwDOBm6k_c5XngTu,1922,Make sure CORS works for write APIs,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,13,2022-11-30T17:15:55Z,2022-12-01T18:47:00Z,2022-12-01T18:47:00Z,OWNER,,"Split from: - #1850",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1922/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470509936,I_kwDOBm6k_c5XpjNw,1924,Docs for replace:true and ignore:true options for insert API,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,4,2022-12-01T01:33:25Z,2022-12-01T18:15:15Z,2022-12-01T02:08:02Z,OWNER,,Equivalent to https://sqlite-utils.datasette.io/en/stable/cli.html#insert-replacing-data,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1924/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470320227,I_kwDOBm6k_c5Xo05j,1923,latest.datasette.io Cloud Run deploys failing,9599,simonw,closed,0,,,,,3,2022-11-30T22:49:34Z,2022-11-30T23:04:56Z,2022-11-30T23:04:56Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/3587402085/jobs/6038106719v ``` Warning: ""service_account_key"" has been deprecated. Please switch to using google-github-actions/auth which supports both Workload Identity Federation and Service Account Key JSON authentication. For more details, see https://github.com/google-github-actions/setup-gcloud#authorization Error: google-github-actions/setup-gcloud failed with: failed to execute command `gcloud --quiet auth activate-service-account *** --key-file -`: /opt/hostedtoolcache/gcloud/275.0.0/x64/lib/googlecloudsdk/core/console/console_io.py:544: SyntaxWarning: ""is"" with a literal. Did you mean ""==""? if answer is None or (answer is '' and default is not None): ERROR: gcloud failed to load: module 'collections' has no attribute 'MutableMapping' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469044738,I_kwDOBm6k_c5Xj9gC,1918,API explorer should list mutable databases first,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-11-30T04:53:33Z,2022-11-30T05:22:07Z,2022-11-30T05:07:56Z,OWNER,,"https://latest.datasette.io/-/api hides `ephemeral` down at the bottom, would be more interesting if it was at the top. Related: - #1915 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1918/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469015001,I_kwDOBm6k_c5Xj2PZ,1916,GET requests against POST endpoints should not 500 error,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-11-30T04:04:43Z,2022-11-30T05:15:19Z,2022-11-30T05:15:19Z,OWNER,,"![CF37BA4D-0677-4DDD-A339-EAF163BB63B7](https://user-images.githubusercontent.com/9599/204705025-6f88e9f7-757d-45e8-a89c-ab97e84781e8.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1916/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469043836,I_kwDOBm6k_c5Xj9R8,1917,Don't allow writable API to edit the `_memory` database,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,2,2022-11-30T04:51:59Z,2022-11-30T05:07:56Z,2022-11-30T05:07:55Z,OWNER,,"It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1917/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468709531,I_kwDOBm6k_c5Xirqb,1915,Interactive demo of Datasette 1.0 write APIs,9599,simonw,closed,0,,,,,6,2022-11-29T21:16:03Z,2022-11-30T04:05:46Z,2022-11-30T04:05:46Z,OWNER,,"I'm going to try to get this working on https://latest.datasette.io/ - it already has a way for people to sign in as root, but none of the databases there are writable. So I'm going to build a plugin which adds a writable named in-memory database. And some kind of mechanism for clearing out that database on a regular basis - maybe tables in that database get deleted automatically an hour after they are created? (Would be neat to display their time-left-until-deleted too)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1915/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421529723,I_kwDOBm6k_c5UutJ7,1850,Write API in Datasette core,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,13,2022-10-24T22:13:24Z,2022-11-29T20:11:20Z,2022-11-29T20:11:20Z,OWNER,,"I need this for Datasette Cloud, and in thinking it through I realized that it's really time Datasette grew a default write API as well. I'm going to mostly model this off `sqlite-utils`, since I've spent a bunch of time iterating on a pseudo-JSON API for that over the past few years (piping JSON to stdin etc). I want this for Datasette 1.0. I'm going to be building it in the new [1.0-dev](https://github.com/simonw/datasette/tree/1.0-dev) branch, which is automatically deployed to https://latest-1-0-dev.datasette.io/ running on Cloud Run. API features to build: - [x] #1852 - [x] #1856 - [x] #1857 - [x] #1858 - [x] #1859 - [x] #1871 - [x] #1888 - [x] #1868 - [x] #1851 - [x] #1863 - [x] #1864 - [x] #1866 - [x] https://github.com/simonw/datasette/issues/1882 - [x] #1862 - [x] #1874 - [x] https://github.com/simonw/datasette/issues/1887 - [x] #1877 Bumped to later on: - #1855 - #1878 - #1873 - #1875 - Make sure CORS works - https://github.com/simonw/datasette/issues/1889 - Alter a table - `sqlite-utils transform` style (more powerful than straight ALTER) - Execute SQL against a write connection - Maybe even multiple write SQL statements bundled in a single transaction - https://github.com/simonw/datasette/issues/1867 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1850/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468603401,I_kwDOBm6k_c5XiRwJ,1913,Release Datasette 1.0a0,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,9,2022-11-29T19:41:42Z,2022-11-29T20:10:35Z,2022-11-29T20:10:35Z,OWNER,,"I attempted the release just now - https://github.com/simonw/datasette/releases/tag/1.0a0 - and got an unexpected test failure: https://github.com/simonw/datasette/actions/runs/3577355358/attempts/1 ``` > assert delete_response.status_code == 200 E assert 404 == 200 E + where 404 = .status_code /home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError =========================== short test summary info ============================ FAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200 + where 404 = .status_code ``` I hit ""retry"" on that test but I expect it to fail again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1913/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1432012302,I_kwDOBm6k_c5VWsYO,1877,Refactor and tidy up final write API code,9599,simonw,closed,0,,,,,1,2022-11-01T20:00:11Z,2022-11-29T19:44:16Z,2022-11-29T19:44:07Z,OWNER,,"- `views/table.py` has got a bit too big - I think the write classes should be pulled out into a separate module. - [x] There's duplicate logic for deciding if the table and database exist and checking permissions",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1877/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468592292,PR_kwDOBm6k_c5D6nzE,1912,Merge 1.0-dev (with initial write API) back into main,9599,simonw,closed,0,,,,,1,2022-11-29T19:31:21Z,2022-11-29T19:39:37Z,2022-11-29T19:39:36Z,OWNER,simonw/datasette/pulls/1912,"See: - #1892 ---- :books: Documentation preview :books:: https://datasette--1912.org.readthedocs.build/en/1912/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1912/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1450303205,I_kwDOBm6k_c5Wcd7l,1891,1.0a0 release notes,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-15T19:58:20Z,2022-11-29T19:23:41Z,2022-11-29T19:23:41Z,OWNER,,"This release will mainly help preview the new Datasette write API: - #1850",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1891/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029275,I_kwDOBm6k_c5U8Dib,1864,Delete a single record from an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-27T04:53:22Z,2022-11-29T18:54:04Z,2022-11-29T18:54:04Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/delete Or... DELETE /db/table/row-pks/-/delete ``` I'm just going to do `POST` for the moment, like I did here: - #1874 Permission: `delete-row` Still needed: - [ ] Tests for rowid tables - [ ] Tests for compound primary keys",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1864/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468519699,I_kwDOBm6k_c5Xh9UT,1911,`/db/-/create` should support creating tables with compound primary keys,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-29T18:30:47Z,2022-11-29T18:50:58Z,2022-11-29T18:48:05Z,OWNER,,"Found myself needing this to write the tests for: - #1864 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1911/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029242,I_kwDOBm6k_c5U8Dh6,1863,Update a single record in an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,16,2022-10-27T04:53:17Z,2022-11-29T18:08:53Z,2022-11-29T18:06:37Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/update { ""field"": ""updated_value"" } ``` Only the fields that you pass will be updated. Maybe this is the wrong design though? The design for insert currently looks like this: - https://github.com/simonw/datasette/issues/1851#issuecomment-1294224185 ``` POST /db/table/-/insert Authorization: Bearer xxx Content-Type: application/json { ""row"": { ""id"": 1, ""name"": ""New name"" } } ``` I could use the same format for `/-/update`, but in this case the API doesn't require you to pass every field so `""row""` doesn't seem like the right key. I think I'll go with this: ``` POST /db/table/1/-/update Authorization: Bearer xxx Content-Type: application/json { ""update"": { ""name"": ""New name"" } } ``` The benefit of having an `""update""` key is that it allows me to use other keys in the future. Maybe a `""alter"": true` key to indicate that new columns should be added if they are missing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1863/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1456012874,I_kwDOBm6k_c5WyP5K,1905,`publish heroku` failing due to old Python version,9599,simonw,closed,0,,,,,4,2022-11-19T00:01:45Z,2022-11-19T01:12:05Z,2022-11-19T00:52:29Z,OWNER,,"Reported on Discord: https://discord.com/channels/823971286308356157/823971286941302908/1042814317118115901 ``` -----> Building on the Heroku-22 stack -----> Determining which buildpack to use for this app -----> Python app detected -----> Using Python version specified in runtime.txt ! Requested runtime 'python-3.8.10' is not available for this stack (heroku-22). ! For supported versions, see: https://devcenter.heroku.com/articles/python-support ! Push rejected, failed to compile Python app. ! Push failed ▸ Build failed ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1905/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1455932972,I_kwDOBm6k_c5Wx8Ys,1904,Datasette Lite tests failing due to httpx upgrade,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,0,2022-11-18T22:49:31Z,2022-11-18T22:57:48Z,2022-11-18T22:52:22Z,OWNER,,"Same problem as this one: - https://github.com/simonw/datasette-lite/issues/56 Caused this failure: https://github.com/simonw/datasette/actions/runs/3500765964",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1904/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452364777,I_kwDOBm6k_c5WkVPp,1896,Extract logic for resolving a URL to a database / table / row,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-16T22:25:20Z,2022-11-18T22:57:47Z,2022-11-18T22:56:55Z,OWNER,,"> In trying to write this I realize that there's a lot of duplicated code with delete row, specifically around resolving the incoming URL into a row (or a database or a table). > > Since this is so common, I think it's worth extracting the logic out first. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1317755263_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1896/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,ar-jan,closed,0,,,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450796965,I_kwDOBm6k_c5WeWel,1894,Initialize CodeMirror during DOMContentLoaded instead of onload,95570,bgrins,closed,0,,,,,0,2022-11-16T03:52:19Z,2022-11-18T07:29:02Z,2022-11-18T07:29:02Z,CONTRIBUTOR,,As per https://github.com/simonw/datasette/pull/1893/files#r1023248927 this should prevent a flash between the textarea being replaced by CodeMirror.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1894/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452485922,PR_kwDOBm6k_c5DEh-E,1898,Use DOMContentLoaded instead of load event for CodeMirror initialization,95570,bgrins,closed,0,,,,,2,2022-11-17T00:19:21Z,2022-11-18T07:29:01Z,2022-11-18T07:29:01Z,CONTRIBUTOR,simonw/datasette/pulls/1898," Closes #1894 ---- :books: Documentation preview :books:: https://datasette--1898.org.readthedocs.build/en/1898/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1898/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1452495049,I_kwDOBm6k_c5Wk1DJ,1899,Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused ,95570,bgrins,closed,0,,,,,4,2022-11-17T00:29:52Z,2022-11-18T07:28:28Z,2022-11-18T07:20:53Z,CONTRIBUTOR,,"After the upgrade to 6 (#1893) I noticed this. I think it's because we're doing overflow:hidden to accomplish the CSS resizer. When there's a single line of SQL there's a gap below that line where clicking doesn't do anything. It should focus at the end of the line.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1453813400,I_kwDOBm6k_c5Wp26Y,1901,"Some plugins show ""home"" breadcrumbs twice in the top left",95570,bgrins,closed,0,,,,,8,2022-11-17T18:44:58Z,2022-11-18T07:22:37Z,2022-11-18T07:02:56Z,CONTRIBUTOR,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1901/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452457263,I_kwDOBm6k_c5Wkr0v,1897,Serve schema JSON to the SQL editor to enable autocomplete,9599,simonw,closed,0,,,,,9,2022-11-16T23:47:45Z,2022-11-18T05:33:20Z,2022-11-18T02:54:43Z,OWNER,,"See: - https://github.com/simonw/datasette/issues/1893#issuecomment-1317831555 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1897/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1430563092,PR_kwDOCGYnMM5B6_6K,508,Allow surrogates in parameters,7908073,chapmanjacobd,closed,0,,,,,2,2022-10-31T22:11:49Z,2022-11-17T15:11:16Z,2022-10-31T22:55:36Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/508,"closes #507 https://dwheeler.com/essays/fixing-unix-linux-filenames.html ---- :books: Documentation preview :books:: https://sqlite-utils--508.org.readthedocs.build/en/508/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1448143294,I_kwDOBm6k_c5WUOm-,1890,Autocomplete text entry for filter values that correspond to facets,536941,fgregg,closed,0,,,,,16,2022-11-14T14:11:31Z,2022-11-17T00:47:36Z,2022-11-16T03:23:01Z,CONTRIBUTOR,,"datasette allows users to enter in the value for named parameters into a free-text form field. I think it would add a lot of usability, if the form field could be a drop down of options when query value is already a faceted column.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1890/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450363982,PR_kwDOBm6k_c5C9ZuP,1893,"Upgrade to CodeMirror 6, add SQL autocomplete",95570,bgrins,closed,0,,,,,48,2022-11-15T20:52:35Z,2022-11-16T23:54:02Z,2022-11-16T23:49:06Z,CONTRIBUTOR,simonw/datasette/pulls/1893,"In an effort to get closer to table / column autocomplete I took a shot at https://github.com/simonw/datasette/issues/1796. I haven't done a lot of testing but would be curious if this fixes some of the concerns raised in https://github.com/simonw/datasette/issues/1796#issue-1355148385 for example. Done: * Changed to bundling using rollup as per https://codemirror.net/examples/bundle/ * Restored a fromTextArea-like function from https://codemirror.net/docs/migration/ * Removed old JS and CSS files (no external CSS needed anymore as per https://codemirror.net/examples/styling/) * Updated instructions for building the bundle Not done: * cmResize had an error, so commented out the resize handle * Add extraKeys option for shift+enter and tab ---- :books: Documentation preview :books:: https://datasette--1893.org.readthedocs.build/en/1893/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1893/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,simonw,closed,0,,,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1447388809,I_kwDOBm6k_c5WRWaJ,1887,Add a confirm step to the drop table API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-14T04:59:53Z,2022-11-15T19:59:59Z,2022-11-14T05:18:51Z,OWNER,,"> In playing with the API explorer just now I realized it's way too easy to accidentally drop a table using it. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1871#issuecomment-1313097057_ Added drop table API in: - #1874",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1887/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429030341,I_kwDOBm6k_c5VLUXF,1874,API to drop a table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-30T21:55:11Z,2022-11-15T19:59:53Z,2022-11-14T05:45:06Z,OWNER,,"`POST /db/table/-/drop` Require `drop-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1874/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425011030,I_kwDOBm6k_c5U7_FW,1862,"Create a new table from one or more records, `sqlite-utils` style",9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T04:25:02Z,2022-11-15T19:59:47Z,2022-11-15T06:42:09Z,OWNER,,"It's interesting to also think about what the form-based UI for this could look like - since that would involve users creating new columns of different types on the fly. Will need the `create-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1862/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1435294468,I_kwDOBm6k_c5VjNsE,1882,`/db/-/create` API for creating tables,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,12,2022-11-03T21:44:32Z,2022-11-15T19:59:43Z,2022-11-15T06:00:41Z,OWNER,,"> It really feels like this should be accompanied by a `/db/-/create` API for creating tables. I had to add that to `sqlite-utils` eventually (initially it only supported creating by passing in an example document): > > https://sqlite-utils.datasette.io/en/stable/cli.html#cli-create-table _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1862#issuecomment-1299073433_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1882/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426001541,I_kwDOBm6k_c5U_w6F,1866,API for bulk inserting records into a table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,12,2022-10-27T17:19:25Z,2022-11-15T19:59:34Z,2022-10-30T06:04:07Z,OWNER,,"Similar to https://github.com/simonw/datasette-insert/blob/0.8/README.md#inserting-data-and-creating-tables I expect this to become by far the most common way that data gets into a Datasette instance - more so than the individual row API in: - #1851 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1866/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421544654,I_kwDOBm6k_c5UuwzO,1851,API to insert a single record into an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,22,2022-10-24T22:24:21Z,2022-11-15T19:59:18Z,2022-10-28T00:59:25Z,OWNER,,Controlled by a new `insert-row` permission.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1851/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426195437,I_kwDOBm6k_c5VAgPt,1868,Design URLs for the write API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T19:55:30Z,2022-11-15T19:59:14Z,2022-10-27T20:07:01Z,OWNER,,"My original design for this issue: - #1851 Was `POST /db/table` with JSON of `{""insert"": {...}}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1447439985,I_kwDOBm6k_c5WRi5x,1888,API explorer should take immutability into account,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,1,2022-11-14T06:00:14Z,2022-11-15T19:59:10Z,2022-11-14T06:04:48Z,OWNER,,"Refs: - #1871 I noticed the API explorer doesn't show any links on https://latest-1-0-dev.datasette.io/-/api because the `fixtures` database is immutable. It should still show read examples there.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1888/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1427293909,I_kwDOBm6k_c5VEsbV,1871,API explorer tool,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,24,2022-10-28T13:49:11Z,2022-11-15T19:59:05Z,2022-11-14T04:59:59Z,OWNER,,The API will be much easier to develop if there's a page that helps you execute JSON POSTs against it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1871/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423369494,I_kwDOBm6k_c5U1uUW,1859,datasette create-token CLI command,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T03:12:59Z,2022-11-15T19:59:00Z,2022-10-26T04:31:39Z,OWNER,,The CLI equivalent of the `/-/create-token` page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423364990,I_kwDOBm6k_c5U1tN-,1858,`max_signed_tokens_ttl` setting for a maximum duration on API tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-26T03:05:53Z,2022-11-15T19:58:52Z,2022-10-27T03:15:05Z,OWNER,,"It's currently possible to use `/-/create-token` to create a token that lasts forever. Some administrators may wish to have a maximum expiry instead. I should support that with a setting.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1858/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423347412,I_kwDOBm6k_c5U1o7U,1857,Prevent API tokens from using /-/create-token to create more tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,1,2022-10-26T02:38:09Z,2022-11-15T19:57:11Z,2022-10-26T02:57:26Z,OWNER,,"> It strikes me that users should NOT be able to use a token to create additional tokens. > > The current design actually does allow that, since the `dstok_` Bearer token can be used to authenticate calls to the `/-/create-token` page. > > So I think I need a mechanism whereby that page can only allow access to users authenticated by cookie. > > Not obvious how to do that though, since Datasette's authentication actor system is designed to abstract that detail away! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1291417100_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1857/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336122,I_kwDOBm6k_c5U1mK6,1856,allow_signed_tokens setting for disabling API signed token mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T02:20:55Z,2022-11-15T19:57:05Z,2022-10-26T02:58:35Z,OWNER,,"Had some design thoughts here: https://github.com/simonw/datasette/issues/1852#issuecomment-1291272280 I liked this option the most: --setting allow_create_tokens off",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1856/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421552095,I_kwDOBm6k_c5Uuynf,1852,Default API token authentication mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,30,2022-10-24T22:31:07Z,2022-11-15T19:57:00Z,2022-10-26T02:19:54Z,OWNER,,"API authentication will be via `Authorization: Bearer XXX` request headers. I'm inclined to add a default token mechanism to Datasette based on tokens that are signed with the `DATASETTE_SECRET`. Maybe the root user can access `/-/create-token` which provides a UI for generating a time-limited signed token? Could also have a `datasette token` command for creating such tokens at the command-line. Plugins can then define alternative ways of creating tokens, such as the existing https://datasette.io/plugins/datasette-auth-tokens plugin. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1289706439_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1852/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1435917503,I_kwDOBm6k_c5Vlly_,1883,Errors when using table filters behind a proxy,31312775,mattmalcher,closed,0,,,,,13,2022-11-04T11:18:47Z,2022-11-11T09:20:22Z,2022-11-11T06:54:58Z,NONE,,"Using datasette==0.63 table filters do not respect the `base_url` setting as described [here](https://docs.datasette.io/en/stable/deploying.html#running-datasette-behind-a-proxy) To reproduce, go to: https://datasette-apache-proxy-demo.datasette.io/prefix/fixtures/binary_data Then use the table filter buttons. The `/prefix/` is dropped, resulting in URL not found: https://datasette-apache-proxy-demo.datasette.io/fixtures/binary_data?_sort=rowid&rowid__exact=1 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1883/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1436539554,I_kwDOCGYnMM5Vn9qi,511,"[insert_all, upsert_all] IntegrityError: constraint failed",7908073,chapmanjacobd,closed,0,,,,,2,2022-11-04T19:21:48Z,2022-11-04T22:59:54Z,2022-11-04T22:54:09Z,CONTRIBUTOR,,"My understand is that `INSERT OR IGNORE` will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from `sqlite3`. ``` import argparse from pathlib import Path from xklb import db, utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument(""database"") parser.add_argument(""dbs"", nargs=""*"") parser.add_argument(""--upsert"") parser.add_argument(""--db"", ""-db"", help=argparse.SUPPRESS) parser.add_argument(""--verbose"", ""-v"", action=""count"", default=0) args = parser.parse_args() if args.db: args.database = args.db Path(args.database).touch() args.db = db.connect(args) log.info(utils.dict_filter_bool(args.__dict__)) return args def merge_db(args, source_db): source_db = str(Path(source_db).resolve()) s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose)) for table in [s for s in s_db.table_names() if not ""_fts"" in s and not s.startswith(""sqlite_"")]: log.info(""[%s]: %s"", source_db, table) with s_db.conn: data = s_db[table].rows with args.db.conn: if args.upsert: args.db[table].upsert_all(data, pk=args.upsert.split("",""), alter=True) else: args.db[table].insert_all(data, alter=True, replace=True) def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db) if __name__ == ""__main__"": merge_dbs() ``` ``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO [media]([path]) VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze) 3116 all_columns += [ 3117 column for column in record if column not in all_columns 3118 ] 3120 first = False -> 3122 self.insert_chunk( 3123 alter, 3124 extracts, 3125 chunk, 3126 all_columns, 3127 hash_id, 3128 hash_id_columns, 3129 upsert, 3130 pk, 3131 conversions, 3132 num_records_processed, 3133 replace, 3134 ignore, 3135 ) 3137 if analyze: 3138 self.analyze() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore) 2885 for query, params in queries_and_params: 2886 try: -> 2887 result = self.db.execute(query, params) 2888 except OperationalError as e: 2889 if alter and ("" column"" in e.args[0]): 2890 # Attempt to add any missing columns, then try again File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters) 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) IntegrityError: constraint failed > /home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py(484)execute() 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) ``` ``` sqlite3 --version 3.36.0 2021-06-18 18:36:39 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,simonw,closed,0,,,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1431786951,I_kwDOBm6k_c5VV1XH,1876,SQL query should wrap on SQL interrupted screen,9599,simonw,closed,0,,,,,2,2022-11-01T17:14:01Z,2022-11-01T17:22:33Z,2022-11-01T17:22:33Z,OWNER,,"Just saw this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1876/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1430325103,I_kwDOCGYnMM5VQQdv,507,conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-31T18:49:51Z,2022-11-01T00:40:17Z,2022-11-01T00:40:16Z,CONTRIBUTOR,,"I'm not really sure what caused this and it happened in the middle of my program (after running for 35775 seconds). ``` Extracting metadata 49.9% (chunk 9893 of 19831) ... File ""/home/xk/.local/lib/python3.10/site-packages/xklb/fs_extract.py"", line 90, in extract_chunk args.db[""media""].insert_all(utils.list_dict_filter_bool(media), pk=""path"", alter=True, replace=True) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3107, in insert_all self.insert_chunk( File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2872, in insert_chunk result = self.db.execute(query, params) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 483, in execute return self.conn.execute(sql, parameters) UnicodeEncodeError: 'utf-8' codec can't encode character '\udcc3' in position 62: surrogates not allowed ``` This might be relevant: https://stackoverflow.com/questions/31898353/python-cant-encode-with-surrogateescape I'm going to try re-running with ```py def execute( self, sql: str, parameters: Optional[Union[Iterable, dict]] = None ) -> sqlite3.Cursor: """""" Execute SQL query and return a ``sqlite3.Cursor``. :param sql: SQL query to execute :param parameters: Parameters to use in that query - an iterable for ``where id = ?`` parameters, or a dictionary for ``where id = :id`` """""" try: if self._tracer: self._tracer(sql, parameters) if parameters is not None: return self.conn.execute(sql, parameters) else: return self.conn.execute(sql) except UnicodeEncodeError: sql = sql.encode('utf-8', 'surrogatepass').decode('utf-8') if parameters is not None: parameters = parameters.encode('utf-8', 'surrogatepass').decode('utf-8') return self.execute(sql, parameters) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1428560020,I_kwDOBm6k_c5VJhiU,1872,"SITE-BUSTING ERROR: ""render_template() called before await ds.invoke_startup()""",192568,mroswell,closed,0,,,,,3,2022-10-30T02:28:39Z,2022-10-30T06:26:01Z,2022-10-30T06:26:01Z,CONTRIBUTOR,,"1. My https://list.saferdisinfectants.org/disinfectants/listN page (linked from https://SaferDisinfectants.org ) has been running beautifully for a year and a half, including a GitHub Actions workflow that's been routinely updating the database. 2. I received a recent report that the list page is down. I don't know when it went down, but the content is replaced with: ""render_template() called before await ds.invoke_startup()"" 3. The local datasette repo runs without incident. 4. The site is hosted on vercel, linked to my github repo. Perhaps some vercel changes were made, but not by anyone on our side. Here is a screenshot of the current project settings: Here a screenshot of the latest deployment status: This is my repository: https://github.com/mroswell/list-N (I notice: datasette==0.59 in my requirements.txt file) Because it's been long while since I actively worked on this or any other datasette project, I forget a lot of what I knew at one point. Perhaps some configuration file could be missing? Or perhaps I just need to know the right incantation to add to that vercel settings page. Help is welcome as the nonprofit org is soon hosting its annual conference, and we'd love to have the page working again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,simonw,closed,0,,,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1342430983,I_kwDOBm6k_c5QA98H,1786,Adjust height of textarea for no JS case,9599,simonw,closed,0,,,,,4,2022-08-18T01:15:15Z,2022-10-27T21:50:12Z,2022-08-18T16:06:09Z,OWNER,,"Datasette Lite: https://lite.datasette.io/?sql=https://gist.githubusercontent.com/simonw/1f8a91123ccefd8844187225b1832d7a/raw/5069075b86aa79358fbab3d4482d1d269077d632/recipes.sql#/data?sql=select+id%2C+name%2C+ingredients%2C+%28%0A++select+json_group_array%28value%29+from+json_each%28ingredients%29%0A++where+value+in+%28select+value+from+json_each%28%3Ap0%29%29%0A%29+as+matching_ingredients%0Afrom+recipes%0Awhere+json_array_length%28matching_ingredients%29+%3E+0%0Aorder+by+json_array_length%28matching_ingredients%29+desc&p0=%5B%22sugar%22%2C+%22cheese%22%5D ![46F8101E-8CE3-4F61-B200-F865E6B5DBCC](https://user-images.githubusercontent.com/9599/185270723-f55513b0-b561-434d-9d7c-4fe5be9756e0.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1401155623,PR_kwDOBm6k_c5AZLzm,1839,Bump black from 22.8.0 to 22.10.0,49699333,dependabot[bot],closed,0,,,,,1,2022-10-07T13:13:41Z,2022-10-27T20:51:46Z,2022-10-27T20:51:45Z,CONTRIBUTOR,simonw/datasette/pulls/1839,"Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0.
Release notes

Sourced from black's releases.

22.10.0

Highlights

  • Runtime support for Python 3.6 has been removed. Formatting 3.6 code will still be supported until further notice.

Stable style

  • Fix a crash when # fmt: on is used on a different block level than # fmt: off (#3281)

Preview style

  • Fix a crash when formatting some dicts with parenthesis-wrapped long string keys (#3262)

Configuration

  • .ipynb_checkpoints directories are now excluded by default (#3293)
  • Add --skip-source-first-line / -x option to ignore the first line of source code while formatting (#3299)

Packaging

  • Executables made with PyInstaller will no longer crash when formatting several files at once on macOS. Native x86-64 executables for macOS are available once again. (#3275)
  • Hatchling is now used as the build backend. This will not have any effect for users who install Black with its wheels from PyPI. (#3233)
  • Faster compiled wheels are now available for CPython 3.11 (#3276)

Blackd

  • Windows style (CRLF) newlines will be preserved (#3257).

Integrations

  • Vim plugin: add flag (g:black_preview) to enable/disable the preview style (#3246)
  • Update GitHub Action to support formatting of Jupyter Notebook files via a jupyter option (#3282)
  • Update GitHub Action to support use of version specifiers (e.g. <23) for Black version (#3265)
Changelog

Sourced from black's changelog.

22.10.0

Highlights

  • Runtime support for Python 3.6 has been removed. Formatting 3.6 code will still be supported until further notice.

Stable style

  • Fix a crash when # fmt: on is used on a different block level than # fmt: off (#3281)

Preview style

  • Fix a crash when formatting some dicts with parenthesis-wrapped long string keys (#3262)

Configuration

  • .ipynb_checkpoints directories are now excluded by default (#3293)
  • Add --skip-source-first-line / -x option to ignore the first line of source code while formatting (#3299)

Packaging

  • Executables made with PyInstaller will no longer crash when formatting several files at once on macOS. Native x86-64 executables for macOS are available once again. (#3275)
  • Hatchling is now used as the build backend. This will not have any effect for users who install Black with its wheels from PyPI. (#3233)
  • Faster compiled wheels are now available for CPython 3.11 (#3276)

Blackd

  • Windows style (CRLF) newlines will be preserved (#3257).

Integrations

  • Vim plugin: add flag (g:black_preview) to enable/disable the preview style (#3246)
  • Update GitHub Action to support formatting of Jupyter Notebook files via a jupyter option (#3282)
  • Update GitHub Action to support use of version specifiers (e.g. <23) for Black version (#3265)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.8.0&new-version=22.10.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1839.org.readthedocs.build/en/1839/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1839/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1400121355,PR_kwDOBm6k_c5AVujU,1835,use inspect data for hash and file size,536941,fgregg,closed,0,,,,,3,2022-10-06T18:25:24Z,2022-10-27T20:51:30Z,2022-10-06T20:06:07Z,CONTRIBUTOR,simonw/datasette/pulls/1835,"`inspect_data` should already include the hash and the db file size, so this PR takes advantage of using those instead of always recalculating. should help a lot on startup with large DBs. closes #1834 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1835/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1400431789,PR_kwDOBm6k_c5AWyQK,1837,Make hash and size a lazy property,536941,fgregg,closed,0,,,,,2,2022-10-06T23:51:22Z,2022-10-27T20:51:21Z,2022-10-27T20:51:20Z,CONTRIBUTOR,simonw/datasette/pulls/1837,"Many apologies, @simonw. My previous PR #1835 did not really solve the problem because the name of the database is often not known to database object in the init method. I took a cue from how you dealt with this issue and made hash a lazy property and did something similar with size. ---- :books: Documentation preview :books:: https://datasette--1837.org.readthedocs.build/en/1837/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1837/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1424378012,I_kwDOBm6k_c5U5kic,1860,SQL query field can't begin by a comment,562352,CharlesNepote,closed,0,,,,,12,2022-10-26T16:55:31Z,2022-10-27T18:57:37Z,2022-10-27T04:21:40Z,NONE,,"![image](https://user-images.githubusercontent.com/562352/198085197-f26fcd61-4dac-4ca4-a346-e70f88a30ecc.png) SQL comments are **very** useful to explain the meaning of the query. It's currently impossible to put it at the beginning of the field as seen on the screen capture: it leads to an error: `Statement must be a SELECT`. It would be great to make it possible because: * as the request is the title of the page: * it eases the search with search engines * it eases the search in the browsers' url field * it acts as a kind of title: the global meaning of the query is immediately understandable * some tools, such as Slack, are shortening long URLs and displaying the beginning of the URLs (eg. `https://example.org/products?sql=select+%28length%28data_quality_errors_ta[...]+%21%3D+%22%22+group+by+NB_of_issues+order+by+NB_of_issues+desc+limit+200`) Beginning a query with a comment is possible with SQLite. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1860/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425682079,I_kwDOBm6k_c5U-i6f,1865,Stop syncing main to master,9599,simonw,closed,0,,,,,1,2022-10-27T13:55:38Z,2022-10-27T13:58:27Z,2022-10-27T13:56:13Z,OWNER,,"I think it's been long enough now that I can drop the code that syncs the main branch to master. I originally added this for people who might be using `datasette publish ... --branch master` - which might only have been me anyway!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1865/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 639072811,MDU6SXNzdWU2MzkwNzI4MTE=,849,Rename master branch to main,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2020-06-15T19:05:54Z,2022-10-27T13:57:08Z,2020-09-15T20:37:14Z,OWNER,,"I was waiting for consensus to form around this (and kind-of hoping for `trunk` since I like the tree metaphor) and it looks like `main` is it. I've seen convincing arguments against `trunk` too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So `main` is better anyway.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/849/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1420174670,I_kwDOBm6k_c5UpiVO,1849,NoneType' object has no attribute 'actor',9599,simonw,closed,0,,,,,5,2022-10-24T04:02:15Z,2022-10-26T21:13:40Z,2022-10-26T21:13:40Z,OWNER,,"``` File ""/usr/local/lib/python3.10/site-packages/datasette/templates/_crumbs.html"", line 3, in template {% set items=crumb_items(request=request, database=database, table=table) %} File ""jinja2/async_utils.py"", line 65, in auto_await return await t.cast(""t.Awaitable[V]"", value) File ""datasette/app.py"", line 638, in _crumb_items actor=request.actor, action=""view-instance"", default=True ``` From Sentry.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1849/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642297505,MDU6SXNzdWU2NDIyOTc1MDU=,857,Comprehensive documentation for variables made available to templates,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2020-06-20T03:19:43Z,2022-10-26T02:58:17Z,2022-10-26T02:58:17Z,OWNER,,"Needed for the Datasette 1.0 release, so template authors can trust that Datasette is unlikely to break their templates.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/857/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423182778,I_kwDOCGYnMM5U1Au6,505,Release sqlite-utils 3.30,9599,simonw,closed,0,,,,,2,2022-10-25T22:20:05Z,2022-10-25T22:41:26Z,2022-10-25T22:41:16Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.29...defa2974c6d3abc19be28d6b319649b8028dc966,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,mhalle,closed,0,,,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393212964,I_kwDOCGYnMM5TCr4k,497,column_names,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-01T03:34:21Z,2022-10-25T21:09:28Z,2022-10-25T21:09:28Z,CONTRIBUTOR,,"It would be nice to have a `column_names`. Similar to `table_names`. Or if you could get one or all of the following syntax to work for both Database and Table that might be even better: Style 1 - `if 'table1' in db` - `if 'col1' in db['table1']` Style 2 - `if 'table1' in db.tables` - `if 'col1' in db['table1'].columns` maybe the table ones actually work but I'm too lazy to check. I just know that I have to do: `[c.name for c in db['table1'].columns]` Edit: This is possible with `columns_dict`. I have actually used that before but I forgot about it. Feel free to close, but I do think accessing this data could be more consistent and intuitive.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423069384,I_kwDOCGYnMM5U0lDI,504,"db.close() method, calling db.conn.close()",9599,simonw,closed,0,,,,,1,2022-10-25T20:50:50Z,2022-10-25T21:00:29Z,2022-10-25T20:57:47Z,OWNER,,"I ended up needing to use `db.conn.close()` to fix this issue: - #503 I think `.close()` should be a method on `Database` itself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423000702,I_kwDOCGYnMM5U0UR-,503,test_recreate failing on Windows Python 3.11,9599,simonw,closed,0,,,,,10,2022-10-25T20:01:41Z,2022-10-25T20:47:34Z,2022-10-25T20:45:43Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3323672128/jobs/5494726927 Related: - #502 ``` FAILED tests/test_recreate.py::test_recreate[True-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_True_True_0\\data.db' FAILED tests/test_recreate.py::test_recreate[False-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_False_True_0\\data.db' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422973111,I_kwDOBm6k_c5U0Ni3,1854,Flaky test: test_serve_localhost_http,9599,simonw,closed,0,,,,,3,2022-10-25T19:37:35Z,2022-10-25T19:53:02Z,2022-10-25T19:53:02Z,OWNER,,Failing on Python 3.10 at the moment: https://github.com/simonw/datasette/actions/runs/3323629947/jobs/5494340302,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422915587,I_kwDOBm6k_c5Uz_gD,1853,Upgrade Datasette Docker to Python 3.11,9599,simonw,closed,0,,,,,7,2022-10-25T18:44:31Z,2022-10-25T19:28:56Z,2022-10-25T19:05:16Z,OWNER,,"Related: - #1768 I think this base image looks right: [3.11.0-slim-bullseye](https://hub.docker.com/layers/library/python/3.11.0-slim-bullseye/images/sha256-244c0b0e6e7608a16f87382fc8a5ef3c330d042113a9a7b6fc15a95360181651?context=explore)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1853/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422954582,I_kwDOCGYnMM5U0JBW,502,Fix tests for Python 3.11,9599,simonw,closed,0,,,,,1,2022-10-25T19:20:31Z,2022-10-25T19:23:47Z,2022-10-25T19:23:47Z,OWNER,,"The way errors are represented has changed: https://github.com/simonw/sqlite-utils/actions/runs/3323588047/jobs/5494127154 ``` _________________________ test_query_invalid_function __________________________ db_path = '/tmp/pytest-of-runner/pytest-0/test_query_invalid_function0/test.db' def test_query_invalid_function(db_path): result = CliRunner().invoke( cli.cli, [db_path, ""select bad()"", ""--functions"", ""def invalid_python""] ) assert result.exit_code == 1 > assert ( result.output.strip() == ""Error: Error in functions definition: invalid syntax (, line 1)"" ) E AssertionError: assert 'Error: Error...ing>, line 1)' == 'Error: Error...ing>, line 1)' E - Error: Error in functions definition: invalid syntax (, line 1) E ? ^^^^^^ ^^^^^^ E + Error: Error in functions definition: expected '(' (, line 1) E ? ^^^^^^^ ^^^ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1420090659,I_kwDOBm6k_c5UpN0j,1848,Private database page should show padlock on every table,9599,simonw,closed,0,,,,,3,2022-10-24T02:28:38Z,2022-10-24T02:50:29Z,2022-10-24T02:42:34Z,OWNER,,"Following: - #1829 https://latest.datasette.io/_internal looks like this: But those queries and tables are private too, and should also show the padlock icon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1396948693,I_kwDOBm6k_c5TQ77V,1829,Table/database that is private due to inherited permissions does not show padlock,9599,simonw,closed,0,,,,,8,2022-10-04T23:14:16Z,2022-10-24T02:23:46Z,2022-10-24T02:11:37Z,OWNER,,"I noticed that a table page that is private because the database or instance is private, e.g. this one: Is not displaying the padlock icon that indicates the table is not visible to the public. Same issue for the database page too, which in this case is private due to `view-instance`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1829/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1408561039,PR_kwDOBm6k_c5Axrpb,1842,check_visibility can now take multiple permissions into account,9599,simonw,closed,0,,,,,3,2022-10-14T00:06:04Z,2022-10-24T02:11:36Z,2022-10-24T02:11:36Z,OWNER,simonw/datasette/pulls/1842,"Refs #1829 - [x] Fix table page - [x] Fix database page - [x] Fix query page - [x] Fix row page - [x] Tests - [x] Updated documentation for `check_visibility` method, to cover the new `permissions=` keyword argument Also this fix is currently only applied on the table page - needs to be applied on database, row and query pages too. ---- :books: Documentation preview :books:: https://datasette--1842.org.readthedocs.build/en/1842/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1842/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1420055377,I_kwDOBm6k_c5UpFNR,1847,Both _local_metadata and _metadata_local?,9599,simonw,closed,0,,,,,2,2022-10-24T01:43:08Z,2022-10-24T01:53:13Z,2022-10-24T01:53:13Z,OWNER,,"Spotted this in the debugger against the `datasette` object while running tests (`pytest -k test_permissions_cascade` to be exact): ``` (Pdb) [p for p in dir(self) if p.startswith('_') and '__' not in p] ['_actor', '_asset_urls', '_connected_databases', '_crumb_items', '_local_metadata', '_metadata', '_metadata_local', '_metadata_recursive_update', '_permission_checks', '_plugins', '_prepare_connection', '_refresh_schemas', '_refresh_schemas_lock', '_register_custom_units', '_register_renderers', '_root_token', '_routes', '_secret', '_settings', '_show_messages', '_startup_hook_calculation', '_startup_hook_fired', '_startup_invoked', '_threads', '_versions', '_write_messages_to_response'] ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1847/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413641049,I_kwDOCGYnMM5UQnNZ,501,Tests failing due to updated tabulate library,9599,simonw,closed,0,,,,,4,2022-10-18T18:07:52Z,2022-10-18T18:23:40Z,2022-10-18T18:23:40Z,OWNER,,"Failure here: https://github.com/simonw/sqlite-utils/actions/runs/3275786702/jobs/5391063221 I figured out the problem: ```diff diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index b88e38a..82b4b6c 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -112,11 +112,15 @@ See :ref:`cli_query`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -176,11 +180,15 @@ See :ref:`cli_memory`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -401,11 +409,14 @@ See :ref:`cli_search`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -651,11 +662,14 @@ See :ref:`cli_tables`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each table @@ -689,11 +703,14 @@ See :ref:`cli_views`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each view @@ -732,11 +749,15 @@ See :ref:`cli_rows`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional @@ -768,11 +789,14 @@ See :ref:`cli_triggers`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -804,11 +828,14 @@ See :ref:`cli_indexes`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint diff --git a/docs/cli.rst b/docs/cli.rst index 8bc4176..1d67e88 100644 --- a/docs/cli.rst +++ b/docs/cli.rst @@ -187,10 +187,15 @@ Available ``--fmt`` options are: cog.out(""\n"" + ""\n"".join('- ``{}``'.format(t) for t in tabulate.tabulate_formats) + ""\n\n"") .. ]]] +- ``asciidoc`` +- ``double_grid`` +- ``double_outline`` - ``fancy_grid`` - ``fancy_outline`` - ``github`` - ``grid`` +- ``heavy_grid`` +- ``heavy_outline`` - ``html`` - ``jira`` - ``latex`` @@ -198,15 +203,22 @@ Available ``--fmt`` options are: - ``latex_longtable`` - ``latex_raw`` - ``mediawiki`` +- ``mixed_grid`` +- ``mixed_outline`` - ``moinmoin`` - ``orgtbl`` +- ``outline`` - ``pipe`` - ``plain`` - ``presto`` - ``pretty`` - ``psql`` +- ``rounded_grid`` +- ``rounded_outline`` - ``rst`` - ``simple`` +- ``simple_grid`` +- ``simple_outline`` - ``textile`` - ``tsv`` - ``unsafehtml`` ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413610718,I_kwDOCGYnMM5UQfze,500,Turn --flatten into a documented utility function,9599,simonw,closed,0,,,,,4,2022-10-18T17:43:36Z,2022-10-18T18:02:10Z,2022-10-18T18:00:40Z,OWNER,,The `--flatten` implementation isn't currently available to Python code - people have to roll their own implementation. Feedback from a conversation at DjangoCon.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1404013495,PR_kwDOCGYnMM5AicIh,498,fix: enable-fts permanently save triggers,7908073,chapmanjacobd,closed,0,,,,,2,2022-10-11T05:10:51Z,2022-10-15T04:33:08Z,2022-10-11T06:34:31Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/498,"I was wondering why my all my databases were giving wild search results. Turns out create_trigger was not sticking! Running `sqlite-utils triggers x.db` shows `[]` after running `enable-fts` using the python api. Looking at the counts trigger it seems that is the right way to save triggers. triggers show up now ---- :books: Documentation preview :books:: https://sqlite-utils--498.org.readthedocs.build/en/498/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/498/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1409679008,I_kwDOBm6k_c5UBf6g,1844,Update screenshots in documentation to match latest designs,9599,simonw,closed,0,,,,,18,2022-10-14T18:01:18Z,2022-10-14T23:51:46Z,2022-10-14T19:57:17Z,OWNER,,"https://docs.datasette.io/en/0.62/full_text_search.html has this out-of-date screenshot: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1844/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1397084281,I_kwDOBm6k_c5TRdB5,1831,If user can see table but NOT database/instance nav links should not display,9599,simonw,closed,0,,,,,10,2022-10-05T02:16:31Z,2022-10-13T21:52:04Z,2022-10-13T21:52:04Z,OWNER,,"Spotted this bug while building this plugin: - https://github.com/simonw/datasette-public This is a public table, but the two links in the nav go to forbidden pages: Those nav links shouldn't be shown at all.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1831/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1361355564,I_kwDOCGYnMM5RJKMs,482,balanced table default column_order,7908073,chapmanjacobd,closed,0,,,,,1,2022-09-05T03:00:18Z,2022-10-10T17:43:02Z,2022-09-06T20:17:27Z,CONTRIBUTOR,,"Is there any performance or size difference with column order in SQLITE ? similar to this https://www.cybertec-postgresql.com/en/column-order-in-postgresql-does-matter/ It might be interesting to have an option to create with an optimized column order. I'm assuming this would look something like INTEGER columns, REAL columns, BLOB columns, TEXT columns, NULL columns. NULL columns at the end because they are more likely to be TEXT and it is impossible to know if they will become INTEGER (Of course, any schema evolution would reduce optimization but maybe column order could also be re-evaluated when schema changes) edit: this is easy to accomplish with the existing `transform` method: ``` int_columns = [k for k, v in table_columns.items() if v == int] db[table].transform(column_order=[*int_columns]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1402608214,PR_kwDOBm6k_c5AdyZ4,1840,test commit,102635518,7lingyuan,closed,0,,,,,0,2022-10-10T05:15:26Z,2022-10-10T09:11:50Z,2022-10-10T09:11:50Z,FIRST_TIMER,simonw/datasette/pulls/1840,"lalalalalalala ---- :books: Documentation preview :books:: https://datasette--1840.org.readthedocs.build/en/1840/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1840/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1400494162,PR_kwDOBm6k_c5AW_kl,1838,Open Datasette link in new tab,4399499,ocdtrekkie,closed,0,,,,,3,2022-10-07T01:12:20Z,2022-10-07T16:28:41Z,2022-10-07T02:01:07Z,NONE,simonw/datasette/pulls/1838,"This is technically a Sandstorm-specific fix (as external links do not work inside the grain frame), however, I think it is an improvement to the upstream project, so I wanted to propose it here rather than patching it in our package. There's much opinions on the Internet about whether external links should open in a new tab by default or not, but I'd argue very few people who might click a ""powered by"" link intend to complete their interaction with the source page (a Datasette). And furthermore, users may be working within various queries or loading visualizations (navigating away when trying to plot a million GPS coordinates pretty much just resets your progress!), so linking away within the tab might be a frustrating or destructive act to one's work, even inadvertently. original report: https://github.com/ocdtrekkie/datasette-sandstorm/issues/1 ---- :books: Documentation preview :books:: https://datasette--1838.org.readthedocs.build/en/1838/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1386456717,PR_kwDOBm6k_c4_oHI4,1820,[SPIKE] Don't truncate query CSVs,536941,fgregg,closed,0,,,,,2,2022-09-26T17:27:01Z,2022-10-07T16:12:17Z,2022-10-07T16:12:17Z,CONTRIBUTOR,simonw/datasette/pulls/1820,"Relates to #526 This is a minimal set of changes needed for having *query* CSVs attempt to download all the rows. What's good about it is the minimalism. What's bad about it: 1. We are abusing the `_size` argument to indicate we don't want truncation, which isn't the most obvious thing. Additionally, there are various checks that make sure the ""_size"" URL parameter is a positive integer, which we are relying on to prevent overloading. 2. The default CSV on a table page will use the max_returned_rows argument. Changing this could be a breaking change, since that's currently a place that has some facilities for pagination. Additionally, i think there's a limit under the hood somewhere which if we removed could lead to sql timeouts 3. There are similar reasons for leaving the current streaming method alone, as the current methods could allow for downloading very large files that could have a sql timeout if we tried to get them in one go. ---- :books: Documentation preview :books:: https://datasette--1820.org.readthedocs.build/en/1820/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1820/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 1400083043,I_kwDOBm6k_c5Tc5Jj,1834,inspect data is not used for caching database hash,536941,fgregg,closed,0,,,,,0,2022-10-06T17:52:01Z,2022-10-06T20:06:21Z,2022-10-06T20:06:08Z,CONTRIBUTOR,,"When databases are loaded, https://github.com/simonw/datasette/blob/cb1e093fd361b758120aefc1a444df02462389a3/datasette/app.py#L257-L260 there is nothing preventing the rehashing of the database for immutable databases. https://github.com/simonw/datasette/blob/cb1e093fd361b758120aefc1a444df02462389a3/datasette/database.py#L50-L53 what i might expect is that relevant values of `inspect_data` get passed to the `Database` class to prevent re-hashing? With data that is many gigs large, this is a significant start up time. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1834/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1397193691,I_kwDOBm6k_c5TR3vb,1832,__bool__ method on Results,9599,simonw,closed,0,,,,,2,2022-10-05T04:18:12Z,2022-10-05T04:32:33Z,2022-10-05T04:32:33Z,OWNER,,"Wrote this code today: https://github.com/simonw/datasette-public/blob/1401bfae50e71c1dfd2bfb6954f2e86d5a7ab21b/datasette_public/__init__.py#L41 ```python results = await db.execute( ""select 1 from _public_tables where table_name = ?"", [table_name] ) if len(results): return True ``` Would be nice if I could use `if results` there instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1832/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393903845,I_kwDOBm6k_c5TFUjl,1828,word-wrap: anywhere resulting in weird display,9599,simonw,closed,0,,,,,2,2022-10-02T21:25:03Z,2022-10-02T23:01:17Z,2022-10-02T23:01:17Z,OWNER,,"e.g. on https://github-to-sqlite.dogsheep.net/github/commits This is from a change introduced here: https://github.com/simonw/datasette/commit/bf8d84af5422606597be893cedd375020cb2b369 in #1805 https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/static/app.css#L447-L450",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1828/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1392426838,PR_kwDOBm6k_c4_8BMC,1827,Bump furo from 2022.9.15 to 2022.9.29,49699333,dependabot[bot],closed,0,,,,,1,2022-09-30T13:15:35Z,2022-09-30T17:55:42Z,2022-09-30T17:55:41Z,CONTRIBUTOR,simonw/datasette/pulls/1827,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.15 to 2022.9.29.
Changelog

Sourced from furo's changelog.

Changelog

2022.09.29 -- Quaint Quartz

  • Add ability to set arbitrary URLs for edit button.
  • Add support for aligning text in MyST-parser generated tables.

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.
  • Correctly select only Furo's own svg in related pages nav.
  • Make numpy-style documentation headers consistent.
  • Retitle the reference section.
  • Update npm dependencies.

2022.06.21 -- Opulent Opal

  • Fix docutils <= 0.17.x compatibility.
  • Bump to the latest Node.js LTS.

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

  • ✨ Advertise Sphinx 5 compatibility.
  • ✨ Change to basic-ng as the base theme (from {pypi}sphinx-basic-ng).
  • Document site-wide announcement banners.
  • Drop the pin on pygments.
  • Improve edit button, using basic-ng's edit-this-page component.
  • Tweak headings to better match what users expect.
  • Tweak how Sphinx's default HTML is rendered, using docutils post-transforms (this replaces parsing+modifying it with BeautifulSoup).
  • When built with docutils 0.18, footnotes are rendered differently and stylised differently in Furo.

2022.04.07 -- Magical Mauve

... (truncated)

Commits
  • 1375f9d Prepare release: 2022.09.29
  • af43607 Update changelog
  • bc0fe52 Update user-facing documentation for edit button
  • 509c558 Modernise the edit-this-page.html template
  • 5a0ceca Add source_edit_link as a theme configuration parameter (#510)
  • 52fc32f Build documentation in pull requests
  • 149f77b Fix stylesheet for MyST tables
  • 9af2e44 Support MyST table column alignment (#531)
  • 82dd61c Back to development
  • See full diff in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.9.15&new-version=2022.9.29)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1827.org.readthedocs.build/en/1827/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1827/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,pjamargh,closed,0,,,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1388227245,PR_kwDOBm6k_c4_uCkO,1825,Add documentation for serving via OpenRC,1048831,asimpson,closed,0,,,,,2,2022-09-27T19:00:56Z,2022-09-28T04:21:37Z,2022-09-28T04:21:37Z,CONTRIBUTOR,simonw/datasette/pulls/1825,"I also removed a few lines which felt redundant given the following section dedicated to running behind a nginx proxy. ---- :books: Documentation preview :books:: https://datasette--1825.org.readthedocs.build/en/1825/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1825/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,danp,closed,0,,,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386734383,I_kwDOBm6k_c5Sp-Mv,1821,Release Datasette 0.63a0,9599,simonw,closed,0,,,,,1,2022-09-26T21:15:27Z,2022-09-26T22:06:39Z,2022-09-26T22:06:39Z,OWNER,,"> - The [prepare_jinja2_environment(env, datasette)](https://docs.datasette.io/en/latest/plugin_hooks.html#plugin-hook-prepare-jinja2-environment) plugin hook now accepts an optional `datasette` argument. Hook implementations can also now return an `async` function which will be awaited automatically. ([#1809](https://github.com/simonw/datasette/issues/1809)) > - `--load-extension` option now supports entrypoints. Thanks, Alex Garcia. ([#1789](https://github.com/simonw/datasette/pull/1789)) > - New tutorial: [Cleaning data with sqlite-utils and Datasette](https://datasette.io/tutorials/clean-data). > - Facet size can now be set per-table with the new `facet_size` table metadata option. ([#1804](https://github.com/simonw/datasette/issues/1804)) > - `truncate_cells_html` setting now also affects long URLs in columns. ([#1805](https://github.com/simonw/datasette/issues/1805)) > - `Database(is_mutable=)` now defaults to `True`. ([#1808](https://github.com/simonw/datasette/issues/1808)) > - Non-JavaScript textarea now increases height to fit the SQL query. ([#1786](https://github.com/simonw/datasette/issues/1786)) > - More detailed command descriptions on the [CLI reference](https://docs.datasette.io/en/latest/cli-reference.html#cli-reference) page. ([#1787](https://github.com/simonw/datasette/issues/1787)) > - Datasette no longer enforces upper bounds on its depenedencies. ([#1800](https://github.com/simonw/datasette/issues/1800)) > - Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. ([#1794](https://github.com/simonw/datasette/pull/1794)) > - The `settings.json` file used in [Configuration directory mode](https://docs.datasette.io/en/latest/settings.html#config-dir) is now validated on startup. ([#1816](https://github.com/simonw/datasette/issues/1816))",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1821/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386593843,I_kwDOCGYnMM5Spb4z,494,Document how to use Just,9599,simonw,closed,0,,,,,2,2022-09-26T19:25:12Z,2022-09-26T19:32:36Z,2022-09-26T19:26:39Z,OWNER,,"I'm using `just` a lot know, based on this file - I should add that to https://sqlite-utils.datasette.io/en/latest/contributing.html https://github.com/simonw/sqlite-utils/blob/afbd2b2cba45cccb305c3d4638d18db4dd3d4bbd/Justfile#L1-L24",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363765916,I_kwDOCGYnMM5RSWqc,483,`sqlite-utils install` command,9599,simonw,closed,0,,,,,2,2022-09-06T20:13:55Z,2022-09-26T19:04:43Z,2022-09-26T18:57:15Z,OWNER,,"With the addition of `--functions` in: - #471 In addition to the existing `convert` command, there are now very good reasons to want to install additional packages into the same virtual environment as `sqlite-utils` itself, to allow them to be used with those features. This isn't easy if you installed the tool with `pipx` or `brew install sqlite-utils`. Datasette solved this problem with the `datasette install` command: - https://github.com/simonw/datasette/issues/925 `sqlite-utils` could benefit from the same idea.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/483/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1382457780,I_kwDOCGYnMM5SZqG0,490,Ability to insert multi-line files,6180701,jeqo,closed,0,,,,,4,2022-09-22T13:29:22Z,2022-09-26T18:24:44Z,2022-09-23T16:37:58Z,NONE,,"I was looking into how to parse application log files that contain multiline text (e.g. Java stack traces) into sqlite. I can see that at the moment `--lines` helps, but falls short when processing multi-line texts. I wonder if this functionality would be useful for sqlite-utils. A similar approach to Elastic logstash/filebeat can be adopted: https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html Potential changes: - add a `--multiline` option - additional properties for - multiline-pattern (regex expression) - multiline-negate: true/false - multiline-what: previous or next Or if this is achievable in a different way, please share. Thanks!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378640768,I_kwDOBm6k_c5SLGOA,1816,Validate settings.json on startup in configuration directory mode,9599,simonw,closed,0,,,,,2,2022-09-19T23:35:18Z,2022-09-20T01:15:48Z,2022-09-20T01:15:48Z,OWNER,,"> It might have been useful for Datasette to show an error when started against a `settings.json` file that contains an invalid setting though. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1816/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378495690,I_kwDOBm6k_c5SKizK,1814,Static files not served,4068,frafra,closed,0,,,,,2,2022-09-19T20:38:17Z,2022-09-19T23:35:06Z,2022-09-19T23:34:30Z,NONE,,"Folder structure: ``` bibliography/ bibliography/static-files bibliography/static-files/styles.css bibliography/bibliography.db bibliography/metadata.json bibliography/settings.json ``` ``` $ cat bibliography/settings.json { ""suggest_facets"": false, ""truncate_cells_html"": 1000, ""static"": ""assets:static-files/"" } ``` File `/assets/styles.css` is not found (HTTP 404, `Database not found: assets`). Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1814/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1377811868,I_kwDOBm6k_c5SH72c,1813,missing next and next_url in JSON responses from an instance deployed on Fly ,883348,adipasquale,closed,0,,,,,1,2022-09-19T11:32:34Z,2022-09-19T11:34:45Z,2022-09-19T11:34:45Z,CONTRIBUTOR,,"👋 thank you for an incredibly useful project! I have noticed that my deployed instance on Fly does not include the `next` and `next_url` keys even for a truncated response : This is publically accessible here: `https://collectif-objets-datasette.fly.dev/collectif-objets.json?sql=select+*+from+mairies` However when I run the dataset server locally with the same data I get these next keys for the exact same query: I am wondering if I've missed some config or something specific to deployments on Fly.io? I am running datasette v0.62, without any specific config : - locally `poetry run datasette data/collectif-objets.sqlite` - for the deploy : `poetry run datasette publish fly data/collectif-objets.sqlite` as visible in [the Makefile](https://github.com/adipasquale/collectif-objets-datasette/blob/main/Makefile). _The very limited codebase is public but the sqlite db is not versioned yet because it is too large._",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1813/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1373595927,I_kwDOBm6k_c5R32kX,1809,`prepare_jinja2_environment()` hook should take `datasette` argument,9599,simonw,closed,0,,,,,11,2022-09-14T21:15:46Z,2022-09-17T03:39:05Z,2022-09-17T03:38:33Z,OWNER,,"That plugin hook's current signature is: https://github.com/simonw/datasette/blob/610425460b519e9c16d386cb81aa081c9d730ef0/datasette/hookspecs.py#L28-L30 As a result in the first alpha release of `datasette-edit-templates` I had to include this horrific hack: https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/datasette_edit_templates/__init__.py#L72-L75 ```python @hookimpl def prepare_jinja2_environment(env): # TODO: This should ideally take datasette, but that's not an argument yet datasette = inspect.currentframe().f_back.f_back.f_back.f_back.f_locals[""self""] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1809/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1375930971,PR_kwDOBm6k_c4_GVBS,1812,Bump furo from 2022.6.21 to 2022.9.15,49699333,dependabot[bot],closed,0,,,,,3,2022-09-16T13:10:45Z,2022-09-16T19:50:53Z,2022-09-16T19:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/1812,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.6.21 to 2022.9.15.
Changelog

Sourced from furo's changelog.

Changelog

2022.09.15 -- Pragmatic Pistachio

  • Add a minimum version constraint on pygments.
  • Add an explicit dependency on sass.
  • Change right sidebar title from "Contents" to "On this page".
  • Correctly position sidebars on small screens.
  • Correctly select only Furo's own svg in related pages nav.
  • Make numpy-style documentation headers consistent.
  • Retitle the reference section.
  • Update npm dependencies.

2022.06.21 -- Opulent Opal

  • Fix docutils <= 0.17.x compatibility.
  • Bump to the latest Node.js LTS.

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

  • ✨ Advertise Sphinx 5 compatibility.
  • ✨ Change to basic-ng as the base theme (from {pypi}sphinx-basic-ng).
  • Document site-wide announcement banners.
  • Drop the pin on pygments.
  • Improve edit button, using basic-ng's edit-this-page component.
  • Tweak headings to better match what users expect.
  • Tweak how Sphinx's default HTML is rendered, using docutils post-transforms (this replaces parsing+modifying it with BeautifulSoup).
  • When built with docutils 0.18, footnotes are rendered differently and stylised differently in Furo.

2022.04.07 -- Magical Mauve

  • ✨ Make sphinx-copybutton look better.
  • Add margin to indentations in line blocks.
  • Add styling for non-arabic list styles
  • Add support for html_baseurl.

... (truncated)

Commits
  • 08e6b38 Prepare release: 2022.09.15
  • 9de7613 Update changelog
  • a064929 Tweak changelog content style
  • 46f4adc Revert "Add initial theme.conf content for eventual ablog support"
  • 45b839b Set a minimum constraint on pygments
  • a4af988 [pre-commit.ci] pre-commit autoupdate (#518)
  • a72186f [pre-commit.ci] pre-commit autoupdate (#504)
  • 9f41ee6 Add initial theme.conf content for eventual ablog support
  • 75e0361 Make numpy-style documentation headers consistent
  • 9d280e6 [pre-commit.ci] pre-commit autoupdate (#487)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.6.21&new-version=2022.9.15)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1812.org.readthedocs.build/en/1812/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1812/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1366512990,PR_kwDOCGYnMM4-nBs9,486,"progressbar for inserts/upserts of all fileformats, closes #485",99098079,MischaU8,closed,0,,,,,7,2022-09-08T14:58:02Z,2022-09-15T20:40:03Z,2022-09-15T20:37:51Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/486," ---- :books: Documentation preview :books:: https://sqlite-utils--486.org.readthedocs.build/en/486/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1366423176,I_kwDOCGYnMM5RcfaI,485,Progressbar not shown when inserting/upserting jsonlines file,99098079,MischaU8,closed,0,,,,,1,2022-09-08T14:13:18Z,2022-09-15T20:39:52Z,2022-09-15T20:37:52Z,CONTRIBUTOR,,"When inserting or upserting a jsonlines file, no progressbar is shown. Expected behavior is that, just like with .csv/.tsv files, also for a jsonlines file (--nl), unless --silent is provided, a progressbar is shown. ```bash sql-utils upsert mydb.db posts posts.jl --nl --pk post_id (silence) ``` Currently `file_progress` is only called within the tsv/csv logic, however I think it can be safely wrapped around all the all the input formats that use `decoded`: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py#L963",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1367835380,I_kwDOCGYnMM5Rh4L0,487,Specify foreign key against compound key in other table,540968,ryanfox,closed,0,,,,,2,2022-09-09T13:32:09Z,2022-09-11T04:00:44Z,2022-09-11T04:00:44Z,NONE,,"When inserting rows via the library, is it possible to specify a foreign key to a compound primary key? For example, suppose I create a table: ``` db = Database('events.db') db['events'].insert_all([ {'venue': 'Times Square', 'date': '2022-12-31', 'title': 'Rockin New Year Eve'}, {'venue': 'Wembley Stadium', 'date': '2022-06-05', 'title': 'FA Cup'}, {'venue': 'Times Square', 'date': '2021-12-31', 'title': 'Rockin New Year Eve'}, ], pk=('date', 'venue')) ``` And I want to add related data in another table: ``` act = {'name': 'Rick Astley', 'venue': 'Times Square', 'date': '2021-12-31' } db['performers'].insert(act, pk=) ``` Is it possible to specify a value for `pk` that will point to the compound primary key in `events`? SQLite does support it: https://www.sqlite.org/foreignkeys.html#fk_composite",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1368030952,I_kwDOBm6k_c5Rin7o,1808,Database() constructor currently defaults is_mutable to False,9599,simonw,closed,0,,,,,5,2022-09-09T16:02:41Z,2022-09-09T16:37:57Z,2022-09-09T16:19:25Z,OWNER,,"This is surprising. It caused a bug in `datasette-upload-dbs` because I didn't expect it to do that. > I think this is an API design flaw in Datasette itself, but I can fix it here first. _Originally posted by @simonw in https://github.com/simonw/datasette-upload-dbs/issues/6#issuecomment-1242150394_ Code in question: https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/database.py#L29-L32",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1808/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353441389,I_kwDOCGYnMM5Qq-Bt,477,Conda Forge,49702524,thewchan,closed,0,,,,,2,2022-08-28T19:03:08Z,2022-09-07T03:46:55Z,2022-09-07T03:46:55Z,NONE,,"Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks. https://github.com/conda-forge/sqlite-utils-feedstock",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932716,I_kwDOCGYnMM5QpB1s,471,sqlite-utils query --functions mechanism for registering extra functions,9599,simonw,closed,0,,,8355157,3.29,12,2022-08-27T03:57:53Z,2022-09-07T03:46:26Z,2022-08-27T05:10:57Z,OWNER,,"It would be really cool if you could register additional custom SQL functions for use with the `sqlite-utils query` command - something like this: ``` sqlite-utils data.db 'update images set domain = extract_domain(url)' --functions ' from urllib.parse import urlparse def extract_domain(url): return urlparse(url).netloc ' ``` Every function defined in that code block would be registered with the connection, unless the name began with an underscore.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363440999,I_kwDOBm6k_c5RRHVn,1804,Ability to set a custom facet_size per table,9599,simonw,closed,0,,,,,6,2022-09-06T15:11:40Z,2022-09-07T00:21:56Z,2022-09-06T18:06:53Z,OWNER,,"Suggestion from Discord: https://discord.com/channels/823971286308356157/823971286941302908/1016725586351247430 > Is it possible to limit the facet size per database or even per table? This is a really good idea, it could be done in `metadata.yml`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1804/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352931076,PR_kwDOBm6k_c495vvy,1794,fix word break in facets by adding ul.tight-bullets li word-break: break-all,128286,dmr,closed,0,,,,,1,2022-08-27T03:47:25Z,2022-09-06T00:45:41Z,2022-09-06T00:45:41Z,CONTRIBUTOR,simonw/datasette/pulls/1794,"I noticed that long words break the layout of facets: ![image](https://user-images.githubusercontent.com/128286/187013146-fb2bbb60-a225-441b-ba8e-b9e74fb04f93.png) So I added CSS to add a line break. This is how the result looks now: ![image](https://user-images.githubusercontent.com/128286/187013175-a706fc72-9e69-4a75-9bdf-bdaa34a0cf51.png) I don't know enough about facet edge cases to decide if this change might break other things but it looks better for me so maybe this is helpful. ---- :books: Documentation preview :books:: https://datasette--1794.org.readthedocs.build/en/1794/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1794/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1351949898,PR_kwDOBm6k_c492dPw,1793,Added a useful resource,111973926,MobiWancode,closed,0,,,,,1,2022-08-26T08:41:26Z,2022-09-06T00:41:25Z,2022-09-06T00:41:24Z,NONE,simonw/datasette/pulls/1793,"Have added a useful resource about the types of databases in SQL i.e SQLite, PostgreSQL, MySQL &, etc from the scaler topics. ---- :books: Documentation preview :books:: https://datasette--1793.org.readthedocs.build/en/1793/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1362402998,I_kwDOBm6k_c5RNJ62,1802,Tests reliably failing on Python 3.7,9599,simonw,closed,0,,,,,15,2022-09-05T19:21:16Z,2022-09-06T00:40:20Z,2022-09-06T00:40:20Z,OWNER,," https://github.com/simonw/datasette/runs/8194907739?check_suite_focus=true I thought this might be an intermittent failure but attempts to re-run the tests have not made it pass. End of that trace is: ``` /home/runner/work/datasette/datasette/datasette/app.py:234: in __init__ self._refresh_schemas_lock = asyncio.Lock() /opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/asyncio/locks.py:161: in __init__ self._loop = events.get_event_loop() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_event_loop(self): """"""Get the event loop for the current context. Returns an instance of EventLoop or raises an exception. """""" if (self._local._loop is None and not self._local._set_called and isinstance(threading.current_thread(), threading._MainThread)): self.set_event_loop(self.new_event_loop()) if self._local._loop is None: raise RuntimeError('There is no current event loop in thread %r.' > % threading.current_thread().name) E RuntimeError: There is no current event loop in thread 'MainThread'. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1802/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1362567197,PR_kwDOBm6k_c4-ZxWD,1803,Workaround for test failure: RuntimeError: There is no current event loop,9599,simonw,closed,0,,,,,1,2022-09-06T00:31:06Z,2022-09-06T00:40:19Z,2022-09-06T00:40:19Z,OWNER,simonw/datasette/pulls/1803,"Closes #1802 ---- :books: Documentation preview :books:: https://datasette--1803.org.readthedocs.build/en/1803/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1803/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1358848933,PR_kwDOBm6k_c4-NhzQ,1797,Bump black from 22.6.0 to 22.8.0,49699333,dependabot[bot],closed,0,,,,,0,2022-09-01T13:25:14Z,2022-09-05T18:51:52Z,2022-09-05T18:51:52Z,CONTRIBUTOR,simonw/datasette/pulls/1797,"Bumps [black](https://github.com/psf/black) from 22.6.0 to 22.8.0.
Release notes

Sourced from black's releases.

22.8.0

Highlights

  • Python 3.11 is now supported, except for blackd as aiohttp does not support 3.11 as of publishing (#3234)
  • This is the last release that supports running Black on Python 3.6 (formatting 3.6 code will continue to be supported until further notice)
  • Reword the stability policy to say that we may, in rare cases, make changes that affect code that was not previously formatted by Black (#3155)

Stable style

  • Fix an infinite loop when using # fmt: on/off in the middle of an expression or code block (#3158)
  • Fix incorrect handling of # fmt: skip on colon (:) lines (#3148)
  • Comments are no longer deleted when a line had spaces removed around power operators (#2874)

Preview style

  • Single-character closing docstring quotes are no longer moved to their own line as this is invalid. This was a bug introduced in version 22.6.0. (#3166)
  • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168)
  • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209)
  • Implicitly concatenated strings inside a list, set, or tuple are now wrapped inside parentheses (#3162)
  • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227)

Blackd

  • blackd now supports enabling the preview style via the X-Preview header (#3217)

Configuration

  • Black now uses the presence of debug f-strings to detect target version (#3215)
  • Fix misdetection of project root and verbose logging of sources in cases involving --stdin-filename (#3216)
  • Immediate .gitignore files in source directories given on the command line are now also respected, previously only .gitignore files in the project root and automatically discovered directories were respected (#3237)

Documentation

  • Recommend using BlackConnect in IntelliJ IDEs (#3150)

Integrations

  • Vim plugin: prefix messages with Black: so it's clear they come from Black (#3194)
  • Docker: changed to a /opt/venv installation + added to PATH to be available to non-root users (#3202)

Output

  • Change from deprecated asyncio.get_event_loop() to create our event loop which removes DeprecationWarning (#3164)
  • Remove logging from internal blib2to3 library since it regularly emits error logs about failed caching that can and should be ignored (#3193)

Parser

  • Type comments are now included in the AST equivalence check consistently so accidental deletion raises an error. Though type comments can't be tracked when running on PyPy 3.7 due to standard library limitations. (#2874)

Performance

... (truncated)

Changelog

Sourced from black's changelog.

22.8.0

Highlights

  • Python 3.11 is now supported, except for blackd as aiohttp does not support 3.11 as of publishing (#3234)
  • This is the last release that supports running Black on Python 3.6 (formatting 3.6 code will continue to be supported until further notice)
  • Reword the stability policy to say that we may, in rare cases, make changes that affect code that was not previously formatted by Black (#3155)

Stable style

  • Fix an infinite loop when using # fmt: on/off in the middle of an expression or code block (#3158)
  • Fix incorrect handling of # fmt: skip on colon (:) lines (#3148)
  • Comments are no longer deleted when a line had spaces removed around power operators (#2874)

Preview style

  • Single-character closing docstring quotes are no longer moved to their own line as this is invalid. This was a bug introduced in version 22.6.0. (#3166)
  • --skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168)
  • When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209)
  • Implicitly concatenated strings inside a list, set, or tuple are now wrapped inside parentheses (#3162)
  • Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227)

Blackd

  • blackd now supports enabling the preview style via the X-Preview header (#3217)

Configuration

  • Black now uses the presence of debug f-strings to detect target version (#3215)
  • Fix misdetection of project root and verbose logging of sources in cases involving --stdin-filename (#3216)
  • Immediate .gitignore files in source directories given on the command line are now also respected, previously only .gitignore files in the project root and automatically discovered directories were respected (#3237)

Documentation

  • Recommend using BlackConnect in IntelliJ IDEs (#3150)

Integrations

... (truncated)

Commits
  • 2018e66 Prepare docs for release 22.8.0 (#3248)
  • 0019261 Update stable branch after publishing to PyPI (#3223)
  • 7757078 Improve & update release process to reflect recent changes (#3242)
  • 767604e Use .gitignore files in the initial source directories (#3237)
  • 2c90480 Use strict mypy checking (#3222)
  • ba618a3 Add parens around implicit string concatenations where it increases readabili...
  • c0cc19b Delay worker count determination
  • afed2c0 Load .gitignore and exclude regex at time of use
  • e269f44 Lazily import parallelized format modules
  • c47b91f Fix misdetection of project root with --stdin-filename (#3216)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.6.0&new-version=22.8.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1797.org.readthedocs.build/en/1797/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1797/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1362363685,I_kwDOBm6k_c5RNAUl,1800,Remove upper bound dependencies as a default policy,9599,simonw,closed,0,,,,,3,2022-09-05T18:23:45Z,2022-09-05T18:39:52Z,2022-09-05T18:35:41Z,OWNER,,"https://iscinumpy.dev/post/bound-version-constraints/ has convinced me not to use upper bound dependencies unless I'm certain they are needed. Relevant PR: - https://github.com/simonw/datasette/pull/1799 Also: https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L45-L46 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L48-L49 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L51-L55 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L57-L59 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L75-L78 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L81-L82 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1800/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1180778860,PR_kwDOBm6k_c41BFWj,1685,"Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0",49699333,dependabot[bot],closed,0,,,,,3,2022-03-25T13:12:13Z,2022-09-05T18:36:49Z,2022-09-05T18:36:48Z,CONTRIBUTOR,simonw/datasette/pulls/1685,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes

Sourced from jinja2's releases.

3.1.0

This is a feature release, which includes new features and removes previously deprecated features. The 3.1.x branch is now the supported bugfix branch, the 3.0.x branch has become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades. We also encourage upgrading to MarkupSafe 2.1.1, the latest version at this time.

Changelog

Sourced from jinja2's changelog.

Version 3.1.0

Released 2022-03-24

  • Drop support for Python 3.6. :pr:1534

  • Remove previously deprecated code. :pr:1544

    • WithExtension and AutoEscapeExtension are built-in now.
    • contextfilter and contextfunction are replaced by pass_context. evalcontextfilter and evalcontextfunction are replaced by pass_eval_context. environmentfilter and environmentfunction are replaced by pass_environment.
    • Markup and escape should be imported from MarkupSafe.
    • Compiled templates from very old Jinja versions may need to be recompiled.
    • Legacy resolve mode for Context subclasses is no longer supported. Override resolve_or_missing instead of resolve.
    • unicode_urlencode is renamed to url_quote.
  • Add support for native types in macros. :issue:1510

  • The {% trans %} tag can use pgettext and npgettext by passing a context string as the first token in the tag, like {% trans "title" %}. :issue:1430

  • Update valid identifier characters from Python 3.6 to 3.7. :pr:1571

  • Filters and tests decorated with @async_variant are pickleable. :pr:1612

  • Add items filter. :issue:1561

  • Subscriptions ([0], etc.) can be used after filters, tests, and calls when the environment is in async mode. :issue:1573

  • The groupby filter is case-insensitive by default, matching other comparison filters. Added the case_sensitive parameter to control this. :issue:1463

  • Windows drive-relative path segments in template names will not result in FileSystemLoader and PackageLoader loading from drive-relative paths. :pr:1621

Version 3.0.3

Released 2021-11-09

  • Fix traceback rewriting internals for Python 3.10 and 3.11. :issue:1535
  • Fix how the native environment treats leading and trailing spaces when parsing values on Python 3.10. :pr:1537

... (truncated)

Commits
  • 84c0e2c Merge pull request #1625 from pallets/release-3.1.0
  • 7b0c47f release version 3.1.0
  • ede0f98 Merge pull request #1621 from pallets/template-safe-path
  • 040088a use posixpath.join when loading template names
  • a292075 Merge pull request #1620 from janfilips/patch-1
  • 6e4df02 Fix formatting in tricks.rst
  • 3a050b1 Merge pull request #1617 from pallets/docs-prose
  • 4b63cd8 rewrite include statement section
  • a98d482 clean up faq, move technical discussions
  • 9de99f8 clean up engine comparisons
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1685/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1362242558,PR_kwDOBm6k_c4-Yqgo,1799,"Update aiofiles requirement from <0.9,>=0.4 to >=0.4,<22.2",49699333,dependabot[bot],closed,0,,,,,1,2022-09-05T16:13:48Z,2022-09-05T18:36:44Z,2022-09-05T18:36:43Z,CONTRIBUTOR,simonw/datasette/pulls/1799,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
---- :books: Documentation preview :books:: https://datasette--1799.org.readthedocs.build/en/1799/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1799/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1362367821,PR_kwDOBm6k_c4-ZGW6,1801,"Don't use upper bound dependencies, refs #1800",9599,simonw,closed,0,,,,,1,2022-09-05T18:29:28Z,2022-09-05T18:35:41Z,2022-09-05T18:35:41Z,OWNER,simonw/datasette/pulls/1801,"See https://iscinumpy.dev/post/bound-version-constraints/ ---- :books: Documentation preview :books:: https://datasette--1801.org.readthedocs.build/en/1801/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1801/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1359557737,I_kwDOBm6k_c5RCTRp,1798,"Parts of YAML file do not work when db name is ""off""",562352,CharlesNepote,closed,0,,,,,4,2022-09-01T22:10:57Z,2022-09-02T00:02:53Z,2022-09-01T23:56:33Z,NONE,,"I guess this issue is not very important and probably rare. To reproduce: * create and populate a db named `off.db` * in the yaml file, add any kind of information below `databases:\n off:` * the data are not taken into account (because ""off"" is interpreted as ""false"") YAML file: ```yaml title: Some title description_html: |-

This is an experiment.

databases: off: tables: products_from_owners: title: products_from_owners* description_html: |-

Description

``` The result for http://xxxx.xxx/-/metadata gives: ```json { ""title"": ""Some title"", ""description_html"": ""

This is an experiment.

"", ""databases"": { ""false"": { ""tables"": { ""products_from_owners"": { ""title"": ""products_from_owners*"", ""description_html"": ""

Description

"" } } } } } ``` => see the `""false""` instead of `""off""`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1355433619,PR_kwDOCGYnMM4-B7Mc,480,search_sql add include_rank option,7908073,chapmanjacobd,closed,0,,,,,4,2022-08-30T09:10:29Z,2022-08-31T03:40:35Z,2022-08-31T03:40:35Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/480,"I haven't tested this yet but wanted to get a heads-up whether this kind of change would be useful or if I should just duplicate the function and tweak it within my code ---- :books: Documentation preview :books:: https://sqlite-utils--480.org.readthedocs.build/en/480/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/480/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1178546862,I_kwDOCGYnMM5GPzKu,420,Document how to use a `--convert` function that runs initialization code first,770231,strada,closed,0,,,,,12,2022-03-23T19:07:36Z,2022-08-28T11:34:37Z,2022-03-25T20:07:33Z,NONE,,"When I have an insert command with transform like this: ``` cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert ' d = enchant.Dict(""en_US"") row[""is_dictionary_word""] = d.check(row[""name""]) ' --import=enchant --ignore ``` I noticed as the number of rows increases the operation becomes quite slow, likely due to the creation of the `d = enchant.Dict(""en_US"")` object for each row. Is there a way to share that instance `d` between transform function calls, like a shared context?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353196970,I_kwDOCGYnMM5QqCWq,476,Release notes for 3.29,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T23:21:21Z,2022-08-28T04:07:15Z,2022-08-28T04:07:03Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.28...104f37fa4d2e7e5999c1d829267b62c737f74d3e,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1348169997,I_kwDOCGYnMM5QW3EN,467,Mechanism for ensuring a table has all the columns,9599,simonw,closed,0,,,8355157,3.29,13,2022-08-23T15:50:23Z,2022-08-27T23:19:41Z,2022-08-27T23:17:56Z,OWNER,,Suggested by @jefftriplett on Discord: https://discord.com/channels/823971286308356157/997738192360964156/1011655389063958600,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1348294436,PR_kwDOCGYnMM49qP2V,468,"db[table].create(..., transform=True) and create-table --transform",9599,simonw,closed,0,,,8355157,3.29,6,2022-08-23T17:27:58Z,2022-08-27T23:17:55Z,2022-08-27T23:17:55Z,OWNER,simonw/sqlite-utils/pulls/468,"Work in progress. Still needs documentation and tests (and to cover more cases of things that might have changed). Refs: - #467 ---- :books: Documentation preview :books:: https://sqlite-utils--468.org.readthedocs.build/en/468/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1353189941,I_kwDOCGYnMM5QqAo1,475,table.default_values introspection property,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T22:33:31Z,2022-08-27T22:44:46Z,2022-08-27T22:43:02Z,OWNER,,"> Interesting challenge with `default_value`: I need to be able to tell if the default values passed to `.create()` differ from those in the database already. > > Introspecting that is a bit tricky: > > ```pycon > >>> import sqlite_utils > >>> db = sqlite_utils.Database(memory=True) > >>> db[""blah""].create({""id"": int, ""name"": str}, not_null=(""name"",), defaults={""name"": ""bob""}) >
> >>> db[""blah""].columns > [Column(cid=0, name='id', type='INTEGER', notnull=0, default_value=None, is_pk=0), Column(cid=1, name='name', type='TEXT', notnull=1, default_value=""'bob'"", is_pk=0)] > ``` > Note how a default value of the Python string `bob` is represented in the results of `PRAGMA table_info()` as `default_value=""'bob'""` - it's got single quotes added to it! > > So comparing default values from introspecting the database needs me to first parse that syntax. This may require a new table introspection method. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/468#issuecomment-1229279539_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1199158210,I_kwDOCGYnMM5HebPC,423,.extract() doesn't set foreign key when extracted columns contain NULL value,37447552,jlieth,closed,0,,,,,1,2022-04-10T20:05:30Z,2022-08-27T14:45:04Z,2022-08-27T14:45:04Z,NONE,,"I've run into an issue with `extract` and I don't believe this is the intended behaviour. I'm working with a database with music listening information. Currently it has one large table `listens` that contains all information. I'm trying to normalize the database by extracting relevant columns to separate tables (`artists`, `tracks`, `albums`). Not every track has an album. A simplified demonstration with just `track_title` and `album_title` columns: ```ipython In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""listens""].insert_all([ ...: {""id"": 1, ""track_title"": ""foo"", ""album_title"": ""bar""}, ...: {""id"": 2, ""track_title"": ""baz"", ""album_title"": None} ...: ], pk=""id"") Out[3]:
``` The track in the first row has an album, the second track doesn't. Now I extract album information into a separate column: ```ipython In [4]: db[""listens""].extract(columns=[""album_title""], table=""albums"", fk_column=""album_id"") Out[4]:
In [5]: list(db[""albums""].rows) Out[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}] In [6]: list(db[""listens""].rows) Out[6]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] ``` This behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty). Now I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table. ```ipython In [7]: db[""listens""].extract(columns=[""track_title"", ""album_id""], table=""tracks"", fk_column=""track_id"") Out[7]:
In [8]: list(db[""tracks""].rows) Out[8]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] In [9]: list(db[""listens""].rows) Out[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}] ``` Extracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL. Changing the order of extracts doesn't help. I poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1309542173,PR_kwDOCGYnMM47pwAb,455,"in extract code, check equality with IS instead of = for nulls",536941,fgregg,closed,0,,,,,3,2022-07-19T13:40:25Z,2022-08-27T14:45:03Z,2022-08-27T14:45:03Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/455,"sqlite ""IS"" is equivalent to SQL ""IS NOT DISTINCT FROM"" closes #423",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1352953535,PR_kwDOCGYnMM4950Az,473,Support entrypoints for `--load-extension`,9599,simonw,closed,0,,,,,1,2022-08-27T05:53:59Z,2022-08-27T05:55:52Z,2022-08-27T05:55:47Z,OWNER,simonw/sqlite-utils/pulls/473,"Refs #470 ---- :books: Documentation preview :books:: https://sqlite-utils--473.org.readthedocs.build/en/473/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1352932038,I_kwDOCGYnMM5QpBrG,470,Upgrade `--load-extension` to accept entrypoints like Datasette,9599,simonw,closed,0,,,8355157,3.29,6,2022-08-27T03:53:20Z,2022-08-27T05:55:49Z,2022-08-27T05:55:48Z,OWNER,,"Imitate: - https://github.com/simonw/datasette/pull/1789 ``` # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the ""sqlite3_foo_init"" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the ""sqlite3_bar_init"" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352946135,I_kwDOCGYnMM5QpFHX,472,Reuse the locals/globals fix from --functions for other code accepting options,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T05:12:05Z,2022-08-27T05:20:12Z,2022-08-27T05:20:12Z,OWNER,,"I figured out a workaround for the ugly `global x` hack here: - https://github.com/simonw/sqlite-utils/issues/471#issuecomment-1229120653",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352931464,I_kwDOCGYnMM5QpBiI,469,sqlite-utils rows --order option,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T03:49:51Z,2022-08-27T04:30:49Z,2022-08-27T04:10:32Z,OWNER,,"For consistency with `search`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#search ``` -o, --order TEXT Order by ('column' or 'column desc') ``` I wanted to run `sqlite-utils rows db.db mytable --order 'rowid desc'` to see the most recently imported rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1320243134,I_kwDOCGYnMM5OsU--,458,Support custom names for registered functions,9599,simonw,closed,0,,,8355157,3.29,1,2022-07-28T00:13:00Z,2022-08-27T03:56:01Z,2022-07-28T00:13:57Z,OWNER,,"In this example: ```python @db.register_function def reverse_string(s): return """".join(reversed(list(s))) print(db.execute('select reverse_string(""hello"")').fetchone()[0]) ``` There's currently no way to over-ride the automatically selected name for the SQL function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1319881016,PR_kwDOCGYnMM48Mmde,457,Link to installation instructions,9599,simonw,closed,0,,,8355157,3.29,2,2022-07-27T17:38:36Z,2022-08-27T03:55:52Z,2022-07-27T17:57:50Z,OWNER,simonw/sqlite-utils/pulls/457,Also testing https://docs.readthedocs.io/en/stable/pull-requests.html,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1348394901,PR_kwDOBm6k_c49qmC2,1792,Test `--load-extension` in GitHub Actions,9599,simonw,closed,0,,,,,3,2022-08-23T18:43:29Z,2022-08-24T00:11:46Z,2022-08-24T00:11:45Z,OWNER,simonw/datasette/pulls/1792,"Refs: - #1789 ---- :books: Documentation preview :books:: https://datasette--1792.org.readthedocs.build/en/1792/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1792/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1344823170,PR_kwDOBm6k_c49e3_k,1789,Add new entrypoint option to `--load-extension`,15178711,asg017,closed,0,,,,,9,2022-08-19T19:27:47Z,2022-08-23T18:42:52Z,2022-08-23T18:34:30Z,CONTRIBUTOR,simonw/datasette/pulls/1789,"Closes #1784 The `--load-extension` flag can now accept an optional ""entrypoint"" value, to specify which entrypoint SQLite should load from the given extension. ```bash # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the ""sqlite3_foo_init"" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the ""sqlite3_bar_init"" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ``` For testing, I added a small SQLite extension in C at `tests/ext.c`. If compiled, then pytest will run the unit tests in `test_load_extensions.py`to verify that Datasette loads in extensions correctly (and loads the correct entrypoints). Compiling the extension requires a C compiler, I compiled it on my Mac with: ``` gcc ext.c -I path/to/sqlite -fPIC -shared -o ext.dylib ``` Where `path/to/sqlite` is a directory that contains the SQLite amalgamation header files. Re documentation: I added a bit to the help text for `--load-extension` (which I believe should auto-add to documentation?), and the existing extension documentation is spatialite specific. Let me know if a new extensions documentation page would be helpful!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1789/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1339663518,I_kwDOBm6k_c5P2aSe,1784,"Include ""entrypoint"" option on `--load-extension`?",15178711,asg017,closed,0,,,,,2,2022-08-16T00:22:57Z,2022-08-23T18:34:31Z,2022-08-23T18:34:31Z,CONTRIBUTOR,,"## Problem SQLite extensions have the option to define multiple ""entrypoints"" in each loadable extension. For example, the upcoming version of `sqlite-lines` will have 2 entrypoints: the default `sqlite3_lines_init` (which SQLite will automatically guess for) and `sqlite3_lines_noread_init`. The `sqlite3_lines_noread_init` version omits functions that read from the filesystem, which is necessary for security purposes when running untrusted SQL (which Datasette does). (Similar multiple entrypoints will also be added for sqlite-http). The `--load-extension` flag, however, doesn't give the option to specify a different entrypoint, so the default one is always used. ## Proposal I want there to be a new command line option of the `--load-extension` flag to specify a custom entrypoint like so: ``` datasette my.db \ --load-extension ./lines0 sqlite3_lines0_noread_init ``` Then, under the hood, this line of code: https://github.com/simonw/datasette/blob/7af67b54b7d9bca43e948510fc62f6db2b748fa8/datasette/app.py#L562 Would look something like this: ```python conn.execute(""SELECT load_extension(?, ?)"", [extension, entrypoint]) ``` One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options (""arity""). So I guess it could also use a `:` delimiter like `--static`: ``` datasette my.db \ --load-extension ./lines0:sqlite3_lines0_noread_init ``` Or maybe even a new flag name? ``` datasette my.db \ --load-extension-entrypoint ./lines0 sqlite3_lines0_noread_init ``` Personally I prefer the `:` option... and maybe even `--load-extension` -> `--load`? Definitely out of scope for this issue tho ``` datasette my.db \ --load./lines0:sqlite3_lines0_noread_init ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1784/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750141615,MDExOlB1bGxSZXF1ZXN0NTI2ODQ3ODIz,7,Fixed conflicting CLI flags,8944,tlockney,closed,0,,,,,1,2020-11-24T23:25:12Z,2022-08-21T21:11:56Z,2022-08-21T21:11:56Z,CONTRIBUTOR,dogsheep/pocket-to-sqlite/pulls/7,"The `-a` used for the auth credentials and the shortened form of the `--all` flags were in conflict on the `fetch` command. To be consistent with other `-to-sqlite` libraries in the Dogsheep ecosystem, I removed the shortened form of the `--all` flag.",213286752,pocket-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1334415381,PR_kwDOBm6k_c488nq6,1778,Use Read the Docs action v1,244656,humitos,closed,0,,,,,0,2022-08-10T10:30:50Z,2022-08-20T00:04:17Z,2022-08-20T00:04:17Z,CONTRIBUTOR,simonw/datasette/pulls/1778,"Read the Docs repository was renamed from `readthedocs/readthedocs-preview` to `readthedocs/actions/`. Now, the `preview` action is under `readthedocs/actions/preview` and is tagged as `v1` ---- :books: Documentation preview :books:: https://datasette--1778.org.readthedocs.build/en/1778/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1778/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1343732788,I_kwDOBm6k_c5QF7w0,1788,Make it more obvious that Datasette publish can publish multiple databases,9599,simonw,closed,0,,,,,0,2022-08-18T22:57:51Z,2022-08-18T23:06:16Z,2022-08-18T23:06:16Z,OWNER,,Feedback initially for `datasette-publish-fly` but it applies to the others too.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1788/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1326087800,PR_kwDOCGYnMM48hI-_,460,Cross-link CLI to Python docs,9599,simonw,closed,0,,,,,4,2022-08-02T16:18:28Z,2022-08-18T21:58:10Z,2022-08-18T21:58:07Z,OWNER,simonw/sqlite-utils/pulls/460,"Work in progress, partly to test the ReadTheDocs preview link action. Refs: - #426 ---- :books: Documentation preview :books:: https://readthedocs-preview--460.org.readthedocs.build/en/460/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1343422749,I_kwDOBm6k_c5QEwEd,1787,"Move ""datasette --get"" from Getting Started to CLI Reference",9599,simonw,closed,0,,,,,5,2022-08-18T17:53:39Z,2022-08-18T21:57:09Z,2022-08-18T21:56:21Z,OWNER,,It really shouldn't be here: https://docs.datasette.io/en/0.62/getting_started.html#datasette-get,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1787/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1334416486,PR_kwDOCGYnMM488n6D,463,Use Read the Docs action v1,244656,humitos,closed,0,,,,,1,2022-08-10T10:31:47Z,2022-08-18T08:30:14Z,2022-08-17T23:11:16Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/463,"Read the Docs repository was renamed from `readthedocs/readthedocs-preview` to `readthedocs/actions/`. Now, the `preview` action is under `readthedocs/actions/preview` and is tagged as `v1` ---- :books: Documentation preview :books:: https://sqlite-utils--463.org.readthedocs.build/en/463/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/463/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1342357149,PR_kwDOCGYnMM49Wsnq,465,beanbag-docutils>=2.0,9599,simonw,closed,0,,,,,2,2022-08-17T22:41:39Z,2022-08-17T23:38:07Z,2022-08-17T23:38:02Z,OWNER,simonw/sqlite-utils/pulls/465,Refs #464,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1338001039,I_kwDOCGYnMM5PwEaP,464,Link from documentation to source code,9599,simonw,closed,0,,,,,5,2022-08-13T16:19:57Z,2022-08-17T23:38:03Z,2022-08-17T23:38:03Z,OWNER,,Twitter conversation asking for ways to automate this here: https://twitter.com/simonw/status/1558260492015046656,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1342374388,PR_kwDOCGYnMM49Wv9T,466,Use Read the Docs action v1 (#463),9599,simonw,closed,0,,,,,0,2022-08-17T23:11:50Z,2022-08-17T23:11:54Z,2022-08-17T23:11:54Z,OWNER,simonw/sqlite-utils/pulls/466,"Read the Docs repository was renamed from `readthedocs/readthedocs-preview` to `readthedocs/actions/`. Now, the `preview` action is under `readthedocs/actions/preview` and is tagged as `v1`",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1084193403,PR_kwDOBm6k_c4wDKmb,1574,introduce new option for datasette package to use a slim base image,33631,fs111,closed,0,,,,,6,2021-12-19T21:18:19Z,2022-08-15T08:49:31Z,2022-08-15T08:49:31Z,NONE,simonw/datasette/pulls/1574,"The official python images on docker hub come with a slim variant that is significantly smaller than the default. The diff does not change the default, but allows to switch to the `slim` variant with commandline switch (`--slim-base-image`) Size comparison: ``` $ datasette package some.db -t fat --install ""datasette-basemap datasette-cluster-map"" $ datasette package some.db -t slim --slim-base-image --install ""datasette-basemap datasette-cluster-map"" $ docker images REPOSITORY TAG IMAGE ID CREATED SIZE fat latest 807b393ace0d 9 seconds ago 978MB slim latest 31bc5e63505c 8 minutes ago 191MB ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1334628400,I_kwDOBm6k_c5PjNAw,1779,google cloudrun updated their limits on maxscale based on memory and cpu count,536941,fgregg,closed,0,,,8303187,Datasette 0.62,13,2022-08-10T13:27:21Z,2022-08-14T19:42:59Z,2022-08-14T17:07:34Z,CONTRIBUTOR,,"if you don't set an explicit limit on container scaling, then [google defaults to 100](https://cloud.google.com/run/docs/configuring/max-instances#limits) google recently updated the [limits on container scaling](https://cloud.google.com/run/docs/configuring/max-instances#limits), such that if you set up datasette to use more memory or cpu, then you need to set the maxScale argument much smaller than 100. would be nice if `datasette publish` could do this math for you and set the right maxScale. [Log of an failing publish run](https://github.com/labordata/warehouse/runs/7764725972?check_suite_focus=true#step:8:332). ``` ERROR: (gcloud.run.deploy) spec.template.spec.containers[0].resources.limits.cpu: Invalid value specified for cpu. For the specified value, maxScale may not exceed 15. Consider running your workload in a region with greater capacity, decreasing your requested cpu-per-instance, or requesting an increase in quota for this region if you are seeing sustained usage near this limit, see https://cloud.google.com/run/quotas. Your project may gain access to further scaling by adding billing information to your account. Traceback (most recent call last): File ""/home/runner/.local/bin/datasette"", line 8, in sys.exit(cli()) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/datasette/publish/cloudrun.py"", line 160, in cloudrun check_call( File ""/usr/lib/python3.8/subprocess.py"", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/labordata/datasette warehouse --memory 8Gi --cpu 2' returned non-zero exit status 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1779/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338278056,I_kwDOBm6k_c5PxICo,1782,Release notes for Datasette 0.62,9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-08-14T15:26:45Z,2022-08-14T17:40:45Z,2022-08-14T17:32:54Z,OWNER,,"I've written a lot of these already for the alphas: - https://github.com/simonw/datasette/releases/tag/0.62a0 - https://github.com/simonw/datasette/releases/tag/0.62a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1218133366,I_kwDOBm6k_c5Imz12,1728,Writable canned queries fail with useless non-error against immutable databases,127565,wragge,closed,0,,,8303187,Datasette 0.62,13,2022-04-28T03:10:34Z,2022-08-14T16:34:40Z,2022-08-14T16:34:40Z,CONTRIBUTOR,,"I've been banging my head against a wall for a while and would appreciate any pointers... - I have a writeable canned query to update rows in the db. - I'm using the github-oauth plugin for authentication. - I have `allow` set on the query to accept my GitHub id and a GH organisation. - Authentication seems to work as expected both locally and on Cloudrun -- viewing `/-/actor` gives the same result in both environments - I can access the 'padlocked' canned query in both environments. Everything seems to be the same, but the canned query works perfectly when run locally, and fails when I try it on Cloudrun. I'm redirected back to the canned query page and the db is not changed. There's nothing in the Cloudstor logs to indicate an error. Any clues as to where I should be looking for the problem?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223527226,I_kwDOBm6k_c5I7Ys6,1738,"""Cannot use _sort and _sort_desc at the same time""",9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-05-03T01:06:24Z,2022-08-14T16:13:55Z,2022-08-14T16:13:55Z,OWNER,,"Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width: https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1 Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply: ![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif) Also notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1318907685,I_kwDOBm6k_c5OnO8l,1773,500 error if sorted by a column not in the ?_col= list,9599,simonw,closed,0,,,8303187,Datasette 0.62,4,2022-07-27T01:20:27Z,2022-08-14T16:06:25Z,2022-08-14T15:44:05Z,OWNER,,"For example: https://latest.datasette.io/fixtures/sortable?_sort_desc=sortable&_col=sortable_with_nulls That's `?_sort_desc=sortable&_col=sortable_with_nulls` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1773/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1296222572,I_kwDOBm6k_c5NQsls,1768,Upgrade to 3.10.6-slim-bullseye Docker base image,9599,simonw,closed,0,,,8303187,Datasette 0.62,5,2022-07-06T18:37:49Z,2022-08-14T15:54:36Z,2022-08-14T15:54:11Z,OWNER,,For the package published to Docker Hub and also the containers used by `datasette package` and `datasette publish cloudrun`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1768/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306492437,I_kwDOBm6k_c5N334V,1770,`handle_exception` plugin hook for custom error handling,9599,simonw,closed,0,,,8303187,Datasette 0.62,14,2022-07-15T20:52:49Z,2022-08-14T15:25:51Z,2022-08-14T15:25:51Z,OWNER,,"I need this for a couple of plugins, both of which are broken at the moment: - https://github.com/simonw/datasette-sentry/issues/1 - https://github.com/simonw/datasette-show-errors/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338137350,I_kwDOBm6k_c5PwlsG,1781,Ensure Datasette Lite is promoted in docs and README,9599,simonw,closed,0,,,8303187,Datasette 0.62,1,2022-08-14T05:12:35Z,2022-08-14T15:24:40Z,2022-08-14T15:24:40Z,OWNER,,As of 0.62 https://lite.datasette.io is a supported piece of the overall Datasette ecosystem.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1326391841,PR_kwDOCGYnMM48iLGF,462,Discord badge,9599,simonw,closed,0,,,,,2,2022-08-02T20:56:04Z,2022-08-02T21:15:57Z,2022-08-02T21:15:52Z,OWNER,simonw/sqlite-utils/pulls/462,"Also testing fix for: - https://github.com/readthedocs/readthedocs-preview/issues/10 ---- :books: Documentation preview :books:: https://sqlite-utils--462.org.readthedocs.build/en/462/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 663145122,MDU6SXNzdWU2NjMxNDUxMjI=,903,Add temporary plugin testing pattern to the testing docs,9599,simonw,closed,0,,,,,1,2020-07-21T16:22:34Z,2022-07-18T21:34:33Z,2022-07-18T21:31:22Z,OWNER,,"https://til.simonwillison.net/pytest/registering-plugins-in-tests Would be useful to include this pattern on https://datasette.readthedocs.io/en/stable/testing_plugins.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/903/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1261884917,PR_kwDODFdgUs45K1L3,73,Fixing 'NoneType' object has no attribute 'items',1224205,empjustine,closed,0,,,,,1,2022-06-06T13:58:11Z,2022-07-18T19:40:12Z,2022-07-18T19:40:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/73,"Under some conditions, GitHub caches removed starred repositories and ends up leaving dangling `None` user references. Traceback (most recent call last): File ""/home/dogsheep/dogsheep/github-to-sqlite/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/cli.py"", line 181, in starred utils.save_stars(db, user, stars) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 494, in save_stars repo_id = save_repo(db, repo) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 308, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 229, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items'",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1292368833,I_kwDOBm6k_c5NB_vB,1764,Keep track of config_dir in directory mode (for plugins),25778,eyeseast,closed,0,,,,,0,2022-07-03T16:57:49Z,2022-07-18T01:12:45Z,2022-07-18T01:12:45Z,CONTRIBUTOR,,"I started working on using `config_dir` with my [datasette-query-files plugin](https://github.com/eyeseast/datasette-query-files) and realized Datasette doesn't actually hold onto the `config_dir` argument. It gets used in `__init__` but then forgotten. It would be nice to be able to use it in plugins, though. Here's the reference issue: https://github.com/eyeseast/datasette-query-files/issues/4 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1292377561,PR_kwDOBm6k_c46wdOW,1766,Keep track of config_dir,25778,eyeseast,closed,0,,,,,2,2022-07-03T17:37:02Z,2022-07-18T01:12:45Z,2022-07-18T01:12:45Z,CONTRIBUTOR,simonw/datasette/pulls/1766,"Closes #1764 Small change that adds `self.config_dir = config_dir` to `Datasette.__init__`. This will let plugins also use `config_dir`, if available.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1306020162,PR_kwDOBm6k_c47eFtx,1769,"Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20",49699333,dependabot[bot],closed,0,,,,,1,2022-07-15T13:10:15Z,2022-07-18T01:06:38Z,2022-07-18T01:06:38Z,CONTRIBUTOR,simonw/datasette/pulls/1769,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Release notes

Sourced from pytest-asyncio's releases.

pytest-asyncio 0.19.0


title: 'pytest-asyncio: pytest support for asyncio'

pytest-asyncio is an Apache2 licensed library, written in Python, for testing asyncio code with pytest.

asyncio code is usually written in the form of coroutines, which makes it slightly more difficult to test using normal testing tools. pytest-asyncio provides useful fixtures and markers to make testing easier.

@pytest.mark.asyncio
async def test_some_asyncio_code():
    res = await library.do_something()
    assert b"expected result" == res

pytest-asyncio has been strongly influenced by pytest-tornado.

Features

  • fixtures for creating and injecting versions of the asyncio event loop
  • fixtures for injecting unused tcp/udp ports
  • pytest markers for treating tests as asyncio coroutines
  • easy testing with non-default event loops
  • support for [async def]{.title-ref} fixtures and async generator fixtures
  • support auto mode to handle all async fixtures and tests automatically by asyncio; provide strict mode if a test suite should work with different async frameworks simultaneously, e.g. asyncio and trio.

Installation

... (truncated)

Changelog

Sourced from pytest-asyncio's changelog.

0.19.0 (22-07-13)

  • BREAKING: The default asyncio_mode is now strict. [#293](https://github.com/pytest-dev/pytest-asyncio/issues/293) <https://github.com/pytest-dev/pytest-asyncio/issues/293>_
  • Removes setup.py since all relevant configuration is present setup.cfg. Users requiring an editable installation of pytest-asyncio need to use pip v21.1 or newer. [#283](https://github.com/pytest-dev/pytest-asyncio/issues/283) <https://github.com/pytest-dev/pytest-asyncio/issues/283>_
  • Declare support for Python 3.11.

0.18.3 (22-03-25)

  • Adds pytest-trio <https://pypi.org/project/pytest-trio/>_ to the test dependencies
  • Fixes a bug that caused pytest-asyncio to try to set up async pytest_trio fixtures in strict mode. [#298](https://github.com/pytest-dev/pytest-asyncio/issues/298) <https://github.com/pytest-dev/pytest-asyncio/issues/298>_

0.18.2 (22-03-03)

  • Fix asyncio auto mode not marking static methods. [#295](https://github.com/pytest-dev/pytest-asyncio/issues/295) <https://github.com/pytest-dev/pytest-asyncio/issues/295>_
  • Fix a compatibility issue with Hypothesis 6.39.0. [#302](https://github.com/pytest-dev/pytest-asyncio/issues/302) <https://github.com/pytest-dev/pytest-asyncio/issues/302>_

0.18.1 (22-02-10)

  • Fixes a regression that prevented async fixtures from working in synchronous tests. [#286](https://github.com/pytest-dev/pytest-asyncio/issues/286) <https://github.com/pytest-dev/pytest-asyncio/issues/286>_

0.18.0 (22-02-07)

  • Raise a warning if @​pytest.mark.asyncio is applied to non-async function. [#275](https://github.com/pytest-dev/pytest-asyncio/issues/275) <https://github.com/pytest-dev/pytest-asyncio/issues/275>_
  • Support parametrized event_loop fixture. [#278](https://github.com/pytest-dev/pytest-asyncio/issues/278) <https://github.com/pytest-dev/pytest-asyncio/issues/278>_

0.17.2 (22-01-17)

  • Require typing-extensions on Python`_
  • Fix a regression in tests collection introduced by 0.17.1, the plugin works fine with non-python tests again. [#267](https://github.com/pytest-dev/pytest-asyncio/issues/267) <https://github.com/pytest-dev/pytest-asyncio/issues/267>_

0.17.1 (22-01-16)

  • Fixes a bug that prevents async Hypothesis tests from working without explicit asyncio marker when --asyncio-mode=auto is set. [#258](https://github.com/pytest-dev/pytest-asyncio/issues/258) <https://github.com/pytest-dev/pytest-asyncio/issues/258>_
  • Fixed a bug that closes the default event loop if the loop doesn't exist [#257](https://github.com/pytest-dev/pytest-asyncio/issues/257) <https://github.com/pytest-dev/pytest-asyncio/issues/257>_
  • Added type annotations. [#198](https://github.com/pytest-dev/pytest-asyncio/issues/198) <https://github.com/pytest-dev/pytest-asyncio/issues/198>_
  • Show asyncio mode in pytest report headers. [#266](https://github.com/pytest-dev/pytest-asyncio/issues/266) <https://github.com/pytest-dev/pytest-asyncio/issues/266>_
  • Relax asyncio_mode type definition; it allows to support pytest 6.1+. [#262](https://github.com/pytest-dev/pytest-asyncio/issues/262) <https://github.com/pytest-dev/pytest-asyncio/issues/262>_

0.17.0 (22-01-13)

  • pytest-asyncio no longer alters existing event loop policies. [#168](https://github.com/pytest-dev/pytest-asyncio/issues/168) <https://github.com/pytest-dev/pytest-asyncio/issues/168>, [#188](https://github.com/pytest-dev/pytest-asyncio/issues/188) <https://github.com/pytest-dev/pytest-asyncio/issues/168>
  • Drop support for Python 3.6
  • Fixed an issue when pytest-asyncio was used in combination with flaky or inherited asynchronous Hypothesis tests. [#178](https://github.com/pytest-dev/pytest-asyncio/issues/178) <https://github.com/pytest-dev/pytest-asyncio/issues/178>_ [#231](https://github.com/pytest-dev/pytest-asyncio/issues/231) <https://github.com/pytest-dev/pytest-asyncio/issues/231>_
  • Added flaky <https://pypi.org/project/flaky/>_ to test dependencies
  • Added unused_udp_port and unused_udp_port_factory fixtures (similar to unused_tcp_port and unused_tcp_port_factory counterparts. [#99](https://github.com/pytest-dev/pytest-asyncio/issues/99) <https://github.com/pytest-dev/pytest-asyncio/issues/99>_
  • Added the plugin modes: strict, auto, and legacy. See documentation <https://github.com/pytest-dev/pytest-asyncio#modes>_ for details. [#125](https://github.com/pytest-dev/pytest-asyncio/issues/125) <https://github.com/pytest-dev/pytest-asyncio/issues/125>_
  • Correctly process KeyboardInterrupt during async fixture setup phase [#219](https://github.com/pytest-dev/pytest-asyncio/issues/219) <https://github.com/pytest-dev/pytest-asyncio/issues/219>_

... (truncated)

Commits
  • 2da33c4 docs: Prepare v0.19.0 release. (#385)
  • 07beb80 opt into strict mode by default (#380)
  • 25c54a5 Clarify documentation of event_loop fixture (#375)
  • 49f07a4 Bump typing-extensions from 4.2.0 to 4.3.0 in /dependencies/default (#382)
  • 739198b Bump hypothesis from 6.48.0 to 6.48.3 in /dependencies/default (#381)
  • db72f25 Bump importlib-metadata from 4.11.4 to 4.12.0 in /dependencies/default (#378)
  • 4cf16cf Bump hypothesis from 6.47.3 to 6.48.0 in /dependencies/default (#377)
  • f13c85f docs: Fix typo in README.
  • b463f72 Python 3.11 support (#370)
  • 860ff51 Bump hypothesis from 6.47.2 to 6.47.3 in /dependencies/default (#373)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1769/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1280136357,PR_kwDOBm6k_c46Hsvj,1760,Bump furo from 2022.4.7 to 2022.6.21,49699333,dependabot[bot],closed,0,,,,,1,2022-06-22T13:22:31Z,2022-07-18T01:06:27Z,2022-07-18T01:06:27Z,CONTRIBUTOR,simonw/datasette/pulls/1760,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.4.7 to 2022.6.21.
Changelog

Sourced from furo's changelog.

Changelog

2022.06.21 -- Opulent Opal

  • Fix docutils <= 0.17.x compatibility
  • Bump to the latest Node.js LTS

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

  • ✨ Advertise Sphinx 5 compatibility.
  • ✨ Change to basic-ng as the base theme (from {pypi}sphinx-basic-ng).
  • Document site-wide announcement banners.
  • Drop the pin on pygments.
  • Improve edit button, using basic-ng's edit-this-page component.
  • Tweak headings to better match what users expect.
  • Tweak how Sphinx's default HTML is rendered, using docutils post-transforms (this replaces parsing+modifying it with BeautifulSoup).
  • When built with docutils 0.18, footnotes are rendered differently and stylised differently in Furo.

2022.04.07 -- Magical Mauve

  • ✨ Make sphinx-copybutton look better.
  • Add margin to indentations in line blocks.
  • Add styling for non-arabic list styles
  • Add support for html_baseurl.
  • Improve "Edit this page" icon to be more accessible.
  • Improve html_sidebars example.
  • Tweak positioning of back to top on desktop.

2022.03.04 -- Lucent Lilac

  • Improve support for print media.
  • Reduce heading sizes for h3 and below.
  • Don't allow selecting headerlink content.
  • Improve how overflow wrapping is handled.
  • Add a reference from the configuration variables to the color customisation page.

2022.02.23 -- Keen Kobi

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.4.7&new-version=2022.6.21)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1760/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1292060682,I_kwDOCGYnMM5NA0gK,450,Add --ignore option to more commands,9599,simonw,closed,0,,,,,9,2022-07-02T13:52:02Z,2022-07-15T22:39:09Z,2022-07-15T22:37:45Z,OWNER,,"As seen in https://sqlite-utils.datasette.io/en/stable/cli-reference.html#add-foreign-key Could make this TIL trick unnecessary: https://til.simonwillison.net/bash/ignore-errors",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1298531653,I_kwDOCGYnMM5NZgVF,451,Make sqlite_utils.utils.chunks a documented function,9599,simonw,closed,0,,,,,2,2022-07-08T06:01:04Z,2022-07-15T22:09:34Z,2022-07-15T21:59:33Z,OWNER,,I want to use it in another project: https://github.com/simonw/sqlite-utils/blob/8a9fe6498faf783a1fdeb1793e661ad194a05267/sqlite_utils/utils.py#L471-L474,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1303169663,I_kwDOCGYnMM5NrMp_,453,'unclosed file' warning when using insert_upsert_implementation from Python,311257,makkus,closed,0,,,,,1,2022-07-13T09:34:35Z,2022-07-15T21:52:25Z,2022-07-15T21:52:21Z,NONE,,"I'm using the `[insert_upsert_implementation](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py)` function directly in my Python code to import a csv file with all the bells and whistles `sqlite-utils` provides, but I'm getting a resource warning that a io.TextWrapper object is not closed. The warning goes away when wrapping the code from [this line](https://github.com/simonw/sqlite-utils/blob/42440d6345c242ee39778045e29143fb550bd2c2/sqlite_utils/cli.py#L924) in a try/finally block like: ``` try: ... ... finally: decoded.close() ``` (might be that `sniff_buffer` must also be closed if non null, but I might be wrong) I suspect Python closes the reference automatically when the sqlite-utils cli run is done, but since my code doesn't exit, I'm getting the warning. Alternatively, it'd be cool if the 'import csv/tsv' functionality could be added directly to the Database class.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306548397,I_kwDOCGYnMM5N4Fit,454,CLI command for duplicating tables,9599,simonw,closed,0,,,,,1,2022-07-15T21:31:27Z,2022-07-15T21:48:23Z,2022-07-15T21:45:51Z,OWNER,,"CLI equivalent of: - #449",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279863844,I_kwDOCGYnMM5MSSwk,449,Utilities for duplicating tables and creating a table with the results of a query,1690072,davidleejy,closed,0,,,,,4,2022-06-22T09:41:43Z,2022-07-15T21:46:13Z,2022-07-15T21:21:36Z,CONTRIBUTOR,,"is there a duplicate table functionality? Otherwise, I'd be happy to submit a PR. In sqlite3 it would look like: ```python import sqlite3 as sl con = sl.connect('prompt-tune.db') def db_duplicate_table(table_name, table_name_new, con=con): # Duplicates table `table_name` to a new table `table_name_new`. try: cur = con.cursor() cur.execute(f""""""CREATE TABLE {table_name_new} AS SELECT * FROM {table_name}"""""") except Exception as e: print(e) finally: cur.close() db_duplicate_table('orig_table', 'new_table') ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1299760627,PR_kwDOCGYnMM47JUun,452,Add duplicate table feature,1690072,davidleejy,closed,0,,,,,1,2022-07-09T20:24:31Z,2022-07-15T21:21:37Z,2022-07-15T21:21:36Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/452,"This PR addresses a feature request raised in issue #449. Specifically this PR adds a functionality that lets users duplicate a table via: ```python table_new = db[""my_table""].duplicate(""new_table"") ``` Test added in file `tests/test_duplicate.py`. Happy to make changes to meet maintainers' feedback, if any. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/452/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 860625833,MDU6SXNzdWU4NjA2MjU4MzM=,1300,Make row available to `render_cell` plugin hook,3243482,abdusco,closed,0,,,,,5,2021-04-18T10:14:37Z,2022-07-07T16:34:05Z,2022-07-07T16:31:22Z,CONTRIBUTOR,,"*Original title: **Generating URL for a row inside `render_cell` hook*** Hey, I am using Datasette to view a database that contains video metadata. It has BLOB columns that contain video thumbnails in JPG format (around 100-500KB per row). I've registered an output formatter that extends `datasette.blob_renderer.render_blob` function and serves the column with `image/jpeg` content type. ```python from datasette.blob_renderer import render_blob async def render_jpg(datasette, database, rows, columns, request, table, view_name): response = await render_blob(datasette, database, rows, columns, request, table, view_name) response.content_type = ""image/jpeg"" response.headers[""Content-Disposition""] = f'inline; filename=""image.jpg""' return response @hookimpl def register_output_renderer(): return { ""extension"": ""jpg"", ""render"": render_jpg, ""can_render"": lambda: True, } ``` This works well. I can visit `http://localhost:8001/mydb/videos/1.jpg?_blob_column=thumbnail` and view the image. I want to display the image directly with an `` tag (lazy-loaded of course). So, I need a URL, because embedding base64 would increase the page size too much (each image > 100KB). Datasette generates a link with `.blob` extension for blob columns. It does this by calling `datasette.urls.row_blob` https://github.com/simonw/datasette/blob/7a2ed9f8a119e220b66d67c7b9e07cbab47b1196/datasette/views/table.py#L169-L179 But I have no way of getting the row inside the `render_cell` hook. ```python @hookimpl def render_cell(value, column, table, database, datasette): if isinstance(value, bytes) and imghdr.what(None, value): # generate url return '$renderedLink' ``` Any pointers?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212701569,I_kwDOCGYnMM5ISFuB,427,"sqlite-utils convert date parsing recipe complains about trying to parse ""*""",1385831,wdccdw,closed,0,,,,,1,2022-04-22T19:27:10Z,2022-07-02T13:59:59Z,2022-07-02T13:59:32Z,NONE,,"Missing values in my dataset are denoted by a single asterisk. I am trying to parse string dates into dates. This works fine for columns without missing values, but, when the column contains ""*"", I get the following: ``` $ sqlite-utils convert ${dbfile} details dob 'r.parsedate(value)' [------------------------------------] 0%Traceback (most recent call last): File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2508, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 643, in parse raise ParserError(""Unknown string format: %s"", timestr) dateutil.parser._parser.ParserError: Unknown string format: * Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils==3.25.1', 'console_scripts', 'sqlite-utils')()) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2698, in convert db[table].convert( File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2524, in convert self.db.execute(sql, where_args or []) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 458, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: user-defined function raised exception ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/427/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1287325944,PR_kwDOBm6k_c46ftHo,1763,Bump black from 22.1.0 to 22.6.0,49699333,dependabot[bot],closed,0,,,,,1,2022-06-28T13:11:32Z,2022-06-28T17:40:25Z,2022-06-28T17:40:25Z,CONTRIBUTOR,simonw/datasette/pulls/1763,"Bumps [black](https://github.com/psf/black) from 22.1.0 to 22.6.0.
Release notes

Sourced from black's releases.

22.6.0

Style

  • Fix unstable formatting involving #fmt: skip and # fmt:skip comments (notice the lack of spaces) (#2970)

Preview style

  • Docstring quotes are no longer moved if it would violate the line length limit (#3044)
  • Parentheses around return annotations are now managed (#2990)
  • Remove unnecessary parentheses around awaited objects (#2991)
  • Remove unnecessary parentheses in with statements (#2926)
  • Remove trailing newlines after code block open (#3035)

Integrations

  • Add scripts/migrate-black.py script to ease introduction of Black to a Git project (#3038)

Output

  • Output Python version and implementation as part of --version flag (#2997)

Packaging

  • Use tomli instead of tomllib on Python 3.11 builds where tomllib is not available (#2987)

Parser

  • PEP 654 syntax (for example, except *ExceptionGroup:) is now supported (#3016)
  • PEP 646 syntax (for example, Array[Batch, *Shape] or def fn(*args: *T) -> None) is now supported (#3071)

Vim Plugin

  • Fix strtobool function. It didn't parse true/on/false/off. (#3025)

Full Changelog: https://github.com/psf/black/compare/22.3.0...22.6.0


Thank you!

  • @​jpy-git for improving our parentheses formatting significantly
  • @​siuryan for fixing a fmt: skip bug, making it a little less annoying to use :)
  • @​isidentical for implementing support for PEP 654 and 646 syntax
  • @​defntvdm for fixing our vim plugin, especially as we (the maintainers) don't really know vim script sadly
  • @​idorrington92 for fixing the docstring bug where Black would move the closing quotes causing it to violate the line length limit (whoops!)
  • @​hbrunn for contributing the migrate-black script
  • @​saroad2 for improving newline handling after code blocks and test infrastructure improvements

... and everyone else who contributed documentation, tests, or other improvements to the Black project!

... (truncated)

Changelog

Sourced from black's changelog.

22.6.0

Style

  • Fix unstable formatting involving #fmt: skip and # fmt:skip comments (notice the lack of spaces) (#2970)

Preview style

  • Docstring quotes are no longer moved if it would violate the line length limit (#3044)
  • Parentheses around return annotations are now managed (#2990)
  • Remove unnecessary parentheses around awaited objects (#2991)
  • Remove unnecessary parentheses in with statements (#2926)
  • Remove trailing newlines after code block open (#3035)

Integrations

  • Add scripts/migrate-black.py script to ease introduction of Black to a Git project (#3038)

Output

  • Output Python version and implementation as part of --version flag (#2997)

Packaging

  • Use tomli instead of tomllib on Python 3.11 builds where tomllib is not available (#2987)

Parser

  • PEP 654 syntax (for example, except *ExceptionGroup:) is now supported (#3016)
  • PEP 646 syntax (for example, Array[Batch, *Shape] or def fn(*args: *T) -> None) is now supported (#3071)

Vim Plugin

  • Fix strtobool function. It didn't parse true/on/false/off. (#3025)

22.3.0

Preview style

  • Code cell separators #%% are now standardised to # %% (#2919)
  • Remove unnecessary parentheses from except statements (#2939)
  • Remove unnecessary parentheses from tuple unpacking in for loops (#2945)
  • Avoid magic-trailing-comma in single-element subscripts (#2942)

Configuration

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.1.0&new-version=22.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1763/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1184850337,PR_kwDOBm6k_c41OrSL,1693,Bump black from 22.1.0 to 22.3.0,49699333,dependabot[bot],closed,0,,,,,3,2022-03-29T13:11:09Z,2022-06-28T13:11:38Z,2022-06-28T13:11:36Z,CONTRIBUTOR,simonw/datasette/pulls/1693,"Bumps [black](https://github.com/psf/black) from 22.1.0 to 22.3.0.
Release notes

Sourced from black's releases.

22.3.0

Preview style

  • Code cell separators #%% are now standardised to # %% (#2919)
  • Remove unnecessary parentheses from except statements (#2939)
  • Remove unnecessary parentheses from tuple unpacking in for loops (#2945)
  • Avoid magic-trailing-comma in single-element subscripts (#2942)

Configuration

  • Do not format __pypackages__ directories by default (#2836)
  • Add support for specifying stable version with --required-version (#2832).
  • Avoid crashing when the user has no homedir (#2814)
  • Avoid crashing when md5 is not available (#2905)
  • Fix handling of directory junctions on Windows (#2904)

Documentation

  • Update pylint config documentation (#2931)

Integrations

  • Move test to disable plugin in Vim/Neovim, which speeds up loading (#2896)

Output

  • In verbose, mode, log when Black is using user-level config (#2861)

Packaging

  • Fix Black to work with Click 8.1.0 (#2966)
  • On Python 3.11 and newer, use the standard library's tomllib instead of tomli (#2903)
  • black-primer, the deprecated internal devtool, has been removed and copied to a separate repository (#2924)

Parser

  • Black can now parse starred expressions in the target of for and async for statements, e.g for item in *items_1, *items_2: pass (#2879).
Changelog

Sourced from black's changelog.

22.3.0

Preview style

  • Code cell separators #%% are now standardised to # %% (#2919)
  • Remove unnecessary parentheses from except statements (#2939)
  • Remove unnecessary parentheses from tuple unpacking in for loops (#2945)
  • Avoid magic-trailing-comma in single-element subscripts (#2942)

Configuration

  • Do not format __pypackages__ directories by default (#2836)
  • Add support for specifying stable version with --required-version (#2832).
  • Avoid crashing when the user has no homedir (#2814)
  • Avoid crashing when md5 is not available (#2905)
  • Fix handling of directory junctions on Windows (#2904)

Documentation

  • Update pylint config documentation (#2931)

Integrations

  • Move test to disable plugin in Vim/Neovim, which speeds up loading (#2896)

Output

  • In verbose, mode, log when Black is using user-level config (#2861)

Packaging

  • Fix Black to work with Click 8.1.0 (#2966)
  • On Python 3.11 and newer, use the standard library's tomllib instead of tomli (#2903)
  • black-primer, the deprecated internal devtool, has been removed and copied to a separate repository (#2924)

Parser

  • Black can now parse starred expressions in the target of for and async for statements, e.g for item in *items_1, *items_2: pass (#2879).
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.1.0&new-version=22.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1693/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1261826957,PR_kwDOBm6k_c45Kojn,1753,Bump furo from 2022.4.7 to 2022.6.4.1,49699333,dependabot[bot],closed,0,,,,,2,2022-06-06T13:10:22Z,2022-06-22T13:22:37Z,2022-06-22T13:22:35Z,CONTRIBUTOR,simonw/datasette/pulls/1753,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.4.7 to 2022.6.4.1.
Changelog

Sourced from furo's changelog.

Changelog

2022.06.04.1 -- Naughty Nickel bugfix

  • Fix the URL used in the "Edit this page" for Read the Docs builds.

2022.06.04 -- Naughty Nickel

  • ✨ Advertise Sphinx 5 compatibility.
  • ✨ Change to basic-ng as the base theme (from {pypi}sphinx-basic-ng).
  • Document site-wide announcement banners.
  • Drop the pin on pygments.
  • Improve edit button, using basic-ng's edit-this-page component.
  • Tweak headings to better match what users expect.
  • Tweak how Sphinx's default HTML is rendered, using docutils post-transforms (this replaces parsing+modifying it with BeautifulSoup).
  • When built with docutils 0.18, footnotes are rendered differently and stylised differently in Furo.

2022.04.07 -- Magical Mauve

  • ✨ Make sphinx-copybutton look better.
  • Add margin to indentations in line blocks.
  • Add styling for non-arabic list styles
  • Add support for html_baseurl.
  • Improve "Edit this page" icon to be more accessible.
  • Improve html_sidebars example.
  • Tweak positioning of back to top on desktop.

2022.03.04 -- Lucent Lilac

  • Improve support for print media.
  • Reduce heading sizes for h3 and below.
  • Don't allow selecting headerlink content.
  • Improve how overflow wrapping is handled.
  • Add a reference from the configuration variables to the color customisation page.

2022.02.23 -- Keen Kobi

  • ✨ Add a "Back to Top" button that shows up when scrolling up.
  • Add a URL to GitHub in Project-URLs.
  • Break long words in the prev/next buttons.
  • Fix includes in Kitchen sink.

... (truncated)

Commits
  • 1142fad Prepare release: 2022.06.04.1
  • 211abb4 Update changelog
  • 06cdba6 Fix the edit this page URL
  • 43ce491 Back to development
  • fb6e486 Prepare release: 2022.06.04
  • 090b02e Update changelog
  • 098d51d Fix the Just the Docs link
  • 7fa8d08 Change to a post-transform for wrapping math blocks and table
  • 51f1e52 Speed up determining if there's multiple toc entries
  • 99a6ff8 Update caniuse-lite NPM package
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.4.7&new-version=2022.6.4.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1277328147,I_kwDOCGYnMM5MInsT,446,Use Just to automate running tests and linters locally,9599,simonw,closed,0,,,,,2,2022-06-20T19:51:09Z,2022-06-21T19:28:35Z,2022-06-20T19:54:50Z,OWNER,,I keep committing code that fails additional tests like `mypy` and `flake8` and `black`. Automate those using Just.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,simonw,closed,0,,,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269998342,I_kwDOCGYnMM5LsqMG,443,Make `utils.rows_from_file()` a documented API,9599,simonw,closed,0,,,,,2,2022-06-13T21:53:24Z,2022-06-20T19:49:37Z,2022-06-14T20:12:46Z,OWNER,,"> `rows_from_file()` isn't part of the documented API but maybe it should be! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1154385916_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,simonw,closed,0,,,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1275523220,PR_kwDOBm6k_c454SlE,1759,Extract facet portions of table.html out into included templates,19872,nsmgr8,closed,0,,,,,3,2022-06-17T22:04:04Z,2022-06-20T18:05:45Z,2022-06-20T18:05:45Z,CONTRIBUTOR,simonw/datasette/pulls/1759,To allow users customise the facet content as they would prefer such as sorting of facet results. ordering of suggested facets etc.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,ryascott,closed,0,,,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250629388,I_kwDOCGYnMM5KixcM,440,CSV files with too many values in a row cause errors,4068,frafra,closed,0,,,,,20,2022-05-27T10:54:44Z,2022-06-14T22:23:01Z,2022-06-14T20:12:46Z,NONE,,"*Original title: csv.DictReader can have None as key* In some cases, `csv.DictReader` can have `None` as key for unnamed columns, and a list of values as value. `sqlite_utils.utils.rows_from_file` cannot handle that: ```python url=""https://artsdatabanken.no/Fab2018/api/export/csv"" db = sqlite_utils.Database("":memory"") with urlopen(url) as fab: reader, _ = sqlite_utils.utils.rows_from_file(fab, encoding=""utf-16le"") db[""fab2018""].insert_all(reader, pk=""Id"") ``` Result: ``` Traceback (most recent call last): File """", line 3, in File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2924, in insert_all chunk = list(chunk) File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in fix_square_braces if any(""["" in key or ""]"" in key for key in record.keys()): File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in if any(""["" in key or ""]"" in key for key in record.keys()): TypeError: argument of type 'NoneType' is not iterable ``` Code: https://github.com/simonw/sqlite-utils/blob/59be60c471fd7a2c4be7f75e8911163e618ff5ca/sqlite_utils/db.py#L3454 `sqlite-utils insert` from command line is not affected by this issue.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1257724585,I_kwDOCGYnMM5K91qp,441,Combining `rows_where()` and `search()` to limit which rows are searched,1448859,betatim,closed,0,,,,,4,2022-06-02T06:01:55Z,2022-06-14T21:57:57Z,2022-06-14T21:54:38Z,NONE,,"What is the right way to limit a full text search query to some rows of a table? For example, I have a table that contains the following columns: `title`, `content`, `owner` (each row represents a document). The `owner` column is a username. It feels right to store all documents in one table, instead of having one table per owner. In particular because I'd like to full text search all documents, only documents owned by one user and documents owned by a set of users. I tried to combine `.rows_where(""owner = ?"", ""1234"")` and `.search()` from the `Table` class but I don't think that is meant to work. I discovered `.search_sql()` as a way to generate the FTS SQL statement. By hand I can edit it to add a `AND [original].[owner] = :owner` to the `where` clause. This seems to do what I want. My two questions: 1. is adding a `AND ...` to the `where` clause actually the right thing to do or should I be doing something else (my SQL skills are low)? 2. is there a built-in to sqlite-utils way to achieve this? Right now I am thinking I will make my own version of `search_sql()` that generates a query that contains an additional `owner = :owner` for my particular use-case. Bonus question: is this generally useful/something to add to sqlite-utils or too niche?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269886084,I_kwDOCGYnMM5LsOyE,442,`maximize_csv_field_size_limit()` utility function,9599,simonw,closed,0,,,,,2,2022-06-13T19:54:54Z,2022-06-14T21:55:15Z,2022-06-14T21:31:49Z,OWNER,,"This code here runs only if `cli.py` is imported: https://github.com/simonw/sqlite-utils/blob/7ddf5300886a32d6daf60cf1d71efe492b65c87e/sqlite_utils/cli.py#L50-L59 I found myself needing the same fix in another library: - https://github.com/simonw/datasette-socrata/issues/13 It should be a documented utility function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1226106354,PR_kwDOBm6k_c43U1z7,1740,chore: Set permissions for GitHub actions,172697,naveensrinivasan,closed,0,,,,,1,2022-05-05T01:03:08Z,2022-05-31T19:28:41Z,2022-05-31T19:28:40Z,CONTRIBUTOR,simonw/datasette/pulls/1740," Restrict the GitHub token permissions only to the required ones; this way, even if the attackers will succeed in compromising your workflow, they won’t be able to do much. - Included permissions for the action. https://github.com/ossf/scorecard/blob/main/docs/checks.md#token-permissions https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#permissions https://docs.github.com/en/actions/using-jobs/assigning-permissions-to-jobs [Keeping your GitHub Actions and workflows secure Part 1: Preventing pwn requests](https://securitylab.github.com/research/github-actions-preventing-pwn-requests/) Signed-off-by: naveen <172697+naveensrinivasan@users.noreply.github.com> ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1740/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1244294227,PR_kwDOCGYnMM44P4GG,437,docs to dogs,114388,yurivish,closed,0,,,,,1,2022-05-22T15:50:33Z,2022-05-30T21:32:41Z,2022-05-30T21:32:41Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/437,Fixes a typo.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/437/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,knutwannheden,closed,0,,,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.) When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach. I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250161887,I_kwDOCGYnMM5Kg_Tf,438,illegal UTF-16 surrogate,4068,frafra,closed,0,,,,,2,2022-05-26T22:49:52Z,2022-05-27T08:21:53Z,2022-05-27T08:21:53Z,NONE,,"I am trying to insert `https://artsdatabanken.no/Fab2018/api/export/csv` into a SQLite database, but I have an error when using `sqlite-utils`: ``` sqlite-utils insert --csv --delimiter "";"" --encoding=""utf-16-le"" --pk ""Id"" csv fremmedart test.db [------------------------------------] 0% Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I tried to convert the file using `iconv -f ""utf-16le"" -t ""utf-8""`, but I still get a similar error (slightly different position): ``` sqlite-utils insert --csv --delimiter "";"" --encoding=utf-8 --pk ""Id"" csv_utf8 fremmedart test.db [------------------------------------] 0% Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I have no issues reading such file using this Python code: ```python content = open('csv', encoding='utf-16-le').read()) ``` `in2csv` works too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243715381,I_kwDOCGYnMM5KIZc1,436,"Add ""copy to clipboard"" button to code examples in documentation",9599,simonw,closed,0,,,,,0,2022-05-20T21:53:23Z,2022-05-20T21:57:53Z,2022-05-20T21:57:53Z,OWNER,,"Follows: - #435 Imitates: - https://github.com/simonw/datasette/issues/1748 I'll use https://github.com/executablebooks/sphinx-copybutton - here's the Datasette commit: https://github.com/simonw/datasette/commit/1465fea4798599eccfe7e8f012bd8d9adfac3039",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243704847,I_kwDOCGYnMM5KIW4P,435,Switch to Furo documentation theme,9599,simonw,closed,0,,,,,2,2022-05-20T21:46:39Z,2022-05-20T21:56:10Z,2022-05-20T21:54:43Z,OWNER,,"As seen in: - https://github.com/simonw/datasette/issues/1746 - https://github.com/simonw/shot-scraper/issues/77",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243498298,I_kwDOBm6k_c5KHkc6,1746,Switch documentation theme to Furo,9599,simonw,closed,0,,,,,21,2022-05-20T18:42:17Z,2022-05-20T21:28:29Z,2022-05-20T21:28:29Z,OWNER,,"https://github.com/pradyunsg/furo I just did this for `shot-scraper` and I really like it: https://shot-scraper.datasette.io/en/latest/ - https://github.com/simonw/shot-scraper/issues/77",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243517592,I_kwDOBm6k_c5KHpKY,1748,Add copy buttons next to code examples in the documentation,9599,simonw,closed,0,,,,,2,2022-05-20T19:09:00Z,2022-05-20T19:15:00Z,2022-05-20T19:11:32Z,OWNER,,Similar to the ones in `datasette-copyable` which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243512344,I_kwDOBm6k_c5KHn4Y,1747,Add tutorials to the getting started guide,9599,simonw,closed,0,,,,,1,2022-05-20T19:01:52Z,2022-05-20T19:12:30Z,2022-05-20T19:05:34Z,OWNER,,On https://docs.datasette.io/en/stable/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239008850,I_kwDOBm6k_c5J2cZS,1744,`--nolock` feature for opening locked databases,9599,simonw,closed,0,,,,,7,2022-05-17T18:25:16Z,2022-05-17T19:46:38Z,2022-05-17T19:40:30Z,OWNER,,"The getting started docs currently suggest you try this to browse your Chrome history: datasette ~/Library/Application\ Support/Google/Chrome/Default/History But if Chrome is running you will likely get this error: sqlite3.OperationalError: database is locked Turns out there's a workaround for this which I just spotted [on the SQLite forum](https://sqlite.org/forum/forumpost/86a67f6995): > You can do this using a [URI filename](https://sqlite.org/uri.html): > ``` > sqlite3 'file:places.sqlite?mode=ro&nolock=1' > ``` > That opens the file `places.sqlite` in read-only mode with locking disabled. This isn't safe, in that changes to the database made by other corrections are likely to cause this connection to return incorrect results or crash. Read-only mode should at least mean that you don't corrupt the database in the process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239080102,I_kwDOBm6k_c5J2tym,1745,Documentation on running cog,9599,simonw,closed,0,,,,,1,2022-05-17T19:41:06Z,2022-05-17T19:45:51Z,2022-05-17T19:43:45Z,OWNER,,Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223699280,I_kwDOBm6k_c5I8CtQ,1739,.db downloads should be served with an ETag,9599,simonw,closed,0,,,,,6,2022-05-03T05:11:21Z,2022-05-04T18:21:18Z,2022-05-03T14:59:51Z,OWNER,,"I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it: ![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223234932,I_kwDOBm6k_c5I6RV0,1733,Get Datasette compatible with Pyodide,9599,simonw,closed,0,,,,,9,2022-05-02T19:01:58Z,2022-05-04T15:14:01Z,2022-05-02T20:15:27Z,OWNER,,"I've already got this working as a prototype. Here are the changes I had to make: - Replace the two dependencies that don't publish pure Python wheels to PyPI: `click-default-group` and `python-baseconv` - Get Datasette to work without threading - which it turns out is exclusively used for database connections - Make the `uvicorn` dependency optional (only needed when Datasette runs in the CLI) TODO: - [x] Switch to `click-default-group-wheel` - [x] https://github.com/simonw/datasette/issues/1734 - [x] Work around `uvicorn` import error - [x] https://github.com/simonw/datasette/issues/1735 - [x] #1737 Goal is to be able to do the following directly in https://pyodide.org/en/stable/console.html ```python import micropip await micropip.install(""datasette"") from datasette.app import Datasette ds = Datasette() await ds.client.get(""/.json"") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1733/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173023272,I_kwDOCGYnMM5F6uoo,416,Options for how `r.parsedate()` should handle invalid dates,638427,mattkiefer,closed,0,,,,,11,2022-03-17T23:29:55Z,2022-05-03T21:36:49Z,2022-03-21T04:01:39Z,NONE,,"Exceptions are normal expected behavior when typecasting an invalid format. However, r.parsedate() is really just re-formatting strings and keeping the type as text. So it may be better to print-and-pass on exception so the user can see a complete list of invalid values -- while also allowing for the parser to reformat the remaining valid values. ``` sqlite-utils convert idfpr.db license ""Expiration Date"" ""r.parsedate(value)"" [#######-----------------------------] 21% 00:01:57Traceback (most recent call last): File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/db.py"", line 2336, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 652, in parse raise ParserError(""String does not contain a date: %s"", timestr) dateutil.parser._parser.ParserError: String does not contain a date: / / ``` In this case, I had just one variation of an invalid date: ' / / '. But theoretically there could be many values that would have to be fixed one at a time with the current exception handling. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223459734,I_kwDOBm6k_c5I7IOW,1737,Automated test for Pyodide compatibility,9599,simonw,closed,0,,,,,4,2022-05-02T23:24:25Z,2022-05-02T23:40:50Z,2022-05-02T23:40:50Z,OWNER,,"Refs: - #1733 Need something in the test suite such that if Datasette breaks against Pyodide in the future we hear about it. I'm thinking this is an opportunity to use [shot-scraper javascript](https://github.com/simonw/shot-scraper#scraping-pages-using-javascript).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223263540,I_kwDOBm6k_c5I6YU0,1735,Datasette setting to disable threading (for Pyodide),9599,simonw,closed,0,,,,,1,2022-05-02T19:31:08Z,2022-05-02T23:25:49Z,2022-05-02T20:13:52Z,OWNER,,"> I'm going to add a Datasette setting to disable threading entirely, designed for usage in this particular case. > > I thought about adding a new setting, then I noticed this: > > datasette mydatabase.db --setting num_sql_threads 10 > > I'm going to let users set that to `0` to disable threaded execution of SQL queries. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115278325_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223312279,PR_kwDOBm6k_c43MIU0,1736,Clean up compatibility with Pyodide,9599,simonw,closed,0,,,,,0,2022-05-02T20:14:38Z,2022-05-02T20:15:28Z,2022-05-02T20:15:27Z,OWNER,simonw/datasette/pulls/1736,"Closes #1735, closes #1733",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1736/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1223177069,PR_kwDOCGYnMM43LrKB,429,Depend on click-default-group-wheel,9599,simonw,closed,0,,,,,2,2022-05-02T18:03:10Z,2022-05-02T18:52:42Z,2022-05-02T18:05:00Z,OWNER,simonw/sqlite-utils/pulls/429,"Trying to get this to work with Pyodide. Refs: https://github.com/simonw/click-default-group-wheel/issues/3",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1065432388,I_kwDOBm6k_c4_gTVE,1534,Maybe return JSON from HTML pages if `Accept: application/json` is sent,9599,simonw,closed,0,,,,,4,2021-11-28T20:48:09Z,2022-04-27T21:59:34Z,2022-02-02T23:39:33Z,OWNER,,"Relates to #1533 - and to the work I've been doing on the https://github.com/simonw/datasette-table Web Component. It would be useful to support users pasting in a URL to a Datasette table or query without first having to add the `.json` extension themselves - since then other systems could hit that URL with `Accept: application/json` to get back the JSON representation without first needing to read the `Link: ` header from #1533 to figure out what the URL to that JSON is. (There is weird logic deep in Datasette that says that you add `.json` to the path UNLESS the table name itself ends with `.json`, in which case you add `?_format=json` - this is super-confusing). [Update: I removed that confusing feature here: [https://simonwillison.net/2022/Mar/19/weeknotes/](https://simonwillison.net/2022/Mar/19/weeknotes/)]",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216508080,I_kwDOBm6k_c5IgnCw,1723,Research running SQL in table view in parallel using `asyncio.gather()`,9599,simonw,closed,0,,,,,5,2022-04-26T21:42:48Z,2022-04-27T18:53:44Z,2022-04-26T22:19:09Z,OWNER,,"Spun off from: - #1715",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1723/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1213683988,I_kwDOBm6k_c5IV1kU,1718,Code examples in the documentation should be formatted with Black,9599,simonw,closed,0,,,,,12,2022-04-24T15:22:50Z,2022-04-24T16:24:14Z,2022-04-24T16:18:03Z,OWNER,,"For example on this page: https://docs.datasette.io/en/stable/writing_plugins.html#packaging-a-plugin I wonder if there's an easy way for me to enforce this for Sphinx documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1213281044,PR_kwDOBm6k_c42qyUI,1717,Add timeout option to Cloudrun build,127565,wragge,closed,0,,,,,2,2022-04-23T11:51:21Z,2022-04-24T14:03:08Z,2022-04-24T14:03:08Z,CONTRIBUTOR,simonw/datasette/pulls/1717,I've found that the Cloudrun build phase often hits a timeout limit with large databases. I believe the default timeout is 10 minutes. This pull request just adds a `--timeout` option to the cloudrun `publish` command and passes the value on to the build step.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1717/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1212838949,I_kwDOBm6k_c5ISnQl,1716,Configure git blame to ignore Black commit,9599,simonw,closed,0,,,,,1,2022-04-22T21:56:37Z,2022-04-22T22:02:19Z,2022-04-22T22:02:19Z,OWNER,,"GitHub can support this in blame views now too: https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810507413,MDExOlB1bGxSZXF1ZXN0NTc1MTg3NDU3,1229,ensure immutable databses when starting in configuration directory mode with,295329,camallen,closed,0,,,,,3,2021-02-17T20:18:26Z,2022-04-22T13:16:36Z,2021-03-29T00:17:32Z,CONTRIBUTOR,simonw/datasette/pulls/1229,"fixes #1224 This PR ensures all databases found in a configuration directory that match the files in `inspect-data.json` will be set to `immutable` as outlined in https://docs.datasette.io/en/latest/settings.html#configuration-directory-mode specifically on building the `datasette` instance it checks: - if `immutables` is an empty tuple - as passed by the cli code - if `immutables` is the default function value `None` - when it's not explicitly set And correctly builds the immutable database list from the `inspect-data[file]` keys. Note for this to work the `inspect-data.json` file must contain `file` paths which are relative to the configuration directory otherwise the file paths won't match and the dbs won't be set to immutable. I couldn't find an easy way to test this due to the way `make_app_client` works, happy to take directions on adding a test for this. I've updated the relevant docs as well, i.e. use the `inspect` cli cmd from the config directory path to create the relevant file ``` cd $config_dir datasette inspect *.db --inspect-file=inspect-data.json ``` https://docs.datasette.io/en/latest/performance.html#using-datasette-inspect",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1180427792,I_kwDOCGYnMM5GW-YQ,421,"""Error: near ""("": syntax error"" when using sqlite-utils indexes CLI",24938923,learning4life,closed,0,,,,,8,2022-03-25T07:12:51Z,2022-04-13T22:41:59Z,2022-04-13T22:41:59Z,NONE,,"This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 **New error when using CLI: ""sqlite-utils indexes global.db --table""** ``` (app-root) sqlite-utils indexes global.db --table Error: near ""("": syntax error (app-root) sqlite-utils --version sqlite-utils, version 3.25.1 (app-root) sqlite3 --version 3.36.0 2021-06-18 18:36:39 (app-root) python --version Python 3.8.11 ``` Dockerfile ``` FROM centos/python-38-centos7 USER root RUN yum update -y RUN yum upgrade -y # epel RUN yum -y install epel-release && yum clean all # SQLite RUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel WORKDIR /build/ COPY sqlite-autoconf-3360000.tar.gz ./ RUN tar -zxf sqlite-autoconf-3360000.tar.gz WORKDIR /build/sqlite-autoconf-3360000 RUN ./configure RUN make RUN make install # RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip RUN pip install sqlite-utils ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200866134,I_kwDOCGYnMM5Hk8NW,424,Better error message if you try to create a table with no columns,9599,simonw,closed,0,,,,,1,2022-04-12T02:43:20Z,2022-04-13T22:40:15Z,2022-04-13T22:40:10Z,OWNER,,"Seen here: - https://github.com/simonw/geojson-to-sqlite/issues/30 Attempting to create a table with no columns produced this confusing error: ``` File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/geojson_to_sqlite/utils.py"", line 69, in import_features db[table].create(column_types, pk=pk) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 863, in create self.db.create_table( File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 517, in create_table self.execute(sql) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 236, in execute return self.conn.execute(sql) sqlite3.OperationalError: near "")"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200649889,I_kwDOBm6k_c5HkHah,1710,Guide for plugin authors to upgrade their plugins for 1.0,9599,simonw,closed,0,,,,,1,2022-04-11T22:58:25Z,2022-04-11T23:04:01Z,2022-04-11T23:03:25Z,OWNER,,I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1184850675,PR_kwDOBm6k_c41OrWq,1694,"Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0",49699333,dependabot[bot],closed,0,,,,,1,2022-03-29T13:11:23Z,2022-04-08T23:05:10Z,2022-04-08T23:05:09Z,CONTRIBUTOR,simonw/datasette/pulls/1694,"Updates the requirements on [click](https://github.com/pallets/click) to permit the latest version.
Release notes

Sourced from click's releases.

8.1.0

This is a feature release, which includes new features and removes previously deprecated features. The 8.1.x branch is now the supported bugfix branch, the 8.0.x branch will become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades.

Changelog

Sourced from click's changelog.

Version 8.1.0

Released 2022-03-28

  • Drop support for Python 3.6. :pr:2129

  • Remove previously deprecated code. :pr:2130

    • Group.resultcallback is renamed to result_callback.
    • autocompletion parameter to Command is renamed to shell_complete.
    • get_terminal_size is removed, use shutil.get_terminal_size instead.
    • get_os_args is removed, use sys.argv[1:] instead.
  • Rely on :pep:538 and :pep:540 to handle selecting UTF-8 encoding instead of ASCII. Click's locale encoding detection is removed. :issue:2198

  • Single options boolean flags with show_default=True only show the default if it is True. :issue:1971

  • The command and group decorators can be applied with or without parentheses. :issue:1359

  • The Path type can check whether the target is executable. :issue:1961

  • Command.show_default overrides Context.show_default, instead of the other way around. :issue:1963

  • Parameter decorators and @group handles cls=None the same as not passing cls. @option handles help=None the same as not passing help. :issue:[#1959](https://github.com/pallets/click/issues/1959)

  • A flag option with required=True requires that the flag is passed instead of choosing the implicit default value. :issue:1978

  • Indentation in help text passed to Option and Command is cleaned the same as using the @option and @command decorators does. A command's epilog and short_help are also processed. :issue:1985

  • Store unprocessed Command.help, epilog and short_help strings. Processing is only done when formatting help text for output. :issue:2149

  • Allow empty str input for prompt() when confirmation_prompt=True and default="". :issue:2157

  • Windows glob pattern expansion doesn't fail if a value is an invalid pattern. :issue:2195

  • It's possible to pass a list of params to @command. Any params defined with decorators are appended to the passed params. :issue:2131.

  • @command decorator is annotated as returning the correct type if a cls argument is used. :issue:2211

  • A Group with invoke_without_command=True and chain=False will invoke its result callback with the group function's return value. :issue:2124

... (truncated)

Commits
  • e4aceee Merge pull request #2224 from pallets/release-8.1.0
  • f8d811e release version 8.1.0
  • 20c88f0 Merge pull request #2223 from pallets/env-var
  • 8d7f03d treat empty auto_envvar as None
  • ef11be6 Merge pull request #2041 from spanglerco/shell-completion-option-values
  • f2e579a shell completion prioritizes option values over new options
  • d251cb0 Merge pull request #2219 from pallets/paramtype-name
  • e003331 fix ParamType.to_info_dict() with no name
  • 19be092 Merge pull request #2217 from pallets/group-return
  • 7d3a871 group without command passes return value to result callback
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1694/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1197298420,PR_kwDOBm6k_c4132NJ,1703,"Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0",49699333,dependabot[bot],closed,0,,,,,1,2022-04-08T13:08:53Z,2022-04-08T22:51:05Z,2022-04-08T22:51:05Z,CONTRIBUTOR,simonw/datasette/pulls/1703,"Updates the requirements on [beautifulsoup4](https://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1194790504,I_kwDOBm6k_c5HNw5o,1701,Use + for spaces instead of ~20,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-04-06T15:40:48Z,2022-04-06T15:55:10Z,2022-04-06T15:55:05Z,OWNER,,"Tilde encoding introduced in #1657 means that database files with spaces in the name - e.g. the Apple Mail `Envelope Index` database - end up with URLs like this: http://127.0.0.1:8001/Envelope~20Index I think this would be prettier: http://127.0.0.1:9933/Envelope+Index",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1189113609,I_kwDOBm6k_c5G4G8J,1697,"`Request.fake(..., url_vars={})`",9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-04-01T01:48:40Z,2022-04-01T02:02:18Z,2022-04-01T02:02:10Z,OWNER,,"I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well: ```python from datasette.utils.asgi import Request class Request(Request): @classmethod def fake(cls, path_with_query_string, method=""GET"", scheme=""http"", url_vars=None): """"""Useful for constructing Request objects for tests"""""" path, _, query_string = path_with_query_string.partition(""?"") scope = { ""http_version"": ""1.1"", ""method"": method, ""path"": path, ""raw_path"": path_with_query_string.encode(""latin-1""), ""query_string"": query_string.encode(""latin-1""), ""scheme"": scheme, ""type"": ""http"", } if url_vars: scope[""url_route""] = { ""kwargs"": url_vars } return cls(scope, None) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182143895,I_kwDOBm6k_c5GdhWX,1691,Bug in pytest-httpx example,9599,simonw,closed,0,,,,,0,2022-03-26T22:45:30Z,2022-03-26T22:46:09Z,2022-03-26T22:46:09Z,OWNER,,"https://docs.datasette.io/en/0.61.1/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx says: ```python async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url='https://www.example.com/', data='Hello world', ) ``` That's wrong - `data=` should be `text=`. https://github.com/Colin-b/pytest_httpx/blob/v0.20.0/README.md#reply-with-custom-body",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182065616,I_kwDOBm6k_c5GdOPQ,1689,datasette.add_message() documentation is incorrect,9599,simonw,closed,0,,,,,1,2022-03-26T20:49:42Z,2022-03-26T21:35:57Z,2022-03-26T20:51:21Z,OWNER,,"https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says: `.add_message(request, message, message_type=datasette.INFO)` But in the code it's: https://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178456794,I_kwDOCGYnMM5GPdLa,418,Add generated files to .gitignore,25778,eyeseast,closed,0,,,,,0,2022-03-23T17:48:12Z,2022-03-24T21:01:44Z,2022-03-24T21:01:44Z,CONTRIBUTOR,,"I end up with these in my local directory: .hypothesis/ Pipfile Pipfile.lock pyproject.toml Might as well gitignore them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178484369,PR_kwDOCGYnMM405rPe,419,Ignore common generated files,25778,eyeseast,closed,0,,,,,1,2022-03-23T18:06:22Z,2022-03-24T21:01:44Z,2022-03-24T21:01:44Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/419,"Closes #418 This adds four files to `.gitignore`: .hypothesis/ Pipfile Pipfile.lock pyproject.toml Those are all generated in the course of development and testing.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/419/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1179928510,I_kwDOBm6k_c5GVEe-,1683,allow_facet: False should be respected by column cog menu,9599,simonw,closed,0,,,,,0,2022-03-24T19:05:06Z,2022-03-24T19:16:36Z,2022-03-24T19:16:36Z,OWNER,,"The column cog menu currently shows ""Facet by this"" even if faceting is disabled for the Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1089529555,I_kwDOBm6k_c5A8ObT,1581,"when hashed urls are turned on, the _memory db has improperly long-lived cache expiry",536941,fgregg,closed,0,,,,,1,2021-12-28T00:05:48Z,2022-03-24T04:08:18Z,2022-03-24T04:08:18Z,CONTRIBUTOR,,"if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database. in particular, this header is set: `cache-control: max-age=31536000` this is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561). Either the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1090055810,PR_kwDOBm6k_c4wWDxH,1582,don't set far expiry if hash is '000',536941,fgregg,closed,0,,,,,1,2021-12-28T18:16:13Z,2022-03-24T04:07:58Z,2022-03-24T04:07:58Z,CONTRIBUTOR,simonw/datasette/pulls/1582,"This will close #1581. I couldn't find any unit tests related to the testing hashed urls, and I know that you want to break that code out of the core application (#1561), so I'm not quite sure what you would like me to for testing.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1173828092,PR_kwDOBm6k_c40q1eP,1665,Pin setup-gcloud to v0 instead of master,408570,sethvargo,closed,0,,,,,1,2022-03-18T17:17:22Z,2022-03-23T19:31:10Z,2022-03-23T17:55:39Z,NONE,simonw/datasette/pulls/1665,"setup-gcloud will be updating the branch name from master to main in a future release. Even though GitHub will establish redirects, this will break any GitHub Actions workflows that pin to master. This PR updates your GitHub Actions workflows to pin to v0, which is the recommended best practice.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1665/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1178521513,I_kwDOBm6k_c5GPs-p,1682,SQL queries against databases with different routes are broken,9599,simonw,closed,0,,,,,1,2022-03-23T18:42:57Z,2022-03-23T18:48:16Z,2022-03-23T18:48:16Z,OWNER,,"500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable Here's the trace: ``` File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 54, in data return await QueryView(self.ds).data( File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 232, in data self.ds.get_database(database), sql File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py"", line 401, in get_database return self.databases[name] KeyError: 'fixtures-aa7318b' ``` It looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago! _Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1682/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change. After shipping, release a new `datasette-hashed-urls` that depends on it, also this: - https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,bsilverm,closed,0,,,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324835838,MDU6SXNzdWUzMjQ4MzU4Mzg=,276,Handle spatialite geometry columns better,45057,russss,closed,0,,,,,21,2018-05-21T08:46:55Z,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,CONTRIBUTOR,,"I'd like to see spatialite geometry columns rendered more sensibly - at the moment they come through as well-known-binary unless you use custom SQL, and WKB isn't of much use to anyone on the web. In HTML: they should be shown either as simple lat/long (if it's just a point, for example), or as a sensible placeholder if they're more complex geometries. In JSON: they should be GeoJSON geometries, (which means they can be automatically fed into a leaflet map with no further messing around). In CSV: they should be WKT. I briefly wondered if this should go into a plugin, but I suspect it needs hooking in at a deeper level than the plugin architecture will support any time soon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/276/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175854982,I_kwDOBm6k_c5GFh-G,1679,Research: how much overhead does the n=1 time limit have?,9599,simonw,closed,0,,,3268330,Datasette 1.0,11,2022-03-21T19:27:46Z,2022-03-21T21:55:57Z,2022-03-21T21:55:56Z,OWNER,,"https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200 ```python @contextmanager def sqlite_timelimit(conn, ms): deadline = time.perf_counter() + (ms / 1000) # n is the number of SQLite virtual machine instructions that will be # executed between each check. It's hard to know what to pick here. # After some experimentation, I've decided to go with 1000 by default and # 1 for time limits that are less than 50ms n = 1000 if ms < 50: n = 1 def handler(): if time.perf_counter() >= deadline: return 1 conn.set_progress_handler(handler, n) try: yield finally: conn.set_progress_handler(None, n) ``` How often do I set a time limit of 50 or less? How much slower does it go thanks to this code?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170144879,I_kwDOBm6k_c5Fvv5v,1660,Refactor and simplify Datasette routing and views,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-15T19:56:56Z,2022-03-21T19:19:12Z,2022-03-21T19:19:01Z,OWNER,,"While working on: - https://github.com/simonw/datasette/issues/1657 - https://github.com/simonw/datasette/issues/1439 It became very clear that the least maintainable part of Datasette at the moment is the way routing to the database, table and row views work - in particular the subclassing mechanism with BaseView and DataView, but also the complex variety of ways in which the URL routes capture different named regular expression groups.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175715988,I_kwDOBm6k_c5GFACU,1678,Make `check_visibility()` a documented API,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:30:34Z,2022-03-21T19:04:03Z,2022-03-21T19:01:46Z,OWNER,,"Spotted this while working on: - #1677 https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175694248,I_kwDOBm6k_c5GE6uo,1677,Remove `check_permission()` from `BaseView`,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:18:18Z,2022-03-21T18:45:04Z,2022-03-21T18:45:03Z,OWNER,,"Follow-on from: - #1675 Refs: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175648453,I_kwDOBm6k_c5GEvjF,1675,Extract out `check_permissions()` from `BaseView,9599,simonw,closed,0,,,,,7,2022-03-21T16:39:46Z,2022-03-21T17:14:31Z,2022-03-21T17:13:21Z,OWNER,,"> I'm going to refactor this stuff out and document it so it can be easily used by plugins: https://github.com/simonw/datasette/blob/4a4164b81191dec35e423486a208b05a9edc65e4/datasette/views/base.py#L69-L103 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1074136176_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1091819089,I_kwDOCGYnMM5BE9ZR,360,MemoryError,559453,nzaar9,closed,0,,,,,1,2022-01-01T13:39:17Z,2022-03-21T04:22:46Z,2022-03-21T04:22:46Z,NONE,,"HI, when dealing with large json file (~170GB) i got the following error ``` Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1126, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1051, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1393, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 752, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/cli.py"", line 1300, in memory rows, format_used = rows_from_file(csv_fp, format=format, encoding=encoding) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 185, in rows_from_file return rows_from_file(buffered, format=Format.JSON) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 156, in rows_from_file decoded = json.load(fp) File ""/usr/lib/python3.9/json/__init__.py"", line 293, in load return loads(fp.read(), MemoryError ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,dotcs,closed,0,,,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command. Let's first create a simple database from JSON string ```console $ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo - $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""abc""}] ``` and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns)) ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run Traceback (most recent call last): File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert for row in db.conn.execute(sql, where_args).fetchall(): sqlite3.OperationalError: user-defined function raised exception ``` But without the `--dry-run` flag it does work as expected: ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""bar""}] ``` ```console $ sqlite-utils --version sqlite-utils, version 3.25.1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174306154,I_kwDOBm6k_c5F_n1q,1668,"Introduce concept of a database `route`, separate from its name",9599,simonw,closed,0,,,3268330,Datasette 1.0,20,2022-03-19T16:48:28Z,2022-03-20T16:43:16Z,2022-03-20T16:43:16Z,OWNER,,"Some issues came up in the new `datasette-hashed-urls` plugin relating to the way it renames databases on startup to achieve unique URLs that depend on the database SHA-256 content: - https://github.com/simonw/datasette-hashed-urls/issues/10 - https://github.com/simonw/datasette-hashed-urls/issues/9 - https://github.com/simonw/datasette-hashed-urls/issues/8 All three of these could be addressed by making the ""path"" concept for a database (the `/foo` bit where it is served) work independently of the database's name, which would be used for default display and also as the alias when configuring cross-database aliases.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531755959,MDU6SXNzdWU1MzE3NTU5NTk=,647,Move hashed URL mode out to a plugin,9599,simonw,closed,0,,,3268330,Datasette 1.0,9,2019-12-03T06:29:03Z,2022-03-19T11:56:05Z,2022-03-15T23:13:06Z,OWNER,,"They used to be the default until #418. Since making them optional I haven't felt the need to use them even once. That suggests to me that they should be removed. I think their effect could be entirely handled by an ASGI wrapping plugin. https://datasette.readthedocs.io/en/0.32/performance.html#hashed-url-mode",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810397025,MDU6SXNzdWU4MTAzOTcwMjU=,1228,500 error caused by faceting if a column called `n` exists,7107523,Kabouik,closed,0,,,,,5,2021-02-17T17:41:20Z,2022-03-19T06:44:40Z,2022-03-19T01:38:04Z,NONE,,"I recently discovered `datasette` thanks to your great talk at FOSDEM and would like to use it for some projects. However, when trying to use it on databases created from some csv ot tsv files, I am sometimes getting this issue when going to http://127.0.0.1:8001/databasetest/databasetest and I don't exactly understand what it refers to. So far, I couldn't find anything relevant when reviewing the raw text files that could explain this issue, nor could I find something obvious between the files that generate this issue and those that don't. Does the error ring a bell and, if so, could you please point me to the right direction? ``` $ datasette databasetest.db INFO: Started server process [1408482] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:56394 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56394 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56396 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56398 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK Traceback (most recent call last): File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/app.py"", line 1099, in route_path response = await view(request, send) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 147, in view request, **request.scope[""url_route""][""kwargs""] File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 121, in dispatch_request return await handler(request, *args, **kwargs) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 260, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 434, in view_get request, database, hash, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/table.py"", line 782, in data suggested_facets.extend(await facet.suggest()) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in suggest and any(r[""n""] > 1 for r in distinct_values) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in and any(r[""n""] > 1 for r in distinct_values) TypeError: '>' not supported between instances of 'str' and 'int' INFO: 127.0.0.1:56402 - ""GET /databasetest/databasetest HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:56402 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest?sql=select+*+from+databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /databasetest.json?sql=select+*+from+databasetest&_shape=array&_shape=array HTTP/1.1"" 200 OK ^CINFO: Shutting down INFO: Waiting for application shutdown. INFO: Application shutdown complete. INFO: Finished server process [1408482] ``` Note that there is no error if I go to http://127.0.0.1:8001/databasetest and then click on `Run SQL`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161584460,I_kwDOBm6k_c5FPF9M,1651,Get rid of the no-longer necessary ?_format=json hack for tables called x.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-07T15:40:42Z,2022-03-19T04:04:50Z,2022-03-15T18:25:42Z,OWNER,,"Tidy up from: - #1439",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1169840669,I_kwDOBm6k_c5Fulod,1658,Revert main to version that passes tests,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-15T15:37:02Z,2022-03-19T04:04:50Z,2022-03-15T15:42:58Z,OWNER,,"> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1658/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features: - `db.hash` - `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108235694,I_kwDOBm6k_c5CDlWu,1603,A proper favicon,9599,simonw,closed,0,,,3268330,Datasette 1.0,19,2022-01-19T15:24:55Z,2022-03-19T04:04:49Z,2022-01-20T06:07:31Z,OWNER,,"Tips here: https://adamj.eu/tech/2022/01/18/how-to-add-a-favicon-to-your-django-site/ - I think a PNG served at `/favicon.ico` is the best option, since safari doesn't support SVG yet. Relevant code: https://github.com/simonw/datasette/blob/cb29119db9115b1f40de2fb45263ed77e3bfbb3e/datasette/app.py#L182-L183 I can reuse the icon for https://datasette.io/desktop",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114147905,I_kwDOBm6k_c5CaIxB,1612,Move canned queries closer to the SQL input area,639012,jsfenfen,closed,0,,,3268330,Datasette 1.0,5,2022-01-25T17:06:39Z,2022-03-19T04:04:49Z,2022-01-25T18:34:21Z,CONTRIBUTOR,,"*Original title: Consider placing example queries above the sql input?* Hi! Have been enjoying deploying ad hoc datasettes for collaborators to pick over! I keep finding myself manually ""fixing"" the database.html template so that the ""example queries"" (canned queries) appear directly *over* the sql box? So they are sorta more a suggestion for collaborators who aren't inclined to write their own queries? My sense is any time I go to the trouble of writing canned queries my users should see 'em? (( I have also considered a client-side reactive-ish option where selecting a query just places the raw SQL in the box and doesn't execute it, but this seems to end up being an inconvenience, rather than a teaching tool. )) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122413719,I_kwDOBm6k_c5C5qyX,1621,Test against Python 3.11 dev version,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-02-02T21:38:57Z,2022-03-19T04:04:49Z,2022-02-02T21:58:54Z,OWNER,,"To avoid another surprise like we got with 3.10: https://simonwillison.net/2021/Oct/9/finding-and-reporting-a-bug/ From a quick GitHub code search it looks like `3.11-dev` should work: https://cs.github.com/urllib3/urllib3/blob/7bec77e81aa0a194c98381053225813f5347c9d2/.github/workflows/ci.yml#L60",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from: - #1620 ``` % curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ``` > % mkdir /tmp/data > % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit) > ^CINFO: Shutting down > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK > ``` > The first time I ran Datasette I got two databases - `fixtures` and `created` > > BUT... when I ran Datasette the second time it looked like this: > > > > This is the same result you get if you run: > > datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db > > This is caused by this Datasette issue: > - https://github.com/simonw/datasette/issues/509 > > So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831 _Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead: - #647 - https://github.com/simonw/datasette-hashed-urls/issues/3 https://github.com/simonw/datasette-hashed-urls Sub-tasks: - [x] Remove hashed URL mode implementation - [x] Update documentation - [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173017980,PR_kwDOBm6k_c40oRq-,1664,Remove hashed URL mode,9599,simonw,closed,0,,,,,7,2022-03-17T23:19:10Z,2022-03-19T00:12:04Z,2022-03-19T00:12:04Z,OWNER,simonw/datasette/pulls/1664,Refs #1661.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1664/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 973139047,MDU6SXNzdWU5NzMxMzkwNDc=,1439,Rethink how .ext formats (v.s. ?_format=) works before 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,48,2021-08-17T23:32:51Z,2022-03-15T20:51:26Z,2022-03-15T20:51:26Z,OWNER,,"Datasette currently has surprising special behaviour for if a table name ends in `.csv` - which can happen when a tool like `csvs-to-sqlite` creates tables that match the filename that they were imported from. https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv illustrates this behaviour: it links to `.csv` and `.json` that look like this: - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv&_size=max Where normally Datasette would add the `.csv` or `.json` extension to the path component of the URL (as seen on other pages such as https://latest.datasette.io/fixtures/facet_cities) here the [path_with_format() function](https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/utils/__init__.py#L710) notices that there is already a `.` in the path and instead adds `?_format=csv` to the query string instead. The problem with this mechanism is that it's pretty surprising. Anyone writing external code to Datasette who wants to get back the `.csv` or `.json` version giving the URL to a table page will need to know about and implement this behaviour themselves. That's likely to cause all kinds of bugs in the future.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1168995756,I_kwDOBm6k_c5FrXWs,1657,Tilde encoding: use ~ instead of - for dash-encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2022-03-14T22:55:17Z,2022-03-15T18:25:11Z,2022-03-15T18:01:58Z,OWNER,,Refs #1439,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1168357113,PR_kwDOBm6k_c40ZRDA,1656,"Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0",49699333,dependabot[bot],closed,0,,,,,1,2022-03-14T13:11:53Z,2022-03-15T18:03:03Z,2022-03-15T18:03:02Z,CONTRIBUTOR,simonw/datasette/pulls/1656,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
Release notes

Sourced from pytest's releases.

7.1.0

pytest 7.1.0 (2022-03-13)

Breaking Changes

  • #8838: As per our policy, the following features have been deprecated in the 6.X series and are now removed:

    • pytest._fillfuncargs function.
    • pytest_warning_captured hook - use pytest_warning_recorded instead.
    • -k -foobar syntax - use -k 'not foobar' instead.
    • -k foobar: syntax.
    • pytest.collect module - import from pytest directly.

    For more information consult Deprecations and Removals in the docs.

  • #9437: Dropped support for Python 3.6, which reached end-of-life at 2021-12-23.

Improvements

  • #5192: Fixed test output for some data types where -v would show less information.

    Also, when showing diffs for sequences, -q would produce full diffs instead of the expected diff.

  • #9362: pytest now avoids specialized assert formatting when it is detected that the default __eq__ is overridden in attrs or dataclasses.

  • #9536: When -vv is given on command line, show skipping and xfail reasons in full instead of truncating them to fit the terminal width.

  • #9644: More information about the location of resources that led Python to raise ResourceWarning{.interpreted-text role="class"} can now be obtained by enabling tracemalloc{.interpreted-text role="mod"}.

    See resource-warnings{.interpreted-text role="ref"} for more information.

  • #9678: More types are now accepted in the ids argument to @pytest.mark.parametrize. Previously only [str]{.title-ref}, [float]{.title-ref}, [int]{.title-ref} and [bool]{.title-ref} were accepted; now [bytes]{.title-ref}, [complex]{.title-ref}, [re.Pattern]{.title-ref}, [Enum]{.title-ref} and anything with a [__name__]{.title-ref} are also accepted.

  • #9692: pytest.approx{.interpreted-text role="func"} now raises a TypeError{.interpreted-text role="class"} when given an unordered sequence (such as set{.interpreted-text role="class"}).

    Note that this implies that custom classes which only implement __iter__ and __len__ are no longer supported as they don't guarantee order.

Bug Fixes

  • #8242: The deprecation of raising unittest.SkipTest{.interpreted-text role="class"} to skip collection of tests during the pytest collection phase is reverted - this is now a supported feature again.
  • #9493: Symbolic link components are no longer resolved in conftest paths. This means that if a conftest appears twice in collection tree, using symlinks, it will be executed twice.

... (truncated)

Commits
  • 1dbffcc [pre-commit.ci] auto fixes from pre-commit.com hooks
  • d53a5fb Prepare release version 7.1.0
  • d306ec0 Update upcoming trainings (#9744)
  • 3e4c14b Merge pull request #9751 from fabianegli/main
  • 7f924b1 Fix typo in deprecation documentation
  • 4a8f8ad build(deps): Bump django from 4.0.2 to 4.0.3 in /testing/plugins_integration ...
  • c0fd2d8 build(deps): Bump pytest-asyncio from 0.18.1 to 0.18.2 in /testing/plugins_in...
  • 843e018 Merge pull request #9732 from nicoddemus/9730-toml-failure
  • bc43d66 [automated] Update plugin list (#9733)
  • e38d1ca Improve error message for malformed pyproject.toml files
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1656/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1169895600,PR_kwDOBm6k_c40eW7C,1659,Tilde encoding,9599,simonw,closed,0,,,,,1,2022-03-15T16:19:07Z,2022-03-15T18:01:58Z,2022-03-15T18:01:57Z,OWNER,simonw/datasette/pulls/1659,Refs #1657,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1145882578,I_kwDOCGYnMM5ETMfS,408,`deterministic=True` fails on versions of SQLite prior to 3.8.3,24938923,learning4life,closed,0,,,,,6,2022-02-21T14:36:43Z,2022-03-13T16:54:09Z,2022-03-02T00:38:11Z,NONE,,"Hi, love your work. I am unable to lookup indexes in a database using sqlite-utils: ` sqlite-utils indexes city_spec.db --table` or `sqlite-utils indexes city_spec.db MyTable ` **Software** sqlite-utils, version 3.24 sqlite3 --version: 3.36.0 **Output:** Traceback (most recent call last): File ""/opt/app-root/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/decorators.py"", line 26, in new_func return f(get_current_context(), *args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 2123, in indexes ctx.invoke( File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 1624, in query db.register_fts4_bm25() File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 403, in register_fts4_bm25 self.register_function(rank_bm25, deterministic=True) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 399, in register_function register(fn) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 392, in register self.conn.create_function(name, arity, fn, **kwargs) sqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166731361,I_kwDOCGYnMM5Fiuhh,414,I forgot to include the changelog in the 3.25.1 release,9599,simonw,closed,0,,,,,7,2022-03-11T18:32:36Z,2022-03-11T18:40:39Z,2022-03-11T18:40:39Z,OWNER,,"I pushed a release for https://github.com/simonw/sqlite-utils/releases/tag/3.25.1 but forgot to include the release notes in `docs/changelog.rst` This means https://sqlite-utils.datasette.io/en/stable/changelog.html isn't showing them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166587040,I_kwDOCGYnMM5FiLSg,413,Display autodoc type information more legibly,9599,simonw,closed,0,,,,,5,2022-03-11T15:58:20Z,2022-03-11T18:07:10Z,2022-03-11T18:07:10Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.25/reference.html#sqlite_utils.db.Table.insert looks like this at the moment: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161969891,I_kwDOBm6k_c5FQkDj,1654,Adopt a code of conduct,9599,simonw,closed,0,,,,,5,2022-03-07T22:00:24Z,2022-03-07T22:19:35Z,2022-03-07T22:19:35Z,OWNER,,"This is long overdue, especially given the size of the project now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160750713,I_kwDOBm6k_c5FL6Z5,1650,Implement redirects from old % encoding to new dash encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,5,2022-03-06T23:40:02Z,2022-03-07T19:26:15Z,2022-03-07T19:26:14Z,OWNER,,"> One big advantage to this scheme is that redirecting old links to `%2F` pages (e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators) is easy - if you see a `%` in the `raw_path`, redirect to that page with the `%` replaced by `-`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1439#issuecomment-1060044007_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160432941,PR_kwDOBm6k_c4z_p6S,1648,Use dash encoding for table names and row primary keys in URLs,9599,simonw,closed,0,,,,,7,2022-03-05T19:50:45Z,2022-03-07T15:38:30Z,2022-03-07T15:38:30Z,OWNER,simonw/datasette/pulls/1648,"Refs #1439. - [x] Build `dash_encode` / `dash_decode` functions - [x] Use dash encoding for row primary keys - [x] Use dash encoding for `?_next=` pagination tokens - [x] Use dash encoding for table names in URLs - [x] Use dash encoding for database name - ~~Implement redirects from previous `%` URLs that replace those with `-`~~ - separate issue: #1650",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1648/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1160677684,PR_kwDOBm6k_c40AW_v,1649,Add /opt/homebrew to where spatialite extension can be found,2182,danp,closed,0,,,,,1,2022-03-06T18:09:35Z,2022-03-06T22:46:00Z,2022-03-06T19:39:15Z,CONTRIBUTOR,simonw/datasette/pulls/1649,"Helps homebrew on Apple Silicon setups find spatialite without needing a full path. Similar to #1114",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1649/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1149729902,PR_kwDOCGYnMM4zbaJy,410,Correct spelling mistakes (found with codespell),3818,EdwardBetts,closed,0,,,,,1,2022-02-24T20:44:18Z,2022-03-06T08:48:29Z,2022-03-01T21:05:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/410,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/410/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1098275181,PR_kwDOBm6k_c4wwNCl,1589,Typo in docs about default redirect status code,3556,davidbgk,closed,0,,,,,1,2022-01-10T19:14:36Z,2022-03-06T02:27:49Z,2022-03-06T01:58:32Z,CONTRIBUTOR,simonw/datasette/pulls/1589,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1108084641,PR_kwDOBm6k_c4xQ0uZ,1602,"Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2",49699333,dependabot[bot],closed,0,,,,,1,2022-01-19T13:11:50Z,2022-03-06T01:41:50Z,2022-03-06T01:41:49Z,CONTRIBUTOR,simonw/datasette/pulls/1602,"Updates the requirements on [pytest-timeout](https://github.com/pytest-dev/pytest-timeout) to permit the latest version.
Commits
  • 8e4800e Fixup readme
  • dc1efca 2.1.0 release
  • dd9d608 Add custom hooks specifications for overriding setup_timeout and teardown_tim...
  • ed8ecd6 module names, they're difficult
  • 3ab4319 Add changelog
  • 4f7ebae Replace deprecated py.io.get_terminal_width() with shutil.get_terminal_size()...
  • b8a2fa6 Prep release
  • 951972d Update changelog
  • 748a9c3 Making detection of whether a debugger is currently attached more flexible. (...
  • f8a46a1 Github removed the git protocol (#112)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1112633417,PR_kwDOBm6k_c4xfryi,1610,"Update asgiref requirement from <3.5.0,>=3.2.10 to >=3.2.10,<3.6.0",49699333,dependabot[bot],closed,0,,,,,0,2022-01-24T13:14:18Z,2022-03-06T01:30:27Z,2022-03-06T01:30:27Z,CONTRIBUTOR,simonw/datasette/pulls/1610,"Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
Changelog

Sourced from asgiref's changelog.

3.5.0 (2022-01-22)

  • Python 3.6 is no longer supported, and asyncio calls have been changed to use only the modern versions of the APIs as a result

  • Several causes of RuntimeErrors in cases where an event loop was assigned to a thread but not running

  • Speed improvements in the Local class

3.4.1 (2021-07-01)

  • Fixed an issue with the deadlock detection where it had false positives during exception handling.

3.4.0 (2021-06-27)

  • Calling sync_to_async directly from inside itself (which causes a deadlock when in the default, thread-sensitive mode) now has deadlock detection.

  • asyncio usage has been updated to use the new versions of get_event_loop, ensure_future, wait and gather, avoiding deprecation warnings in Python 3.10. Python 3.6 installs continue to use the old versions; this is only for 3.7+

  • sync_to_async and async_to_sync now have improved type hints that pass through the underlying function type correctly.

  • All Websocket* types are now spelled WebSocket, to match our specs and the official spelling. The old names will work until release 3.5.0, but will raise deprecation warnings.

  • The typing for WebSocketScope and HTTPScope's extensions key has been fixed.

3.3.4 (2021-04-06)

  • The async_to_sync type error is now a warning due the high false negative rate when trying to detect coroutine-returning callables in Python.

3.3.3 (2021-04-06)

... (truncated)

Commits
  • 8b61513 Releasing 3.5.0
  • b2e1c9d Fixed pytest_asyncio deprecation warning.
  • 2eda551 Added testing for Python 3.10.
  • 02fecb6 Drop Python 3.6 (#307)
  • 6689c0a Added stacklevel to warning in AsyncToSync.
  • 4364f9b Changed how StatelessServer handles event loops
  • 7bc055c Update implementations.rst (#295)
  • c758984 Move current_task import choice to module definition time
  • dfe87b2 Fixed #292: Use get_event_loop in class-level code
  • b3a65e3 Removed class variable which has been unused since a0bbe90
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1610/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1124191982,PR_kwDOBm6k_c4yFTCp,1629,"Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0",49699333,dependabot[bot],closed,0,,,,,1,2022-02-04T13:14:10Z,2022-03-06T01:30:06Z,2022-03-06T01:30:06Z,CONTRIBUTOR,simonw/datasette/pulls/1629,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
Release notes

Sourced from pytest's releases.

7.0.0

pytest 7.0.0 (2022-02-03)

(Please see the full set of changes for this release also in the 7.0.0rc1 notes below)

Deprecations

  • #9488: If custom subclasses of nodes like pytest.Item{.interpreted-text role="class"} override the __init__ method, they should take **kwargs. See uncooperative-constructors-deprecated{.interpreted-text role="ref"} for details.

    Note that a deprection warning is only emitted when there is a conflict in the arguments pytest expected to pass. This deprecation was already part of pytest 7.0.0rc1 but wasn't documented.

Bug Fixes

  • #9355: Fixed error message prints function decorators when using assert in Python 3.8 and above.
  • #9396: Ensure pytest.Config.inifile{.interpreted-text role="attr"} is available during the pytest_cmdline_main <_pytest.hookspec.pytest_cmdline_main>{.interpreted-text role="func"} hook (regression during 7.0.0rc1).

Improved Documentation

  • #9404: Added extra documentation on alternatives to common misuses of [pytest.warns(None)]{.title-ref} ahead of its deprecation.
  • #9505: Clarify where the configuration files are located. To avoid confusions documentation mentions that configuration file is located in the root of the repository.

Trivial/Internal Changes

  • #9521: Add test coverage to assertion rewrite path.

pytest 7.0.0rc1 (2021-12-06)

Breaking Changes

  • #7259: The Node.reportinfo() <non-python tests>{.interpreted-text role="ref"} function first return value type has been expanded from [py.path.local | str]{.title-ref} to [os.PathLike[str] | str]{.title-ref}.

    Most plugins which refer to [reportinfo()]{.title-ref} only define it as part of a custom pytest.Item{.interpreted-text role="class"} implementation. Since [py.path.local]{.title-ref} is a [os.PathLike[str]]{.title-ref}, these plugins are unaffacted.

    Plugins and users which call [reportinfo()]{.title-ref}, use the first return value and interact with it as a [py.path.local]{.title-ref}, would need to adjust by calling [py.path.local(fspath)]{.title-ref}. Although preferably, avoid the legacy [py.path.local]{.title-ref} and use [pathlib.Path]{.title-ref}, or use [item.location]{.title-ref} or [item.path]{.title-ref}, instead.

    Note: pytest was not able to provide a deprecation period for this change.

... (truncated)

Commits
  • 3554b83 Add note to changelog
  • 6ea7f99 Prepare release version 7.0.0
  • 737b220 [7.0.x] releasing: Add template for major releases (#9597)
  • 7fa3972 [7.0.x] releasing: Always set doc_version (#9590)
  • b304499 [7.0.x] Make 'warnings' and 'deselected' in assert_outcomes optional (#9566)
  • f17525d [7.0.x] doc: Add ellipsis to warning usecase list (#9562)
  • 0a7be97 ci: Bump up timeout (#9565)
  • c17908c [7.0.x] doc: Recategorize 7.0.0 changelog items (#9564)
  • ab549bb [7.0.x] Add missing cooperative constructor changelog (#9563)
  • 4b1707f [7.0.x] Autouse linearization graph (#9557)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1629/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1125973221,PR_kwDOBm6k_c4yK44E,1631,"Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19",49699333,dependabot[bot],closed,0,,,,,1,2022-02-07T13:13:19Z,2022-03-06T01:29:54Z,2022-03-06T01:29:53Z,CONTRIBUTOR,simonw/datasette/pulls/1631,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Release notes

Sourced from pytest-asyncio's releases.

pytest-asyncio 0.18.0


title: 'pytest-asyncio: pytest support for asyncio'

pytest-asyncio is an Apache2 licensed library, written in Python, for testing asyncio code with pytest.

asyncio code is usually written in the form of coroutines, which makes it slightly more difficult to test using normal testing tools. pytest-asyncio provides useful fixtures and markers to make testing easier.

@pytest.mark.asyncio
async def test_some_asyncio_code():
    res = await library.do_something()
    assert b"expected result" == res

pytest-asyncio has been strongly influenced by pytest-tornado.

Features

  • fixtures for creating and injecting versions of the asyncio event loop
  • fixtures for injecting unused tcp/udp ports
  • pytest markers for treating tests as asyncio coroutines
  • easy testing with non-default event loops
  • support for [async def]{.title-ref} fixtures and async generator fixtures
  • support auto mode to handle all async fixtures and tests automatically by asyncio; provide strict mode if a test suite should work with different async frameworks simultaneously, e.g. asyncio and trio.

Installation

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1631/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1160407071,I_kwDOBm6k_c5FKmgf,1647,Test failures with SQLite 3.37.0+ due to column affinity case,9599,simonw,closed,0,,,,,5,2022-03-05T17:37:46Z,2022-03-05T19:56:28Z,2022-03-05T19:47:04Z,OWNER,,"These three tests are failing on my local machine: ``` FAILED tests/test_internals_database.py::test_table_column_details[facetable-expected0] - AssertionError: assert [Column(cid=0, name='pk', type='INTEGER', no... FAILED tests/test_internals_database.py::test_table_column_details[sortable-expected1] - AssertionError: assert [Column(cid=0, name='pk1', type='varchar(30)'... FAILED tests/test_table_html.py::test_sort_links - AssertionError: assert [{'a_href': None,\n 'attrs': {'class': ['col-Link'],\n 'data-column': '... ``` I ran `pytest --lf -vv` and the output had things like this in it: ``` E - Column(cid=1, name='created', type='text', notnull=0, default_value=None, is_pk=0, hidden=0), E ? ^^^^ E + Column(cid=1, name='created', type='TEXT', notnull=0, default_value=None, is_pk=0, hidden=0), ... E {'a_href': '/fixtures/sortable?_sort=sortable_with_nulls_2', E 'attrs': {'class': ['col-sortable_with_nulls_2'], E 'data-column': 'sortable_with_nulls_2', E 'data-column-not-null': '0', E - 'data-column-type': 'real', E ? ^^^^ E + 'data-column-type': 'REAL', E ? ^^^^ ``` Something is causing column types to come back in uppercase where previously they were lowercase.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1152072027,I_kwDOBm6k_c5Eqzlb,1642,Dependency issue with asgiref and uvicorn,9599,simonw,closed,0,,,,,1,2022-02-26T18:00:35Z,2022-03-05T01:11:27Z,2022-03-05T01:11:17Z,OWNER,,"``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. datasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible. ``` That's after I forced an upgrade of `uvicorn` due to this warning: ``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. uvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1063388037,I_kwDOCGYnMM4_YgOF,343,Provide function to generate hash_id from specified columns,82988,psychemedia,closed,0,,,,,4,2021-11-25T10:12:12Z,2022-03-02T04:25:25Z,2022-03-02T04:25:25Z,NONE,,"Hi I note that you define `_hash()` to create a `hash_id` from non-id column values in a table [here](https://github.com/simonw/sqlite-utils/blob/8f386a0d300d1b1c76132bb75972b755049fb742/sqlite_utils/db.py#L2996). It would be useful to be able to call a complementary function to generate a corresponding `_id` from a subset of specified columns when adding items to another table, eg to support the creation of foreign keys. Or is there a better pattern for doing that?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 677272618,MDU6SXNzdWU2NzcyNzI2MTg=,928,Test failures caused by failed attempts to mock pip,9599,simonw,closed,0,,,,,4,2020-08-11T23:53:18Z,2022-02-23T16:19:47Z,2020-08-12T00:07:49Z,OWNER,,"Errors like this one: https://github.com/simonw/datasette/pull/927/checks?check_run_id=973559696 ``` 2020-08-11T23:36:39.8801334Z =================================== FAILURES =================================== 2020-08-11T23:36:39.8802411Z _________________________________ test_install _________________________________ 2020-08-11T23:36:39.8803242Z 2020-08-11T23:36:39.8804935Z thing = 2020-08-11T23:36:39.8806663Z comp = 'main', import_path = 'pip._internal.cli.main' 2020-08-11T23:36:39.8807696Z 2020-08-11T23:36:39.8808728Z def _dot_lookup(thing, comp, import_path): 2020-08-11T23:36:39.8810573Z try: 2020-08-11T23:36:39.8812262Z > return getattr(thing, comp) 2020-08-11T23:36:39.8817136Z E AttributeError: module 'pip._internal.cli' has no attribute 'main' 2020-08-11T23:36:39.8843043Z 2020-08-11T23:36:39.8855951Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1215: AttributeError 2020-08-11T23:36:39.8873372Z 2020-08-11T23:36:39.8877803Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.8906532Z 2020-08-11T23:36:39.8925767Z def get_src_prefix(): 2020-08-11T23:36:39.8928277Z # type: () -> str 2020-08-11T23:36:39.8930068Z if running_under_virtualenv(): 2020-08-11T23:36:39.8949721Z src_prefix = os.path.join(sys.prefix, 'src') 2020-08-11T23:36:39.8951813Z else: 2020-08-11T23:36:39.8969014Z # FIXME: keep src in cwd for now (it is not a temporary folder) 2020-08-11T23:36:39.9012110Z try: 2020-08-11T23:36:39.9013489Z > src_prefix = os.path.join(os.getcwd(), 'src') 2020-08-11T23:36:39.9014538Z E FileNotFoundError: [Errno 2] No such file or directory 2020-08-11T23:36:39.9016122Z 2020-08-11T23:36:39.9017617Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/site-packages/pip/_internal/locations.py:50: FileNotFoundError 2020-08-11T23:36:39.9018802Z 2020-08-11T23:36:39.9020070Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.9020930Z 2020-08-11T23:36:39.9022275Z args = (), keywargs = {} 2020-08-11T23:36:39.9023183Z 2020-08-11T23:36:39.9024077Z @wraps(func) 2020-08-11T23:36:39.9024984Z def patched(*args, **keywargs): 2020-08-11T23:36:39.9028770Z > with self.decoration_helper(patched, 2020-08-11T23:36:39.9031861Z args, 2020-08-11T23:36:39.9038358Z keywargs) as (newargs, newkeywargs): 2020-08-11T23:36:39.9039654Z 2020-08-11T23:36:39.9040566Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1322: 2020-08-11T23:36:39.9041492Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842765105,MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky,6,Add testres-db tool,1151557,ligurio,closed,0,,,,,1,2021-03-28T15:43:23Z,2022-02-16T05:12:05Z,2022-02-16T05:12:05Z,NONE,dogsheep/dogsheep.github.io/pulls/6,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1138948786,PR_kwDOCGYnMM4y3yW0,407,Add SpatiaLite helpers to CLI,25778,eyeseast,closed,0,,,,,7,2022-02-15T16:50:17Z,2022-02-16T01:49:40Z,2022-02-16T00:58:08Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/407,"Closes #398 This adds SpatiaLite helpers to the CLI. ```sh # init spatialite when creating a database sqlite-utils create database.db --enable-wal --init-spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this will throw an error if the table doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column # this will throw an error it the table and column don't exist sqlite-utils create-spatial-index database.db table-name geometry ``` Docs and tests are included. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1128139375,I_kwDOCGYnMM5DPgpv,405,"`Database(memory_name=""name"")` constructor argument",9599,simonw,closed,0,,,,,2,2022-02-09T07:15:03Z,2022-02-16T01:23:16Z,2022-02-16T01:23:16Z,OWNER,,"SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process. Datasette supports this - SQLite could support it too. https://docs.datasette.io/en/0.60.2/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124237013,I_kwDOCGYnMM5DAn7V,398,Add SpatiaLite helpers to CLI,25778,eyeseast,closed,0,,,,,9,2022-02-04T14:01:28Z,2022-02-16T01:02:29Z,2022-02-16T00:58:07Z,CONTRIBUTOR,,"Now that #385 is merged, add CLI versions of those methods. ```sh # init spatialite sqlite-utils init-spatialite database.db # or maybe/also sqlite-utils create database.db --enable-wal --spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this needs to create a table if it doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column sqlite-utils create-spatial-index database.db table-name geometry ``` Should be mostly straightforward. The one thing worth highlighting in docs is that geometry columns can only be added to existing tables. Trying to add a geometry column to a table that doesn't exist yet might mean you have a schema like `{""rowid"": int, ""geometry"": bytes}`. Might be worth nudging people to explicitly create a table first, then add geometry columns. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128120451,I_kwDOCGYnMM5DPcCD,404,Add example of `--convert` to the help for `sqlite-utils insert`,9599,simonw,closed,0,,,,,2,2022-02-09T06:49:09Z,2022-02-09T06:56:35Z,2022-02-09T06:55:16Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of `--convert` in action. I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike. These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126692066,I_kwDOCGYnMM5DJ_Ti,403,Document how to add a primary key to a rowid table using `sqlite-utils transform --pk`,536941,fgregg,closed,0,,,,,4,2022-02-08T01:39:40Z,2022-02-09T04:22:43Z,2022-02-08T19:33:59Z,CONTRIBUTOR,,"*Original title: Add option for adding a new, serial, primary key* sometimes we have tables that don't have primary keys, but ought to have them. we *can* use rowid for that, but it would often be nicer to have an explicit primary key. using the current value of rowid would be fine.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1101705012,PR_kwDOBm6k_c4w7eqc,1593,"Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18",49699333,dependabot[bot],closed,0,,,,,2,2022-01-13T13:11:50Z,2022-02-07T13:13:24Z,2022-02-07T13:13:23Z,CONTRIBUTOR,simonw/datasette/pulls/1593,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Release notes

Sourced from pytest-asyncio's releases.

pytest-asyncio 0.17.0


title: 'pytest-asyncio: pytest support for asyncio'

pytest-asyncio is an Apache2 licensed library, written in Python, for testing asyncio code with pytest.

asyncio code is usually written in the form of coroutines, which makes it slightly more difficult to test using normal testing tools. pytest-asyncio provides useful fixtures and markers to make testing easier.

@pytest.mark.asyncio
async def test_some_asyncio_code():
    res = await library.do_something()
    assert b"expected result" == res

pytest-asyncio has been strongly influenced by pytest-tornado.

Features

  • fixtures for creating and injecting versions of the asyncio event loop
  • fixtures for injecting unused tcp/udp ports
  • pytest markers for treating tests as asyncio coroutines
  • easy testing with non-default event loops
  • support for [async def]{.title-ref} fixtures and async generator fixtures
  • support auto mode to handle all async fixtures and tests automatically by asyncio; provide strict mode if a test suite should work with different async frameworks simultaneously, e.g. asyncio and trio.

Installation

... (truncated)

Commits
  • 2e2d5d2 Bump to 0.17 release
  • 90436c9 Fix pandoc installation procedure
  • d291c66 Convert README.rst to Markdown for making githun release
  • 141937b Fix release artifacts
  • 696cf7d Fix trove classifier for asyncio
  • 8ccb306 Build on tag
  • cd84987 Release process automation (#252)
  • 2eb12a7 Setup GitHub Workflows linter and yaml-reformatter (#253)
  • d28b826 Bump codecov/codecov-action from 1 to 2.1.0 (#251)
  • 2f523ba Configure dependabot version updater (#250)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 779691739,MDU6SXNzdWU3Nzk2OTE3Mzk=,1176,"Policy on documenting ""public"" datasette.utils functions",9599,simonw,closed,0,,,3268330,Datasette 1.0,13,2021-01-05T22:55:25Z,2022-02-07T06:43:32Z,2022-02-07T06:42:58Z,OWNER,,"https://github.com/simonw/datasette-css-properties starts [like this](https://github.com/simonw/datasette-css-properties/blob/0.1/datasette_css_properties/__init__.py#L1-L3): ```python from datasette import hookimpl from datasette.utils.asgi import Response from datasette.utils import escape_css_string, to_css_class ``` `escape_css_string` and `to_css_class` are not documented, which means relying on them is risky since there's no promise that they won't change. Would be good to figure out a policy on this, and maybe promote some of them to ""documented"" status.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1176/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688622148,MDU6SXNzdWU2ODg2MjIxNDg=,957,Simplify imports of common classes,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2020-08-29T23:44:04Z,2022-02-06T06:36:41Z,2022-02-06T06:34:37Z,OWNER,,"There are only a few classes that plugins need to import. It would be nice if these imports were as short and memorable as possible. For example: ```python from datasette.app import Datasette from datasette.utils.asgi import Response ``` Could both become: ```python from datasette import Datasette from datasette import Response ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125081640,I_kwDOCGYnMM5DD2Io,401,Update SpatiaLite example in the documentation,9599,simonw,closed,0,,,,,2,2022-02-06T02:02:07Z,2022-02-06T02:05:03Z,2022-02-06T02:03:24Z,OWNER,,"This one here: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions It should take advantage of the new methods from: - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125077063,I_kwDOCGYnMM5DD1BH,400,`sqlite-utils create-table` ... `--if-not-exists`,9599,simonw,closed,0,,,,,1,2022-02-06T01:32:53Z,2022-02-06T01:34:53Z,2022-02-06T01:34:46Z,OWNER,,"Inspired by: - #397 To match the option on `create-index`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-index ``` --if-not-exists Ignore if index already exists ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087181951,I_kwDOBm6k_c5AzRR_,1576,Traces should include SQL executed by subtasks created with `asyncio.gather`,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2021-12-22T20:52:02Z,2022-02-05T05:21:35Z,2022-02-05T05:19:53Z,OWNER,,"I tried running some parallel SQL queries using `asyncio.gather()` but the SQL that was executed didn't show up in the trace rendered by https://datasette.io/plugins/datasette-pretty-traces I realized that was because traces are keyed against the current task ID, which changes when a sub-task is run using `asyncio.gather` or similar. The faceting and suggest faceting queries are missing from this trace: ![image](https://user-images.githubusercontent.com/9599/147153855-2d611f07-922a-4d18-9e6e-4be89e010dc4.png) > The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25 > > This is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively ""child tasks"" of the current request, and hence should be tracked the same. > > https://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-999870993_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683805434,MDU6SXNzdWU2ODM4MDU0MzQ=,135,Code for finding SpatiaLite in the usual locations,9599,simonw,closed,0,,,,,3,2020-08-21T20:15:34Z,2022-02-05T00:04:26Z,2020-08-21T20:30:13Z,OWNER,,"I built this for `shapefile-to-sqlite` but it would be useful in `sqlite-utils` too: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L16-L19 ```python SPATIALITE_PATHS = ( ""/usr/lib/x86_64-linux-gnu/mod_spatialite.so"", ""/usr/local/lib/mod_spatialite.dylib"", ) ``` https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L105-L109 ```python def find_spatialite(): for path in SPATIALITE_PATHS: if os.path.exists(path): return path return None ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683812642,MDU6SXNzdWU2ODM4MTI2NDI=,136,--load-extension=spatialite shortcut option,9599,simonw,closed,0,,,,,3,2020-08-21T20:31:25Z,2022-02-05T00:04:26Z,2020-10-16T19:14:32Z,OWNER,,In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134),140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723460107,MDU6SXNzdWU3MjM0NjAxMDc=,187,Maybe: Utility method / CLI tool for initializing SpatiaLite,9599,simonw,closed,0,,,,,2,2020-10-16T19:04:03Z,2022-02-05T00:04:26Z,2020-10-16T19:15:13Z,OWNER,,"> I think this should initialize SpatiaLite against the current database if it has not been initialized already. > > Relevant code: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L112-L126",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723708310,MDU6SXNzdWU3MjM3MDgzMTA=,188,About loading spatialite,30607,aborruso,closed,0,,,,,1,2020-10-17T08:47:02Z,2022-02-05T00:04:26Z,2020-10-17T08:52:58Z,NONE,,"Hi @simonw , If I run ``` sqlite3 .load /usr/local/lib/mod_spatialite.so select spatialite_version(); ``` I have `5.0.0`. ![image](https://user-images.githubusercontent.com/30607/96332706-d8cd3300-1065-11eb-906b-daf99963198e.png) If I run ``` sqlite-utils :memory: ""select spatialite_version()"" --load-extension=spatialite ``` I have ``` Traceback (most recent call last): File ""/home/aborruso/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 936, in query _load_extensions(db, load_extension) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 1326, in _load_extensions db.conn.load_extension(ext) TypeError: argument 1 must be str, not None ``` How to load properly spatialite extension in sqlite-utils? Thank you very muc",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102899312,PR_kwDOCGYnMM4w_p22,385,Add new spatialite helper methods,25778,eyeseast,closed,0,,,,,16,2022-01-14T03:57:30Z,2022-02-05T00:04:26Z,2022-02-04T05:55:10Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/385,"Refs #79 This PR adds three new Spatialite-related methods to Database and Table: - `Database.init_spatialite` loads the Spatialite extension and initializes it - `Table.add_geometry_column` adds a geometry column - `Table.create_spatial_index` creates a spatial index Has tests and documentation. Feedback very welcome.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/385/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert: ```python # Convert to ""Well Known Text"" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", ( ""Wales"", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557842245,MDU6SXNzdWU1NTc4NDIyNDU=,79,Helper methods for working with SpatiaLite,9599,simonw,closed,0,,,,,8,2020-01-31T00:39:19Z,2022-02-05T00:04:25Z,2022-02-04T05:55:11Z,OWNER,,"As demonstrated by this piece of documentation, using SpatiaLite with sqlite-utils requires a fair bit of boilerplate: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/docs/python-api.rst#L880-L909",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123849278,I_kwDOCGYnMM5C_JQ-,395,"""apt-get: command not found"" error on macOS",9599,simonw,closed,0,,,,,1,2022-02-04T06:03:42Z,2022-02-04T06:10:58Z,2022-02-04T06:10:58Z,OWNER,,"Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123851690,I_kwDOCGYnMM5C_J2q,396,"mypy failure, sqlite_utils/utils.py:56",9599,simonw,closed,0,,,,,0,2022-02-04T06:08:09Z,2022-02-04T06:10:33Z,2022-02-04T06:10:33Z,OWNER,,"https://github.com/simonw/sqlite-utils/runs/5062725880?check_suite_focus=true > `sqlite_utils/utils.py:56: error: Incompatible return value type (got ""None"", expected ""str"")`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1094981339,I_kwDOCGYnMM5BRBbb,363,Better error message if `--convert` code fails to return a dict,9599,simonw,closed,0,,,,,4,2022-01-06T05:26:28Z,2022-02-03T22:52:30Z,2022-02-03T22:51:30Z,OWNER,,"Here's the traceback if your `--convert` function doesn't return a dict right now: ``` % sqlite-utils insert /tmp/all.db blah /tmp/log.log --convert 'all.upper()' --all Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 949, in insert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 834, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2602, in insert_all first_record = next(records) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 3044, in fix_square_braces for record in records: File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 831, in docs = (decode_base64_values(doc) for doc in docs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 86, in decode_base64_values to_fix = [ File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 89, in if isinstance(doc[k], dict) TypeError: string indices must be integers ``` It would be nicer if that returned a more useful error message. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/361#issuecomment-1006295276_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1118585417,I_kwDOCGYnMM5CrEJJ,393,Better documentation for insert-replace,9599,simonw,closed,0,,,,,1,2022-01-30T15:40:23Z,2022-02-03T22:13:24Z,2022-02-03T22:13:24Z,OWNER,,"Currently: https://sqlite-utils.datasette.io/en/stable/python-api.html#insert-replacing-data > If you want to insert a record or replace an existing record with the same primary key, using the replace=True argument to .insert() or .insert_all(): Should describe the exception you get first, then how to use replace to avoid it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097091527,I_kwDOCGYnMM5BZEnH,369,Research how much of a difference analyze / sqlite_stat1 makes,9599,simonw,closed,0,,,,,11,2022-01-09T03:03:36Z,2022-02-03T21:07:41Z,2022-02-03T21:07:35Z,OWNER,,"> Is there a downside to having a `sqlite_stat1` table if it has wildly incorrect statistics in it? _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008163050_ More generally: how much of a difference does the `sqlite_stat1` table created by `ANALYZE` make to queries? I'm particularly interested in `group by` / `count *` queries since Datasette uses those for faceting.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1120990806,PR_kwDOBm6k_c4x6zZ5,1617,"Ensure template_path always uses ""/"" to match jinja",3526913,cb160,closed,0,,,,,3,2022-02-01T17:20:30Z,2022-02-03T01:58:35Z,2022-02-03T01:58:35Z,CONTRIBUTOR,simonw/datasette/pulls/1617,"This PR shoudl fix #1545 The existing code substituted / for \, assuming this was the right behaviour for windows. But on Windows, Jinja still uses / for the template list - See https://github.com/pallets/jinja/blob/896a62135bcc151f2997e028c5125bec2cb2431f/src/jinja2/loaders.py#L225",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1119413338,PR_kwDOBm6k_c4x1kCu,1616,Bump black from 21.12b0 to 22.1.0,49699333,dependabot[bot],closed,0,,,,,2,2022-01-31T13:13:46Z,2022-02-02T22:23:52Z,2022-02-02T22:23:51Z,CONTRIBUTOR,simonw/datasette/pulls/1616,"Bumps [black](https://github.com/psf/black) from 21.12b0 to 22.1.0.
Release notes

Sourced from black's releases.

22.1.0

At long last, Black is no longer a beta product! This is the first non-beta release and the first release covered by our new stability policy.

Highlights

  • Remove Python 2 support (#2740)
  • Introduce the --preview flag (#2752)

Style

  • Deprecate --experimental-string-processing and move the functionality under --preview (#2789)
  • For stubs, one blank line between class attributes and methods is now kept if there's at least one pre-existing blank line (#2736)
  • Black now normalizes string prefix order (#2297)
  • Remove spaces around power operators if both operands are simple (#2726)
  • Work around bug that causes unstable formatting in some cases in the presence of the magic trailing comma (#2807)
  • Use parentheses for attribute access on decimal float and int literals (#2799)
  • Don't add whitespace for attribute access on hexadecimal, binary, octal, and complex literals (#2799)
  • Treat blank lines in stubs the same inside top-level if statements (#2820)
  • Fix unstable formatting with semicolons and arithmetic expressions (#2817)
  • Fix unstable formatting around magic trailing comma (#2572)

Parser

  • Fix mapping cases that contain as-expressions, like case {"key": 1 | 2 as password} (#2686)
  • Fix cases that contain multiple top-level as-expressions, like case 1 as a, 2 as b (#2716)
  • Fix call patterns that contain as-expressions with keyword arguments, like case Foo(bar=baz as quux) (#2749)
  • Tuple unpacking on return and yield constructs now implies 3.8+ (#2700)
  • Unparenthesized tuples on annotated assignments (e.g values: Tuple[int, ...] = 1, 2, 3) now implies 3.8+ (#2708)
  • Fix handling of standalone match() or case() when there is a trailing newline or a comment inside of the parentheses. (#2760)
  • from __future__ import annotations statement now implies Python 3.7+ (#2690)

Performance

  • Speed-up the new backtracking parser about 4X in general (enabled when --target-version is set to 3.10 and higher). (#2728)
  • Black is now compiled with mypyc for an overall 2x speed-up. 64-bit Windows, MacOS, and Linux (not including musl) are supported. (#1009, #2431)

Configuration

  • Do not accept bare carriage return line endings in pyproject.toml (#2408)
  • Add configuration option (python-cell-magics) to format cells with custom magics in Jupyter Notebooks (#2744)
  • Allow setting custom cache directory on all platforms with environment variable BLACK_CACHE_DIR (#2739).
  • Enable Python 3.10+ by default, without any extra need to specify --target-version=py310. (#2758)
  • Make passing SRC or --code mandatory and mutually exclusive (#2804)

Output

  • Improve error message for invalid regular expression (#2678)
  • Improve error message when parsing fails during AST safety check by embedding the underlying SyntaxError (#2693)
  • No longer color diff headers white as it's unreadable in light themed terminals (#2691)
  • Text coloring added in the final statistics (#2712)
  • Verbose mode also now describes how a project root was discovered and which paths will be formatted. (#2526)

Packaging

  • All upper version bounds on dependencies have been removed (#2718)
  • typing-extensions is no longer a required dependency in Python 3.10+ (#2772)
  • Set click lower bound to 8.0.0 as Black crashes on 7.1.2 (#2791)

... (truncated)

Changelog

Sourced from black's changelog.

22.1.0

At long last, Black is no longer a beta product! This is the first non-beta release and the first release covered by our new stability policy.

Highlights

  • Remove Python 2 support (#2740)
  • Introduce the --preview flag (#2752)

Style

  • Deprecate --experimental-string-processing and move the functionality under --preview (#2789)
  • For stubs, one blank line between class attributes and methods is now kept if there's at least one pre-existing blank line (#2736)
  • Black now normalizes string prefix order (#2297)
  • Remove spaces around power operators if both operands are simple (#2726)
  • Work around bug that causes unstable formatting in some cases in the presence of the magic trailing comma (#2807)
  • Use parentheses for attribute access on decimal float and int literals (#2799)
  • Don't add whitespace for attribute access on hexadecimal, binary, octal, and complex literals (#2799)
  • Treat blank lines in stubs the same inside top-level if statements (#2820)
  • Fix unstable formatting with semicolons and arithmetic expressions (#2817)
  • Fix unstable formatting around magic trailing comma (#2572)

Parser

  • Fix mapping cases that contain as-expressions, like case {"key": 1 | 2 as password} (#2686)
  • Fix cases that contain multiple top-level as-expressions, like case 1 as a, 2 as b (#2716)
  • Fix call patterns that contain as-expressions with keyword arguments, like case Foo(bar=baz as quux) (#2749)
  • Tuple unpacking on return and yield constructs now implies 3.8+ (#2700)
  • Unparenthesized tuples on annotated assignments (e.g values: Tuple[int, ...] = 1, 2, 3) now implies 3.8+ (#2708)
  • Fix handling of standalone match() or case() when there is a trailing newline or a comment inside of the parentheses. (#2760)
  • from __future__ import annotations statement now implies Python 3.7+ (#2690)

Performance

  • Speed-up the new backtracking parser about 4X in general (enabled when --target-version is set to 3.10 and higher). (#2728)
  • Black is now compiled with mypyc for an overall 2x speed-up. 64-bit Windows, MacOS, and Linux (not including musl) are supported. (#1009, #2431)

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.12b0&new-version=22.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1122414274,PR_kwDOBm6k_c4x_evE,1622,Test against Python 3.11-dev,9599,simonw,closed,0,,,,,1,2022-02-02T21:39:38Z,2022-02-02T21:58:53Z,2022-02-02T21:58:53Z,OWNER,simonw/datasette/pulls/1622,Refs #1621,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1622/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` Could add support for `--batch-size` as seen in `insert`/`upsert` too - causing it to break the list up into batches and commit for each one. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/391#issuecomment-1021876055_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114638930,I_kwDOCGYnMM5CcApS,391,`sqlite-utils bulk` progress bar,9599,simonw,closed,0,,,,,2,2022-01-26T05:14:49Z,2022-01-26T05:17:20Z,2022-01-26T05:16:51Z,OWNER,,"It can easily have a progress bar because it works by looping through an iterator: https://github.com/simonw/sqlite-utils/blob/a9fca7efa4184fbb2a65ca1275c326950ed9d3c1/sqlite_utils/cli.py#L1014-L1018 Should also support the `--silent` option if I add this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114557284,I_kwDOCGYnMM5Cbstk,390,`sqlite-utils upsert` should require `--pk` more elegantly,9599,simonw,closed,0,,,,,1,2022-01-26T02:20:31Z,2022-01-26T03:20:25Z,2022-01-26T03:19:43Z,OWNER,,"Currently throws an ugly traceback: ``` % echo '[ {""id"": 1, ""name"": ""Lila""}, {""id"": 1, ""name"": ""Lila""} ]' | sqlite-utils upsert data.db chickens - Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 1104, in upsert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 906, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2615, in insert_all raise PrimaryKeyRequired(""upsert() requires a pk"") sqlite_utils.db.PrimaryKeyRequired: upsert() requires a pk ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099897648,I_kwDOCGYnMM5Bjxsw,384,Add examples to every `--help`,9599,simonw,closed,0,,,,,0,2022-01-12T05:31:25Z,2022-01-26T03:15:02Z,2022-01-26T03:15:02Z,OWNER,,Everything on https://sqlite-utils.datasette.io/en/stable/cli-reference.html would benefit from an example.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114544727,I_kwDOCGYnMM5CbppX,389,Plausible analytics for documentation,9599,simonw,closed,0,,,,,2,2022-01-26T01:58:35Z,2022-01-26T02:07:41Z,2022-01-26T02:07:41Z,OWNER,,"```html ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/388#issuecomment-1021785268_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1111293050,I_kwDOCGYnMM5CPPx6,387,Python library docs should start with a self contained example,9599,simonw,closed,0,,,,,1,2022-01-22T06:23:56Z,2022-01-26T01:37:17Z,2022-01-26T01:35:30Z,OWNER,,You have to read a lot of stuff in a lot of different places to get started with the Python library. Add a getting started introduction to https://sqlite-utils.datasette.io/en/stable/python-api.html,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6* > `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/ > > Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days? > > Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109884720,I_kwDOBm6k_c5CJ38w,1609,"Ensure ""pip install datasette"" still works with Python 3.6",9599,simonw,closed,0,,,,,12,2022-01-21T00:08:10Z,2022-01-24T19:20:09Z,2022-01-21T02:24:13Z,OWNER,,"## Original title: Can I keep ""pip install datasette"" working on Python 3.6? I dropped support for 3.6 in: - #1577 I'm getting reports that `pip3 install datasette` throws an error on that Python, even though I haven't made that new release yet - presumably due to lack of pinning of Uvicorn: https://twitter.com/ldodds/status/1484289475195080706 Is it possible to get `pip` on that version of Python to install the highest possible version of the packages that are still known to support Python 3.6? If so, how?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1609/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1105916061,I_kwDOBm6k_c5B6vCd,1601,Add KNN and data_licenses to hidden tables list,25778,eyeseast,closed,0,,,,,5,2022-01-17T14:19:57Z,2022-01-20T21:29:44Z,2022-01-20T04:38:54Z,CONTRIBUTOR,,"They're generated by Spatialite and not very interesting in most cases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 824064069,MDU6SXNzdWU4MjQwNjQwNjk=,1249,Updated Dockerfile with SpatiaLite version 5.0,9599,simonw,closed,0,,,,,45,2021-03-08T00:17:36Z,2022-01-20T21:29:43Z,2021-03-29T00:57:13Z,OWNER,,"The version bundled in Datasette's Docker image right now is 4.4.0-RC0 https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/Dockerfile#L16-L17 5 has been out for a couple of months and has a bunch of big improvements, most notable stable KNN support.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 837308703,MDU6SXNzdWU4MzczMDg3MDM=,1268,Figure out why SpatiaLite 5.0 hangs the database page on Linux,9599,simonw,closed,0,,,,,18,2021-03-22T04:44:16Z,2022-01-20T21:29:43Z,2021-03-22T17:41:12Z,OWNER,,"See detailed notes in https://github.com/simonw/datasette/issues/1249 - for some reason SpatiaLite 5.0 hangs the `/dbname` page on Linux (inside Docker containers, both with a custom compiled SpatiaLite and one installed from the Ubuntu 20.10 package repository). This doesn't happen on macOS with SpatiaLite 5 installed using Homebrew.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842416110,MDU6SXNzdWU4NDI0MTYxMTA=,1278,SpatiaLite timezones demo is broken,9599,simonw,closed,0,,,,,2,2021-03-27T04:45:27Z,2022-01-20T21:29:43Z,2021-03-27T16:17:13Z,OWNER,,https://github.com/simonw/datasette/blob/5fd02890650db790b2ffdb90eb9f78f8e0639c37/docs/spatialite.rst#L96,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection: ```pycon >>> from datasette.app import Datasette >>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""]) >>> db = ds.add_memory_database('geo') >>> await db.execute_write(""select InitSpatialMetadata(1)"") UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f') no such function: InitSpatialMetadata ``` It looks like the code that loads additional modules only works on read-only connections, not on write connections: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153 Compared to: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752966476,MDU6SXNzdWU3NTI5NjY0NzY=,1114,--load-extension=spatialite not working with datasetteproject/datasette docker image,2182,danp,closed,0,,,,,4,2020-11-29T17:35:20Z,2022-01-20T21:29:42Z,2020-11-29T17:37:45Z,CONTRIBUTOR,,"https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places: https://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60 However, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`. This results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite Error: Could not find SpatiaLite extension ``` But it does work when given an explicit path: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit) ... ``` Perhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752995227,MDU6SXNzdWU3NTI5OTUyMjc=,1115,SpatiaLite error could suggest --load-extension=spatialite,9599,simonw,closed,0,,,,,1,2020-11-29T20:05:07Z,2022-01-20T21:29:42Z,2020-11-29T20:13:22Z,OWNER,,"https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/cli.py#L533-L548 This could use the `find_spatialite()` function and, if it finds something, suggest the user use `--load-extension=spatialite` https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/utils/__init__.py#L1015-L1019",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443040665,MDU6SXNzdWU0NDMwNDA2NjU=,466,"Move ""no such module: VirtualSpatialIndex"" code elsewhere",9599,simonw,closed,0,,,4305096,0.28,2,2019-05-11T22:09:00Z,2022-01-20T21:29:41Z,2019-05-11T22:57:22Z,OWNER,,"We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module: https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554 This code is part of `.inspect()` which is going away - see #462 - so I need to find somewhere else for it to live.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563347679,MDU6SXNzdWU1NjMzNDc2Nzk=,668,Make it easier to load SpatiaLite,9599,simonw,closed,0,,,,,2,2020-02-11T17:03:43Z,2022-01-20T21:29:41Z,2021-01-04T20:18:39Z,OWNER,,"``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn=, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: ``` datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib ``` Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/668/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723803777,MDU6SXNzdWU3MjM4MDM3Nzc=,1028,--load-extension=spatialite shortcut,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-17T17:02:08Z,2022-01-20T21:29:41Z,2020-10-19T22:37:55Z,OWNER,,I added this to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/136 and I really like it: pass a special value of `spatialite` and Datasette should attempt to load it from known likely installation locations.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1028/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336936010,MDU6SXNzdWUzMzY5MzYwMTA=,331,Datasette throws error when loading spatialite db without extension loaded,82988,psychemedia,closed,0,,,,,2,2018-06-29T09:51:14Z,2022-01-20T21:29:40Z,2018-07-10T15:13:36Z,CONTRIBUTOR,,"When starting datasette on a SpatialLite database *without* loading the SpatiaLite extension (using eg `--load-extension=/usr/local/lib/mod_spatialite.dylib`) an error is thrown and the server fails to start: ``` datasette -p 8003 adminboundaries.db Serve! files=('adminboundaries.db',) on port 8003 Traceback (most recent call last): File ""/Users/ajh59/anaconda3/bin/datasette"", line 11, in sys.exit(cli()) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py"", line 552, in serve ds.inspect() File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py"", line 273, in inspect ""tables"": inspect_tables(conn, self.metadata.get(""databases"", {}).get(name, {})) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py"", line 79, in inspect_tables ""PRAGMA table_info({});"".format(escape_sqlite(table)) sqlite3.OperationalError: no such module: VirtualSpatialIndex ``` It would be nice to trap this and return a message saying something like: ``` It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1104691662,I_kwDOBm6k_c5B2EHO,1600,plugins --all example should use cog,9599,simonw,closed,0,,,,,1,2022-01-15T11:47:49Z,2022-01-20T05:06:21Z,2022-01-20T05:04:16Z,OWNER,,The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table > > This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1107557831,I_kwDOCGYnMM5CA_3H,386,"Better ""contributing"" documentation",9599,simonw,closed,0,,,,,0,2022-01-19T02:11:48Z,2022-01-19T02:15:21Z,2022-01-19T02:15:21Z,OWNER,,"This page jumps straight into running the tests: https://sqlite-utils.datasette.io/en/latest/contributing.html It should add a little more about expected collaboration styles - opening an issue before filing a pull request - and probably link to https://simonwillison.net/2022/Jan/12/how-i-build-a-feature/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/386/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099723916,I_kwDOBm6k_c5BjHSM,1590,Table+query JSON and CSV links broken when using `base_url` setting,1001306,eelkevdbos,closed,0,,,7571612,Datasette 0.60,11,2022-01-11T23:46:39Z,2022-01-14T01:16:34Z,2022-01-14T01:16:08Z,NONE,,"Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set. In the follow asgi example, I'm hosting a custom Datasette instance: ```python # asgi.py import pathlib from asgi_cors import asgi_cors from channels.routing import URLRouter from django.urls import re_path from datasette.app import Datasette datasette_ = Datasette( files=[], settings={ ""base_url"": ""/datasettes/"", ""plugins"": {} }, config_dir=pathlib.Path('.'), ) application = URLRouter([ re_path(r""^datasettes/.*"", asgi_cors(datasette_.app(), allow_all=True)), ]) ``` Running it with: ```shell $ daphne -p 8002 asgi:application ``` Using a simple query on the `_memory` table: ```sql select sqlite_version() ``` http://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29 It renders the following upon inspection: ![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png) I am using datasette version `0.59.4`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059555791,I_kwDOBm6k_c4_J4nP,1527,Columns starting with an underscore behave poorly in filters,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-11-22T01:01:36Z,2022-01-14T00:57:08Z,2022-01-14T00:57:08Z,OWNER,,"Similar bug to #1525 (and #1506 before it). Start on https://latest.datasette.io/fixtures/facetable?_facet=_neighborhood - then select a neighborhood - then try to remove that filter using the little ""x"" and submitting the form again. ![filter-bug](https://user-images.githubusercontent.com/9599/142786754-31d265a2-944d-4ea2-af6f-305d445a2ccb.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102637351,I_kwDOBm6k_c5BuOkn,1598,Replace update-docs-help.py script with cog,9599,simonw,closed,0,,,,,1,2022-01-14T00:33:27Z,2022-01-14T00:47:57Z,2022-01-14T00:47:57Z,OWNER,,"I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism: https://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help ``` Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. ``` ``` Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension TEXT Path to a SQLite extension to load --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518. ``` % curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 '' <head> <title>fixtures: facetable: 14 rows where where on_earth = 1 sorted by pk ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991467558,MDU6SXNzdWU5OTE0Njc1NTg=,1466,Add Datasette Desktop to installation documentation,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-09-08T19:41:27Z,2022-01-13T22:28:28Z,2022-01-13T21:55:18Z,OWNER,,See https://datasette.io/desktop and https://simonwillison.net/2021/Sep/8/datasette-desktop/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087931918,I_kwDOBm6k_c5A2IYO,1579,`.execute_write(... block=True)` should be the default behaviour,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-12-23T18:54:28Z,2022-01-13T22:28:08Z,2021-12-23T19:18:26Z,OWNER,,"Every single piece of code I've written against the write APIs has used the `block=True` option to wait for the result. Without that, it instead fires the write into the queue but then continues even before it has finished executing. `block=True` should clearly be the default behaviour here!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name): `` My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`): `` So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication. Thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083927147,I_kwDOBm6k_c5Am2pr,1571,Track number of executions for execute_write_many() in traces,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T19:16:17Z,2022-01-13T22:27:49Z,2021-12-19T20:30:40Z,OWNER,,"Spotted while working on #1555 There's no indication there of how many times `execute_write_many()` executed the SQL. Solving this is a tiny bit tricky because `params_seq` is an iterator that we don't want to exhaust before passing it to `conn.executemany()` - so we need to instead wrap it in something that counts how many times it was called. But then we need a way to attach that to the trace here: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/database.py#L115-L122 So probably need to redesign the `trace()` decorator to allow extra pairs to be attached to it within the `with` statement. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084007781,I_kwDOBm6k_c5AnKVl,1572,"""Query took"" should be ""Queries took""",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-19T04:03:00Z,2022-01-13T22:27:43Z,2021-12-19T04:03:24Z,OWNER,,"This is misleading, since usually there have been more than one query executed: ![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods: > > - `db.execute_write(sql, params=None, block=False)` > - `db.execute_write_script(sql, block=False)` > - `db.execute_write_many(sql, params_seq, block=False)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1 I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc ```sql SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns' FROM sqlite_schema AS m, pragma_index_list(m.name) AS il, pragma_index_info(il.name) AS ii WHERE m.type='table' ORDER BY 1; ``` https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this: ```diff diff --git a/datasette/database.py b/datasette/database.py index 468e936..1a424f5 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -94,10 +94,14 @@ class Database: f""file:{self.path}{qs}"", uri=True, check_same_thread=False ) - async def execute_write(self, sql, params=None, block=False): + async def execute_write(self, sql, params=None, executescript=False, block=False): + assert not executescript and params, ""Cannot use params with executescript=True"" def _inner(conn): with conn: - return conn.execute(sql, params or []) + if executescript: + return conn.executescript(sql) + else: + return conn.execute(sql, params or []) with trace(""sql"", database=self.name, sql=sql.strip(), params=params): results = await self.execute_write_fn(_inner, block=block) ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon >>> ds = Datasette(memory=True) Traceback (most recent call last): File """", line 1, in TypeError: __init__() missing 1 required positional argument: 'files' >>> ds = Datasette(memory=True, files=[]) ``` I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083718998,I_kwDOBm6k_c5AmD1W,1567,Remove undocumented sqlite_functions mechanism,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T01:51:10Z,2022-01-13T22:27:04Z,2021-12-18T01:54:46Z,OWNER,,"I added this in 0b8c1b0a6da9cb8ac0d28cc90dd783de87554036 but it's never been documented and the same thing can now be achieved using the `prepare_connection` plugin hook. https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L262 https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L551-L552 It's used here in the tests: https://github.com/simonw/datasette/blob/69244a617b1118dcbd04a8f102173f04680cf08c/tests/fixtures.py#L156",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520740741,MDU6SXNzdWU1MjA3NDA3NDE=,625,If you apply ?_facet_array=tags then &_facet=tags does nothing,9599,simonw,closed,0,,,7571612,Datasette 0.60,13,2019-11-11T04:59:29Z,2022-01-13T22:26:58Z,2021-12-16T20:12:22Z,OWNER,,"Start here: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags Note that `tags` is offered as a suggested facet. But if you click that you get this: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags&_facet=tags The `_facet=tags` is added to the URL and it's removed from the list of suggested tags... but the facet itself is not displayed: The `_facet=tags` facet should look like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082564912,I_kwDOBm6k_c5AhqEw,1557,`?_nosuggest=1` parameter for disabling facet suggestions on table view,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-16T19:21:42Z,2022-01-13T22:26:48Z,2021-12-16T19:24:59Z,OWNER,,"Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1078702875,I_kwDOBm6k_c5AS7Mb,1552,Allow to set `facets_array` in metadata (like current `facets`),3556,davidbgk,closed,0,,,7571612,Datasette 0.60,9,2021-12-13T16:00:44Z,2022-01-13T22:26:15Z,2021-12-16T18:47:48Z,CONTRIBUTOR,,"For now, you can set a `facets` value (array) in your metadata file but I couldn't find a way to set a `facets_array` in order to provide default facets for arrays (like tags). My use-case is to access to [that kind of view](https://latest.datasette.io/fixtures/facetable?_facet_array=tags) by default without URL's parameters as with other default facets. _I'm new to datasette, and I'm willing to help with a PR if that is not already implemented and I missed it!_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1081318247,I_kwDOBm6k_c5Ac5tn,1556,"Show count of facet values always, not just for `?_facet_size=max`",9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-15T17:49:01Z,2022-01-13T22:26:07Z,2021-12-15T17:58:06Z,OWNER,,"> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1556/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520655983,MDU6SXNzdWU1MjA2NTU5ODM=,619,"""Invalid SQL"" page should let you edit the SQL",9599,simonw,closed,0,,,,,14,2019-11-10T20:54:12Z,2022-01-13T22:21:42Z,2021-06-02T04:15:54Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/619/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642651572,MDU6SXNzdWU2NDI2NTE1NzI=,860,Plugin hook for instance/database/table metadata,9599,simonw,closed,0,,,,,10,2020-06-21T22:20:25Z,2022-01-13T22:21:42Z,2021-06-26T22:56:28Z,OWNER,,"I'm not happy with how `metadata.(json|yaml)` keeps growing new features. Rather than having a single plugin hook for all of `metadata.json` I'm going to split out the feature that shows actual real metadata for tables and databases - `source`, `license` etc - into its own plugin-powered mechanism. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/357#issuecomment-647189045_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/860/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681334912,MDU6SXNzdWU2ODEzMzQ5MTI=,942,Support column descriptions in metadata.json,9599,simonw,closed,0,,,,,18,2020-08-18T20:52:00Z,2022-01-13T22:21:42Z,2021-08-12T23:53:24Z,OWNER,,"Could look something like this: ```json { ""title"": ""Five Thirty Eight"", ""license"": ""CC Attribution 4.0 License"", ""license_url"": ""https://creativecommons.org/licenses/by/4.0/"", ""source"": ""fivethirtyeight/data on GitHub"", ""source_url"": ""https://github.com/fivethirtyeight/data"", ""databases"": { ""fivethirtyeight"": { ""tables"": { ""mueller-polls/mueller-approval-polls"": { ""description_html"": ""

....

"", ""columns"": { ""name_of_column"": ""column_description goes here"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/942/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1067771698,I_kwDOCGYnMM4_pOcy,348,Command for creating an empty database,9599,simonw,closed,0,,,7558727,3.21,6,2021-11-30T23:24:27Z,2022-01-13T07:06:59Z,2022-01-09T20:33:20Z,OWNER,,"I sometimes find the need to create an empty SQLite database file - for example if I want to enable WAL on it before using it with another script. I currently do that like this: sqlite3 my.db vacuum sqlite-utils enable-wal my.db It would be nice if `sqlite-utils` had a convenience command for doing this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099584685,I_kwDOCGYnMM5BilSt,381,`sqlite-utils rows` options `--limit` and `--offset`,9599,simonw,closed,0,,,,,2,2022-01-11T20:23:12Z,2022-01-11T23:33:37Z,2022-01-11T23:19:36Z,OWNER,,Because I often want to use it just to preview a few rows from the database. Piping through `| head -n 20` works for JSON and CSV (they stream) but not for `--table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099585611,I_kwDOCGYnMM5BilhL,382,`--where` option for `sqlite-rows`,9599,simonw,closed,0,,,,,1,2022-01-11T20:24:23Z,2022-01-11T23:33:14Z,2022-01-11T23:32:47Z,OWNER,,CLI equivalent of `table.rows_where()` - should accept parameters too. Work on this at the same time as #381.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099586786,I_kwDOCGYnMM5Bilzi,383,Add documentation page with the output of `--help`,9599,simonw,closed,0,,,,,4,2022-01-11T20:25:58Z,2022-01-11T22:55:05Z,2022-01-11T21:44:05Z,OWNER,,"Can be maintained using `cog` from #373. Similar in purpose to the API reference page, but this is for the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096558279,I_kwDOCGYnMM5BXCbH,365,create-index should run analyze after creating index,536941,fgregg,closed,0,,,7558727,3.21,16,2022-01-07T18:21:25Z,2022-01-11T02:43:34Z,2022-01-11T01:36:48Z,CONTRIBUTOR,,"sqlite's query planner depends upon analyze to make good use of indices. It would be nice if analyze was run as part of the create-index command. If data is inserted later, things can get out date, but it would still probably be a net win. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098574572,I_kwDOCGYnMM5Beurs,380,Release notes for 3.21,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-11T02:12:30Z,2022-01-11T02:34:26Z,2022-01-11T02:34:26Z,OWNER,,For these commits: https://github.com/simonw/sqlite-utils/compare/3.20...129141572f249ea290e2a075437e2ebaad215859,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/380/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097477582,PR_kwDOCGYnMM4wtl17,377,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-10T05:34:24Z,2022-01-11T02:10:57Z,2022-01-11T02:10:54Z,OWNER,simonw/sqlite-utils/pulls/377,"Refs #375 Still needs: - [x] Refactor `@insert_upsert_options` so that it doesn't duplicate `@import_options` - [x] Tests - [x] Documentation - [x] Try it against a really big file",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1098544628,I_kwDOCGYnMM5BenX0,379,CLI options for running ANALYZE,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-11T01:09:16Z,2022-01-11T01:38:01Z,2022-01-11T01:36:48Z,OWNER,,"> The Python methods are all done now, next step is the CLI options. I'll do those in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009508865_ - [x] `sqlite-utils analyze` command - [x] `sqlite-utils create-index --analyze` option (see #365) - [x] `sqlite-utils insert --analyze` option - [x] `sqlite-utils upsert --analyze` option In #378 I also added `.delete_where(..., analyze=True)` but there isn't currently a `sqlite-utils delete-where` CLI command - deletions via CLI are expected to be handled using SQL queries.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096563265,I_kwDOCGYnMM5BXDpB,366,Python library methods for calling ANALYZE,9599,simonw,closed,0,,,7558727,3.21,10,2022-01-07T18:28:01Z,2022-01-11T01:09:33Z,2022-01-11T01:09:33Z,OWNER,,"> Relevant documentation: https://www.sqlite.org/lang_analyze.html _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1007633376_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098309897,I_kwDOCGYnMM5BduEJ,378,analyze=True parameter for some methods,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T19:54:52Z,2022-01-11T01:08:11Z,2022-01-11T01:08:09Z,OWNER,,"This would cause `ANALYZE` to be run against the relevant table at the end of executing the method. > Having browsed the API reference I think the methods that would benefit from an `analyze=True` parameter are: - [x] `table.create_index` - [x] `table.insert_all` - [x] `table.upsert_all` - [x] `table.delete_where` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009288898_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/378/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097041471,PR_kwDOCGYnMM4wsVM6,367,Initial prototype of .analyze() methods,9599,simonw,closed,0,,,7558727,3.21,2,2022-01-08T21:35:12Z,2022-01-10T19:31:08Z,2022-01-10T19:31:08Z,OWNER,simonw/sqlite-utils/pulls/367,Refs #366,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/367/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 1097436959,I_kwDOCGYnMM5BaY8f,376,`--nl` mode should ignore blank lines,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T04:10:54Z,2022-01-10T19:27:41Z,2022-01-10T04:12:46Z,OWNER,,Spotted this while manually testing #364 - there's no reason `--nl` should crash if you feed it an empty line in between JSON objects.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097129710,I_kwDOCGYnMM5BZN7u,372,Idea: `suffix` and `stem` file columns,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-09T07:48:53Z,2022-01-10T19:27:34Z,2022-01-09T20:17:00Z,OWNER,,"For https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-data-from-files Given a file called `dogs.jpg` stem would be `dogs` and ext would be `jpg`. Need to decide what happens for `dogs.and.cats.jpg.gz`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/372/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097128334,I_kwDOCGYnMM5BZNmO,371,Support mutating row in `--convert` without returning it,9599,simonw,closed,0,,,7558727,3.21,6,2022-01-09T07:38:44Z,2022-01-10T19:27:30Z,2022-01-09T20:06:15Z,OWNER,,"Currently you have to do this: ``` $ sqlite-utils insert dogs.db dogs dogs.json --convert ' row[""is_good""] = 1 return row' ``` Would be neat if this worked too: ``` $ sqlite-utils insert dogs.db dogs dogs.json \ --convert 'row[""is_good""] = 1' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135860,I_kwDOCGYnMM5BZPb0,374,`--fmt` should imply `-t`,9599,simonw,closed,0,,,7558727,3.21,4,2022-01-09T08:23:07Z,2022-01-10T19:27:26Z,2022-01-09T18:07:59Z,OWNER,,Not sure why I didn't implement this.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1095570074,I_kwDOCGYnMM5BTRKa,364,`--batch-size 1` doesn't seem to commit for every item,9599,simonw,closed,0,,,7558727,3.21,16,2022-01-06T18:18:50Z,2022-01-10T19:27:17Z,2022-01-10T05:36:19Z,OWNER,,"I'm trying this, but it doesn't seem to write anything to the database file until I hit `CTRL+C`: ``` heroku logs --app=simonwillisonblog --tail | grep 'measure#nginx.service' | \ sqlite-utils insert /tmp/herokutail.db log - --import re --convert ""$(cat < ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1094890366,PR_kwDOCGYnMM4wlm3B,361,--lines and --text and --convert and --import,9599,simonw,closed,0,,,,,15,2022-01-06T01:49:44Z,2022-01-06T06:37:03Z,2022-01-06T06:24:54Z,OWNER,simonw/sqlite-utils/pulls/361,"Refs #356 Still TODO: - [x] Get `--lines` working, with tests - [x] Get `--text` working, with tests - [x] Get regular JSON import working with `--convert` with tests - [x] Get `--lines` working with `--convert` with tests - [x] Get `--text` working with `--convert` with tests - [x] Get `--csv` and `--tsv` import working with `--convert` with tests - [x] Get `--nl` working with `--convert` with tests - [x] Documentation for all of the above",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1077431957,I_kwDOCGYnMM5AOE6V,356,`sqlite-utils insert --convert` option,9599,simonw,closed,0,,,,,11,2021-12-11T07:24:48Z,2022-01-06T06:30:13Z,2022-01-06T06:28:53Z,OWNER,,"Idea come to me while re-reading this: https://simonwillison.net/2021/Aug/6/sqlite-utils-convert/ This is a bit of a hack: ``` cat /tmp/log.txt | \ jq --raw-input '{line: .}' --compact-output | \ sqlite-utils insert /tmp/logs.db log - --nl ``` Would be great if you could pipe lines to `insert` and transform them on the way in. A `--convert python-code` option, modeled after `sqlite-utils convert`, could do this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807437089,MDU6SXNzdWU4MDc0MzcwODk=,228,--no-headers option for CSV and TSV,9599,simonw,closed,0,,,,,10,2021-02-12T17:56:51Z,2021-12-26T07:01:31Z,2021-02-14T22:25:17Z,OWNER,,"https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c has a fascinating CSV file that doesn't have a header row - it starts like this: ```csv Computation and measurement of turbulent flow through idealized turbine blade passages,,""Loizou, Panos A."",https://isni.org/isni/0000000136122593,,University of Manchester,https://isni.org/isni/0000000121662407,1989,Thesis (Ph.D.),,Physical Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781, ""Prolactin and growth hormone secretion in normal, hyperprolactinaemic and acromegalic man"",,""Prescott, R. W. G."",https://isni.org/isni/0000000134992122,,University of Newcastle upon Tyne,https://isni.org/isni/0000000104627212,1983,Thesis (Ph.D.),,Biological Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232784, ``` It would be useful if `sqlite-utils insert ... --csv` had a mechanism for importing files like this one.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 781262510,MDU6SXNzdWU3ODEyNjI1MTA=,1181,"Certain database names results in 404: ""Database not found: None""",1470389,jieter,closed,0,,,6346396,Datasette 0.54,4,2021-01-07T12:01:16Z,2021-12-21T18:25:15Z,2021-01-25T05:13:19Z,NONE,,"I have a file named `test-database (1).sqlite`. When requesting the home route `/`, I see datasette is able to read it correctly: However, if I click any of the links, datasette replies with: `Error 404 Database not found: None` It seems the hash is crucial, as renaming the file to `database (1).sqlite` makes the error go away. This lines checks for a single dash: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184 ``` $ datasette test-database\ \(1\).sqlite INFO: Started server process [68314] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:54043 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54043 - ""GET / HTTP/1.1"" 200 OK ... INFO: 127.0.0.1:54044 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54044 - ""GET /test-database (1) HTTP/1.1"" 404 Not Found ``` Version: ``` $ datasette --version datasette, version 0.53 ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079129258,PR_kwDOBm6k_c4vynly,1554,TableView refactor,9599,simonw,closed,0,,,,,3,2021-12-13T23:16:04Z,2021-12-20T23:52:11Z,2021-12-20T23:52:04Z,OWNER,simonw/datasette/pulls/1554,"I'm starting a PR with almost nothing in it so I can use the GitHub code commenting feature to add a bunch of comments to the code I intend to refactor. Related issues: - #617 - #715 - #870 - #1518",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1084257842,I_kwDOBm6k_c5AoHYy,1575,__call__() got an unexpected keyword argument 'specname',9599,simonw,closed,0,,,,,1,2021-12-20T01:24:04Z,2021-12-20T01:48:03Z,2021-12-20T01:47:57Z,OWNER,,"> I've installed the alpha version but get an error when starting up Datasette: ``` Traceback (most recent call last): File ""/Users/tim/.pyenv/versions/stock-exchange/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py"", line 15, in from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py"", line 31, in from .views.database import DatabaseDownload, DatabaseView File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py"", line 25, in from datasette.plugins import pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py"", line 29, in mod = importlib.import_module(plugin) File ""/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py"", line 9, in @hookimpl(specname=""filters_from_request"") TypeError: __call__() got an unexpected keyword argument 'specname' ``` _Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519613116,MDU6SXNzdWU1MTk2MTMxMTY=,617,Refactor TableView.data() method,9599,simonw,closed,0,,,,,9,2019-11-08T01:55:41Z,2021-12-18T01:41:47Z,2021-12-11T19:17:11Z,OWNER,,"This is by far the most complex piece of Datasette - the `TableView.data()` method is over 500 lines long and is increasingly getting in the way of cleanly implementing new features (e.g. #615 and #613). Need to break it up into smaller, cleaner pieces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445850934,MDU6SXNzdWU0NDU4NTA5MzQ=,473,Plugin hook: filters_from_request,9599,simonw,closed,0,,,,,13,2019-05-19T18:44:33Z,2021-12-17T23:11:30Z,2021-12-17T19:02:17Z,OWNER,,"I meant to add this as part of the facets plugin mechanism but didn't quite get to it. Original idea was to allow plugins to register extra filters, as seen in `datasette/filters.py`: https://github.com/simonw/datasette/blob/260085838887ee343f4d3b177c422e7aef5ade9d/datasette/filters.py#L83-L98",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082743068,PR_kwDOBm6k_c4v-izc,1559,"filters_from_request plugin hook, now used in TableView",9599,simonw,closed,0,,,,,6,2021-12-16T23:59:33Z,2021-12-17T23:09:41Z,2021-12-17T19:02:15Z,OWNER,simonw/datasette/pulls/1559,"New plugin hook, refs #473 Used it to extract the logic from TableView that handles _search and _through and _where - refs #1518",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1083246400,PR_kwDOBm6k_c4wAMK8,1562,"Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1",49699333,dependabot[bot],closed,0,,,,,4,2021-12-17T13:11:10Z,2021-12-17T23:08:29Z,2021-12-17T23:08:28Z,CONTRIBUTOR,simonw/datasette/pulls/1562,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
Release notes

Sourced from janus's releases.

janus 1.0.0 release

  • Dropped Python 3.6 support
  • Janus is marked as stable, no API changes was made for years
Changelog

Sourced from janus's changelog.

1.0.0 (2021-12-17)

  • Drop Python 3.6 support

0.7.0 (2021-11-24)

  • Add SyncQueue and AsyncQueue Protocols to provide type hints for sync and async queues #374

0.6.2 (2021-10-24)

  • Fix Python 3.10 compatibility #358

0.6.1 (2020-10-26)

  • Raise RuntimeError on queue.join() after queue closing. #295

  • Replace timeout type from Optional[int] to Optional[float] #267

0.6.0 (2020-10-10)

  • Drop Python 3.5, the minimal supported version is Python 3.6

  • Support Python 3.9

  • Refomat with black

0.5.0 (2020-04-23)

  • Remove explicit loop arguments and forbid creating queues outside event loops #246

0.4.0 (2018-07-28)

  • Add py.typed macro #89

  • Drop python 3.4 support and fix minimal version python3.5.3 #88

  • Add property with that indicates if queue is closed #86

0.3.2 (2018-07-06)

  • Fixed python 3.7 support #97

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1079422215,I_kwDOCGYnMM5AVq0H,357,pytest-runner is not required,4067843,pgajdos,closed,0,,,,,1,2021-12-14T07:51:24Z,2021-12-16T20:43:19Z,2021-12-16T20:43:13Z,NONE,,Deprecated pytest-runner is not necessary for running the testsuite.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 962391325,MDU6SXNzdWU5NjIzOTEzMjU=,1423,Show count of facet values if ?_facet_size=max,9599,simonw,closed,0,,,,,9,2021-08-06T04:42:20Z,2021-12-15T17:48:40Z,2021-08-16T18:56:43Z,OWNER,,"I sometimes want to get a count of the values in a facet - if it's a facet of US states for example I want to know if all 50 are represented. Idea: if `?_facet_size=max` is present, add a count to the facet heading. So on: https://latest.datasette.io/fixtures/compound_three_primary_keys?_facet=content&_facet_size=max&_facet=pk1&_facet=pk2#facet-pk2 It could have something like this: Note that the first column shows >1000 - because in that case we've truncated the facet calculation since the maximum allowed returned rows is 1000.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1423/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072135269,PR_kwDOBm6k_c4vb__Y,1543,Bump black from 21.11b1 to 21.12b0,49699333,dependabot[bot],closed,0,,,,,1,2021-12-06T13:11:16Z,2021-12-13T23:22:29Z,2021-12-13T23:22:29Z,CONTRIBUTOR,simonw/datasette/pulls/1543,"Bumps [black](https://github.com/psf/black) from 21.11b1 to 21.12b0.
Release notes

Sourced from black's releases.

21.12b0

Black

  • Fix determination of f-string expression spans (#2654)
  • Fix bad formatting of error messages about EOF in multi-line statements (#2343)
  • Functions and classes in blocks now have more consistent surrounding spacing (#2472)

Jupyter Notebook support

  • Cell magics are now only processed if they are known Python cell magics. Earlier, all cell magics were tokenized, leading to possible indentation errors e.g. with %%writefile. (#2630)
  • Fix assignment to environment variables in Jupyter Notebooks (#2642)

Python 3.10 support

  • Point users to using --target-version py310 if we detect 3.10-only syntax (#2668)
  • Fix match statements with open sequence subjects, like match a, b: or match a, *b: (#2639) (#2659)
  • Fix match/case statements that contain match/case soft keywords multiple times, like match re.match() (#2661)
  • Fix case statements with an inline body (#2665)
  • Fix styling of starred expressions inside match subject (#2667)
  • Fix parser error location on invalid syntax in a match statement (#2649)
  • Fix Python 3.10 support on platforms without ProcessPoolExecutor (#2631)
  • Improve parsing performance on code that uses match under --target-version py310 up to ~50% (#2670)

Packaging


Thank you!

  • @​isidentical for the polishing up 3.10 syntax support (which they contributed in the first place!)
  • @​MarcoGorelli for their ever-continuing work on Black's jupyter support
  • @​jalaziz for cleaning up our Pyinstaller CD workflow
  • @​hauntsaninja for helping us drop the regex dependency

And also congrats to first contributors!

Changelog

Sourced from black's changelog.

21.12b0

Black

  • Fix determination of f-string expression spans (#2654)
  • Fix bad formatting of error messages about EOF in multi-line statements (#2343)
  • Functions and classes in blocks now have more consistent surrounding spacing (#2472)

Jupyter Notebook support

  • Cell magics are now only processed if they are known Python cell magics. Earlier, all cell magics were tokenized, leading to possible indentation errors e.g. with %%writefile. (#2630)
  • Fix assignment to environment variables in Jupyter Notebooks (#2642)

Python 3.10 support

  • Point users to using --target-version py310 if we detect 3.10-only syntax (#2668)
  • Fix match statements with open sequence subjects, like match a, b: or match a, *b: (#2639) (#2659)
  • Fix match/case statements that contain match/case soft keywords multiple times, like match re.match() (#2661)
  • Fix case statements with an inline body (#2665)
  • Fix styling of starred expressions inside match subject (#2667)
  • Fix parser error location on invalid syntax in a match statement (#2649)
  • Fix Python 3.10 support on platforms without ProcessPoolExecutor (#2631)
  • Improve parsing performance on code that uses match under --target-version py310 up to ~50% (#2670)

Packaging

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.11b1&new-version=21.12b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1076834768,PR_kwDOBm6k_c4vrZxV,1548,"Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6",49699333,dependabot[bot],closed,0,,,,,1,2021-12-10T13:12:06Z,2021-12-13T23:22:22Z,2021-12-13T23:22:21Z,CONTRIBUTOR,simonw/datasette/pulls/1548,"Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
Changelog

Sourced from pytest-xdist's changelog.

pytest-xdist 2.5.0 (2021-12-10)

Features

  • [#722](https://github.com/pytest-dev/pytest-xdist/issues/722) <https://github.com/pytest-dev/pytest-xdist/issues/722>_: Full compatibility with pytest 7 - no deprecation warnings or use of legacy features.

  • [#733](https://github.com/pytest-dev/pytest-xdist/issues/733) <https://github.com/pytest-dev/pytest-xdist/issues/733>_: New --dist=loadgroup option, which ensures all tests marked with @pytest.mark.xdist_group run in the same session/worker. Other tests run distributed as in --dist=load.

Trivial Changes

  • [#708](https://github.com/pytest-dev/pytest-xdist/issues/708) <https://github.com/pytest-dev/pytest-xdist/issues/708>_: Use @pytest.hookspec decorator to declare hook options in newhooks.py to avoid warnings in pytest 7.0.

  • [#719](https://github.com/pytest-dev/pytest-xdist/issues/719) <https://github.com/pytest-dev/pytest-xdist/issues/719>_: Use up-to-date setup.cfg/pyproject.toml packaging setup.

  • [#720](https://github.com/pytest-dev/pytest-xdist/issues/720) <https://github.com/pytest-dev/pytest-xdist/issues/720>_: Require pytest>=6.2.0.

  • [#721](https://github.com/pytest-dev/pytest-xdist/issues/721) <https://github.com/pytest-dev/pytest-xdist/issues/721>_: Started using type annotations and mypy checking internally. The types are incomplete and not published.

pytest-xdist 2.4.0 (2021-09-20)

Features

  • [#696](https://github.com/pytest-dev/pytest-xdist/issues/696) <https://github.com/pytest-dev/pytest-xdist/issues/696>_: On Linux, the process title now changes to indicate the current worker state (running/idle).

    Depends on the setproctitle <https://pypi.org/project/setproctitle/>__ package, which can be installed with pip install pytest-xdist[setproctitle].

  • [#704](https://github.com/pytest-dev/pytest-xdist/issues/704) <https://github.com/pytest-dev/pytest-xdist/issues/704>_: Add support for Python 3.10.

pytest-xdist 2.3.0 (2021-06-16)

Deprecations and Removals

  • [#654](https://github.com/pytest-dev/pytest-xdist/issues/654) <https://github.com/pytest-dev/pytest-xdist/issues/654>_: Python 3.5 is no longer supported.

Features

  • [#646](https://github.com/pytest-dev/pytest-xdist/issues/646) <https://github.com/pytest-dev/pytest-xdist/issues/646>_: Add --numprocesses=logical flag, which automatically uses the number of logical CPUs available, instead of physical CPUs with auto.

... (truncated)

Commits
  • 13f3934 Remove unnecessary skip from test_logfinish_hook as we require pytest>=6.2
  • c76d562 Skip test_warning_captured_deprecated_in_pytest_6 in pytest>=7.1
  • 5f78c71 Fix CHANGELOG header
  • c8bbc03 Release 2.5.0
  • 8dbf367 Merge pull request #738 from pytest-dev/pre-commit-ci-update-config
  • a25c14b [pre-commit.ci] pre-commit autoupdate
  • 110c114 Merge pull request #734 from nicoddemus/revamp-readme
  • 83bdbf4 Revamp README
  • 630c1eb Merge pull request #733 from baekdohyeop/feature-loadgroup
  • 62e50d0 Address review
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1077102934,I_kwDOCGYnMM5AM0lW,353,"Allow passing a file of code to ""sqlite-utils convert""",536941,fgregg,closed,0,,,,,8,2021-12-10T18:06:14Z,2021-12-11T01:38:29Z,2021-12-11T01:09:39Z,CONTRIBUTOR,,"sqlite-utils is so nice, but the ergonomics of the multiline code in kind of tough. It's really hard (maybe impossible) to make the newlines play well with Makefiles. it would be great to write your code fragment in a separate file and direct it into the sqlite-utils either like ```sqlite-utils convert my.db my_table my_column < custom_code.py``` or ```sqlite-utils convert my.db my_table my_column --custom-code=custom_code.py``` Thanks, as ever, for these great tools!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077243232,I_kwDOCGYnMM5ANW1g,354,Test failure in test_rebuild_fts,9599,simonw,closed,0,,,,,7,2021-12-10T21:27:55Z,2021-12-11T01:08:46Z,2021-12-11T01:08:46Z,OWNER,,"Not sure why this has only just started failing, but I'm getting this: https://github.com/simonw/sqlite-utils/runs/4488687639 ``` E sqlite3.DatabaseError: database disk image is malformed sqlite_utils/db.py:425: DatabaseError _______________________ test_rebuild_fts[searchable_fts] _______________________ fresh_db = > table_to_fix = 'searchable_fts' @pytest.mark.parametrize(""table_to_fix"", [""searchable"", ""searchable_fts""]) def test_rebuild_fts(fresh_db, table_to_fix): table = fresh_db[""searchable""] table.insert(search_records[0]) table.enable_fts([""text"", ""country""]) # Run a search rows = list(table.search(""tanuki"")) assert len(rows) == 1 assert { ""rowid"": 1, ""text"": ""tanuki are running tricksters"", ""country"": ""Japan"", ""not_searchable"": ""foo"", }.items() <= rows[0].items() # Delete from searchable_fts_data fresh_db[""searchable_fts_data""].delete_where() # This should have broken the index with pytest.raises(sqlite3.DatabaseError): list(table.search(""tanuki"")) # Running rebuild_fts() should fix it > fresh_db[table_to_fix].rebuild_fts() ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077322009,I_kwDOCGYnMM5ANqEZ,355,Allow users to pass a full convert() function definition,9599,simonw,closed,0,,,,,4,2021-12-10T23:59:58Z,2021-12-11T00:51:15Z,2021-12-11T00:49:31Z,OWNER,,"> I think the fix for this is to change the rules about what code is accepted in both the `-` mode and the literal code string mode: you can pass in a Python expression, OR a fragment that gets turned into a function, OR code that implements its own `def convert(value)` function. So this would work too: > ```sh > sqlite-utils convert my.db mytable col1 ' > def convert(value): > return value.upper() > ' > ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991381679_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072780607,I_kwDOCGYnMM4_8VU_,351,Support `--import xml.etree.ElementTree` in `sqlite-utils convert`,9599,simonw,closed,0,,,,,1,2021-12-07T00:40:29Z,2021-12-11T00:11:25Z,2021-12-11T00:11:25Z,OWNER,,"It's not possible to use a module that requires a nested import, such as `xml.etree.ElementTree`, at the moment. I found and fixed this bug in `git-history`, I should replicate that fix (and accompanying documentation) here: https://github.com/simonw/git-history/issues/39",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058790545,I_kwDOBm6k_c4_G9yR,1519,base_url is omitted in JSON and CSV views,157158,phubbard,closed,0,,,,,22,2021-11-19T18:10:45Z,2021-12-01T17:50:09Z,2021-11-20T19:11:21Z,NONE,,"I have a datasette deployment, using Apache2 to reverse proxy: ProxyPass /ged http://thor.phfactor.net:8001 ProxyPreserveHost On In settings.json I have ```json { ""base_url"": ""/ged/"", ""trace_debug"": 1, ""template_debug"": 1 } ``` and datasette works correctly. However, if you view a query and then click on the 'This data as json, CSV' both links omit the base_url prefix and are therefore 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1062414013,PR_kwDOBm6k_c4u9wQq,1529,"Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8",49699333,dependabot[bot],closed,0,,,,,1,2021-11-24T13:12:42Z,2021-11-30T02:37:13Z,2021-11-30T02:37:13Z,CONTRIBUTOR,simonw/datasette/pulls/1529,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
Changelog

Sourced from janus's changelog.

0.7.0 (2021-11-24)

  • Add SyncQueue and AsyncQueue Protocols to provide type hints for sync and async queues #374

0.6.2 (2021-10-24)

  • Fix Python 3.10 compatibility #358

0.6.1 (2020-10-26)

  • Raise RuntimeError on queue.join() after queue closing. #295

  • Replace timeout type from Optional[int] to Optional[float] #267

0.6.0 (2020-10-10)

  • Drop Python 3.5, the minimal supported version is Python 3.6

  • Support Python 3.9

  • Refomat with black

0.5.0 (2020-04-23)

  • Remove explicit loop arguments and forbid creating queues outside event loops #246

0.4.0 (2018-07-28)

  • Add py.typed macro #89

  • Drop python 3.4 support and fix minimal version python3.5.3 #88

  • Add property with that indicates if queue is closed #86

0.3.2 (2018-07-06)

  • Fixed python 3.7 support #97

0.3.1 (2018-01-30)

  • Fixed bug with join() in case tasks are added by sync_q.put() #75

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1057340779,PR_kwDOBm6k_c4utsKs,1516,Bump black from 21.9b0 to 21.11b1,49699333,dependabot[bot],closed,0,,,,,1,2021-11-18T13:11:12Z,2021-11-30T02:35:29Z,2021-11-30T02:35:29Z,CONTRIBUTOR,simonw/datasette/pulls/1516,"Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.11b1.
Release notes

Sourced from black's releases.

21.11b1

Black

  • Bumped regex version minimum to 2021.4.4 to fix Pattern class usage (#2621)

21.11b0

Black

  • Warn about Python 2 deprecation in more cases by improving Python 2 only syntax detection (#2592)
  • Add experimental PyPy support (#2559)
  • Add partial support for the match statement. As it's experimental, it's only enabled when --target-version py310 is explicitly specified (#2586)
  • Add support for parenthesized with (#2586)
  • Declare support for Python 3.10 for running Black (#2562)

Integrations

  • Fixed vim plugin with Python 3.10 by removing deprecated distutils import (#2610)
  • The vim plugin now parses skip_magic_trailing_comma from pyproject.toml (#2613)

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatiblity (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Changelog

Sourced from black's changelog.

21.11b1

Black

  • Bumped regex version minimum to 2021.4.4 to fix Pattern class usage (#2621)

21.11b0

Black

  • Warn about Python 2 deprecation in more cases by improving Python 2 only syntax detection (#2592)
  • Add experimental PyPy support (#2559)
  • Add partial support for the match statement. As it's experimental, it's only enabled when --target-version py310 is explicitly specified (#2586)
  • Add support for parenthesized with (#2586)
  • Declare support for Python 3.10 for running Black (#2562)

Integrations

  • Fixed vim plugin with Python 3.10 by removing deprecated distutils import (#2610)
  • The vim plugin now parses skip_magic_trailing_comma from pyproject.toml (#2613)

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatibility (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.9b0&new-version=21.11b1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1053655062,PR_kwDOBm6k_c4uiE0n,1508,Update docutils requirement from <0.18 to <0.19,49699333,dependabot[bot],closed,0,,,,,1,2021-11-15T13:15:47Z,2021-11-30T02:35:19Z,2021-11-30T02:35:19Z,CONTRIBUTOR,simonw/datasette/pulls/1508,"Updates the requirements on [docutils](http://docutils.sourceforge.net/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1066023866,PR_kwDOBm6k_c4vIJqi,1537,"Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9",49699333,dependabot[bot],closed,0,,,,,1,2021-11-29T13:13:52Z,2021-11-30T02:29:55Z,2021-11-30T02:29:54Z,CONTRIBUTOR,simonw/datasette/pulls/1537,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1066501534,I_kwDOCGYnMM4_kYWe,345,`table.strict` introspection boolean for identifying STRICT mode tables,9599,simonw,closed,0,,,,,2,2021-11-29T21:05:10Z,2021-11-29T22:45:26Z,2021-11-29T22:44:36Z,OWNER,,"> From the STRICT docs: >> The SQLite parser accepts a comma-separated list of table options after the final close parenthesis in a CREATE TABLE statement. As of this writing (2021-08-23) only two options are recognized: >> >> - STRICT >> - [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html) > > So I think I need to read the `CREATE TABLE` statement from the `sqlite_master` table, split on the last `)`, split those tokens on `,` and see if `create` is in there (case insensitive). _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982020757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1039037439,PR_kwDOCGYnMM4t0uaI,333,Add functionality to read Parquet files.,2118708,Florents-Tselai,closed,0,,,,,3,2021-10-28T23:43:19Z,2021-11-25T19:47:35Z,2021-11-25T19:47:35Z,NONE,simonw/sqlite-utils/pulls/333,"I needed this for a project of mine, and I thought it'd be useful to have it in sqlite-utils (It's also mentioned in #248 ). The current implementation works (data is read & data types are inferred correctly. I've added a single straightforward test case, but @simonw please let me know if there are any non-obvious flags/combinations I should test too.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/333/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1058896236,I_kwDOBm6k_c4_HXls,1522,Deploy a live instance of demos/apache-proxy,9599,simonw,closed,0,,,,,34,2021-11-19T20:32:55Z,2021-11-23T03:00:34Z,2021-11-20T18:51:56Z,OWNER,,"> I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1521#issuecomment-974322178_ I started by following https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here: https://github.com/simonw/datasette-publish-vercel/issues/51 In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site). Can this be addressed? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273944952,MDU6SXNzdWUyNzM5NDQ5NTI=,93,Package as standalone binary,67420,atomotic,closed,0,,,,,18,2017-11-14T21:14:07Z,2021-11-21T07:00:23Z,2021-11-21T07:00:23Z,NONE,,"hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_ I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637395097,MDU6SXNzdWU2MzczOTUwOTc=,838,Incorrect URLs when served behind a proxy with base_url set,79913,tsibley,closed,0,,,6026070,0.51,14,2020-06-11T23:58:55Z,2021-11-20T19:35:48Z,2021-11-20T19:35:48Z,NONE,,"I'm running `datasette serve --config base_url:/foo/ …`, proxying to it with this Apache config: ProxyPass /foo/ http://localhost:8001/ ProxyPassReverse /foo/ http://localhost:8001/ and then accessing it via `https://example.com/foo/`. Although many of the URLs in the pages are correct (presumably because they either use absolute paths which include `base_url` or relative paths), the faceting and pagination links still use fully-qualified URLs pointing at `http://localhost:8001`. I looked into this a little in the source code, and it seems to be an issue anywhere `request.url` or `request.path` is used, as these contain the values for the request between the frontend (Apache) and backend (Datasette) server. Those properties are primarily used via the `path_with_…` family of utility functions and the `Datasette.absolute_url` method.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058815557,I_kwDOBm6k_c4_HD5F,1521,Docker configuration for exercising Datasette behind Apache mod_proxy,9599,simonw,closed,0,,,,,10,2021-11-19T18:46:18Z,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,,"> Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974310208_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058196641,I_kwDOCGYnMM4_Esyh,342,Extra options to `lookup()` which get passed to `insert()`,9599,simonw,closed,0,,,,,7,2021-11-19T06:53:03Z,2021-11-19T07:26:54Z,2021-11-19T07:26:54Z,OWNER,,"For https://github.com/simonw/git-history/issues/12 I found myself wanting to pass extra options to `lookup()` to set the column order, primary key etc.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes. It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1056117435,PR_kwDOBm6k_c4up0R0,1514,Bump black from 21.9b0 to 21.11b0,49699333,dependabot[bot],closed,0,,,,,2,2021-11-17T13:13:55Z,2021-11-18T13:11:17Z,2021-11-18T13:11:15Z,CONTRIBUTOR,simonw/datasette/pulls/1514,"Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.11b0.
Release notes

Sourced from black's releases.

21.11b0

Black

  • Warn about Python 2 deprecation in more cases by improving Python 2 only syntax detection (#2592)
  • Add experimental PyPy support (#2559)
  • Add partial support for the match statement. As it's experimental, it's only enabled when --target-version py310 is explicitly specified (#2586)
  • Add support for parenthesized with (#2586)
  • Declare support for Python 3.10 for running Black (#2562)

Integrations

  • Fixed vim plugin with Python 3.10 by removing deprecated distutils import (#2610)
  • The vim plugin now parses skip_magic_trailing_comma from pyproject.toml (#2613)

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatiblity (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Changelog

Sourced from black's changelog.

21.11b0

Black

  • Warn about Python 2 deprecation in more cases by improving Python 2 only syntax detection (#2592)
  • Add experimental PyPy support (#2559)
  • Add partial support for the match statement. As it's experimental, it's only enabled when --target-version py310 is explicitly specified (#2586)
  • Add support for parenthesized with (#2586)
  • Declare support for Python 3.10 for running Black (#2562)

Integrations

  • Fixed vim plugin with Python 3.10 by removing deprecated distutils import (#2610)
  • The vim plugin now parses skip_magic_trailing_comma from pyproject.toml (#2613)

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatibility (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.9b0&new-version=21.11b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1041158024,PR_kwDOBm6k_c4t7RKr,1500,Bump black from 21.9b0 to 21.10b0,49699333,dependabot[bot],closed,0,,,,,2,2021-11-01T13:11:23Z,2021-11-17T13:14:00Z,2021-11-17T13:13:58Z,CONTRIBUTOR,simonw/datasette/pulls/1500,"Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.10b0.
Release notes

Sourced from black's releases.

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatiblity (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Changelog

Sourced from black's changelog.

21.10b0

Black

  • Document stability policy, that will apply for non-beta releases (#2529)
  • Add new --workers parameter (#2514)
  • Fixed feature detection for positional-only arguments in lambdas (#2532)
  • Bumped typed-ast version minimum to 1.4.3 for 3.10 compatiblity (#2519)
  • Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (#2580)
  • Deprecate Python 2 formatting support (#2523)

Blackd

  • Remove dependency on aiohttp-cors (#2500)
  • Bump required aiohttp version to 3.7.4 (#2509)

Black-Primer

  • Add primer support for --projects (#2555)
  • Print primer summary after individual failures (#2570)

Integrations

  • Allow to pass target_version in the vim plugin (#1319)
  • Install build tools in docker file and use multi-stage build to keep the image size down (#2582)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.9b0&new-version=21.10b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1055402144,PR_kwDOBm6k_c4unfnq,1512,New pattern for async view classes,9599,simonw,closed,0,,,,,7,2021-11-16T21:55:44Z,2021-11-17T01:39:54Z,2021-11-17T01:39:44Z,OWNER,simonw/datasette/pulls/1512,Refs #878 - starting out with the new `AsyncBase` class implementing a pytest-inspired `asyncio` parallel execution mechanism.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets. This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets. Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440222719,MDU6SXNzdWU0NDAyMjI3MTk=,448,_facet_array should work against views,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2019-05-03T21:08:04Z,2021-11-16T01:32:05Z,2021-11-16T01:19:40Z,OWNER,,"I created this view: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads-8dbda00/ads_with_targets ``` CREATE VIEW ads_with_targets as select ads.*, json_group_array(targets.name) as target_names from ads join ad_targets on ad_targets.ad_id = ads.id join targets on ad_targets.target_id = targets.id group by ad_targets.ad_id ``` When I try to apply faceting by array it appears to work at first: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names But actually it's doing the wrong thing - the SQL for the facets uses rowid, but rowid is not present on views at all! These results are incorrect, and clicking to select a facet will fail to produce any rows: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names&target_names__arraycontains=people_who_match%3Ainterests%3AAfrican-American+Civil+Rights+Movement+%281954%E2%80%9468%29 Here's the SQL it should be using when you select a facet (note that it does not use a rowid): https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads?sql=select+*+from+ads_with_targets+where+id+in+%28%0D%0A++++++++++++select+ads_with_targets.id+from+ads_with_targets%2C+json_each%28ads_with_targets.target_names%29+j%0D%0A++++++++++++where+j.value+%3D+%3Ap0%0D%0A++++++++%29+limit+101&p0=people_who_match%3Ainterests%3ABlack+%28Color%29 So we need to do something a lot smarter here. I'm not sure what the fix will look like, or even if it's feasible given that views don't have a rowid to hook into so the JSON faceting SQL may have to be completely rewritten. ``` datasette publish cloudrun \ russian-ads.db \ --name json-view-facet-bug-demo \ --branch master \ --extra-options ""--config sql_time_limit_ms:5000 --config facet_time_limit_ms:5000"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/448/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459590021,MDU6SXNzdWU0NTk1OTAwMjE=,519,Decide what goes into Datasette 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-06-23T15:47:41Z,2021-11-15T23:26:11Z,2021-11-15T23:26:11Z,OWNER,,Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1025726600,PR_kwDOCGYnMM4tKxHD,330,Test against Python 3.10,9599,simonw,closed,0,,,,,1,2021-10-13T21:50:22Z,2021-11-15T02:59:29Z,2021-10-13T22:25:05Z,OWNER,simonw/sqlite-utils/pulls/330,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/330/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1005891028,I_kwDOCGYnMM479K3U,329,Rethink approach to [ and ] in column names (currently throws error),9599,simonw,closed,0,,,,,12,2021-09-23T22:14:24Z,2021-11-15T02:57:51Z,2021-11-15T02:57:51Z,OWNER,,"> I think it's best to still keep `[` and `]` out of column names though. Transforming them into `(` and `)` seems reasonable - but should that happen here or in `sqlite-utils`? I think in `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/datasette-app/issues/121#issuecomment-926200398_ This is a rethinking of the solution to: - https://github.com/simonw/sqlite-utils/issues/86",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053136495,I_kwDOCGYnMM4-xZZv,341,`hash_id: Optional[Any]` should be `hash_id: Optional[str]`,9599,simonw,closed,0,,,,,0,2021-11-15T02:12:39Z,2021-11-15T02:19:31Z,2021-11-15T02:19:31Z,OWNER,,"In a few places: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L642 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L751 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1049 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1230 But it's correct here: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L2470",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053122092,I_kwDOCGYnMM4-xV4s,339,`table.lookup()` option to populate additional columns when creating a record,9599,simonw,closed,0,,,,,4,2021-11-15T01:41:17Z,2021-11-15T02:02:34Z,2021-11-15T02:02:00Z,OWNER,,"> For the commits table I feel like I want a version of `table.lookup()` that can be passed additional columns to populate only if the record does not exist yet. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/12#issuecomment-967455017_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053087862,I_kwDOCGYnMM4-xNh2,338,"dict, list, tuple should all map to TEXT",9599,simonw,closed,0,,,,,0,2021-11-15T00:28:01Z,2021-11-15T00:36:03Z,2021-11-15T00:36:03Z,OWNER,,"> This relates to the fact that dictionaries, lists and tuples get special treatment and are converted to JSON strings, using this code: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L2937-L2947 > > So the `COLUMN_TYPE_MAPPING` should include those too - right now it looks like this: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L165-L188 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/322#issuecomment-968401459_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 979612115,MDExOlB1bGxSZXF1ZXN0NzE5OTk4MjI1,322,Add dict type to be mapped as TEXT in sqllite,2496189,minaeid90,closed,0,,,,,1,2021-08-25T20:54:26Z,2021-11-15T00:27:40Z,2021-11-15T00:27:40Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/322,"the library deal with Postgres type jsonb as dictionary, add dict type as a TEXT for mapping to sqlite ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 988013247,MDExOlB1bGxSZXF1ZXN0NzI3MDEyOTk2,324,Use python-dateutil package instead of dateutils,191622,meatcar,closed,0,,,,,1,2021-09-03T18:31:19Z,2021-11-14T23:25:40Z,2021-11-14T23:25:40Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/324,"While working on updating `sqlite-utils` for NixOS/Nixpkgs, I came a cross the following: In 5ec6686153e29ae10d4921a1ad4c841f192f20e2, a new dependency was added on `dateutils` (https://pypi.org/project/dateutils/). I believe this is unintentional, and instead `python-dateutil` (https://pypi.org/project/python-dateutil/) was intended. My reasoning is: - `python-dateutil` is imported here in [recipes.py](https://github.com/simonw/sqlite-utils/blob/5ec6686153e29ae10d4921a1ad4c841f192f20e2/sqlite_utils/recipes.py#L1) - The `mypy` `type-python-dateutil` dependency in [setup.py](https://github.com/simonw/sqlite-utils/blob/5ec6686153e29ae10d4921a1ad4c841f192f20e2/setup.py#L36) - `python-dateutil` is a dependency of `dateutils` as seen in the output in [docs/tutorial.ipynb](https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/docs/tutorial.ipynb#L43) Seems like the trailing ""s"" seems to be the source of confusion 😅 I've swapped the dependencies out, hope this helps.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,andreaslongo,closed,0,,,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"``` Python 3.9.5 mypy 0.910 sqlite-utils 3.17.1 ``` While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error: ``` mypy . src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ Found 2 errors in 2 files (checked 7 source files) ``` When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away. ``` al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la total 200 drwx------ 3 al al 4096 Oct 14 22:00 . drwx------ 117 al al 4096 Oct 12 21:12 .. -rw------- 1 al al 64409 Oct 12 21:11 cli.py -rw------- 1 al al 109092 Oct 12 21:11 db.py -rw------- 1 al al 0 Oct 14 22:00 py.typed -rw------- 1 al al 684 Oct 12 21:11 recipes.py -rw------- 1 al al 7988 Oct 12 21:11 utils.py -rw------- 1 al al 113 Oct 12 21:11 __init__.py ``` I would like to suggest adding a `py.typed` file to the repository. See also the mypy docs on creating PEP 561 compatible packages: https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028056713,I_kwDOCGYnMM49RuaJ,332,`sqlite-utils memory --flatten` option to flatten nested JSON,22523840,rdtq,closed,0,,,,,1,2021-10-16T14:04:42Z,2021-11-14T23:05:05Z,2021-11-14T23:05:05Z,NONE,,"currently --flatten option works only for `insert` command, it would be cool if it worked for `memory` as well to query nested json",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1042569687,I_kwDOCGYnMM4-JFnX,335,sqlite-utils index-foreign-keys fails due to pre-existing index,596279,zaneselvans,closed,0,,,,,11,2021-11-02T16:22:11Z,2021-11-14T22:55:56Z,2021-11-14T22:55:56Z,NONE,,"While running the command: ```sh sqlite-utils index-foreign-keys $SQLITE_DIR/pudl.sqlite ``` I got the following error: ``` Traceback (most recent call last): File ""/home/zane/miniconda3/envs/pudl-dev/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 454, in index_foreign_keys db.index_foreign_keys() File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 902, in index_foreign_keys table.create_index([fk.column]) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1563, in create_index self.db.execute(sql) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: index idx_generators_eia860_report_date already exists ``` This DB was created with the foreign key constraint `PRAGMA` enabled and a bunch of column-level `CHECK` constraints. Is this an expected behavior? Should one not try to index foreign keys if FK constraints are already being enforced within the DB? I'm also noticing that the size of the DB after FK indexes have been added went from 483MB to 835MB, which seems like a much bigger jump than when I've done this previously. Software versions... * sqlite-utils 3.17.1 * sqlite 3.36.0 * SQLAlchemy 1.4.26 (used to create the DB)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,simonw,closed,0,,,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation. https://readthedocs.org/projects/datasette/builds/15268454/ ``` /home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html Running Sphinx v1.8.5 loading translations [en]... done making output directory... building [mo]: targets for 0 po files that are out of date building [html]: targets for 27 source files that are out of date updating environment: 27 added, 0 changed, 0 removed reading sources... [ 3%] authentication Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main app.build(args.force_all, filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update len(to_build)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc pub.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish self.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms self.document.transformer.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms Transformer.apply_transforms(self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms transform.apply(**kwargs) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply apply_source_workaround(n) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1050163432,I_kwDOBm6k_c4-mDjo,1503,`?_nocol=` removes that column from the filter interface,9599,simonw,closed,0,,,,,1,2021-11-10T18:22:50Z,2021-11-14T05:08:27Z,2021-11-14T04:53:07Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable This causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter. ![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052826038,I_kwDOBm6k_c4-wNm2,1506,Columns beginning with an underscore do not facet correctly,9599,simonw,closed,0,,,,,1,2021-11-14T02:20:32Z,2021-11-14T04:45:21Z,2021-11-14T04:45:21Z,OWNER,,"Datasette treats columns that start with an underscore as querystring parameters it should ignore! Discovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1041778507,I_kwDOCGYnMM4-GEdL,334,Filter by datetime objects using rows_where(),11642379,viseshrp,closed,0,,,,,0,2021-11-02T00:44:08Z,2021-11-13T19:23:21Z,2021-11-13T19:23:21Z,NONE,,"Firstly, thanks for this nice utility. It would be nice to have an example in the docs on how to filter by date range using `rows_where()`. This doesn't seem to work: ``` table.rows_where('datetime(created) between datetime(""2021-10-31T17:29:59.277428-04:00"") AND datetime(""2021-11-01T03:44:04.544651+00:00"")') ``` I could probably just use `db.query()`, which works for the above, but it would be nice if I could pass in `datetime` objects in `rows_where()`. Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1046271107,PR_kwDOCGYnMM4uK5z2,337,Default values for `--attach` and `--param` options,771193,urbas,closed,0,,,,,1,2021-11-05T21:57:53Z,2021-11-05T22:33:03Z,2021-11-05T22:33:02Z,NONE,simonw/sqlite-utils/pulls/337,"It seems that `click` 8.x uses `None` as the default value for `multiple=True` options. This change makes the code forward-compatible with `click` 8.x. See this build failure for more info: https://hydra.nixos.org/build/156926608",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/337/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 707478649,MDU6SXNzdWU3MDc0Nzg2NDk=,173,Progress bar for sqlite-utils insert,9599,simonw,closed,0,,,,,6,2020-09-23T15:43:56Z,2021-11-01T08:42:24Z,2020-10-27T18:16:04Z,OWNER,,"It would be nice if `sqlite-utils insert` had a progress bar, for when it's churning through huge CSV files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 761915790,MDU6SXNzdWU3NjE5MTU3OTA=,206,sqlite-utils should suggest --csv if JSON parsing fails,9599,simonw,closed,0,,,,,4,2020-12-11T05:17:56Z,2021-10-30T15:52:17Z,2021-01-03T18:42:22Z,OWNER,,"``` ~ % gsutil cat gs://ossf-criticality-score/python_top_200.csv | sqlite-utils insert /tmp/crit.db crit - ... File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 355, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ``` A nicer error message here would be one that says the JSON is invalid but suggests that maybe you could try `--csv`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1029100823,PR_kwDOBm6k_c4tU5cz,1494,"Update pytest-asyncio requirement from <0.16,>=0.10 to >=0.10,<0.17",49699333,dependabot[bot],closed,0,,,,,1,2021-10-18T13:14:17Z,2021-10-24T22:22:40Z,2021-10-24T22:22:39Z,CONTRIBUTOR,simonw/datasette/pulls/1494,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • f2fe98e 0.16.0
  • 4e1df31 Remove obsolete test, add make test
  • 6ec7647 feat: Add support for Python 3.10.
  • 42ff5d1 ci: Include Python 3.10 in the CI test run.
  • be3b327 build: Include Python 3.10 in Tox test runs.
  • 1c283bd refactor: test_async_fixtures_with_finalizer no longer trigger a DeprecationW...
  • 2751982 refactor: Replaced tests asserting that the event loop is properly closed.
  • 70989fd refactor: Grouped test cases together that are related to the use of the asyn...
  • b27abe8 refactor: Removed TestUnexistingLoop.remove_loop fixture, because it has no e...
  • e3ec312 Adjusted Hypothesis integration test to use the same event loop initializatio...
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see: - https://github.com/aio-libs/janus/issues/358 This is a tracking issue for anything else that shows up. This is also needed for the Homebrew package to upgrade to 3.10: - https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1020436713,PR_kwDOBm6k_c4s6bJm,1481,Fix compatibility with Python 3.10,9599,simonw,closed,0,,,,,13,2021-10-07T20:34:23Z,2021-10-24T22:19:55Z,2021-10-24T22:19:54Z,OWNER,simonw/datasette/pulls/1481,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1033864602,I_kwDOBm6k_c49n4Wa,1496,Named parameters docs should include an example of a cast,9599,simonw,closed,0,,,,,1,2021-10-22T18:56:04Z,2021-10-22T19:38:23Z,2021-10-22T19:34:27Z,OWNER,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters It's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944903881,MDU6SXNzdWU5NDQ5MDM4ODE=,1396,"""invalid reference format"" publishing Docker image",9599,simonw,closed,0,,,,,9,2021-07-15T01:02:07Z,2021-10-19T08:10:26Z,2021-07-15T19:47:25Z,OWNER,,"Error ocurred at the end of the publish flow for Datasette 0.58: https://github.com/simonw/datasette/runs/3072216421 ``` Removing intermediate container cf32b9440907 ---> dfd6985b2afc Successfully built dfd6985b2afc Successfully tagged ***/datasette:0.58 invalid reference format Error: Process completed with exit code 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,chenrui333,closed,0,,,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command. My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks! relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 995098231,MDU6SXNzdWU5OTUwOTgyMzE=,1470,?_sort=rowid with _next= returns error,19851673,eigenfoo,closed,0,,,,,4,2021-09-13T16:36:15Z,2021-10-18T19:30:15Z,2021-10-10T01:15:03Z,NONE,,"For example: - Go to https://cryptics.eigenfoo.xyz/clues/clues?_next=100 (this is the second page of results in a Datasette site) - Search anything using the FTS search bar. For example, searching for `hello` will take you to https://cryptics.eigenfoo.xyz/clues/clues?_search=hello&_sort=rowid&_next=100 - A `500 Error: list index out of range` is raised. This is because the search URL includes the `&_next=100` UTM parameter, carried over from where the FTS search was run. However, there isn't a second page in the search results, so a `list index out of range` error is raised. You can confirm that removing this UTM parameter from the URL returns the appropriate search results. The FTS search request should strip any `_next` UTM parameter. --- ```bash datasette, version 0.58.1 sqlite-utils, version 3.17 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991575770,MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3,1467,Add Authorization header when CORS flag is set,3058200,jameslittle230,closed,0,,,,,3,2021-09-08T22:14:41Z,2021-10-17T02:29:07Z,2021-10-14T18:54:18Z,NONE,simonw/datasette/pulls/1467,"This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 964400482,MDU6SXNzdWU5NjQ0MDA0ODI=,310,`sqlite-utils insert --flatten` option to flatten nested JSON,9599,simonw,closed,0,,,,,3,2021-08-09T21:23:08Z,2021-10-16T13:54:56Z,2021-08-09T21:44:06Z,OWNER,,"I had to do this with a `jq` recipe today: https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs ``` cat log.json | jq -c '[leaf_paths as $path | { ""key"": $path | join(""_""), ""value"": getpath($path) }] | from_entries' \ | sqlite-utils insert /tmp/logs.db logs - --nl --alter --batch-size 1 ``` That was to turn something like this: ```json { ""httpRequest"": { ""latency"": ""0.112114537s"", ""requestMethod"": ""GET"", ""requestSize"": ""534"", ""status"": 200, }, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels"": { ""service"": ""datasette-io"" } } ``` Into this instead: ```json { ""httpRequest_latency"": ""0.112114537s"", ""httpRequest_requestMethod"": ""GET"", ""httpRequest_requestSize"": ""534"", ""httpRequest_status"": 200, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels_service"": ""datasette-io"" } ``` I have to do this often enough that I think it should be an option, `--flatten` - so I can do this instead: ``` cat log.json | sqlite-utils insert /tmp/logs.db logs - --flatten ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988555009,MDExOlB1bGxSZXF1ZXN0NzI3NDM2NDA0,1458,Rework the `--static` documentation a bit,51016,ctb,closed,0,,,,,1,2021-09-05T17:08:48Z,2021-10-15T13:24:29Z,2021-10-14T18:39:55Z,CONTRIBUTOR,simonw/datasette/pulls/1458,"Per https://github.com/simonw/datasette/issues/1457, I was confused by the current docs and took a few minutes to sort out what the right combination of locations was. This is a PR that differentiates the docs to split out `/static/` in URL from `--static` option and `./static/` path. Not wedded to the details in any way, happy to change to suit. Fixes #1457.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 988553806,MDU6SXNzdWU5ODg1NTM4MDY=,1457,suggestion: distinguish names in `--static` documentation,51016,ctb,closed,0,,,,,0,2021-09-05T17:04:27Z,2021-10-14T18:39:55Z,2021-10-14T18:39:55Z,CONTRIBUTOR,,"Over in https://docs.datasette.io/en/stable/custom_templates.html#serving-static-files, there is the slightly comical example command - ``` datasette -m metadata.json --static static:static/ --memory ``` (now, with MORE STATIC!) It took me a while to sort out all the URLs and paths involved because I wasn't being very clever. But in the interests of simplification and distinction, I might suggest something like ``` datasette -m metadata.json --static loc:static-files/ --memory ``` I will submit a PR for your consideration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026379132,PR_kwDOBm6k_c4tM0JV,1489,"Update pyyaml requirement from ~=5.3 to >=5.3,<7.0",49699333,dependabot[bot],closed,0,,,,,3,2021-10-14T13:09:33Z,2021-10-14T18:10:43Z,2021-10-14T18:10:42Z,CONTRIBUTOR,simonw/datasette/pulls/1489,"Updates the requirements on [pyyaml](https://github.com/yaml/pyyaml) to permit the latest version.
Changelog

Sourced from pyyaml's changelog.

6.0 (2021-10-13)

5.4.1 (2021-01-20)

  • yaml/pyyaml#480 -- Fix stub compat with older pyyaml versions that may unwittingly load it

5.4 (2021-01-19)

5.3.1 (2020-03-18)

  • yaml/pyyaml#386 -- Prevents arbitrary code execution during python/object/new constructor

5.3 (2020-01-06)

5.2 (2019-12-02)

... (truncated)

Commits
  • 8cdff2c 6.0 release
  • a4fb55e Update Python 3.10 versions for Windows build
  • e45b964 Add Python 3.10 to tox.ini
  • 4808fba 6.0b1 release
  • d5aba40 Omnibus CI/artifact build update
  • a6d384c Various setup fixes
  • 8f3f979 No longer using appveyor
  • c274365 The yaml.load{,_all} functions require Loader= now
  • 2f87ac4 Add a basic test file for yaml.load and yaml.dump
  • 7bd92df Makefile tweaks
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1489/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1025754125,I_kwDOBm6k_c49I8QN,1488,Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects'),9599,simonw,closed,0,,,,,5,2021-10-13T22:37:22Z,2021-10-14T18:03:45Z,2021-10-14T18:03:45Z,OWNER,,This is caused by a change made to `httpx` in https://github.com/encode/httpx/releases/tag/0.20.0,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026664511,PR_kwDOBm6k_c4tNtoe,1490,Upgrade to httpx 0.20,9599,simonw,closed,0,,,,,0,2021-10-14T17:51:05Z,2021-10-14T18:03:45Z,2021-10-14T18:03:44Z,OWNER,simonw/datasette/pulls/1490,Refs #1488 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 991121619,MDExOlB1bGxSZXF1ZXN0NzI5NjMyNjQz,1463,"Update beautifulsoup4 requirement from <4.10.0,>=4.8.1 to >=4.8.1,<4.11.0",49699333,dependabot[bot],closed,0,,,,,1,2021-09-08T13:09:38Z,2021-10-13T22:35:37Z,2021-10-13T22:35:36Z,CONTRIBUTOR,simonw/datasette/pulls/1463,"Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1463/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 929748885,MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5,293,Test against Python 3.10-dev,9599,simonw,closed,0,,,,,3,2021-06-25T01:40:39Z,2021-10-13T21:49:33Z,2021-10-13T21:49:33Z,OWNER,simonw/sqlite-utils/pulls/293,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 991237645,MDExOlB1bGxSZXF1ZXN0NzI5NzMxNDQx,326,Test against 3.10-dev,9599,simonw,closed,0,,,,,2,2021-09-08T15:01:15Z,2021-10-13T21:49:28Z,2021-10-13T21:49:28Z,OWNER,simonw/sqlite-utils/pulls/326,"This tests against the latest 3.10 RC, https://www.python.org/downloads/release/python-3100rc2/",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 996002181,PR_kwDOBm6k_c4rutdp,1471,Bump black from 21.7b0 to 21.9b0,49699333,dependabot[bot],closed,0,,,,,1,2021-09-14T13:10:35Z,2021-10-13T21:47:42Z,2021-10-13T21:47:42Z,CONTRIBUTOR,simonw/datasette/pulls/1471,"Bumps [black](https://github.com/psf/black) from 21.7b0 to 21.9b0.
Release notes

Sourced from black's releases.

21.9b0

Packaging

  • Fix missing modules in self-contained binaries (#2466)
  • Fix missing toml extra used during installation (#2475)

21.8b0

Black

  • Add support for formatting Jupyter Notebook files (#2357)
  • Move from appdirs dependency to platformdirs (#2375)
  • Present a more user-friendly error if .gitignore is invalid (#2414)
  • The failsafe for accidentally added backslashes in f-string expressions has been hardened to handle more edge cases during quote normalization (#2437)
  • Avoid changing a function return type annotation's type to a tuple by adding a trailing comma (#2384)
  • Parsing support has been added for unparenthesized walruses in set literals, set comprehensions, and indices (#2447).
  • Pin setuptools-scm build-time dependency version (#2457)
  • Exclude typing-extensions version 3.10.0.1 due to it being broken on Python 3.10 (#2460)

Blackd

  • Replace sys.exit(-1) with raise ImportError as it plays more nicely with tools that scan installed packages (#2440)

Integrations

  • The provided pre-commit hooks no longer specify language_version to avoid overriding default_language_version (#2430)
Changelog

Sourced from black's changelog.

21.9b0

Packaging

  • Fix missing modules in self-contained binaries (#2466)
  • Fix missing toml extra used during installation (#2475)

21.8b0

Black

  • Add support for formatting Jupyter Notebook files (#2357)
  • Move from appdirs dependency to platformdirs (#2375)
  • Present a more user-friendly error if .gitignore is invalid (#2414)
  • The failsafe for accidentally added backslashes in f-string expressions has been hardened to handle more edge cases during quote normalization (#2437)
  • Avoid changing a function return type annotation's type to a tuple by adding a trailing comma (#2384)
  • Parsing support has been added for unparenthesized walruses in set literals, set comprehensions, and indices (#2447).
  • Pin setuptools-scm build-time dependency version (#2457)
  • Exclude typing-extensions version 3.10.0.1 due to it being broken on Python 3.10 (#2460)

Blackd

  • Replace sys.exit(-1) with raise ImportError as it plays more nicely with tools that scan installed packages (#2440)

Integrations

  • The provided pre-commit hooks no longer specify language_version to avoid overriding default_language_version (#2430)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.7b0&new-version=21.9b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 994450961,MDU6SXNzdWU5OTQ0NTA5NjE=,1469,"Column cog shows ""facet by this"" when already default faceted",9599,simonw,closed,0,,,,,2,2021-09-13T04:51:26Z,2021-10-13T21:20:07Z,2021-10-13T21:20:07Z,OWNER,,"e.g. on https://covid-19.datasettes.com/covid/economist_excess_deaths But if you add `?_facet=country` to the URL that goes away: https://covid-19.datasettes.com/covid/economist_excess_deaths?_facet_size=5&_facet=country The logic that decides if the ""Facet by this"" item is shown does not take default `metadata.json` facets into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 980228553,MDExOlB1bGxSZXF1ZXN0NzIwNTA2MTM1,1448,"Update pluggy requirement from ~=0.13.0 to >=0.13,<1.1",49699333,dependabot[bot],closed,0,,,,,1,2021-08-26T13:09:52Z,2021-10-13T21:11:01Z,2021-10-13T21:11:00Z,CONTRIBUTOR,simonw/datasette/pulls/1448,"Updates the requirements on [pluggy](https://github.com/pytest-dev/pluggy) to permit the latest version.
Changelog

Sourced from pluggy's changelog.

pluggy 1.0.0 (2021-08-25)

Deprecations and Removals

  • [#116](https://github.com/pytest-dev/pluggy/issues/116) <https://github.com/pytest-dev/pluggy/issues/116>_: Remove deprecated implprefix support. Decorate hook implementations using an instance of HookimplMarker instead. The deprecation was announced in release 0.7.0.

  • [#120](https://github.com/pytest-dev/pluggy/issues/120) <https://github.com/pytest-dev/pluggy/issues/120>_: Remove the deprecated proc argument to call_historic. Use result_callback instead, which has the same behavior. The deprecation was announced in release 0.7.0.

  • [#265](https://github.com/pytest-dev/pluggy/issues/265) <https://github.com/pytest-dev/pluggy/issues/265>_: Remove the _Result.result property. Use _Result.get_result() instead. Note that unlike result, get_result() raises the exception if the hook raised. The deprecation was announced in release 0.6.0.

  • [#267](https://github.com/pytest-dev/pluggy/issues/267) <https://github.com/pytest-dev/pluggy/issues/267>_: Remove official support for Python 3.4.

  • [#272](https://github.com/pytest-dev/pluggy/issues/272) <https://github.com/pytest-dev/pluggy/issues/272>_: Dropped support for Python 2. Continue to use pluggy 0.13.x for Python 2 support.

  • [#308](https://github.com/pytest-dev/pluggy/issues/308) <https://github.com/pytest-dev/pluggy/issues/308>_: Remove official support for Python 3.5.

  • [#313](https://github.com/pytest-dev/pluggy/issues/313) <https://github.com/pytest-dev/pluggy/issues/313>_: The internal pluggy.callers, pluggy.manager and pluggy.hooks are now explicitly marked private by a _ prefix (e.g. pluggy._callers). Only API exported by the top-level pluggy module is considered public.

  • [#59](https://github.com/pytest-dev/pluggy/issues/59) <https://github.com/pytest-dev/pluggy/issues/59>_: Remove legacy __multicall__ recursive hook calling system. The deprecation was announced in release 0.5.0.

Features

  • [#282](https://github.com/pytest-dev/pluggy/issues/282) <https://github.com/pytest-dev/pluggy/issues/282>_: When registering a hookimpl which is declared as hookwrapper=True but whose function is not a generator function, a PluggyValidationError exception is now raised.

    Previously this problem would cause an error only later, when calling the hook.

    In the unlikely case that you have a hookwrapper that returns a generator

... (truncated)

Commits
  • 4259fdd Fix CHANGELOG title manually
  • 906abca Preparing release 1.0.0
  • 56eb23c Rename HOWTORELEASE to RELEASING to follow pytest
  • fc6395c Fix scripts/release.py to use main instead of master
  • e04816f Merge pull request #324 from RonnyPfannschmidt/benchmarks
  • 1424ab0 add micro benchmarks for hook calling playing with a the number of callers, w...
  • 5e51864 Merge pull request #323 from RonnyPfannschmidt/switch-to-main
  • 05c3bbd switch to main as primary branch
  • 6b344fb Merge pull request #319 from RonnyPfannschmidt/pre-commit-update
  • 71f2d6b introduce pyupgrade and update black
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1448/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1000779422,PR_kwDOBm6k_c4r9CTw,1474,Update full_text_search.rst,72577720,MichaelTiemannOSC,closed,0,,,,,0,2021-09-20T09:59:45Z,2021-10-13T21:10:23Z,2021-10-13T21:10:23Z,CONTRIBUTOR,simonw/datasette/pulls/1474,"Change ""above"" to ""below"" to correct correspondence of reference to example.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1002459220,PR_kwDOBm6k_c4sCfBT,1476,"Update pytest-xdist requirement from <2.4,>=2.2.1 to >=2.2.1,<2.5",49699333,dependabot[bot],closed,0,,,,,1,2021-09-21T13:13:01Z,2021-10-13T21:10:03Z,2021-10-13T21:10:03Z,CONTRIBUTOR,simonw/datasette/pulls/1476,"Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
Changelog

Sourced from pytest-xdist's changelog.

pytest-xdist 2.4.0 (2021-09-20)

Features

  • [#696](https://github.com/pytest-dev/pytest-xdist/issues/696) <https://github.com/pytest-dev/pytest-xdist/issues/696>_: On Linux, the process title now changes to indicate the current worker state (running/idle).

    Depends on the setproctitle <https://pypi.org/project/setproctitle/>__ package, which can be installed with pip install pytest-xdist[setproctitle].

  • [#704](https://github.com/pytest-dev/pytest-xdist/issues/704) <https://github.com/pytest-dev/pytest-xdist/issues/704>_: Add support for Python 3.10.

pytest-xdist 2.3.0 (2021-06-16)

Deprecations and Removals

  • [#654](https://github.com/pytest-dev/pytest-xdist/issues/654) <https://github.com/pytest-dev/pytest-xdist/issues/654>_: Python 3.5 is no longer supported.

Features

  • [#646](https://github.com/pytest-dev/pytest-xdist/issues/646) <https://github.com/pytest-dev/pytest-xdist/issues/646>_: Add --numprocesses=logical flag, which automatically uses the number of logical CPUs available, instead of physical CPUs with auto.

    This is very useful for test suites which are not CPU-bound.

  • [#650](https://github.com/pytest-dev/pytest-xdist/issues/650) <https://github.com/pytest-dev/pytest-xdist/issues/650>_: Added new pytest_handlecrashitem hook to allow handling and rescheduling crashed items.

Bug Fixes

  • [#421](https://github.com/pytest-dev/pytest-xdist/issues/421) <https://github.com/pytest-dev/pytest-xdist/issues/421>_: Copy the parent process sys.path into local workers, to work around execnet's python -c adding the current directory to sys.path.

  • [#638](https://github.com/pytest-dev/pytest-xdist/issues/638) <https://github.com/pytest-dev/pytest-xdist/issues/638>_: Fix issue caused by changing the branch name of the pytest repository.

Trivial Changes

  • [#592](https://github.com/pytest-dev/pytest-xdist/issues/592) <https://github.com/pytest-dev/pytest-xdist/issues/592>_: Replace master with controller where ever possible.

  • [#643](https://github.com/pytest-dev/pytest-xdist/issues/643) <https://github.com/pytest-dev/pytest-xdist/issues/643>_: Use 'main' to refer to pytest default branch in tox env names.

pytest-xdist 2.2.1 (2021-02-09)

... (truncated)

Commits
  • 4b487ed Manually fix changelog title
  • ecf4d3b Release 2.4.0
  • 87d8979 Merge pull request #704 from hugovk/add-3.10
  • b4544c8 Merge pull request #706 from pytest-dev/pre-commit-ci-update-config
  • 66dc390 [pre-commit.ci] pre-commit autoupdate
  • e0ce1b7 Add news file to add support for Python 3.10
  • ed47f0e Add support for Python 3.10
  • 1c8178a Merge pull request #703 from pytest-dev/pre-commit-ci-update-config
  • 9807064 [pre-commit.ci] pre-commit autoupdate
  • 766e67c Use setproctitle if available to show state (#696)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1022688960,PR_kwDOBm6k_c4tBIEE,1485,"Update pytest-timeout requirement from <1.5,>=1.4.2 to >=1.4.2,<2.1",49699333,dependabot[bot],closed,0,,,,,1,2021-10-11T13:10:51Z,2021-10-13T21:09:23Z,2021-10-13T21:09:23Z,CONTRIBUTOR,simonw/datasette/pulls/1485,"Updates the requirements on [pytest-timeout](https://github.com/pytest-dev/pytest-timeout) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1023243105,I_kwDOBm6k_c48_XNh,1486,pipx installation instructions for plugins don't reference pipx inject,41546558,RhetTbull,closed,0,,,,,0,2021-10-12T00:43:42Z,2021-10-13T21:09:11Z,2021-10-13T21:09:11Z,CONTRIBUTOR,,"The datasette [installation instructions](https://github.com/simonw/datasette/blob/main/docs/installation.rst) discuss how to install with pipx, how to upgrade with pipx, and how to upgrade plugins with pipx but do not mention how to install a plugin with pipx. You discussed this on your [blog](https://til.simonwillison.net/python/installing-upgrading-plugins-with-pipx) but looks like this didn't make it in when you updated the docs for pipx (#756). I'll submit a PR shortly to fix this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1023245060,PR_kwDOBm6k_c4tC4Lx,1487,"Added instructions for installing plugins via pipx, #1486",41546558,RhetTbull,closed,0,,,,,1,2021-10-12T00:48:30Z,2021-10-13T21:09:11Z,2021-10-13T21:09:10Z,CONTRIBUTOR,simonw/datasette/pulls/1487,Adds missing instructions for installing plugins via pipx,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1021849766,I_kwDOBm6k_c486DCm,1483,Running a search on page 2 of results should not preserve ?_next=,9599,simonw,closed,0,,,,,0,2021-10-10T01:18:12Z,2021-10-13T21:08:10Z,2021-10-13T21:08:10Z,OWNER,,Reported by @eigenfoo in https://github.com/simonw/datasette/issues/1470,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1483/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1022294524,PR_kwDOBm6k_c4s_4Cw,1484,GitHub Actions: Add Python 3.10 to the tests,3709715,cclauss,closed,0,,,,,0,2021-10-11T06:03:03Z,2021-10-11T06:03:31Z,2021-10-11T06:03:28Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/1484,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 597671518,MDU6SXNzdWU1OTc2NzE1MTg=,98,"Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records",9599,simonw,closed,0,,,,,7,2020-04-10T03:19:40Z,2021-09-28T04:38:44Z,2020-04-13T03:29:15Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 990844088,MDU6SXNzdWU5OTA4NDQwODg=,325,sqlite-utils memory can't deal with multiple files with the same name,144773,karlb,closed,0,,,,,4,2021-09-08T08:14:42Z,2021-09-22T20:52:56Z,2021-09-22T20:45:45Z,NONE,,"When I use multiple files with the same name, e.g. in `sqlite-utils memory a/bug.csv b/bug.csv`, sqlite-utils creates invalid views. ``` Traceback (most recent call last): File ""/home/karl/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 1299, in memory db[csv_table].transform(types=tracker.types) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1287, in transform self.db.execute(sql) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: error in view t1: no such table: main.bug ``` This can be reproduced with ```sh #!/bin/bash mkdir foo mkdir bar echo -e 'col1,col2\nval1,val2' > foo/bug.csv echo -e 'col3,col4\nval3,val4' > bar/bug.csv sqlite-utils memory */bug.csv 'SELECT 1' ``` Ideally, the tables would get unique names by including the next path segment until the names are unique. But just making the numbered t* aliases work would be good enough. This problem can of course be worked around by renaming the files, but it would be nice if this case was handled more gracefully. Thanks a lot for this great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 984939366,MDU6SXNzdWU5ODQ5MzkzNjY=,58,"Error: Use either --since or --since_id, not both - still broken",42904,rubenv,closed,0,,,,,1,2021-09-01T09:45:28Z,2021-09-21T17:37:41Z,2021-09-21T17:37:41Z,CONTRIBUTOR,,"Hi Simon, It appears the fix for #57 doesn't fix things for me: ``` $ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6 ``` ``` $ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both ``` Is there any way I can help debug this?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 984942782,MDExOlB1bGxSZXF1ZXN0NzI0MzE3NjUw,59,"Fix for since_id bug, closes #58",42904,rubenv,closed,0,,,,,1,2021-09-01T09:49:09Z,2021-09-21T17:37:40Z,2021-09-21T17:37:40Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/59,Fixes remaining instances of this bug,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 982780906,MDExOlB1bGxSZXF1ZXN0NzIyNDgwNTQy,1453,Bump black from 21.7b0 to 21.8b0,49699333,dependabot[bot],closed,0,,,,,2,2021-08-30T13:13:39Z,2021-09-14T13:10:40Z,2021-09-14T13:10:38Z,CONTRIBUTOR,simonw/datasette/pulls/1453,"Bumps [black](https://github.com/psf/black) from 21.7b0 to 21.8b0.
Release notes

Sourced from black's releases.

21.8b0

Black

  • Add support for formatting Jupyter Notebook files (#2357)
  • Move from appdirs dependency to platformdirs (#2375)
  • Present a more user-friendly error if .gitignore is invalid (#2414)
  • The failsafe for accidentally added backslashes in f-string expressions has been hardened to handle more edge cases during quote normalization (#2437)
  • Avoid changing a function return type annotation's type to a tuple by adding a trailing comma (#2384)
  • Parsing support has been added for unparenthesized walruses in set literals, set comprehensions, and indices (#2447).
  • Pin setuptools-scm build-time dependency version (#2457)
  • Exclude typing-extensions version 3.10.0.1 due to it being broken on Python 3.10 (#2460)

Blackd

  • Replace sys.exit(-1) with raise ImportError as it plays more nicely with tools that scan installed packages (#2440)

Integrations

  • The provided pre-commit hooks no longer specify language_version to avoid overriding default_language_version (#2430)
Changelog

Sourced from black's changelog.

21.8b0

Black

  • Add support for formatting Jupyter Notebook files (#2357)
  • Move from appdirs dependency to platformdirs (#2375)
  • Present a more user-friendly error if .gitignore is invalid (#2414)
  • The failsafe for accidentally added backslashes in f-string expressions has been hardened to handle more edge cases during quote normalization (#2437)
  • Avoid changing a function return type annotation's type to a tuple by adding a trailing comma (#2384)
  • Parsing support has been added for unparenthesized walruses in set literals, set comprehensions, and indices (#2447).
  • Pin setuptools-scm build-time dependency version (#2457)
  • Exclude typing-extensions version 3.10.0.1 due to it being broken on Python 3.10 (#2460)

Blackd

  • Replace sys.exit(-1) with raise ImportError as it plays more nicely with tools that scan installed packages (#2440)

Integrations

  • The provided pre-commit hooks no longer specify language_version to avoid overriding default_language_version (#2430)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.7b0&new-version=21.8b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 930855052,MDExOlB1bGxSZXF1ZXN0Njc4NDU5NTU0,1385,Fix + improve get_metadata plugin hook docs,2670795,brandonrobertz,closed,0,,,,,1,2021-06-27T05:43:20Z,2021-09-13T18:53:11Z,2021-09-13T18:53:11Z,CONTRIBUTOR,simonw/datasette/pulls/1385,"This fixes documentation inaccuracies and adds a disclaimer about the signature of the `get_metadata` hook. Addresses the following comments: - https://github.com/simonw/datasette/issues/1384#issuecomment-869069926 - https://github.com/simonw/datasette/issues/1384#issuecomment-869075368",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1385/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 994390593,MDU6SXNzdWU5OTQzOTA1OTM=,1468,Faceting for custom SQL queries,72577720,MichaelTiemannOSC,closed,0,,,,,2,2021-09-13T02:52:16Z,2021-09-13T04:54:22Z,2021-09-13T04:54:17Z,CONTRIBUTOR,,"Facets are awesome. But not when I need to join to tidy tables together. Or even just running explicitly the default SQL query that simply lists all the rows and columns of a table (up to SIZE). That is to say, when I browse a table, I see facets: https://latest.datasette.io/fixtures/compound_three_primary_keys But when I run a custom query, I don't: https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101 Is there an idiom to cause custom SQL to come back with facet suggestions?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 989986586,MDU6SXNzdWU5ODk5ODY1ODY=,1461,Try blacken-docs,9599,simonw,closed,0,,,,,3,2021-09-07T13:28:50Z,2021-09-07T16:13:59Z,2021-09-07T16:13:59Z,OWNER,,https://github.com/asottile/blacken-docs,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988325628,MDExOlB1bGxSZXF1ZXN0NzI3MjY1MDI1,1455,Add scientists to target groups,198537,rgieseke,closed,0,,,,,3,2021-09-04T16:28:58Z,2021-09-04T16:32:21Z,2021-09-04T16:31:38Z,CONTRIBUTOR,simonw/datasette/pulls/1455,"Not sure if you want them mentioned explicitly (it's already a long list), but following up on https://twitter.com/simonw/status/1434176989565382656",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 978357984,MDU6SXNzdWU5NzgzNTc5ODQ=,1446,Modify base.html template to support optional sticky footer,9599,simonw,closed,0,,,,,3,2021-08-24T18:11:12Z,2021-08-31T01:54:59Z,2021-08-24T20:32:47Z,OWNER,,"The neatest way to have the footer stick to the bottom of the browser window that I've found is to use the flexbox pattern from https://css-tricks.com/couple-takes-sticky-footer/ ```html
content
``` ```css html, body { height: 100%; } body { display: flex; flex-direction: column; } .content { flex: 1 0 auto; } .footer { flex-shrink: 0; } ``` I tried this in a custom plugin but it ended up having to duplicate the entire `base.html` template just to get a wrapper around the not-footer content. I think Datasette's own `base.html` template should have this wrapper element instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 981676832,MDU6SXNzdWU5ODE2NzY4MzI=,1449,`register_commands()` plugin hook to register extra CLI commands,9599,simonw,closed,0,,,,,13,2021-08-28T00:26:21Z,2021-08-28T01:58:35Z,2021-08-28T01:43:11Z,OWNER,,"The `datasette` CLI tool currently has 7 subcommands: `serve`, `inspect`, `install`, `package`, `plugins`, `publish` and `uninstall`. A plugin hook could allow plugins to register extra subcommands. I've avoided this for quite a while because I didn't have good use-cases for it - but the existence of the `datasette install xxx` command for installing packages into the correct virtual environment means that actually there's a good reason to do this: it would allow plugins to provide additional command-line mechanisms without the user having to understand how virtual environments work in order to install those commands into the same environment as the rest of Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 981681138,MDU6SXNzdWU5ODE2ODExMzg=,1450,"Datasette --help should show something more useful than ""Datasette!""",9599,simonw,closed,0,,,,,1,2021-08-28T00:44:51Z,2021-08-28T00:49:07Z,2021-08-28T00:49:07Z,OWNER,,"https://github.com/simonw/datasette/blob/a1a33bb5822214be1cebd98cd858b2058d91a4aa/datasette/cli.py#L122-L127 _Originally spotted by @simonw in https://github.com/simonw/datasette/issues/1449#issuecomment-907539668_ ``` ~ % datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 979627285,MDU6SXNzdWU5Nzk2MjcyODU=,323,`table.convert()` method should clean up after itself,9599,simonw,closed,0,,,,,1,2021-08-25T21:15:39Z,2021-08-25T21:25:26Z,2021-08-25T21:25:18Z,OWNER,,"It currently works like this: https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/sqlite_utils/db.py#L2177-L2195 It's registering a function called `convert_value()` and then failing to de-register that function once it has finished. It might even be possible for two queries running against the same connection to clobber each other's `convert_value()` functions, leading to incorrect behaviour. So two fixes: firstly it should register the function with a unique name (maybe add a random suffix). Secondly, it should de-register that function once it has finished.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 947640902,MDExOlB1bGxSZXF1ZXN0NjkyNTk2MDA2,1400,Bump black from 21.6b0 to 21.7b0,49699333,dependabot[bot],closed,0,,,,,1,2021-07-19T13:13:41Z,2021-08-25T01:29:56Z,2021-08-25T01:29:55Z,CONTRIBUTOR,simonw/datasette/pulls/1400,"Bumps [black](https://github.com/psf/black) from 21.6b0 to 21.7b0.
Release notes

Sourced from black's releases.

21.7b0

Black

  • Configuration files using TOML features higher than spec v0.5.0 are now supported (#2301)
  • Add primer support and test for code piped into black via STDIN (#2315)
  • Fix internal error when FORCE_OPTIONAL_PARENTHESES feature is enabled (#2332)
  • Accept empty stdin (#2346)
  • Provide a more useful error when parsing fails during AST safety checks (#2304)

Docker

  • Add new latest_release tag automation to follow latest black release on docker images (#2374)

Integrations

  • The vim plugin now searches upwards from the directory containing the current buffer instead of the current working directory for pyproject.toml. (#1871)
  • The vim plugin now reads the correct string normalization option in pyproject.toml (#1869)
  • The vim plugin no longer crashes Black when there's boolean values in pyproject.toml (#1869)
Changelog

Sourced from black's changelog.

21.7b0

Black

  • Configuration files using TOML features higher than spec v0.5.0 are now supported (#2301)
  • Add primer support and test for code piped into black via STDIN (#2315)
  • Fix internal error when FORCE_OPTIONAL_PARENTHESES feature is enabled (#2332)
  • Accept empty stdin (#2346)
  • Provide a more useful error when parsing fails during AST safety checks (#2304)

Docker

  • Add new latest_release tag automation to follow latest black release on docker images (#2374)

Integrations

  • The vim plugin now searches upwards from the directory containing the current buffer instead of the current working directory for pyproject.toml. (#1871)
  • The vim plugin now reads the correct string normalization option in pyproject.toml (#1869)
  • The vim plugin no longer crashes Black when there's boolean values in pyproject.toml (#1869)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.6b0&new-version=21.7b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 970386262,MDExOlB1bGxSZXF1ZXN0NzEyMzQ2MTk5,1433,"Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10",49699333,dependabot[bot],closed,0,,,,,1,2021-08-13T13:10:24Z,2021-08-25T01:29:27Z,2021-08-25T01:29:26Z,CONTRIBUTOR,simonw/datasette/pulls/1433,"Updates the requirements on [trustme](https://github.com/python-trio/trustme) to permit the latest version.
Commits
  • 8fc5bf9 Bump version to 0.9.0
  • 913e21d Bump types-cryptography from 3.3.3 to 3.3.5 (#342)
  • c66709d Bump types-pyopenssl from 20.0.4 to 20.0.5 (#343)
  • 5131f79 Add type annotations (#341)
  • a411dad Bump charset-normalizer from 2.0.3 to 2.0.4 (#340)
  • be5ec8a Bump sphinx from 4.1.1 to 4.1.2
  • d3b8865 Bump charset-normalizer from 2.0.2 to 2.0.3
  • 4503bef Merge pull request #334 from python-trio/dependabot/pip/charset-normalizer-2.0.2
  • ce8099d Merge pull request #335 from python-trio/dependabot/pip/sphinx-4.1.1
  • 8b6d3c6 Merge pull request #336 from python-trio/dependabot/pip/idna-3.2
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1433/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 978614898,MDExOlB1bGxSZXF1ZXN0NzE5MTc1NTkz,1447,Remove underscore from search mode parameter name,127565,wragge,closed,0,,,,,1,2021-08-25T01:28:04Z,2021-08-25T01:28:58Z,2021-08-25T01:28:58Z,CONTRIBUTOR,simonw/datasette/pulls/1447,The fulltext search documentation refers to the parameter as `searchmode` but the `metadata.json` example uses `search_mode`. The latter doesn't actually seem to work.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 976399638,MDU6SXNzdWU5NzYzOTk2Mzg=,319,[Enhancement] Please allow 'insert-files' to insert content as text.,66709385,pjamargh,closed,0,,,,,10,2021-08-22T15:10:46Z,2021-08-24T23:33:45Z,2021-08-24T23:33:44Z,NONE,,"'insert-files' creates BLOB columns for file contents. Transforming the column to TEXT still keep the content as binary. Even though I'm sure there is a transform that can be applied decoding the text it would be great to have a argument to make 'insert-files' to do it as text (with optional text encoding). The use case is a bunch of htmls (single file) on a directory structure that inserted with this command could be served in Datasette allowing full text search.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978537855,MDExOlB1bGxSZXF1ZXN0NzE5MTA5NzA5,321,"Ability to insert file contents as text, in addition to blob",9599,simonw,closed,0,,,,,5,2021-08-24T22:37:18Z,2021-08-24T23:31:17Z,2021-08-24T23:31:13Z,OWNER,simonw/sqlite-utils/pulls/321,Refs #319.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/321/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 976405225,MDU6SXNzdWU5NzY0MDUyMjU=,320,sqlite-utils memory --analyze option,9599,simonw,closed,0,,,,,2,2021-08-22T15:37:10Z,2021-08-22T15:46:56Z,2021-08-22T15:44:29Z,OWNER,,To provide a way of running [analyze-tables](https://sqlite-utils.datasette.io/en/stable/cli.html#analyzing-tables) directly against JSON or CSV data.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,simonw,closed,0,,,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779211940,MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz,55,Fix archive imports,21148,jacobian,closed,0,,,,,2,2021-01-05T15:54:48Z,2021-08-20T00:02:49Z,2021-08-20T00:02:49Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/55,This fixes the issues discussed in #54,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 681575714,MDExOlB1bGxSZXF1ZXN0NDY5OTQ0OTk5,49,"Document the use of --stop_after with favorites, refs #20",370930,mikepqr,closed,0,,,,,1,2020-08-19T06:10:52Z,2021-08-20T00:02:11Z,2021-08-20T00:02:11Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/49,(I discovered this trawling the issues for how to use --since with favorites),206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,rubenv,closed,0,,,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command: ``` twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since ``` Which gives the following error: ``` Error: Use either --since or --since_id, not both ``` Running without `--since`. ``` Traceback (most recent call last): File ""/usr/local/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline for tweet in bar: File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator for rv in self.iter: File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline yield from fetch_timeline( File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline raise Exception(str(tweets[""errors""])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}] ``` ``` Python 3.9.5 twitter-to-sqlite, version 0.21.3 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974995592,MDU6SXNzdWU5NzQ5OTU1OTI=,1443,datasette.databases should be a documented property,9599,simonw,closed,0,,,,,1,2021-08-19T19:53:04Z,2021-08-19T21:25:07Z,2021-08-19T21:23:47Z,OWNER,,"https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/app.py#L231 I want to use it in https://github.com/simonw/datasette-block-robots/issues/5",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974987856,MDU6SXNzdWU5NzQ5ODc4NTY=,1442,Mechanism to cause specific branches to deploy their own demos,9599,simonw,closed,0,,,,,6,2021-08-19T19:41:39Z,2021-08-19T21:11:45Z,2021-08-19T21:09:40Z,OWNER,,"A useful capability would be if it was super-easy to say ""any pushes to branch X should be deployed to `latest-X.datasette.io`"". I'd like to use this for the column query information work in #1434",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 975049826,MDExOlB1bGxSZXF1ZXN0NzE2MjYyODI5,1444,Ability to deploy demos of branches,9599,simonw,closed,0,,,,,0,2021-08-19T21:08:04Z,2021-08-19T21:09:44Z,2021-08-19T21:09:39Z,OWNER,simonw/datasette/pulls/1444,See #1442.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 913135723,MDU6SXNzdWU5MTMxMzU3MjM=,266,"Add some types, enforce with mypy",9599,simonw,closed,0,,,,,3,2021-06-07T06:05:56Z,2021-08-18T22:25:38Z,2021-08-18T22:25:38Z,OWNER,,"A good starting point would be adding type information to the members of these named tuples and the introspection methods that return them: https://github.com/simonw/sqlite-utils/blob/9dff7a38831d471b1dff16d40d89eb5c3b4e84d6/sqlite_utils/db.py#L51-L75",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965210966,MDU6SXNzdWU5NjUyMTA5NjY=,314,Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`,9599,simonw,closed,0,,,,,2,2021-08-10T18:03:59Z,2021-08-18T22:25:21Z,2021-08-18T22:25:21Z,OWNER,,"> Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/312#issuecomment-896200682_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,simonw,closed,0,,,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints. https://veekaybee.github.io/2019/07/08/python-type-hints/ is good. It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 931752773,MDU6SXNzdWU5MzE3NTI3NzM=,294,Add a `sqlite-utils memory` example to the README,9599,simonw,closed,0,,,,,0,2021-06-28T16:35:59Z,2021-08-18T21:40:03Z,2021-08-18T21:40:03Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 934123448,MDU6SXNzdWU5MzQxMjM0NDg=,295,Insert with --tsv and --no-headers give error about --nl arguments,7288187,davidscotson,closed,0,,,,,1,2021-06-30T21:01:01Z,2021-08-18T20:19:04Z,2021-08-18T20:18:57Z,NONE,,"Not quite sure if this is a bug, or just an assumption I made but I thought `--tsv` and `--no-headers` would work together when inserting from a file, and currently they seem not to (sqlite-utils, version 3.12, installed on Mac OS X via brew) Instead it says: `Error: Use just one of --nl, --csv or --tsv` As if it has interpreted the --no-headers as --nl. The --help does specifically say CSV: `--no-headers CSV file has no header row` And this heading in the documentation also only refers to CSV, but the text does mention TSV in passing, and I'd generally expect them to behave the same in most cases. https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944326512,MDU6SXNzdWU5NDQzMjY1MTI=,296,"`table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option",32427188,deafmute1,closed,0,,,,,6,2021-07-14T11:26:47Z,2021-08-18T20:13:12Z,2021-08-18T20:10:48Z,NONE,,"Hi, Recently got this error: ``` Traceback (most recent call last): File """", line 1, in File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 38, in start(""/home/ethan/git/music-metadata-indexer/sample"", ""/home/ethan/git/music-metadata-indexer/test.db"") File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 23, in start scanner.build_database() File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 79, in build_database _import_song(self.db, Path(dirpath).joinpath(f), self.logger) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 23, in _import_song db.add_song(filepath) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/index.py"", line 166, in add_song for match in self.search(""albums"", album): File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1625, in search cursor = self.db.execute( File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 243, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: fts5: syntax error near ""."" ``` So, the error seems to suggest there was a ""."" character somewhere in the SQL command that was causing the error. I did a little digging and found this in the docs: https://www.sqlite.org/fts5.html#fts5_strings. ""."" is one of the many prohibited characters. My solution was to just strip these out of the query using this line `query = query.translate({e: None for e in itertools.chain(range(0,26), range(27, 48), range(58,65), range(91,95), [96], range(123,128))})` Perhaps this could be included into the `table.search()` function? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 831751367,MDU6SXNzdWU4MzE3NTEzNjc=,246,Escaping FTS search strings,16001974,DeNeutoy,closed,0,,,,,4,2021-03-15T12:15:09Z,2021-08-18T18:57:13Z,2021-08-18T18:43:12Z,CONTRIBUTOR,," Thanks for the excellent library, it's very nice to use! I've been building some in memory search functionality for a data annotation tool i'm making, and I got tripped up a little bit with escaping the full text search queries. First I tried using `db.quote(q)`, which doesn't work, because sqlite FTS has it's own (separate)[ query syntax](https://www2.sqlite.org/fts5.html#full_text_query_syntax). You can see this happening here also: http://search-24ways.herokuapp.com/24ways-f8f455f/articles?_search=acces%2A I got around this by aggressively escaping quotes inside the query string like this: ```python quoted = q.replace('""', '""""') quoted = f'""{quoted}""' print(quoted) results = db[""data""].search(quoted, columns=[""id""]) return [x[""id""] for x in results] ``` This works in the sense it doesn't crash, but it also removes access to the search query syntax. Given the well specified definition, it might be possible for sqlite-utils to provide a `db.quote_query(q)` which would intelligently escape a query whilst leaving the syntax intact. This would be very nice! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 832687563,MDExOlB1bGxSZXF1ZXN0NTkzODA1ODA0,247,FTS quote functionality from datasette,16001974,DeNeutoy,closed,0,,,,,2,2021-03-16T11:17:34Z,2021-08-18T18:43:12Z,2021-08-18T18:43:12Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/247,"Addresses #246 - this is a bit of a kludge because it doesn't actually *validate* the FTS string, just makes sure that it will not crash when executed, but I figured that building a query parser is a bit out of the scope of sqlite-utils and if you actually want to use the query language, you probably need to parse that yourself. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 972827346,MDU6SXNzdWU5NzI4MjczNDY=,317,Link to a better example on docs index,9599,simonw,closed,0,,,,,1,2021-08-17T15:43:40Z,2021-08-18T18:31:43Z,2021-08-18T18:31:43Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/7a19822ac9ee24be2fbb4c2326a0bf2f3d2d9c4d/docs/index.rst#L39 Is a very old example,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 970320615,MDU6SXNzdWU5NzAzMjA2MTU=,316,Fix visible backticks on reference page,9599,simonw,closed,0,,,,,1,2021-08-13T11:37:46Z,2021-08-14T05:12:23Z,2021-08-14T05:10:48Z,OWNER,,"https://sqlite-utils.datasette.io/en/latest/reference.html Search for backtick to reveal various minor markup bugs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969840302,MDU6SXNzdWU5Njk4NDAzMDI=,1431,`--help-config` should be called `--help-settings`,9599,simonw,closed,0,,,,,1,2021-08-13T00:46:48Z,2021-08-13T01:01:58Z,2021-08-13T01:01:58Z,OWNER,,Follow-on from #1105 rebranding exercise.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969758038,MDExOlB1bGxSZXF1ZXN0NzExNzgzNjE2,1430,Column metadata,9599,simonw,closed,0,,,,,1,2021-08-12T23:34:39Z,2021-08-12T23:53:23Z,2021-08-12T23:53:23Z,OWNER,simonw/datasette/pulls/1430,"Refs #942 Still needs: - [x] Tests - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 965102534,MDU6SXNzdWU5NjUxMDI1MzQ=,311,Add reference documentation generated from docstrings,9599,simonw,closed,0,,,,,4,2021-08-10T16:04:00Z,2021-08-11T12:03:50Z,2021-08-11T12:03:50Z,OWNER,,"Using https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html I'm not a big fan of this kind of documentation because it so often comes in place of narrative documentation - but the library has great narrative documentation now, so the reference documentation can link to it in places. This will also encourage me to add good docstrings everywhere, useful for IDEs and suchlike.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965143346,MDExOlB1bGxSZXF1ZXN0NzA3NDkwNzg5,312,Add reference page to documentation using Sphinx autodoc,9599,simonw,closed,0,,,,,10,2021-08-10T16:59:17Z,2021-08-10T23:09:32Z,2021-08-10T23:09:28Z,OWNER,simonw/sqlite-utils/pulls/312,Refs #311.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 965440017,MDU6SXNzdWU5NjU0NDAwMTc=,315,`.delete_where()` returns `[]` when it should return self,9599,simonw,closed,0,,,,,1,2021-08-10T21:54:55Z,2021-08-10T23:09:29Z,2021-08-10T23:09:29Z,OWNER,,"If the table doesn't exist it should still return `self`, not `[]`: https://github.com/simonw/sqlite-utils/blob/ee469e3122d6f5973ec2584c1580d930daca2e7c/sqlite_utils/db.py#L1676-L1683 Spotted with `mypy` while working on #312.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965166058,MDU6SXNzdWU5NjUxNjYwNTg=,313,`.add_foreign_keys()` doesn't reject being called with a View,9599,simonw,closed,0,,,,,0,2021-08-10T17:22:17Z,2021-08-10T17:25:34Z,2021-08-10T17:25:34Z,OWNER,,"Spotted this bug using `mypy` while working on #311 / #312! ``` % mypy sqlite_utils sqlite_utils/db.py:725: error: Item ""View"" of ""Union[Table, View]"" has no attribute ""foreign_keys"" Found 1 error in 1 file (checked 5 source files) ``` Refers to this code: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L710-L720 It's a bug! We run some checks earlier but none of them ensure that it's a view: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L697-L709",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963897111,MDU6SXNzdWU5NjM4OTcxMTE=,309,"sqlite-utils insert errors should show SQL and parameters, if possible",16622642,scaleoutsean,closed,0,,,,,6,2021-08-09T11:24:14Z,2021-08-09T23:40:29Z,2021-08-09T22:25:58Z,NONE,,"I've tried several approaches, but this is the current one: ```sh echo $json-line | sqlite-utils insert json.db jsontable --truncate --alter --detect-types - ``` In all cases, I get this error: ```sh OverflowError: Python int too large to convert to SQLite INTEGER Traceback (most recent call last): File ""/home/sean/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 841, in insert insert_upsert_implementation( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 780, in insert_upsert_implementation db[table].insert_all( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2145, in insert_all self.insert_chunk( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1957, in insert_chunk result = self.db.execute(query, params) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 257, in execute return self.conn.execute(sql, parameters) ``` I googled the error and checked SO answers and advice, all good. I changed my JSON file to not use integers so I no longer get this error. Of course, that makes using the database a bit harder, so I also tried to solve the problem by modifying DB structure (while using integers in JSON). If change all `INTEGER` Data Types to something else (`STRING`, `TEXT`) and try to import again using `--truncate`, I still get this error. I suppose I should tell sqlite-utils which columns should use non-INTEGER Data Type rather than rely on it to check my SQL table configuration. If that is the case, can this error be a bit more specific for easier troubleshooting - maybe tell us which which record caused the problem when that error is thrown? My table has 60+ columns, many of which use 64-bit integers (not all records are large or known in advance), so while I can modify JSON to use strings instead of integers, it decreases usability and finding out which records have values for which SQLite integers aren't sufficient requires some work (I'm thinking about parsing all integers with `jq` and sorting output by length to identify those columns, but I'd prefer if sqlite-utils could tell me that). My environment: - Python 3.8.10 - sqlite-utils 3.14 - pandas 1.3.1 - numpy 1.21.1 - sqlite-fts4 1.0.1 - sqlite 3.31.1-4ubuntu0.2 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963528457,MDU6SXNzdWU5NjM1Mjg0NTc=,1425,render_cell() hook should support returning an awaitable,9599,simonw,closed,0,,,,,11,2021-08-08T22:32:29Z,2021-08-09T07:14:35Z,2021-08-09T03:00:37Z,OWNER,,"Many of the plugin hooks can return an awaitable - e.g. https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hook-extra-template-vars - but `render_cell()` doesn't support this. I recently found myself wanting to execute an additional SQL query from that hook, but it wasn't possible to do that since I couldn't use `await`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959999095,MDU6SXNzdWU5NTk5OTkwOTU=,1421,"""Query parameters"" form shows wrong input fields if query contains ""03:31"" style times",6988,j4mie,closed,0,,,,,11,2021-08-04T07:29:04Z,2021-08-09T03:41:07Z,2021-08-09T03:33:02Z,NONE,,"Datasette version `0.58.1`. I'm guessing this is a bug in the code that looks for `:param`-style query parameters.. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 961367843,MDU6SXNzdWU5NjEzNjc4NDM=,1422,Ability to default to hiding the SQL for a canned query,9599,simonw,closed,0,,,,,4,2021-08-05T02:51:39Z,2021-08-07T05:32:29Z,2021-08-07T05:32:29Z,OWNER,,"I'm working on a project with some HUGE (400+ lines of SQL) canned queries right now. Any time you land on the canned query page you have to scroll down a long distance to get to the results! Would be useful to be able to default to https://latest.datasette.io/fixtures/magic_parameters?_hide_sql=1 without needing the parameter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959898166,MDU6SXNzdWU5NTk4OTgxNjY=,1420,`datasette publish cloudrun --cpu X` option,9599,simonw,closed,0,,,,,5,2021-08-04T05:04:31Z,2021-08-05T00:54:59Z,2021-08-04T05:33:48Z,OWNER,,"For setting the number of vCPUs - current valid values are 1, 2 or 4: https://cloud.google.com/run/docs/configuring/cpu Pass that through to `gcloud run deploy --image IMAGE_URL --cpu CPU`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959305209,MDU6SXNzdWU5NTkzMDUyMDk=,307,codespell to spell check documentation,9599,simonw,closed,0,,,,,0,2021-08-03T16:48:19Z,2021-08-03T16:48:53Z,2021-08-03T16:48:53Z,OWNER,,As seen in https://github.com/simonw/datasette/issues/1417 and https://til.simonwillison.net/python/codespell,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959278472,MDU6SXNzdWU5NTkyNzg0NzI=,1417,Use codespell in CI to spot spelling errors,9599,simonw,closed,0,,,,,1,2021-08-03T16:14:15Z,2021-08-03T16:36:40Z,2021-08-03T16:36:40Z,OWNER,,"I noticed Rich is using this: https://github.com/willmcgugan/rich/commit/9c12a4537499797c43725fff5276ef0da62423ef#diff-ce84a1b2c9eb4ab3ea22f610cad7111cb9a2f66365c3b24679901376a2a73ab2 Ran it against the Datasette docs and found a bunch of obvious fixes, surprisingly with no false positives. ``` datasette % codespell docs/*.rst docs/authentication.rst:63: perfom ==> perform docs/authentication.rst:76: perfom ==> perform docs/changelog.rst:429: repsonse ==> response docs/changelog.rst:503: permissons ==> permissions docs/changelog.rst:717: compatibilty ==> compatibility docs/changelog.rst:1172: browseable ==> browsable docs/deploying.rst:191: similiar ==> similar docs/internals.rst:434: Respons ==> Response, respond docs/internals.rst:440: Respons ==> Response, respond docs/internals.rst:717: tha ==> than, that, the docs/performance.rst:42: databse ==> database docs/plugin_hooks.rst:667: utilites ==> utilities docs/publish.rst:168: countainer ==> container docs/settings.rst:352: inalid ==> invalid docs/sql_queries.rst:406: preceeded ==> preceded, proceeded ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959284434,MDExOlB1bGxSZXF1ZXN0NzAyNDIyMjYz,1418,Spelling corrections plus CI job for codespell,9599,simonw,closed,0,,,,,2,2021-08-03T16:21:19Z,2021-08-03T16:36:39Z,2021-08-03T16:36:38Z,OWNER,simonw/datasette/pulls/1418,Refs #1417.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 959276629,MDU6SXNzdWU5NTkyNzY2Mjk=,1416,"Use rich to render tracebacks on errors, if available",9599,simonw,closed,0,,,,,0,2021-08-03T16:12:08Z,2021-08-03T16:12:51Z,2021-08-03T16:12:51Z,OWNER,,"> Now thinking I should try adding Rich as an optional dependency to Datasette - if it's there, show tracebacks using it. Could be really handy for development > https://twitter.com/simonw/status/1422576091055616003",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951581763,MDU6SXNzdWU5NTE1ODE3NjM=,298,Read lines with JSON object,2172260,qqilihq,closed,0,,,,,2,2021-07-23T13:28:52Z,2021-08-03T06:50:47Z,2021-08-02T21:55:16Z,NONE,,"I found this posted on HN a while ago and love it -- thank you! As a minor improvement, it would be great to have the ability to parse a file with line-separated JSON objects. Currently the parser obviously requires an array wrapping all these objects.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 841377702,MDU6SXNzdWU4NDEzNzc3MDI=,251,"""sqlite-utils convert"" command to replace the separate ""sqlite-transform"" tool",9599,simonw,closed,0,,,,,15,2021-03-25T22:36:36Z,2021-08-02T22:39:46Z,2021-08-02T04:47:40Z,OWNER,,"See https://github.com/simonw/sqlite-transform/issues/11 - I built a separate `sqlite-transform` tool a while ago that uses the word ""transform"" to means something entirely different from `sqlite-utils transform` - I'd like to resolve this by merging the two tools.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 956832836,MDU6SXNzdWU5NTY4MzI4MzY=,300,Returning underlying cause for User Defined Functions ,71236,wsargent,closed,0,,,,,1,2021-07-30T15:08:21Z,2021-08-02T21:53:50Z,2021-08-02T21:53:50Z,NONE,,"The sqlite3 client takes user defined functions and replaces the text with ""user-defined function raised exception`"" so it's not apparent what's gone wrong: ``` Unexpected error: user-defined function raised exception ``` As mentioned in https://code.djangoproject.com/ticket/29500 and https://stackoverflow.com/questions/45824209/how-to-get-an-error-kind-from-sqlite-create-function/45834923#45834923 the workaround for this is to enable callback tracebacks: ``` sqlite3.enable_callback_tracebacks(True) ``` It would be nice if https://sqlite-utils.datasette.io/en/stable/python-api.html#registering-custom-sql-functions either included a reference to `enable_callback_tracebacks` or if registering a user defined function set this flag automatically.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 958516743,MDU6SXNzdWU5NTg1MTY3NDM=,306,Configure sphinx.ext.extlinks for issues,9599,simonw,closed,0,,,,,2,2021-08-02T21:19:19Z,2021-08-02T21:39:34Z,2021-08-02T21:29:22Z,OWNER,,As seen in Datasette: https://github.com/simonw/datasette/issues/1227,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810394616,MDU6SXNzdWU4MTAzOTQ2MTY=,1227,Configure sphinx.ext.extlinks for issues,9599,simonw,closed,0,,,,,1,2021-02-17T17:38:02Z,2021-08-02T21:38:39Z,2021-02-18T01:20:33Z,OWNER,,"Spotted this in the aspw documentation: https://github.com/rogerbinns/apsw/blob/3.34.0-r1/doc/conf.py#L29-L36 ```python extlinks={ 'cvstrac': ('https://sqlite.org/cvstrac/tktview?tn=%s', 'SQLite ticket #'), 'sqliteapi': ('https://sqlite.org/c3ref/%s.html', 'XXYouShouldNotSeeThisXX'), 'issue': ('https://github.com/rogerbinns/apsw/issues/%s', 'APSW issue '), 'source': ('https://github.com/rogerbinns/apsw/blob/master/%s', ''), } ``` Which lets you link to issues like this: :issue:`268`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957383814,MDU6SXNzdWU5NTczODM4MTQ=,301,insert-files should get a --silent option,9599,simonw,closed,0,,,,,0,2021-08-01T04:11:03Z,2021-08-02T19:12:21Z,2021-08-02T19:12:21Z,OWNER,,"The new `sqlite-utils convert` command I'm adding in #251 will have a `--silent` option for turning off the progress bars. The only other command that has progress bars right now is `insert-files` so it should get this option too, for consistency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957731178,MDU6SXNzdWU5NTc3MzExNzg=,304,"`table.convert(..., where=)` and `sqlite-utils convert ... --where=`",9599,simonw,closed,0,,,,,3,2021-08-02T04:27:23Z,2021-08-02T19:00:00Z,2021-08-02T18:58:10Z,OWNER,,"For applying the conversion to a subset of rows selected using the where clause. Should also take optional arguments, as seen in `db[""dogs""].delete_where(""age < ?"", [3])`. Follows #302 and #251. This was originally https://github.com/simonw/sqlite-transform/issues/9",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957741820,MDU6SXNzdWU5NTc3NDE4MjA=,305,Python: need a way to execute a count with an extra where clause,9599,simonw,closed,0,,,,,1,2021-08-02T04:52:02Z,2021-08-02T05:08:22Z,2021-08-02T05:08:22Z,OWNER,,I need this for #304. I'll probably add this to the `.execute_count()` method as `where=` and `where_args=`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957536983,MDExOlB1bGxSZXF1ZXN0NzAwOTQ0NjQ0,303,sqlite-utils convert command and db[table].convert(...) method,9599,simonw,closed,0,,,,,1,2021-08-01T16:52:42Z,2021-08-02T04:47:42Z,2021-08-02T04:47:39Z,OWNER,simonw/sqlite-utils/pulls/303,"Refs #251, #302. - [x] Get recipes working - [x] Document recipes - [x] Implement `db[table].convert(...)` method - [x] Add tests for recipes that use the new Python method - [x] Implement `db[table].convert(..., multi=True)` mechanism - [x] Documentation for `db[table].convert(...)` - [x] Refactor `sqlite-utils convert` to use the new method",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 957529248,MDU6SXNzdWU5NTc1MjkyNDg=,302,Python library version of `sqlite-utils convert`,9599,simonw,closed,0,9599,simonw,,,1,2021-08-01T16:11:02Z,2021-08-02T04:47:40Z,2021-08-02T04:47:40Z,OWNER,,"Spin off from #251. The ability to execute Python functions to convert and split columns should be part of the library too, not just the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957345476,MDU6SXNzdWU5NTczNDU0NzY=,1411,Canned query ?sql= is pointlessly echoed in query string starting from hidden mode,9599,simonw,closed,0,,,,,1,2021-08-01T00:17:13Z,2021-08-01T03:27:30Z,2021-08-01T00:58:17Z,OWNER,,"Example: https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1 Submitting that form again results in this: https://latest.datasette.io/fixtures/neighborhood_search?sql=%0D%0Aselect+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities%0D%0A++++++++on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B%0D%0A&_hide_sql=1&text=cork Because the HTML on https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1 includes this: ```html

Custom SQL query returning 1 row (show)

```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957298475,MDU6SXNzdWU5NTcyOTg0NzU=,1407,OSError: AF_UNIX path too long in ds_unix_domain_socket_server,9599,simonw,closed,0,,,,,2,2021-07-31T18:36:06Z,2021-07-31T19:03:44Z,2021-07-31T19:03:44Z,OWNER,,"Got this exception while working on #1406. ``` @pytest.fixture(scope=""session"") def ds_unix_domain_socket_server(tmp_path_factory): socket_folder = tmp_path_factory.mktemp(""uds"") uds = str(socket_folder / ""datasette.sock"") ds_proc = subprocess.Popen( [""datasette"", ""--memory"", ""--uds"", uds], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=tempfile.gettempdir(), ) # Give the server time to start time.sleep(1.5) # Check it started successfully > assert not ds_proc.poll(), ds_proc.stdout.read().decode(""utf-8"") E AssertionError: INFO: Started server process [48453] E INFO: Waiting for application startup. E INFO: Application startup complete. E Traceback (most recent call last): E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/datasette"", line 33, in E sys.exit(load_entry_point('datasette', 'console_scripts', 'datasette')()) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ E return self.main(*args, **kwargs) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1062, in main E rv = self.invoke(ctx) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke E return _process_result(sub_ctx.command.invoke(sub_ctx)) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke E return ctx.invoke(self.callback, **ctx.params) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 763, in invoke E return __callback(*args, **kwargs) E File ""/Users/simon/Dropbox/Development/datasette/datasette/cli.py"", line 583, in serve E uvicorn.run(ds.app(), **uvicorn_kwargs) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/main.py"", line 393, in run E server.run() E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 50, in run E loop.run_until_complete(self.serve(sockets=sockets)) E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 616, in run_until_complete E return future.result() E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 67, in serve E await self.startup(sockets=sockets) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 133, in startup E server = await asyncio.start_unix_server( E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/streams.py"", line 132, in start_unix_server E return await loop.create_unix_server(factory, path, **kwds) E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/unix_events.py"", line 296, in create_unix_server E sock.bind(path) E OSError: AF_UNIX path too long E E assert not 1 E + where 1 = >() E + where > = .poll ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 956303470,MDU6SXNzdWU5NTYzMDM0NzA=,1406,Tests failing with FileNotFoundError in runner.isolated_filesystem,9599,simonw,closed,0,,,,,8,2021-07-30T00:39:00Z,2021-07-31T18:56:35Z,2021-07-31T18:56:35Z,OWNER,,"e.g. https://github.com/simonw/datasette/runs/3197141955 I've seen this error before, but I don't yet have a good workaround for it. ``` @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.6.14/x64/lib/python3.6/site-packages/click/testing.py:466: FileNotFoundError =========================== short test summary info ============================ FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_apt_get_install FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[---setting force_https_urls on] FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting base_url /foo---setting base_url /foo --setting force_https_urls on] FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting force_https_urls off---setting force_https_urls off] FAILED tests/test_publish_heroku.py::test_publish_heroku_requires_heroku - Fi... FAILED tests/test_publish_heroku.py::test_publish_heroku_installs_plugin - Fi... FAILED tests/test_publish_heroku.py::test_publish_heroku - FileNotFoundError:... FAILED tests/test_publish_heroku.py::test_publish_heroku_plugin_secrets - Fil... ================== 8 failed, 920 passed in 188.22s (0:03:08) =================== ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 955316250,MDU6SXNzdWU5NTUzMTYyNTA=,1405,utils.parse_metadata() should be a documented internal function,9599,simonw,closed,0,,,,,3,2021-07-28T23:51:39Z,2021-07-29T23:33:30Z,2021-07-29T23:30:24Z,OWNER,,Because it's used by this plugin: https://github.com/simonw/datasette-remote-metadata,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530513784,MDExOlB1bGxSZXF1ZXN0MzQ3MTc5MDgx,644,Validate metadata json on startup,6025893,chris48s,closed,0,,,,,1,2019-11-30T00:32:15Z,2021-07-28T17:58:45Z,2021-07-28T17:58:45Z,CONTRIBUTOR,simonw/datasette/pulls/644,"This PR adds a sanity check which builds up a marshmallow schema on-the-fly based on the structure of the database(s) on startup and then validates the metadata json against it. In case of invalid data, this will raise with a descriptive error e.g: ``` marshmallow.exceptions.ValidationError: {'databases': {'fixtures': {'tables': {'not_a_table': ['Unknown field.']}}}} ``` Closes #260 --- This was intended to be fairly self-contained, but then while I was working on it, I hit some problems getting the tests to pass in the context of the test suite as a whole. My tests passed in isolation, but then failed while doing a full test suite run. That's when the worms started coming out of the can :bug: After some sleuthing, it turned out this was essentially the result of several issues intersecting: * There are certain events in the application lifecycle where the metadata schema can be modified after it is loaded e.g: https://github.com/simonw/datasette/blob/a562f2965552fb2dbbbd74df245c9965ee23d886/datasette/app.py#L299-L320 This means that sometimes what goes in isn't always exactly what comes out when you call `/-/metadata`. * Because the test fixtures use session scope for performance reasons if one unit test performs an action which mutates the metadata, that can impact on other unit tests which run after it using the same fixture. * Because the `self._metadata` property was being set with a simple assignment `self._metadata = metadata`, that created an object reference to the test fixture data, so operating on `self._metadata` was actually modifying the test fixture `METADATA` meaning that depending on when it was loaded in the test suite lifecycle, `METADATA` had different content, which was somewhat unexpected. As such, I've added some band-aids in 3552024 and 6859fd8: * Switching the metadata object to a `deepcopy` of the input prevents us directly mutating the input fixture. * I've switched some of the tests to use a fixture with function scope instead of session scope so we're working on a clean copy that hasn't been mutated by other tests where necessary but keeping session scope in most cases for performance. * I haven't really addressed the fact that sometimes the metadata object gets mutated in place, so the object that is served from `/-/metadata` isn't necessarily always exactly the same as the file you fed into it on init. I'm not sure how much of a problem that is. The way the tests were written makes me think it was unexpected, but getting into it feels like too much scope creep for this PR so its probably best addressed as another issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/644/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 953352015,MDU6SXNzdWU5NTMzNTIwMTU=,1404,`register_routes()` hook should take `datasette` argument,9599,simonw,closed,0,,,,,1,2021-07-26T23:00:33Z,2021-07-26T23:27:07Z,2021-07-26T23:26:00Z,OWNER,,Currently that plugin hook takes no arguments at all. This means it's not possible to conditionally register routes based on Datasette plugin configuration.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 952154468,MDU6SXNzdWU5NTIxNTQ0Njg=,299,Ability to see just specific table schemas with `sqlite-utils schema`,9599,simonw,closed,0,,,,,1,2021-07-24T22:00:05Z,2021-07-24T22:12:01Z,2021-07-24T22:08:46Z,OWNER,,"It currently accepts no arguments. Allowing for optional arguments specifying tables would be useful: sqlite-utils schema fixtures.db facetable searchable ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 946553953,MDExOlB1bGxSZXF1ZXN0NjkxNzA3NDA5,1397,"Fix for race condition in refresh_schemas(), closes #1231",9599,simonw,closed,0,,,,,0,2021-07-16T19:44:43Z,2021-07-16T19:45:00Z,2021-07-16T19:44:58Z,OWNER,simonw/datasette/pulls/1397,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 811367257,MDU6SXNzdWU4MTEzNjcyNTc=,1231,Race condition errors in new refresh_schemas() mechanism,9599,simonw,closed,0,,,,,11,2021-02-18T18:49:54Z,2021-07-16T19:44:59Z,2021-07-16T19:44:59Z,OWNER,,I tried running a Locust load test against Datasette and hit an error message about a failure to create tables because they already existed. I think this means there are race conditions in the new `refresh_schemas()` mechanism added in #1150.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944870799,MDU6SXNzdWU5NDQ4NzA3OTk=,1394,Big performance boost on faceting: skip the inner order by,9599,simonw,closed,0,,,,,4,2021-07-14T23:32:29Z,2021-07-16T02:23:32Z,2021-07-15T00:05:50Z,OWNER,,"I just noticed something that could make for a huge performance improvement in faceting. The default query used by Datasette when faceting looks like this: ```sql select country_long, count(*) from ( select * from [global-power-plants] order by rowid ) where country_long is not null group by country_long order by count(*) desc ``` Here it takes 53ms: https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D+order+by+rowid%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc Note that there's a `order by rowid` in there which isn't necessary - the order on that inner query doesn't matter since we're grouping and counting. I had assumed SQLite would optimize this away - but it turns out it doesn't! Consider this version of the query, with that pointless order by removed: ``` select country_long, count(*) from ( select * from [global-power-plants] ) where country_long is not null group by country_long order by count(*) desc ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc runs in 7.2ms! I tried this optimization on a table with 2.5m rows in it - without the optimization it took 5 seconds, with the optimization it took 450ms. So this is a very significant improvement!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1394/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612673948,MDU6SXNzdWU2MTI2NzM5NDg=,759,fts search on a column doesn't work anymore due to escape_fts,133845,Krazybug,closed,0,,,,,3,2020-05-05T15:03:44Z,2021-07-16T02:11:54Z,2020-05-06T17:50:57Z,NONE,,"Hi and first, thank you for this awesome work you make with this projet. On a db indexed in full text search, I can't query on indexed column anymore. This request ""cauvin language:ita"": is running smoothly on a old version of datasette but not on the current version. Compare the current version query `select uuid, title, authors, year, series, language, formats, publisher, tags, identifiers from summary where rowid in (select rowid from summary_fts where summary_fts match escape_fts(:search)) order by uuid limit 101` To an older version: `select title, authors, series, uuid, language, identifiers, tags, publisher, formats, year, links from summary where rowid in (select rowid from summary_fts where summary_fts match :search) order by uuid limit 101` _language_ is a searchable column but now the search string is known as ""cauvin language:ita"" literally as a search term. columns are not parsed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539590148,MDU6SXNzdWU1Mzk1OTAxNDg=,651,fts5 syntax error when using punctuation,2181410,clausjuhl,closed,0,,,,,3,2019-12-18T10:25:35Z,2021-07-14T19:26:06Z,2019-12-30T06:42:55Z,NONE,,"Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' ""enable-fts""-command. The same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.) What am I doing wrong? Many thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 941412189,MDExOlB1bGxSZXF1ZXN0Njg3MzA0MjQy,1393,Update deploying.rst,80737,aslakr,closed,0,,,,,1,2021-07-11T09:32:16Z,2021-07-13T18:32:49Z,2021-07-13T18:32:49Z,CONTRIBUTOR,simonw/datasette/pulls/1393,"Example on how to use Unix domain socket option on Apache. Not testet. (Usually I would have used [`ProxyPassReverse`](https://httpd.apache.org/docs/current/mod/mod_proxy.html#proxypassreverse) in combination with `ProxyPass` , i.e. ```apache ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/ ProxyPassReverse /my-datasette/ http://127.0.0.1:8009/my-datasette/ ``` and ```apache ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/ ProxyPassReverse /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/ ``` )",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 941403676,MDExOlB1bGxSZXF1ZXN0Njg3Mjk4MTEy,1392,Update deploying.rst,80737,aslakr,closed,0,,,,,1,2021-07-11T08:43:19Z,2021-07-13T17:42:31Z,2021-07-13T17:42:27Z,CONTRIBUTOR,simonw/datasette/pulls/1392,Use same base url for Apache as in the example,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 931557895,MDExOlB1bGxSZXF1ZXN0Njc5MDM1ODQ3,1386,"Update asgiref requirement from <3.4.0,>=3.2.10 to >=3.2.10,<3.5.0",49699333,dependabot[bot],closed,0,,,,,1,2021-06-28T13:13:07Z,2021-07-11T01:36:19Z,2021-07-11T01:36:18Z,CONTRIBUTOR,simonw/datasette/pulls/1386,"Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
Changelog

Sourced from asgiref's changelog.

3.4.0 (2021-06-27)

  • Calling sync_to_async directly from inside itself (which causes a deadlock when in the default, thread-sensitive mode) now has deadlock detection.

  • asyncio usage has been updated to use the new versions of get_event_loop, ensure_future, wait and gather, avoiding deprecation warnings in Python 3.10. Python 3.6 installs continue to use the old versions; this is only for 3.7+

  • sync_to_async and async_to_sync now have improved type hints that pass through the underlying function type correctly.

  • All Websocket* types are now spelled WebSocket, to match our specs and the official spelling. The old names will work until release 3.5.0, but will raise deprecation warnings.

  • The typing for WebSocketScope and HTTPScope's extensions key has been fixed.

3.3.4 (2021-04-06)

  • The async_to_sync type error is now a warning due the high false negative rate when trying to detect coroutine-returning callables in Python.

3.3.3 (2021-04-06)

  • The sync conversion functions now correctly detect functools.partial and other wrappers around async functions on earlier Python releases.

3.3.2 (2021-04-05)

  • SyncToAsync now takes an optional "executor" argument if you want to supply your own executor rather than using the built-in one.

  • async_to_sync and sync_to_async now check their arguments are functions of the correct type.

  • Raising CancelledError inside a SyncToAsync function no longer stops a future call from functioning.

  • ThreadSensitive now provides context hooks/override options so it can be made to be sensitive in a unit smaller than threads (e.g. per request)

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1386/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 939051549,MDU6SXNzdWU5MzkwNTE1NDk=,1388,Serve using UNIX domain socket,80737,aslakr,closed,0,,,,,13,2021-07-07T16:13:37Z,2021-07-11T01:18:38Z,2021-07-10T23:38:32Z,CONTRIBUTOR,,Would it be possible to make datasette serve using UNIX domain socket similar to Uvicorn's ``--uds``?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 466996584,MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw,557,Get tests running on Windows using Travis CI,9599,simonw,closed,0,,,,,4,2019-07-11T16:36:57Z,2021-07-10T23:39:48Z,2021-07-10T23:39:48Z,OWNER,simonw/datasette/pulls/557,Refs #511,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 941300946,MDU6SXNzdWU5NDEzMDA5NDY=,1391,Stop using generated columns in fixtures.db,9599,simonw,closed,0,,,,,5,2021-07-10T18:26:11Z,2021-07-10T19:26:58Z,2021-07-10T19:26:00Z,OWNER,,"Refs #1376 - but I also keep running into this myself, where I try to run something against `fixtures.db` and get this confusing error: sqlite3.DatabaseError: malformed database schema (generated_columns) - near ""AS"": syntax error I'm going to stop using generated columns in `fixtures.db` and instead dynamically generate the generated column table for the duration of the relevant test.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 940077168,MDU6SXNzdWU5NDAwNzcxNjg=,1389,"""searchmode"": ""raw"" in table metadata",9599,simonw,closed,0,,,,,6,2021-07-08T17:32:10Z,2021-07-10T18:33:13Z,2021-07-10T18:33:13Z,OWNER,,"> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw > > But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account) > Here it is: > > ``` > { > ""databases"": { > ""index"": { > ""tables"": { > ""summary"": { > ""sort"": ""title"", > ""searchmode"": ""raw"" > } > } > } > } > } _Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 940891698,MDU6SXNzdWU5NDA4OTE2OTg=,1390,Mention restarting systemd in documentation,9599,simonw,closed,0,,,,,2,2021-07-09T16:05:15Z,2021-07-09T16:32:57Z,2021-07-09T16:32:33Z,OWNER,,"https://docs.datasette.io/en/stable/deploying.html#running-datasette-using-systemd Need to clarify that if you add a new database or change metadata you need to restart systemd.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 935930820,MDU6SXNzdWU5MzU5MzA4MjA=,1387,absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs,9599,simonw,closed,0,,,,,8,2021-07-02T16:58:25Z,2021-07-02T17:58:23Z,2021-07-02T17:33:05Z,OWNER,,"Reported in the wild on https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber - the ""next page"" link links to https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber&_next=100 That installation uses `""base_url"": ""/collection-analysis/""` Weirdly all of the other links on that page - to facet results, sort orders, row permalinks etc - work fine. It's JUST the `next_url` one that is broken. Also broken in their JSON: https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1 returns ```json ""suggested_facets"": [], ""next"": ""1"", ""next_url"": ""https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1&_next=1"", ""private"": false, ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913865304,MDExOlB1bGxSZXF1ZXN0NjYzODM2OTY1,1368,DRAFT: A new plugin hook for dynamic metadata,2670795,brandonrobertz,closed,0,,,,,5,2021-06-07T18:56:00Z,2021-06-26T22:24:54Z,2021-06-26T22:24:54Z,CONTRIBUTOR,simonw/datasette/pulls/1368,"Note that this is a WORK IN PROGRESS! This PR adds the following plugin hook: get_metadata( datasette=self, key=key, database=database, table=table, fallback=fallback ) This gets called when we're building our metdata for the rest of the system to use. Datasette merges whatever the plugins return with any local metadata (from metadata.yml/yaml/json) allowing for a live-editable dynamic Datasette. __A major design consideration is this: should Datasette perform the metadata merge? Or should Datasette allow plugins to perform any modifications themselves?__ As a security precation, local meta is *not* overwritable by plugin hooks. The workflow for transitioning to live-meta would be to load the plugin with the full metadata.yaml and save. Then remove the parts of the metadata that you want to be able to change from the file. I have a WIP dynamic configuration plugin here, for reference: https://github.com/next-LI/datasette-live-config/",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 927789811,MDU6SXNzdWU5Mjc3ODk4MTE=,292,Add contributing documentation,9599,simonw,closed,0,,,,,0,2021-06-23T02:13:05Z,2021-06-25T17:53:51Z,2021-06-25T17:53:51Z,OWNER,,Like https://docs.datasette.io/en/latest/contributing.html (but simpler) - should cover how to run `black` and `flake8` and `mypy` and how to run the tests.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/292/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 926777310,MDU6SXNzdWU5MjY3NzczMTA=,290,`db.query()` method (renamed `db.execute_returning_dicts()`),9599,simonw,closed,0,,,,,6,2021-06-22T03:03:54Z,2021-06-24T23:17:38Z,2021-06-24T22:54:43Z,OWNER,,"Most of this library deals with lists of Python dictionaries - `.insert_all()`, `.rows`, `.rows_where()`, `.search()`. The `db.execute()` method is the only thing that returns a `sqlite3` cursor. There is a clumsily named `db.execute_returning_dicts(sql)` method but it's not currently mentioned in the documentation. It needs a better name, and needs to be properly documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 927766296,MDU6SXNzdWU5Mjc3NjYyOTY=,291,Adopt flake8,9599,simonw,closed,0,,,,,2,2021-06-23T01:19:37Z,2021-06-24T17:50:27Z,2021-06-24T17:50:27Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 920884085,MDU6SXNzdWU5MjA4ODQwODU=,1377,Mechanism for plugins to exclude certain paths from CSRF checks,9599,simonw,closed,0,,,,,3,2021-06-15T00:48:20Z,2021-06-23T22:51:33Z,2021-06-23T22:51:33Z,OWNER,,I need this for a plugin I'm building that offers a POST API.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925677191,MDU6SXNzdWU5MjU2NzcxOTE=,289,Mypy fixes for rows_from_file(),857609,adamchainz,closed,0,,,,,3,2021-06-20T20:34:59Z,2021-06-22T18:44:36Z,2021-06-22T18:13:26Z,NONE,,"Following https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864328927 You had two mypy errors. The first: > sqlite_utils/utils.py:157: error: Argument 1 to ""BufferedReader"" has incompatible type ""BinaryIO""; expected ""RawIOBase"" Looking at the `BufferedReader` docs, it seems to expect a `RawIOBase`, and this [has been copied into typeshed](https://github.com/python/typeshed/blob/9ec2f8712480c57353cea097a65d75a2c4ec1846/stdlib/io.pyi#L100). There may be scope to change how `BufferedReader` is documented and typed upstream, but for now it wouldn't be too bad to use a `typing.cast()`: ``` # Detect the format, then call this recursively buffered = io.BufferedReader( cast(io.RawIOBase, fp), # Undocumented BufferedReader support for BinaryIO buffer_size=4096, ) ``` The second error seems to be flagging a legitimate bug in your code: > sqlite_utils/utils.py:163: error: Argument 1 to ""decode"" of ""bytes"" has incompatible type ""Optional[str]""; expected ""str"" From your type hints, `encoding` may be `None`. In the CSV format block, you use `encoding or ""utf-8-sig""` to set a default, maybe that's desirable in this case too? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/289/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913017577,MDU6SXNzdWU5MTMwMTc1Nzc=,1365,pathlib.Path breaks internal schema,25778,eyeseast,closed,0,,,,,1,2021-06-07T01:40:37Z,2021-06-21T15:57:39Z,2021-06-21T15:57:39Z,CONTRIBUTOR,,"Ran into an issue while trying to build a plugin to render GeoJSON. I'm using pytest's `tmp_path` fixture, which is a `pathlib.Path`, to get a temporary database path. I was getting a weird error involving writes, but I was doing reads. Turns out it's the internal database trying to insert a `Path` where it wants a string. My test looked like this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([database]) # this will break with a path await datasette.refresh_schemas() # build a url url = datasette.urls.table(database.stem, TABLE_NAME, format=""geojson"") response = await datasette.client.get(url) fc = response.json() assert 200 == response.status_code ``` I only ran into this while running tests, because passing in database paths from the CLI uses strings, but it's a weird error and probably something other people have run into. The fix is easy enough: Convert the path to a string and everything works. So this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([str(database)]) # this is fine now await datasette.refresh_schemas() ``` This could (probably, haven't tested) be fixed [here](https://github.com/simonw/datasette/blob/03ec71193b9545536898a4bc7493274fec48bdd7/datasette/app.py#L357) by calling `str(db.path)` or by doing that conversion earlier.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 914130834,MDExOlB1bGxSZXF1ZXN0NjY0MDcyMDQ2,1370,Ensure db.path is a string before trying to insert into internal database,25778,eyeseast,closed,0,,,,,2,2021-06-08T01:16:48Z,2021-06-21T15:57:39Z,2021-06-21T15:57:39Z,CONTRIBUTOR,simonw/datasette/pulls/1370,"Fixes #1365 This is the simplest possible fix, with a test that will fail without it. There are a bunch of places where `db.path` is getting converted to and from a `Path` type, so this fix errs on the side of calling `str(db.path)` right before it's inserted.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1370/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 923697888,MDU6SXNzdWU5MjM2OTc4ODg=,278,"Support db as first parameter before subcommand, or as environment variable",601708,mcint,closed,0,,,,,3,2021-06-17T09:26:29Z,2021-06-20T22:39:57Z,2021-06-18T15:43:19Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925487946,MDU6SXNzdWU5MjU0ODc5NDY=,286,Add installation instructions,9599,simonw,closed,0,,,,,1,2021-06-19T23:55:36Z,2021-06-20T18:47:13Z,2021-06-20T18:47:13Z,OWNER,,"`pip install sqlite-utils`, `pipx install sqlite-utils` and `brew install sqlite-utils`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925544070,MDU6SXNzdWU5MjU1NDQwNzA=,287,Update rowid examples in the docs,9599,simonw,closed,0,,,,,0,2021-06-20T08:03:00Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Changed in #284 - a couple of examples need updating on https://github.com/simonw/sqlite-utils/blob/3.10/docs/cli.rst.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925545468,MDU6SXNzdWU5MjU1NDU0Njg=,288,sqlite-utils memory blah.json --schema,9599,simonw,closed,0,,,,,0,2021-06-20T08:10:40Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Like `--dump` but only outputs the schema - useful for understanding what you are about to run queries against.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925406964,MDU6SXNzdWU5MjU0MDY5NjQ=,1382,Datasette with Glitch - is it possible to use CSV with ISO-8859-1 encoding?,23701514,reichaves,closed,0,,,,,1,2021-06-19T14:37:20Z,2021-06-20T00:21:02Z,2021-06-20T00:20:06Z,NONE,,"Hi Please, I used Remix on Glitch to create a project on Glitch and uploaded a CSV But it's a CSV with ISO-8859-1 encoding (https://en.wikipedia.org/wiki/ISO/IEC_8859-1) Is it possible for me to change the encoding to correctly visualize the data? Example: https://emphasized-carpal-pillow.glitch.me/data/Emendas Best",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923910375,MDExOlB1bGxSZXF1ZXN0NjcyNjIwMTgw,1378,"Update pytest-xdist requirement from <2.3,>=2.2.1 to >=2.2.1,<2.4",49699333,dependabot[bot],closed,0,,,,,1,2021-06-17T13:11:56Z,2021-06-20T00:17:07Z,2021-06-20T00:17:06Z,CONTRIBUTOR,simonw/datasette/pulls/1378,"Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
Changelog

Sourced from pytest-xdist's changelog.

pytest-xdist 2.3.0 (2021-06-16)

Deprecations and Removals

  • [#654](https://github.com/pytest-dev/pytest-xdist/issues/654) <https://github.com/pytest-dev/pytest-xdist/issues/654>_: Python 3.5 is no longer supported.

Features

  • [#646](https://github.com/pytest-dev/pytest-xdist/issues/646) <https://github.com/pytest-dev/pytest-xdist/issues/646>_: Add --numprocesses=logical flag, which automatically uses the number of logical CPUs available, instead of physical CPUs with auto.

    This is very useful for test suites which are not CPU-bound.

  • [#650](https://github.com/pytest-dev/pytest-xdist/issues/650) <https://github.com/pytest-dev/pytest-xdist/issues/650>_: Added new pytest_handlecrashitem hook to allow handling and rescheduling crashed items.

Bug Fixes

  • [#421](https://github.com/pytest-dev/pytest-xdist/issues/421) <https://github.com/pytest-dev/pytest-xdist/issues/421>_: Copy the parent process sys.path into local workers, to work around execnet's python -c adding the current directory to sys.path.

  • [#638](https://github.com/pytest-dev/pytest-xdist/issues/638) <https://github.com/pytest-dev/pytest-xdist/issues/638>_: Fix issue caused by changing the branch name of the pytest repository.

Trivial Changes

  • [#592](https://github.com/pytest-dev/pytest-xdist/issues/592) <https://github.com/pytest-dev/pytest-xdist/issues/592>_: Replace master with controller where ever possible.

  • [#643](https://github.com/pytest-dev/pytest-xdist/issues/643) <https://github.com/pytest-dev/pytest-xdist/issues/643>_: Use 'main' to refer to pytest default branch in tox env names.

pytest-xdist 2.2.1 (2021-02-09)

Bug Fixes

  • [#623](https://github.com/pytest-dev/pytest-xdist/issues/623) <https://github.com/pytest-dev/pytest-xdist/issues/623>_: Gracefully handle the pending deprecation of Node.fspath by using config.rootpath for topdir.

pytest-xdist 2.2.0 (2020-12-14)

Features

... (truncated)

Commits
  • fe57b39 fixup: add release title underline for 2.3.0
  • 26e7d95 prepare release 2.3.0
  • b02a6db Merge pull request #667 from graingert/fix-sys-path
  • b072267 add newsfile
  • 881cc48 Merge pull request #672 from pytest-dev/pre-commit-ci-update-config
  • 958679e [pre-commit.ci] pre-commit autoupdate
  • 7f07d50 Merge pull request #646 from kroeschl/numprocesses-logical
  • fb518de Merge pull request #669 from pytest-dev/pre-commit-ci-update-config
  • 0b14d92 [pre-commit.ci] pre-commit autoupdate
  • 02f971d swap docstring
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1378/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 921878733,MDU6SXNzdWU5MjE4Nzg3MzM=,272,"Idea: import CSV to memory, run SQL, export in a single command",9599,simonw,closed,0,,,,,22,2021-06-15T23:02:48Z,2021-06-19T23:36:48Z,2021-06-18T15:05:03Z,OWNER,,"I quite often load a CSV file into a SQLite DB, then do stuff with it (like export results back out again as a new CSV) without any intention of keeping the CSV file around afterwards. What if `sqlite-utils` could do this for me? Something like this: sqlite-utils --csv blah.csv --csv baz.csv ""select * from blah join baz ..."" ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/272/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925320167,MDU6SXNzdWU5MjUzMjAxNjc=,284,.transform(types=) turns rowid into a concrete column,9599,simonw,closed,0,,,,,5,2021-06-19T05:25:27Z,2021-06-19T15:28:30Z,2021-06-19T15:28:30Z,OWNER,,"Noticed this in the tests for `sqlite-utils memory` in #282 - is it possible to fix this? https://github.com/simonw/sqlite-utils/commit/ec5174ed40fa283cb06f25ee0c0136297ec313ae",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925410305,MDU6SXNzdWU5MjU0MTAzMDU=,285,Introspection property for telling if a table is a rowid table,9599,simonw,closed,0,,,,,7,2021-06-19T14:56:16Z,2021-06-19T15:12:33Z,2021-06-19T15:12:33Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416785_,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925319214,MDU6SXNzdWU5MjUzMTkyMTQ=,283,memory: Shouldn't detect types for JSON,9599,simonw,closed,0,,,,,1,2021-06-19T05:17:35Z,2021-06-19T14:52:48Z,2021-06-19T14:52:48Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/ec5174ed40fa283cb06f25ee0c0136297ec313ae/sqlite_utils/cli.py#L1244-L1251 This runs against JSON as well as CSV/TSV - which isn't necessary and In fact throws errors if there is any nested data.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/283/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925305186,MDU6SXNzdWU5MjUzMDUxODY=,282,Automatic type detection for CSV data,9599,simonw,closed,0,,,,,4,2021-06-19T03:33:21Z,2021-06-19T04:42:03Z,2021-06-19T04:38:00Z,OWNER,,"I've touched on this before in #179 - but now that I've added `sqlite-utils memory` this is much more important - because unlike with `sqlite-utils insert` the in-memory command doesn't give you the opportunity to fix any types you imported from CSV, so queries like `select * from stdin where age > 3` are never going to work correctly against these temporary in-memory tables. Teaching `sqlite-utils insert` to detect types for columns in a CSV file would be a backwards-compatibility breaking change. Teaching `sqlite-utils memory` that trick would not be, since it hasn't been included in a release yet. It's a little inconsistent, but I'm going to have `sqlite-utils memory` default to detecting types while `sqlite-utils insert` does not. In each case this can be controlled by a new command-line option: cat file.csv | sqlite-utils memory - --no-detect-types To opt-in for `sqlite-utils insert`: cat file.csv | sqlite-utils insert blah.db blah - --detect-types I'll have short options for these too: `-n` for `--no-detect-types` and `-d` for `--detect-types`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/282/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 709577625,MDU6SXNzdWU3MDk1Nzc2MjU=,179,sqlite-utils transform/insert --detect-types,9599,simonw,closed,0,,,,,4,2020-09-26T17:28:55Z,2021-06-19T03:36:16Z,2021-06-19T03:36:05Z,OWNER,,"Idea from https://github.com/simonw/datasette-edit-tables/issues/13 - provide Python utility methods and accompanying CLI options for detecting the likely types of TEXT columns. So if you have a text column that actually contained exclusively integer string values, it can let you know and let you run transform against it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924990677,MDU6SXNzdWU5MjQ5OTA2Nzc=,279,sqlite-utils memory should handle TSV and JSON in addition to CSV,9599,simonw,closed,0,,,,,7,2021-06-18T15:02:54Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Use sniff to detect CSV or TSV (if `:tsv` or `:csv` was not specified) and delimiters Follow-on from #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924992318,MDU6SXNzdWU5MjQ5OTIzMTg=,281,Mechanism for explicitly stating CSV or JSON or TSV for sqlite-utils memory,9599,simonw,closed,0,,,,,1,2021-06-18T15:04:53Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Implement `filename.json:json` and `-:nl` and suchlike options for specifying the format rather than guessing it - see https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861985944 Follows #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924991194,MDU6SXNzdWU5MjQ5OTExOTQ=,280,Add --encoding option to sqlite-utils memory,9599,simonw,closed,0,,,,,0,2021-06-18T15:03:32Z,2021-06-18T15:29:46Z,2021-06-18T15:29:46Z,OWNER,,Follow-on from #272 - this will work like `--encoding` on `sqlite-utils insert` and will affect all CSV files processed by `sqlite-utils memory`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922099793,MDExOlB1bGxSZXF1ZXN0NjcxMDE0NzUx,273,sqlite-utils memory command for directly querying CSV/JSON data,9599,simonw,closed,0,,,,,8,2021-06-16T05:04:58Z,2021-06-18T15:01:17Z,2021-06-18T15:00:52Z,OWNER,simonw/sqlite-utils/pulls/273,"Refs #272. Initial implementation only does CSV data, still needs: - [x] Implement `--save` - [x] Add `--dump` to the documentation - [x] Add `--attach` example to the documentation - [x] Replace `:memory:` in documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 923602693,MDU6SXNzdWU5MjM2MDI2OTM=,276,support small help flag -h,601708,mcint,closed,0,,,,,0,2021-06-17T07:59:31Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/276/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923612361,MDExOlB1bGxSZXF1ZXN0NjcyMzU5NjA5,277,add -h support closes #276,601708,mcint,closed,0,,,,,2,2021-06-17T08:08:26Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/277,This appears to be the [canonical solution](https://click.palletsprojects.com/en/7.x/documentation/#help-parameter-customization).,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 268176505,MDU6SXNzdWUyNjgxNzY1MDU=,34,Support CSV export with a .csv extension,9599,simonw,closed,0,,,,,1,2017-10-24T20:34:43Z,2021-06-17T18:14:48Z,2018-05-28T20:45:34Z,OWNER,,"Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesn’t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/34/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323681589,MDU6SXNzdWUzMjM2ODE1ODk=,266,Export to CSV,9599,simonw,closed,0,,,,,27,2018-05-16T15:50:24Z,2021-06-17T18:14:24Z,2018-06-18T06:05:25Z,OWNER,,Datasette needs to be able to export data to CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333000163,MDU6SXNzdWUzMzMwMDAxNjM=,312,"HTML, CSV and JSON views should support ?_col=&_col=",9599,simonw,closed,0,,,,,1,2018-06-16T16:53:35Z,2021-06-17T18:14:24Z,2018-06-16T17:00:12Z,OWNER,,To support whitelisting columns to display.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 335141434,MDU6SXNzdWUzMzUxNDE0MzQ=,326,CSV should respect --cors and return cors headers,9599,simonw,closed,0,,,,,1,2018-06-24T00:44:07Z,2021-06-17T18:14:24Z,2018-06-24T00:59:45Z,OWNER,,Otherwise tools like Vega can't load data via CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 395236066,MDU6SXNzdWUzOTUyMzYwNjY=,393,"CSV export in ""Advanced export"" pane doesn't respect query",1727065,ltrgoddard,closed,0,,,,,6,2019-01-02T12:39:41Z,2021-06-17T18:14:24Z,2019-01-03T02:44:10Z,NONE,,"It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query. It may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 725184645,MDU6SXNzdWU3MjUxODQ2NDU=,1034,Better way of representing binary data in .csv output,9599,simonw,closed,0,,,6026070,0.51,19,2020-10-20T04:28:58Z,2021-06-17T18:13:21Z,2020-10-29T22:47:46Z,OWNER,,"I just noticed this: https://latest.datasette.io/fixtures/binary_data.csv ```csv rowid,data 1,b'\x15\x1c\x02\xc7\xad\x05\xfe' 2,b'\x15\x1c\x03\xc7\xad\x05\xfe' ``` There's no good way to represent binary data in a CSV file, but this seems like one of the more-bad options.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1034/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503190241,MDU6SXNzdWU1MDMxOTAyNDE=,584,Codec error in some CSV exports,9599,simonw,closed,0,,,,,2,2019-10-07T01:15:34Z,2021-06-17T18:13:20Z,2019-10-18T05:23:16Z,OWNER,,"Got this exploring my Swarm checkins: ![448DBFC4-71F8-4846-83C0-BEA511B2157A](https://user-images.githubusercontent.com/9599/66279259-3af53480-e865-11e9-9651-04fd2d895392.jpeg) `/swarm/stickers.csv?stickerType=messageOnly&_size=max`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,JoeGermuska,closed,0,,,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516748849,MDU6SXNzdWU1MTY3NDg4NDk=,612,CSV export is broken for tables with null foreign keys,9599,simonw,closed,0,,,,,2,2019-11-02T22:52:47Z,2021-06-17T18:13:20Z,2019-11-02T23:12:53Z,OWNER,,"Following on from #406 - this CSV export appears to be broken: https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max ```csv pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label 1,1,hello,1,1 2,, ``` That second row should have 5 values, but it only has 4.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906385991,MDU6SXNzdWU5MDYzODU5OTE=,1349,CSV ?_stream=on redundantly calculates facets for every page,9599,simonw,closed,0,,,,,9,2021-05-29T06:11:23Z,2021-06-17T18:12:32Z,2021-06-01T15:52:53Z,OWNER,,"I'm trying to figure out why a full CSV export from https://covid-19.datasettes.com/covid/ny_times_us_counties runs unbearably slowly. It's because the streaming endpoint works by scrolling through every page, and it turns out every page calculates facets and suggested facets!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906993731,MDU6SXNzdWU5MDY5OTM3MzE=,1351,Get `?_trace=1` working with CSV and streaming CSVs,9599,simonw,closed,0,,,,,1,2021-05-31T03:02:15Z,2021-06-17T18:12:32Z,2021-06-01T15:50:09Z,OWNER,,"> I think it's worth getting `?_trace=1` to work with streaming CSV - this would have helped me spot this issue a long time ago. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1349#issuecomment-851133125_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 759695780,MDU6SXNzdWU3NTk2OTU3ODA=,1133,Option to omit header row in CSV export,9599,simonw,closed,0,,,,,2,2020-12-08T18:54:46Z,2021-06-17T18:12:31Z,2020-12-10T23:28:51Z,OWNER,,`?_header=off` - for symmetry with existing option `?_nl=on`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732685643,MDU6SXNzdWU3MzI2ODU2NDM=,1063,.csv should link to .blob downloads,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-29T21:45:58Z,2021-06-17T18:12:30Z,2020-10-29T22:47:45Z,OWNER,,"- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass) - ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~ > Moving the CSV work to a separate ticket. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1063/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922955697,MDU6SXNzdWU5MjI5NTU2OTc=,275,Enable code coverage,9599,simonw,closed,0,,,,,1,2021-06-16T18:33:49Z,2021-06-17T00:12:12Z,2021-06-17T00:12:12Z,OWNER,,"https://app.codecov.io/gh/simonw/sqlite-utils Same mechanism as Datasette. Need to copy across the token from that page and add an equivalent of this workflow: https://github.com/simonw/datasette/blob/main/.github/workflows/test-coverage.yml",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922832113,MDU6SXNzdWU5MjI4MzIxMTM=,274,sqlite-utils dump my.db command,9599,simonw,closed,0,,,,,0,2021-06-16T16:30:14Z,2021-06-16T23:51:54Z,2021-06-16T23:51:54Z,OWNER,,"Inspired by the `--dump` mechanism I added to `sqlite-utils memory` here: https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862018937 > Can use `.iterdump()` to implement this: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.iterdump > > Maybe instead (or as-well-as) offer `--dump` which dumps out the SQL from that.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919314806,MDU6SXNzdWU5MTkzMTQ4MDY=,270,Cannot set type JSON,4068,frafra,closed,0,,,,,4,2021-06-11T23:53:22Z,2021-06-16T17:34:49Z,2021-06-16T15:47:06Z,NONE,,"It would be great if the column type could be set to JSON. That would not be different from handling a regular string. It would be something like `repr(value)` and it would work with both JSON and CSV inputs, no matter if `value` is a real list or just a string representing a list.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919250621,MDU6SXNzdWU5MTkyNTA2MjE=,269,bool type not supported,4068,frafra,closed,0,,,,,3,2021-06-11T22:00:36Z,2021-06-15T01:34:10Z,2021-06-15T01:34:10Z,NONE,,"Hi! Thank you for sharing this very nice tool :) It would be nice to have support for more types, like `bool`: it is not possible to convert to boolean at the moment. My suggestion would be to handle it as `bool(int(value))`, like csvkit does.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919508498,MDU6SXNzdWU5MTk1MDg0OTg=,1375,JSON export dumps JSON fields as TEXT,4068,frafra,closed,0,,,,,2,2021-06-12T09:45:08Z,2021-06-14T09:41:59Z,2021-06-13T15:37:58Z,NONE,,"Hi! When a user tries to export data as JSON, I would expect to see the value of JSON columns represented as JSON instead of being rendered as a string. What do you think?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 916183914,MDExOlB1bGxSZXF1ZXN0NjY1ODkyMzEz,1373,"Update trustme requirement from <0.8,>=0.7 to >=0.7,<0.9",49699333,dependabot[bot],closed,0,,,,,1,2021-06-09T13:09:44Z,2021-06-13T15:38:47Z,2021-06-13T15:38:47Z,CONTRIBUTOR,simonw/datasette/pulls/1373,"Updates the requirements on [trustme](https://github.com/python-trio/trustme) to permit the latest version.
Commits
  • f9e13e0 Release 0.8.0
  • 4ae4435 Merge pull request #304 from python-trio/dependabot/add-v2-config-file
  • 8767902 Merge pull request #327 from graingert/test-on-py310
  • 6abfddd Merge branch 'master' of github.com:python-trio/trustme into test-on-py310
  • 51d3bdf Merge pull request #328 from tiran/correct_ku_eku
  • 034fb3a retry codecov more
  • 53e121d try codecov harder
  • c1e7923 require codecov in ci
  • e3ac2d6 Update tests/test_trustme.py
  • 496dca6 close the wrapped sockets to prevent Unraisable ResourceWarnings
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 918730335,MDExOlB1bGxSZXF1ZXN0NjY4MTI5NDQx,1374,Bump black from 21.5b2 to 21.6b0,49699333,dependabot[bot],closed,0,,,,,1,2021-06-11T13:07:39Z,2021-06-13T15:33:23Z,2021-06-13T15:33:22Z,CONTRIBUTOR,simonw/datasette/pulls/1374,"Bumps [black](https://github.com/psf/black) from 21.5b2 to 21.6b0.
Release notes

Sourced from black's releases.

21.6b0

Black

  • Fix failure caused by fmt: skip and indentation (#2281)
  • Account for += assignment when deciding whether to split string (#2312)
  • Correct max string length calculation when there are string operators (#2292)
  • Fixed option usage when using the --code flag (#2259)
  • Do not call uvloop.install() when Black is used as a library (#2303)
  • Added --required-version option to require a specific version to be running (#2300)
  • Fix incorrect custom breakpoint indices when string group contains fake f-strings (#2311)
  • Fix regression where R prefixes would be lowercased for docstrings (#2285)
  • Fix handling of named escapes (\N{...}) when --experimental-string-processing is used (#2319)
Changelog

Sourced from black's changelog.

21.6b0

Black

  • Fix failure caused by fmt: skip and indentation (#2281)
  • Account for += assignment when deciding whether to split string (#2312)
  • Correct max string length calculation when there are string operators (#2292)
  • Fixed option usage when using the --code flag (#2259)
  • Do not call uvloop.install() when Black is used as a library (#2303)
  • Added --required-version option to require a specific version to be running (#2300)
  • Fix incorrect custom breakpoint indices when string group contains fake f-strings (#2311)
  • Fix regression where R prefixes would be lowercased for docstrings (#2285)
  • Fix handling of named escapes (\N{...}) when --experimental-string-processing is used (#2319)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.5b2&new-version=21.6b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 919733213,MDU6SXNzdWU5MTk3MzMyMTM=,33,Searching for whitespace throws an error,9599,simonw,closed,0,,,,,0,2021-06-13T06:57:57Z,2021-06-13T14:36:39Z,2021-06-13T14:36:39Z,MEMBER,,"https://datasette.io/-/beta?q=+ returns a 500 > fts5: syntax error near """"",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919702451,MDU6SXNzdWU5MTk3MDI0NTE=,271,table.upsert_all() fails if input has a single column that should be a primary key,9599,simonw,closed,0,,,,,1,2021-06-13T02:50:27Z,2021-06-13T02:57:29Z,2021-06-13T02:57:29Z,OWNER,,"This works: ```pycon >>> db['foo'].insert_all([{""name"": ""hello""}], pk=""name"")
``` But this fails: ``` >>> db['foo3'].upsert_all([{""name"": ""hello""}], pk=""name"") Traceback (most recent call last): File """", line 1, in File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1837, in upsert_all return self.insert_all( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1778, in insert_all self.insert_chunk( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1588, in insert_chunk result = self.db.execute(query, params) File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 213, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: near ""WHERE"": syntax error ``` With the debugger: ``` >>> import pdb; pdb.pm() > /Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py(213)execute() -> return self.conn.execute(sql, parameters) (Pdb) print(sql, parameters) UPDATE [foo3] SET WHERE [name] = ? ['hello'] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/271/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919181559,MDU6SXNzdWU5MTkxODE1NTk=,268,db.schema property and sqlite-utils schema command,9599,simonw,closed,0,,,,,4,2021-06-11T20:25:47Z,2021-06-11T20:51:56Z,2021-06-11T20:51:56Z,OWNER,,"`table.schema` returns the schema for a table. `db.schema` should return the schema for the whole databes. Can do this using `select sql from sqlite_master where sql is not null`: https://latest.datasette.io/fixtures?sql=select+sql+from+sqlite_master+where+sql+is+not+null",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 915455228,MDU6SXNzdWU5MTU0NTUyMjg=,1371,Menu plugin hooks should include the request,9599,simonw,closed,0,,,,,1,2021-06-08T20:23:35Z,2021-06-10T04:46:01Z,2021-06-10T04:46:01Z,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#menu-links-datasette-actor - `menu_links(datasette, actor)` - `table_actions(datasette, actor, database, table)` - `database_actions(datasette, actor, database)` All three of these should optionally also accept the `request` object. This would allow them to take into account additional cookies, `Authorization` headers or the current request URL (including the domain/subdomain) - or even access `request.scope` for extra context that might have been passed down from ASGI middleware.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913823889,MDU6SXNzdWU5MTM4MjM4ODk=,1367,Navigation menu display bug,9599,simonw,closed,0,,,,,1,2021-06-07T18:18:08Z,2021-06-07T18:24:19Z,2021-06-07T18:24:19Z,OWNER,,"With Datasette 0.57 the navigation menu looks like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1367/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912959264,MDU6SXNzdWU5MTI5NTkyNjQ=,1364,Don't truncate columns on the list of databases,9599,simonw,closed,0,,,,,0,2021-06-06T22:01:56Z,2021-06-06T22:07:50Z,2021-06-06T22:07:50Z,OWNER,,"https://covid-19.datasettes.com/covid currently truncates at 9 database columns: Django SQL Dashboard showed me that this is a bad idea - having the full list of columns is actually really useful documentation for crafting custom SQL queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1364/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325958506,MDU6SXNzdWUzMjU5NTg1MDY=,283,Support cross-database joins,9599,simonw,closed,0,,,,,26,2018-05-24T04:18:39Z,2021-06-06T09:40:18Z,2021-02-18T22:16:46Z,OWNER,,"SQLite has the ability to attach multiple databases to a single connection and then run joins across multiple databases. Since Datasette supports more than one database, this would make a pretty neat feature.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/283/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912485040,MDU6SXNzdWU5MTI0ODUwNDA=,1361,Intermittent CI failure: restore_working_directory FileNotFoundError,9599,simonw,closed,0,,,,,4,2021-06-05T22:48:13Z,2021-06-05T23:16:24Z,2021-06-05T23:16:24Z,OWNER,,"e.g. in https://github.com/simonw/datasette/runs/2754772233 - this is an intermittent error: ``` __________ ERROR at setup of test_hook_register_routes_render_message __________ [gw0] linux -- Python 3.8.10 /opt/hostedtoolcache/Python/3.8.10/x64/bin/python tmpdir = local('/tmp/pytest-of-runner/pytest-0/popen-gw0/test_hook_register_routes_rend0') request = > @pytest.fixture def restore_working_directory(tmpdir, request): > previous_cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912464443,MDU6SXNzdWU5MTI0NjQ0NDM=,1360,"Security flaw, to be fixed in 0.56.1 and 0.57",9599,simonw,closed,0,,,,,2,2021-06-05T21:53:51Z,2021-06-05T22:23:23Z,2021-06-05T22:22:06Z,OWNER,,"See security advisory here for details: https://github.com/simonw/datasette/security/advisories/GHSA-xw7c-jx9m-xh5g - the `?_trace=1` debugging option was not correctly escaping its JSON output, resulting in a [reflected cross-site scripting](https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks) vulnerability.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912418094,MDU6SXNzdWU5MTI0MTgwOTQ=,1358,Release Datasette 0.57,9599,simonw,closed,0,,,,,3,2021-06-05T19:56:13Z,2021-06-05T22:20:07Z,2021-06-05T22:20:07Z,OWNER,,"Need release notes. Changes are here: https://github.com/simonw/datasette/compare/0.56...368aa5f1b16ca35f82d90ff747023b9a2bfa27c1 Partial release notes already exist for the two alphas, https://github.com/simonw/datasette/releases/tag/0.57a0 and https://github.com/simonw/datasette/releases/tag/0.57a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912419349,MDU6SXNzdWU5MTI0MTkzNDk=,1359,`?_trace=1` should only be available with a new `trace_debug` setting,9599,simonw,closed,0,,,,,0,2021-06-05T19:59:27Z,2021-06-05T20:18:46Z,2021-06-05T20:18:46Z,OWNER,,Just like template debug mode is controlled by this off-by-default setting: https://github.com/simonw/datasette/blob/368aa5f1b16ca35f82d90ff747023b9a2bfa27c1/datasette/app.py#L160-L164,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 849582643,MDExOlB1bGxSZXF1ZXN0NjA4MzM0MDk2,1291,Update docs: explain allow_download setting,5413548,louispotok,closed,0,,,,,2,2021-04-03T05:28:33Z,2021-06-05T19:48:51Z,2021-06-05T19:48:51Z,CONTRIBUTOR,simonw/datasette/pulls/1291,"This fixes one possible source of confusion seen in #502 and clarifies when database downloads will be shown and allowed.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 813899472,MDU6SXNzdWU4MTM4OTk0NzI=,1238,Custom pages don't work with base_url setting,79913,tsibley,closed,0,,,,,9,2021-02-22T21:58:58Z,2021-06-05T18:59:55Z,2021-06-05T18:59:55Z,NONE,,"It seems that custom pages aren't routing properly when the `base_url` setting is used. To reproduce, with Datasette 0.55. Create a `templates/pages/custom.html` with some text. ``` mkdir -p templates/pages/ echo ""Hello, world!"" > templates/pages/custom.html ``` Start Datasette. ``` datasette --template-dir templates/ ``` Visit http://localhost:8001/custom and see ""Hello, world!"". Start Datasette with a `base_url`. ``` datasette --template-dir templates/ --setting base_url /prefix/ ``` Visit http://localhost:8001/prefix/custom and see a ""Database not found: custom"" 404. Note that like all routes, http://localhost:8001/custom still works when run with `base_url`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912394511,MDExOlB1bGxSZXF1ZXN0NjYyNTU3MjQw,1357,Make custom pages compatible with base_url setting,9599,simonw,closed,0,,,,,1,2021-06-05T18:54:39Z,2021-06-05T18:59:54Z,2021-06-05T18:59:54Z,OWNER,simonw/datasette/pulls/1357,Refs #1238.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 907642546,MDU6SXNzdWU5MDc2NDI1NDY=,264,"Supporting additional output formats, like GeoJSON",25778,eyeseast,closed,0,,,,,3,2021-05-31T18:03:32Z,2021-06-03T05:12:21Z,2021-06-03T05:12:21Z,CONTRIBUTOR,,"I have a project going where it would be useful to do some spatial processing in SQLite (instead of large files) and then output GeoJSON. So my workflow would be something like this: 1. Read Shapefiles, GeoJSON, CSVs into a SQLite database 2. Join, filter, prune as needed 3. Export GeoJSON for just the stuff I need at that moment, while still having a database of things that will be useful later I'm wondering if this is worth adding to SQLite-utils itself (GeoJSON, at least), or if it's better to make a counterpart to the ecosystem of `*-to-sqlite` tools, say a suite of `sqlite-to-*` things. Or would it be crazy to have a plugin system?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906356331,MDU6SXNzdWU5MDYzNTYzMzE=,263,`sqlite-utils indexes` command,9599,simonw,closed,0,,,,,6,2021-05-29T04:52:34Z,2021-06-03T04:34:38Z,2021-06-03T04:34:38Z,OWNER,,"While working on #260 I realized there's no command to show indexes in a database, even though there is one for showing tables and one for triggers. I should implement #261 first.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906345899,MDU6SXNzdWU5MDYzNDU4OTk=,261,`table.xindexes` using `PRAGMA index_xinfo(table)`,9599,simonw,closed,0,,,,,5,2021-05-29T04:23:48Z,2021-06-03T03:54:14Z,2021-06-03T03:51:32Z,OWNER,,"> `PRAGMA index_xinfo(table)` DOES return that data: > ``` > (Pdb) [c[0] for c in fresh_db.execute(""PRAGMA > index_xinfo('idx_dogs_age_name')"").description] > ['seqno', 'cid', 'name', 'desc', 'coll', 'key'] > (Pdb) fresh_db.execute(""PRAGMA index_xinfo('idx_dogs_age_name')"").fetchall() > [(0, 2, 'age', 1, 'BINARY', 1), (1, 0, 'name', 0, 'BINARY', 1), (2, -1, None, 0, 'BINARY', 0)] > ``` > See https://sqlite.org/pragma.html#pragma_index_xinfo > > Example output: https://covid-19.datasettes.com/covid?sql=select+*+from+pragma_index_xinfo%28%27idx_ny_times_us_counties_date%27%29 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/260#issuecomment-850766552_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 656959584,MDU6SXNzdWU2NTY5NTk1ODQ=,893,pip3 install datasette not serving static on linuxbrew.,44167,zodman,closed,0,,,,,1,2020-07-14T23:33:38Z,2021-06-02T04:29:56Z,2021-06-02T04:29:56Z,NONE,,"*This error wasn't thrown* ``` Traceback (most recent call last): File ""/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 289, in inner_static full_path.relative_to(root_path) File ""/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/pathlib.py"", line 904, in relative_to raise ValueError(""{!r} does not start with {!r}"" ValueError: '/home/linuxbrew/.linuxbrew/lib/python3.8/site-packages/datasette/static/app.css' does not start with '/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/static' ``` Linuxbrew install python@3.8 with symbolic links when You call the full_path.relative_to(root_path) throw ValueError. This happened when you install from pip3 when you install with python3 setup.py develop , works good. Well at the end the static wasn't serving. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/893/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756818250,MDU6SXNzdWU3NTY4MTgyNTA=,1127,Make the custom SQL query text box larger or resizable,596279,zaneselvans,closed,0,,,,,1,2020-12-04T05:37:11Z,2021-06-02T04:29:06Z,2021-06-02T04:28:55Z,NONE,,"The text entry field for custom SQL queries is too small to display a moderately complex query, especially when it's been formatted. Would it be easy to make the textbox resizable by the user rather than having a fixed height?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 864979486,MDExOlB1bGxSZXF1ZXN0NjIxMTE3OTc4,1306,Avoid error sorting by relationships if related tables are not allowed,416374,gfrmin,closed,0,,,,,4,2021-04-22T13:53:17Z,2021-06-02T04:27:00Z,2021-06-02T04:25:28Z,CONTRIBUTOR,simonw/datasette/pulls/1306,Refs #1305,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 864969683,MDU6SXNzdWU4NjQ5Njk2ODM=,1305,Index view crashes when any database table is not accessible to actor,416374,gfrmin,closed,0,,,,,0,2021-04-22T13:44:22Z,2021-06-02T04:26:29Z,2021-06-02T04:26:29Z,CONTRIBUTOR,,"Because of https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L63, the ```tables``` dict built does not include invisible tables; however, if https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L80 is reached (because table_counts was not successfully initialized, e.g. due to a very large database) then as db.get_all_foreign_keys() returns ALL tables, a KeyError will be raised. This error can be recreated with the fixtures.db if any table is hidden, e.g. by adding something like ```""foreign_key_references"": { ""allow"": {} }``` to fixtures-metadata.json and deleting ```or not table_counts``` from https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L77. I'm not sure how to fix this error; perhaps by testing if the table is in the aforementions ```tables``` dict.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904537568,MDExOlB1bGxSZXF1ZXN0NjU1Njg0NDc3,1346,Re-display user's query with an error message if an error occurs,9599,simonw,closed,0,,,,,3,2021-05-28T02:04:20Z,2021-06-02T03:46:21Z,2021-06-02T03:46:21Z,OWNER,simonw/datasette/pulls/1346,Refs #619,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 828811618,MDU6SXNzdWU4Mjg4MTE2MTg=,1257,Table names containing single quotes break things,9599,simonw,closed,0,,,,,2,2021-03-11T06:29:38Z,2021-06-02T03:28:29Z,2021-06-02T03:28:29Z,OWNER,,"e.g. I found a table called `Yesterday's ELRs by County` It threw an error inside the `detect_fts()` function attempting to run this SQL query: ```sql select name from sqlite_master where rootpage = 0 and ( sql like '%VIRTUAL TABLE%USING FTS%content=""Yesterday's ELRs by County""%' or sql like '%VIRTUAL TABLE%USING FTS%content=[Yesterday's ELRs by County]%' or ( tbl_name = ""Yesterday's ELRs by County"" and sql like '%VIRTUAL TABLE%USING FTS%' ) ) ``` Here's the code at fault: https://github.com/simonw/datasette/blob/640ac7071b73111ba4423812cd683756e0e1936b/datasette/utils/__init__.py#L534-L548",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 800669347,MDU6SXNzdWU4MDA2NjkzNDc=,1216,"/-/databases should reflect connection order, not alphabetical order",9599,simonw,closed,0,,,,,1,2021-02-03T20:20:23Z,2021-06-02T03:10:19Z,2021-06-02T03:10:19Z,OWNER,,"The order in which databases are attached to Datasette matters - it affects the homepage, and it's beginning to influence how certain plugins work (see https://github.com/simonw/datasette-tiles/issues/8). Two years ago in cccea85be6aaaeadb31f3b588ec7f732628815f5 I made `/-/databases` return things in alphabetical order, to fix a test failure in Python 3.5. Python 3.5 is no longer supported, so this is no longer necessary - and this behaviour should now be treated as a bug.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 908276134,MDExOlB1bGxSZXF1ZXN0NjU4OTkxNDA0,1352,Bump black from 21.5b1 to 21.5b2,49699333,dependabot[bot],closed,0,,,,,1,2021-06-01T13:08:52Z,2021-06-02T02:56:45Z,2021-06-02T02:56:44Z,CONTRIBUTOR,simonw/datasette/pulls/1352,"Bumps [black](https://github.com/psf/black) from 21.5b1 to 21.5b2.
Release notes

Sourced from black's releases.

21.5b2

Black

  • A space is no longer inserted into empty docstrings (#2249)
  • Fix handling of .gitignore files containing non-ASCII characters on Windows (#2229)
  • Respect .gitignore files in all levels, not only root/.gitignore file (apply .gitignore rules like git does) (#2225)
  • Restored compatibility with Click 8.0 on Python 3.6 when LANG=C used (#2227)
  • Add extra uvloop install + import support if in python env (#2258)
  • Fix --experimental-string-processing crash when matching parens are not found (#2283)
  • Make sure to split lines that start with a string operator (#2286)
  • Fix regular expression that black uses to identify f-expressions (#2287)

Blackd

  • Add a lower bound for the aiohttp-cors dependency. Only 0.4.0 or higher is supported. (#2231)

Packaging

  • Release self-contained x86_64 MacOS binaries as part of the GitHub release pipeline (#2198)
  • Always build binaries with the latest available Python (#2260)

Documentation

  • Add discussion of magic comments to FAQ page (#2272)
  • --experimental-string-processing will be enabled by default in the future (#2273)
  • Fix typos discovered by codespell (#2228)
  • Fix Vim plugin installation instructions. (#2235)
  • Add new Frequently Asked Questions page (#2247)
  • Fix encoding + symlink issues preventing proper build on Windows (#2262)
Changelog

Sourced from black's changelog.

21.5b2

Black

  • A space is no longer inserted into empty docstrings (#2249)
  • Fix handling of .gitignore files containing non-ASCII characters on Windows (#2229)
  • Respect .gitignore files in all levels, not only root/.gitignore file (apply .gitignore rules like git does) (#2225)
  • Restored compatibility with Click 8.0 on Python 3.6 when LANG=C used (#2227)
  • Add extra uvloop install + import support if in python env (#2258)
  • Fix --experimental-string-processing crash when matching parens are not found (#2283)
  • Make sure to split lines that start with a string operator (#2286)
  • Fix regular expression that black uses to identify f-expressions (#2287)

Blackd

  • Add a lower bound for the aiohttp-cors dependency. Only 0.4.0 or higher is supported. (#2231)

Integrations

  • The official Black action now supports choosing what version to use, and supports the major 3 OSes. (#1940)

Packaging

  • Release self-contained x86_64 MacOS binaries as part of the GitHub release pipeline (#2198)
  • Always build binaries with the latest available Python (#2260)

Documentation

  • Add discussion of magic comments to FAQ page (#2272)
  • --experimental-string-processing will be enabled by default in the future (#2273)
  • Fix typos discovered by codespell (#2228)
  • Fix Vim plugin installation instructions. (#2235)
  • Add new Frequently Asked Questions page (#2247)
  • Fix encoding + symlink issues preventing proper build on Windows (#2262)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.5b1&new-version=21.5b2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 323671577,MDU6SXNzdWUzMjM2NzE1Nzc=,263,Facets should not execute for ?shape=array|object,9599,simonw,closed,0,,,,,3,2018-05-16T15:26:13Z,2021-06-02T02:54:34Z,2021-06-02T02:54:34Z,OWNER,,Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906977719,MDU6SXNzdWU5MDY5Nzc3MTk=,1350,?_nofacets=1 query string argument for disabling facets and suggested facets,9599,simonw,closed,0,,,,,2,2021-05-31T02:22:29Z,2021-06-01T16:19:38Z,2021-05-31T02:39:18Z,OWNER,,"This is needed as an internal option for #1349. `datasette-graphql` can benefit from this too - maybe can even use it so that if you pass `?_shape=array` it gets automatically added, fixing #263.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 908446997,MDU6SXNzdWU5MDg0NDY5OTc=,1353,?_nocount=1 for opting out of table counts,9599,simonw,closed,0,,,,,2,2021-06-01T15:53:27Z,2021-06-01T16:18:54Z,2021-06-01T16:17:04Z,OWNER,,"Running a trace against a CSV streaming export with the new `_trace=1` feature from #1351 shows that the following code is executing a `select count(*) from table` for every page of results returned: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/table.py#L700-L705 This is inefficient - a new `?_nocount=1` option would let us disable this count in the same way as #1349: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/base.py#L264-L276 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 908465747,MDU6SXNzdWU5MDg0NjU3NDc=,1354,Update help in tests for latest Click,9599,simonw,closed,0,,,,,1,2021-06-01T16:14:31Z,2021-06-01T16:17:04Z,2021-06-01T16:17:04Z,OWNER,,"Now that Uvicorn 0.14 is out with an unpinned Click dependency - https://github.com/encode/uvicorn/pull/1033 - our test suite runs against Click 8.0 - which subtly changes the output of `--help` causing test failures: https://github.com/simonw/datasette/runs/2720383031?check_suite_focus=true ``` def test_help_includes(name, filename): expected = (docs_path / filename).read_text() runner = CliRunner() result = runner.invoke(cli, name.split() + [""--help""], terminal_width=88) actual = f""$ datasette {name} --help\n\n{result.output}"" # actual has ""Usage: cli package [OPTIONS] FILES"" # because it doesn't know that cli will be aliased to datasette expected = expected.replace(""Usage: datasette"", ""Usage: cli"") > assert expected == actual E AssertionError: assert '$ datasette ...e and exit.\n' == '$ datasette ...e and exit.\n' E Skipping 848 identical leading characters in diff, use -v to show E nt_id xxx E + E --version-note TEXT Additional note to show on /-/versions E --secret TEXT Secret used for signing secure values, such as signed E cookies E + E --title TEXT Title for metadata ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904071938,MDU6SXNzdWU5MDQwNzE5Mzg=,1345,?_nocol= does not interact well with default facets,9599,simonw,closed,0,,,,,7,2021-05-27T18:39:55Z,2021-05-31T02:40:44Z,2021-05-31T02:31:21Z,OWNER,,"Clicking ""Hide this column"" on `fips` on https://covid-19.datasettes.com/covid/ny_times_us_counties shows this error: https://covid-19.datasettes.com/covid/ny_times_us_counties?_nocol=fips > ## Invalid SQL > no such column: fips The reason is that https://covid-19.datasettes.com/-/metadata sets up the following: ```json ""ny_times_us_counties"": { ""sort_desc"": ""date"", ""facets"": [ ""state"", ""county"", ""fips"" ], ``` It's setting `fips` as a default facet, which breaks if you attempt to remove the column using `?_nocol`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 838148087,MDU6SXNzdWU4MzgxNDgwODc=,250,Handle byte order marks (BOMs) in CSV files,9599,simonw,closed,0,,,,,3,2021-03-22T22:13:18Z,2021-05-29T05:34:21Z,2021-05-29T05:34:21Z,OWNER,,I often find `sqlite-utils insert ... --csv` creates a first column with a weird character at the start of it - which it turns out is the UTF-8 BOM. Fix that.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906355849,MDExOlB1bGxSZXF1ZXN0NjU3MzczNzI2,262,Ability to add descending order indexes,9599,simonw,closed,0,,,,,0,2021-05-29T04:51:04Z,2021-05-29T05:01:42Z,2021-05-29T05:01:39Z,OWNER,simonw/sqlite-utils/pulls/262,Refs #260,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 906330187,MDU6SXNzdWU5MDYzMzAxODc=,260,Support creating descending order indexes,9599,simonw,closed,0,,,,,12,2021-05-29T03:42:59Z,2021-05-29T05:01:39Z,2021-05-29T05:01:39Z,OWNER,,"SQLite lets you create indexes in reverse order, which can have a surprisingly big impact on performance, see https://github.com/simonw/covid-19-datasette/issues/27 I tried doing this using `sqlite-utils` like so, but it's didn't work: ```python db[""ny_times_us_counties""].create_index([""date desc""]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 858501079,MDU6SXNzdWU4NTg1MDEwNzk=,255,transform --help should tell you the available types,9599,simonw,closed,0,,,,,0,2021-04-15T05:24:48Z,2021-05-29T03:55:52Z,2021-05-29T03:55:52Z,OWNER,,"``` Usage: sqlite-utils transform [OPTIONS] PATH TABLE Transform a table beyond the capabilities of ALTER TABLE Options: --type ... Change column type to X ``` This should specify that the possible types are 'INTEGER', 'TEXT', 'FLOAT', 'BLOB'.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904582277,MDExOlB1bGxSZXF1ZXN0NjU1NzI2Mzg3,1347,Test docker platform blair only,10801138,blairdrummond,closed,0,,,,,0,2021-05-28T02:47:09Z,2021-05-28T02:47:28Z,2021-05-28T02:47:28Z,CONTRIBUTOR,simonw/datasette/pulls/1347,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1347/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 881219362,MDExOlB1bGxSZXF1ZXN0NjM0ODIxMDY1,1319,Add Docker multi-arch support with Buildx,10801138,blairdrummond,closed,0,,,,,5,2021-05-08T19:35:03Z,2021-05-27T16:49:24Z,2021-05-27T16:49:24Z,CONTRIBUTOR,simonw/datasette/pulls/1319,"This adds Docker support to extra CPU architectures (like arm) using [Docker's Buildx action](https://github.com/marketplace/actions/docker-setup-buildx) You can see [what that looks like on Dockerhub](https://hub.docker.com/r/blairdrummond/datasette/tags?page=1&ordering=last_updated) And how it lets Datasette run on a Raspberry Pi (top is my dockerhub, bottom is upstream) ![Screenshot from 2021-05-08 15-32-25](https://user-images.githubusercontent.com/10801138/117551210-a17a9f80-b012-11eb-966b-10e1590dd4a9.png) The workflow log [here](https://github.com/blairdrummond/datasette/runs/2535743398?check_suite_focus=true) (I subbed `blairdrummond` for datasetteproject in my branch) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 903978133,MDU6SXNzdWU5MDM5NzgxMzM=,1343,Figure out how to publish alpha/beta releases to Docker Hub,9599,simonw,closed,0,,,,,4,2021-05-27T16:42:17Z,2021-05-27T16:46:37Z,2021-05-27T16:45:41Z,OWNER,,"> It looks like all I need to do to ship an alpha version to Docker Hub is NOT point the `latest` tag at it after it goes live: https://github.com/simonw/datasette/blob/1a8972f9c012cd22b088c6b70661a9c3d3847853/.github/workflows/publish.yml#L75-L77 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1319#issuecomment-849780481_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 898904402,MDU6SXNzdWU4OTg5MDQ0MDI=,1337,"""More"" link for facets that shows _facet_size=max results",9599,simonw,closed,0,,,,,7,2021-05-23T00:08:51Z,2021-05-27T16:14:14Z,2021-05-27T16:01:03Z,OWNER,,"_Original title: ""More"" link for facets that shows the full set of results_ The simplest way to do this will be to have it link to a generated SQL query. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1332#issuecomment-846479062_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1337/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893890496,MDU6SXNzdWU4OTM4OTA0OTY=,1332,?_facet_size=X to increase number of facets results on the page,192568,mroswell,closed,0,,,,,5,2021-05-18T02:40:16Z,2021-05-27T16:13:07Z,2021-05-23T00:34:37Z,CONTRIBUTOR,,"Is there a way to add a parameter to the URL to modify default_facet_size? LIkewise, a way to produce a link on the three dots to expand to all items (or match previous number of items, or add x more)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903200328,MDU6SXNzdWU5MDMyMDAzMjg=,1341,"""Show all columns"" cog menu item should show if ?_col= is used",9599,simonw,closed,0,,,,,1,2021-05-27T04:28:17Z,2021-05-27T04:31:16Z,2021-05-27T04:31:16Z,OWNER,,"On https://latest.datasette.io/fixtures/sortable?_col=sortable the ""Show all columns"" item (from #615) is not shown (it should be): ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517451234,MDU6SXNzdWU1MTc0NTEyMzQ=,615,?_col= and ?_nocol= support for toggling columns on table view,9599,simonw,closed,0,,,,,16,2019-11-04T22:55:41Z,2021-05-27T04:26:10Z,2021-05-27T04:17:44Z,OWNER,,Split off from #292 (I guess this is a re-opening of #312).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326800219,MDU6SXNzdWUzMjY4MDAyMTk=,292,Mechanism for customizing the SQL used to select specific columns in the table view,9599,simonw,closed,0,,,,,15,2018-05-27T09:05:52Z,2021-05-27T04:25:01Z,2021-05-27T04:25:01Z,OWNER,,"Some columns don't make a lot of sense in their default representation - binary blobs such as SpatiaLite geometries for example, or lengthy columns that really should be truncated somehow. We may also find that there are tables where we don't want to show all of the columns - so a mechanism to select a subset of columns would be nice. I think there are two features here: * the ability to request a subset of columns on the table view * the ability to override the SQL for a specific column and/or add extra columns - `AsGeoJSON(Geometry)` for example Both features should be available via both querystring arguments and in `metadata.json` The querystring argument for custom SQL should only work if `allow_sql` config is turned on. Refs #276",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/292/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 899851083,MDExOlB1bGxSZXF1ZXN0NjUxNDkyODg4,1339,?_col=/?_nocol= to show/hide columns on the table page,9599,simonw,closed,0,,,,,1,2021-05-24T17:15:20Z,2021-05-27T04:17:44Z,2021-05-27T04:17:43Z,OWNER,simonw/datasette/pulls/1339,"See #615. Still to do: - [x] Allow combination of `?_col=` and `?_nocol=` (`_nocol` wins) - [x] Deduplicate same column if passed in `?_col=` multiple times - [x] Validate that user did not try to remove a primary key - [x] Add tests - [x] Ensure this works correctly for SQL views - [x] Add documentation ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 892457208,MDU6SXNzdWU4OTI0NTcyMDg=,1327,Support Unicode characters in metadata.json,20846286,GmGniap,closed,0,,,,,2,2021-05-15T14:33:58Z,2021-05-24T19:10:21Z,2021-05-24T19:10:21Z,NONE,,"Hello , when I used Burmese (Unicode) characters in metadata.json like below - ![image](https://user-images.githubusercontent.com/20846286/118364978-cba70100-b5c0-11eb-967c-7dc3b62478f2.png) It gave wrong results when I run datasette - ![image](https://user-images.githubusercontent.com/20846286/118365025-fc873600-b5c0-11eb-97ce-19541b8cc6d8.png) It would be great & helpful for us if metadata.json can support in Unicode supported Asian Languages. Thanks & Regards. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 884952179,MDU6SXNzdWU4ODQ5NTIxNzk=,1320,Can't use apt-get in Dockerfile when using datasetteproj/datasette as base,2670795,brandonrobertz,closed,0,,,,,4,2021-05-10T19:37:27Z,2021-05-24T18:15:56Z,2021-05-24T18:07:08Z,CONTRIBUTOR,,"The datasette base Docker image is super convenient, but there's one problem: if any of the plugins you install require additional system dependencies (e.g., xz, git, curl) then any attempt to use apt in said Dockerfile results in an explosion: ``` $ docker-compose build Building server [+] Building 9.9s (7/9) => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 666B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 34B 0.0s => [internal] load metadata for docker.io/datasetteproject/datasette:latest 0.6s => [base 1/4] FROM docker.io/datasetteproject/datasette@sha256:2250d0fbe57b1d615a8d6df0c9d43deb9533532e00bac68854773d8ff8dcf00a 0.0s => [internal] load build context 1.8s => => transferring context: 2.44MB 1.8s => CACHED [base 2/4] WORKDIR /datasette 0.0s => ERROR [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils 9.2s ------ > [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils: #6 0.446 Get:1 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB] #6 0.449 Get:2 http://deb.debian.org/debian buster InRelease [121 kB] #6 0.459 Get:3 http://httpredir.debian.org/debian sid InRelease [157 kB] #6 0.784 Get:4 http://deb.debian.org/debian buster-updates InRelease [51.9 kB] #6 0.790 Get:5 http://httpredir.debian.org/debian sid/main amd64 Packages [8626 kB] #6 1.003 Get:6 http://deb.debian.org/debian buster/main amd64 Packages [7907 kB] #6 1.180 Get:7 http://security.debian.org/debian-security buster/updates/main amd64 Packages [286 kB] #6 7.095 Get:8 http://deb.debian.org/debian buster-updates/main amd64 Packages [10.9 kB] #6 8.058 Fetched 17.2 MB in 8s (2243 kB/s) #6 8.058 Reading package lists... #6 9.166 E: flAbsPath on /var/lib/dpkg/status failed - realpath (2: No such file or directory) #6 9.166 E: Could not open file - open (2: No such file or directory) #6 9.166 E: Problem opening #6 9.166 E: The package lists or status file could not be parsed or opened. ``` The problem seems to be from completely wiping out `/var/lib/dpkg` in the upstream Dockerfile: https://github.com/simonw/datasette/blob/1b697539f5b53cec3fe13c0f4ada13ba655c88c7/Dockerfile#L18 I've tested without removing the directory and apt works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 899169307,MDU6SXNzdWU4OTkxNjkzMDc=,1338,Fix jinja2 warnings,9599,simonw,closed,0,,,,,0,2021-05-24T01:38:23Z,2021-05-24T01:41:55Z,2021-05-24T01:41:55Z,OWNER,,"Lots of these in the test suite now, after the Jinja upgrade in #1331: ``` tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:45: DeprecationWarning: 'jinja2.escape' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.escape' instead. label=jinja2.escape(data[""label""] or """") or "" "", tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:41: DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead. return jinja2.Markup( ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 891969037,MDU6SXNzdWU4OTE5NjkwMzc=,1326,How to limit fields returned from the JSON API?,5268174,bram2000,closed,0,,,,,1,2021-05-14T14:27:41Z,2021-05-23T02:55:06Z,2021-05-23T02:55:00Z,NONE,,"Hi, I have quite wide tables, and in many cases only want a subset of the data (to save on network bandwidth). I need to use the JSON API as handling pagination is so much easier, but I can't see a way to select specific columns. Is there a way to do this, or is it a feature request? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893537744,MDU6SXNzdWU4OTM1Mzc3NDQ=,1331,Add support for Jinja2 version 3.0,475613,MarkusH,closed,0,,,,,10,2021-05-17T17:14:36Z,2021-05-23T00:57:39Z,2021-05-23T00:57:39Z,NONE,,"A week ago, [The Pallets Project](https://github.com/pallets) released [new major versions of several of its projects](https://palletsprojects.com/blog/flask-2-0-released/). Among those updates is one for Jinja2, which bumps it to version 3.0.0. I'd like for datasette to support Jinaj2 version 3.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1331/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 887241681,MDExOlB1bGxSZXF1ZXN0NjQwNDg0OTY2,1321,Bump black from 21.4b2 to 21.5b1,49699333,dependabot[bot],closed,0,,,,,1,2021-05-11T13:12:28Z,2021-05-22T23:55:39Z,2021-05-22T23:55:39Z,CONTRIBUTOR,simonw/datasette/pulls/1321,"Bumps [black](https://github.com/psf/black) from 21.4b2 to 21.5b1.
Release notes

Sourced from black's releases.

21.5b1

Black

  • Refactor src/black/__init__.py into many files (#2206)

Documentation

  • Replaced all remaining references to the master branch with the main branch. Some additional changes in the source code were also made. (#2210)
  • Sigificantly reorganized the documentation to make much more sense. Check them out by heading over to the stable docs on RTD. (#2174)

21.5b0

Black

  • Set --pyi mode if --stdin-filename ends in .pyi (#2169)
  • Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (#2182)

Black-Primer

  • Add --no-diff to black-primer to suppress formatting changes (#2187)
Changelog

Sourced from black's changelog.

21.5b1

Black

  • Refactor src/black/__init__.py into many files (#2206)

Documentation

  • Replaced all remaining references to the master branch with the main branch. Some additional changes in the source code were also made. (#2210)
  • Sigificantly reorganized the documentation to make much more sense. Check them out by heading over to the stable docs on RTD. (#2174)

21.5b0

Black

  • Set --pyi mode if --stdin-filename ends in .pyi (#2169)
  • Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (#2182)

Black-Primer

  • Add --no-diff to black-primer to suppress formatting changes (#2187)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.4b2&new-version=21.5b1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1321/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 890073888,MDExOlB1bGxSZXF1ZXN0NjQzMTQ5Mjcz,1323,"Update click requirement from ~=7.1.1 to >=7.1.1,<8.1.0",49699333,dependabot[bot],closed,0,,,,,1,2021-05-12T13:08:56Z,2021-05-22T23:54:48Z,2021-05-22T23:54:48Z,CONTRIBUTOR,simonw/datasette/pulls/1323,"Updates the requirements on [click](https://github.com/pallets/click) to permit the latest version.
Release notes

Sourced from click's releases.

8.0.0

New major versions of all the core Pallets libraries, including Click 8.0, have been released! :tada:

This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.

Changelog

Sourced from click's changelog.

Version 8.0.0

Released 2021-05-11

  • Drop support for Python 2 and 3.5.
  • Colorama is always installed on Windows in order to provide style and color support. :pr:1784
  • Adds a repr to Command, showing the command name for friendlier debugging. :issue:1267, :pr:1295
  • Add support for distinguishing the source of a command line parameter. :issue:1264, :pr:1329
  • Add an optional parameter to ProgressBar.update to set the current_item. :issue:1226, :pr:1332
  • version_option uses importlib.metadata (or the importlib_metadata backport) instead of pkg_resources. :issue:1582
  • If validation fails for a prompt with hide_input=True, the value is not shown in the error message. :issue:1460
  • An IntRange or FloatRange option shows the accepted range in its help text. :issue:1525, :pr:1303
  • IntRange and FloatRange bounds can be open (<) instead of closed (<=) by setting min_open and max_open. Error messages have changed to reflect this. :issue:1100
  • An option defined with duplicate flag names ("--foo/--foo") raises a ValueError. :issue:1465
  • echo() will not fail when using pytest's capsys fixture on Windows. :issue:1590
  • Resolving commands returns the canonical command name instead of the matched name. This makes behavior such as help text and Context.invoked_subcommand consistent when using patterns like AliasedGroup. :issue:1422
  • The BOOL type accepts the values "on" and "off". :issue:1629
  • A Group with invoke_without_command=True will always invoke its result callback. :issue:1178
  • nargs == -1 and nargs > 1 is parsed and validated for values from environment variables and defaults. :issue:729
  • Detect the program name when executing a module or package with python -m name. :issue:1603
  • Include required parent arguments in help synopsis of subcommands. :issue:1475
  • Help for boolean flags with show_default=True shows the flag name instead of True or False. :issue:1538
  • Non-string objects passed to style() and secho() will be converted to string. :pr:1146
  • edit(require_save=True) will detect saves for editors that exit very fast on filesystems with 1 second resolution. :pr:1050
  • New class attributes make it easier to use custom core objects throughout an entire application. :pr:938

... (truncated)

Commits
  • 9da1669 Merge pull request #1877 from pallets/release-8.0.0
  • dfa6369 release version 8.0.0
  • b862cb1 update requirements
  • f51584c Merge pull request #1876 from pallets/pre-commit-ci-schedule
  • 804c71c update pre-commit monthly
  • ac655f8 Merge pull request #1872 from janLuke/fix/formatter_write_text
  • dcd991d HelpFormatter.write_text uses full width
  • 5215fc1 Merge pull request #1870 from AdrienPensart/allow_colors_in_metavar
  • e3e1691 repr is erasing ANSI escapes codes
  • 482e6e6 Merge pull request #1875 from pallets/pre-commit-ci-update-config
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 890073989,MDExOlB1bGxSZXF1ZXN0NjQzMTQ5MzY0,1325,"Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0",49699333,dependabot[bot],closed,0,,,,,2,2021-05-12T13:09:03Z,2021-05-22T23:54:25Z,2021-05-22T23:54:25Z,CONTRIBUTOR,simonw/datasette/pulls/1325,"Updates the requirements on [itsdangerous](https://github.com/pallets/itsdangerous) to permit the latest version.
Release notes

Sourced from itsdangerous's releases.

2.0.0

New major versions of all the core Pallets libraries, including ItsDangerous 2.0, have been released! :tada:

This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.

Changelog

Sourced from itsdangerous's changelog.

Version 2.0.0

Released 2021-05-11

  • Drop support for Python 2 and 3.5.
  • JWS support (JSONWebSignatureSerializer, TimedJSONWebSignatureSerializer) is deprecated. Use a dedicated JWS/JWT library such as authlib instead. :issue:129
  • Importing itsdangerous.json is deprecated. Import Python's json module instead. :pr:152
  • Simplejson is no longer used if it is installed. To use a different library, pass it as Serializer(serializer=...). :issue:146
  • datetime values are timezone-aware with timezone.utc. Code using TimestampSigner.unsign(return_timestamp=True) or BadTimeSignature.date_signed may need to change. :issue:150
  • If a signature has an age less than 0, it will raise SignatureExpired rather than appearing valid. This can happen if the timestamp offset is changed. :issue:126
  • BadTimeSignature.date_signed is always a datetime object rather than an int in some cases. :issue:124
  • Added support for key rotation. A list of keys can be passed as secret_key, oldest to newest. The newest key is used for signing, all keys are tried for unsigning. :pr:141
  • Removed the default SHA-512 fallback signer from default_fallback_signers. :issue:155
  • Add type information for static typing tools. :pr:186

Version 1.1.0

Released 2018-10-26

  • Change default signing algorithm back to SHA-1. :pr:113
  • Added a default SHA-512 fallback for users who used the yanked 1.0.0 release which defaulted to SHA-512. :pr:114
  • Add support for fallback algorithms during deserialization to support changing the default in the future without breaking existing signatures. :pr:113
  • Changed capitalization of packages back to lowercase as the change in capitalization broke some tooling. :pr:113

Version 1.0.0

Released 2018-10-18

YANKED

... (truncated)

Commits
  • d101100 Merge pull request #235 from pallets/release-2.0.0
  • ca0f59a release version 2.0.0
  • d1ed89f update requirements
  • d1722ea Merge pull request #234 from pallets/pre-commit-ci-schedule
  • d1eb7aa update pre-commit monthly
  • acbc456 Merge pull request #233 from pallets/pre-commit-ci-update-config
  • 04e485a [pre-commit.ci] pre-commit autoupdate
  • c0e6b48 Merge pull request #232 from pallets/pre-commit-ci-update-config
  • 6a9df83 [pre-commit.ci] pre-commit autoupdate
  • 477f42c Merge pull request #231 from pallets/dependabot/pip/pre-commit-2.12.1
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 893314402,MDExOlB1bGxSZXF1ZXN0NjQ1ODQ5MDI3,1330,"Update aiofiles requirement from <0.7,>=0.4 to >=0.4,<0.8",49699333,dependabot[bot],closed,0,,,,,1,2021-05-17T13:07:31Z,2021-05-22T23:53:57Z,2021-05-22T23:53:56Z,CONTRIBUTOR,simonw/datasette/pulls/1330,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1330/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 895315478,MDExOlB1bGxSZXF1ZXN0NjQ3NTUyMTQx,1335,Fix small typo,3243482,abdusco,closed,0,,,,,1,2021-05-19T11:17:04Z,2021-05-22T23:53:34Z,2021-05-22T23:53:34Z,CONTRIBUTOR,simonw/datasette/pulls/1335,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,frosencrantz,closed,0,,,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797108702,MDExOlB1bGxSZXF1ZXN0NTY0MTcyMTQw,61,fixing typo in get cli help text,22578954,daniel-butler,closed,0,,,,,1,2021-01-29T18:57:04Z,2021-05-19T16:07:09Z,2021-05-19T16:07:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/61,,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 857280617,MDExOlB1bGxSZXF1ZXN0NjE0NzI3MDM2,254,Fix incorrect create-table cli description,1935268,robjwells,closed,0,,,,,1,2021-04-13T20:03:15Z,2021-05-19T04:43:46Z,2021-05-19T02:57:26Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/254,The description for `create-table` was duplicated from `create-index`.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 847423559,MDU6SXNzdWU4NDc0MjM1NTk=,253,fixtures.db example error in sql-utils blog post,192568,mroswell,closed,0,,,,,2,2021-03-31T22:07:36Z,2021-05-19T03:31:48Z,2021-05-19T03:31:47Z,NONE,,"En route to trying to understand column order transform documentation, I tried the instructions here: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/ I get a malformed database schema syntax error. ``` $ wget https://latest.datasette.io/fixtures.db --2021-03-31 18:00:23-- https://latest.datasette.io/fixtures.db Resolving latest.datasette.io (latest.datasette.io)... 2607:f8b0:4004:801::2013, 142.250.73.211 Connecting to latest.datasette.io (latest.datasette.io)|2607:f8b0:4004:801::2013|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [application/octet-stream] Saving to: ‘fixtures.db’ fixtures.db [ <=> ] 260.00K --.-KB/s in 0.1s 2021-03-31 18:00:23 (2.41 MB/s) - ‘fixtures.db’ saved [266240] $ sqlite3 fixtures.db '.schema facetable' Error: malformed database schema (generated_columns) - near ""AS"": syntax error $ sqlite3 fixtures.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> .schema Error: malformed database schema (generated_columns) - near ""AS"": syntax error ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 894948100,MDU6SXNzdWU4OTQ5NDgxMDA=,259,Suggest the --alter option if a new column cannot be added,9599,simonw,closed,0,,,,,1,2021-05-19T03:17:38Z,2021-05-19T03:27:33Z,2021-05-19T03:26:26Z,OWNER,,Refs #256.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 861622839,MDU6SXNzdWU4NjE2MjI4Mzk=,256,inserting with --nl errors with: sqlite3.OperationalError: table
has no column named ,279769,rathboma,closed,0,,,,,2,2021-04-19T18:01:03Z,2021-05-19T03:26:54Z,2021-05-19T03:26:54Z,NONE,,"I have a `jsonl` file, it is 10,000 lines long. Inserting from the cli with `sqlite-utils insert db table file --nl --batch-size 10000` fails with this missing column error, even though I'm telling it to use the whole file in the first batch. This seems similar to #18 and #139, but maybe it's unique to `--nl`?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 820468864,MDExOlB1bGxSZXF1ZXN0NTgzNDA3OTg5,244,Typo in upsert example,387669,j-e-d,closed,0,,,,,1,2021-03-02T23:14:14Z,2021-05-19T02:58:21Z,2021-05-19T02:58:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/244,Remove extra `[`,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/244/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 830803173,MDExOlB1bGxSZXF1ZXN0NTkyMjg5MzI0,245,Correct some typos,1076745,dbready,closed,0,,,,,1,2021-03-13T04:26:56Z,2021-05-19T02:58:04Z,2021-05-19T02:58:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/245,Noticed a typo in the docs and followed that up with a spellcheck. Had to bite my tongue at some of the British spellings.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/245/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 868188068,MDU6SXNzdWU4NjgxODgwNjg=,257,"Insert from JSON containing strings with non-ascii characters are escaped as unicode for lists, tuples, dicts.",6586811,dylan-wu,closed,0,,,,,0,2021-04-26T20:46:25Z,2021-05-19T02:57:05Z,2021-05-19T02:57:05Z,CONTRIBUTOR,,"JSON Test File (test.json): ```json [ { ""id"": 123, ""text"": ""FR Théâtre"" }, { ""id"": 223, ""text"": [ ""FR Théâtre"" ] } ] ``` Command to import: ```bash sqlite-utils insert test.db text test.json --pk=id ``` Resulting table view from datasette: ![image](https://user-images.githubusercontent.com/6586811/116147833-cdf2fb00-a6a5-11eb-8412-0aae81b6e6dd.png) Original, db.py line 2225: ```python return json.dumps(value, default=repr) ``` Fix, db.py line 2225: ```python return json.dumps(value, default=repr, ensure_ascii=False) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 868191959,MDExOlB1bGxSZXF1ZXN0NjIzNzU1NzIz,258,Fixing insert from JSON containing strings with non-ascii characters …,6586811,dylan-wu,closed,0,,,,,1,2021-04-26T20:50:00Z,2021-05-19T02:47:44Z,2021-05-19T02:47:44Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/258,"…are escaped aps unicode for lists, tuples, dicts Fix of #257 ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 812228314,MDU6SXNzdWU4MTIyMjgzMTQ=,1236,Ability to increase size of the SQL editor window,9599,simonw,closed,0,,,,,9,2021-02-19T18:09:27Z,2021-05-18T03:28:25Z,2021-02-22T21:05:21Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1236/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 890073940,MDExOlB1bGxSZXF1ZXN0NjQzMTQ5MzIw,1324,"Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0",49699333,dependabot[bot],closed,0,,,,,2,2021-05-12T13:08:59Z,2021-05-17T17:19:41Z,2021-05-17T17:19:40Z,CONTRIBUTOR,simonw/datasette/pulls/1324,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes

Sourced from jinja2's releases.

3.0.0

New major versions of all the core Pallets libraries, including Jinja 3.0, have been released! :tada:

This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.

Changelog

Sourced from jinja2's changelog.

Version 3.0.0

Released 2021-05-11

  • Drop support for Python 2.7 and 3.5.
  • Bump MarkupSafe dependency to >=1.1.
  • Bump Babel optional dependency to >=2.1.
  • Remove code that was marked deprecated.
  • Add type hinting. :pr:1412
  • Use :pep:451 API to load templates with :class:~loaders.PackageLoader. :issue:1168
  • Fix a bug that caused imported macros to not have access to the current template's globals. :issue:688
  • Add ability to ignore trim_blocks using +%}. :issue:1036
  • Fix a bug that caused custom async-only filters to fail with constant input. :issue:1279
  • Fix UndefinedError incorrectly being thrown on an undefined variable instead of Undefined being returned on NativeEnvironment on Python 3.10. :issue:1335
  • Blocks can be marked as required. They must be overridden at some point, but not necessarily by the direct child. :issue:1147
  • Deprecate the autoescape and with extensions, they are built-in to the compiler. :issue:1203
  • The urlize filter recognizes mailto: links and takes extra_schemes (or env.policies["urlize.extra_schemes"]) to recognize other schemes. It tries to balance parentheses within a URL instead of ignoring trailing characters. The parsing in general has been updated to be more efficient and match more cases. URLs without a scheme are linked as https:// instead of http://. :issue:522, 827, 1172, :pr:1195
  • Filters that get attributes, such as map and groupby, can use a false or empty value as a default. :issue:1331
  • Fix a bug that prevented variables set in blocks or loops from being accessed in custom context functions. :issue:768
  • Fix a bug that caused scoped blocks from accessing special loop variables. :issue:1088
  • Update the template globals when calling Environment.get_template(globals=...) even if the template was already loaded. :issue:295
  • Do not raise an error for undefined filters in unexecuted if-statements and conditional expressions. :issue:842
  • Add is filter and is test tests to test if a name is a registered filter or test. This allows checking if a filter is available in a template before using it. Test functions can be decorated with @pass_environment, @pass_eval_context, or @pass_context. :issue:842, :pr:1248
  • Support pgettext and npgettext (message contexts) in i18n extension. :issue:441
  • The |indent filter's width argument can be a string to

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 876431852,MDExOlB1bGxSZXF1ZXN0NjMwNTc4NzM1,1318,Bump black from 21.4b2 to 21.5b0,49699333,dependabot[bot],closed,0,,,,,2,2021-05-05T13:07:51Z,2021-05-11T13:12:32Z,2021-05-11T13:12:31Z,CONTRIBUTOR,simonw/datasette/pulls/1318,"Bumps [black](https://github.com/psf/black) from 21.4b2 to 21.5b0.
Release notes

Sourced from black's releases.

21.5b0

Black

  • Set --pyi mode if --stdin-filename ends in .pyi (#2169)
  • Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (#2182)

Black-Primer

  • Add --no-diff to black-primer to suppress formatting changes (#2187)
Changelog

Sourced from black's changelog.

21.5b0

Black

  • Set --pyi mode if --stdin-filename ends in .pyi (#2169)
  • Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (#2182)

Black-Primer

  • Add --no-diff to black-primer to suppress formatting changes (#2187)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.4b2&new-version=21.5b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 870125126,MDU6SXNzdWU4NzAxMjUxMjY=,1310,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx),3747136,ColinMaudry,closed,0,,,,,2,2021-04-28T16:20:11Z,2021-04-30T07:26:11Z,2021-04-30T06:58:46Z,NONE,,"Hi, I have started developing a plugin to export records as a spreadsheet file. It could be ods or xlsx, whatever is easier. I have spotted the following packages: - ods files: https://pypi.org/project/odswriter/ - xlsx files: https://openpyxl.readthedocs.io/en/stable/index.html (quite powerful) or https://xlsxwriter.readthedocs.io/ (faster) This is the code I have so far, I test it with the `--plugins-dir` option: ```python from datasette import hookimpl from datasette.utils.asgi import Response import odswriter as ods def render_spreadsheet(rows): with ods.writer(open(""test.ods"",""wb"")) as odsfile: for row in rows: odsfile.writerow([""String"", ""ABCDEF123456"", ""123456""]) return Response(odsfile, content_type=""application/vnd.oasis.opendocument.spreadsheet"", status=200) @hookimpl def register_output_renderer(): return {""extension"": ""ods"", ""render"": render_spreadsheet} ``` I get the following error: ``` Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' ERROR: Exception in ASGI application Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 396, in run_asgi result = await app(self.scope, self.receive, self.send) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py"", line 45, in __call__ return await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 161, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/tracer.py"", line 75, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 107, in app_wrapped_with_csrf await app(scope, receive, wrapped_send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1086, in __call__ return await self.route_path(scope, receive, send, path) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1133, in route_path return await self.handle_500(request, send, exception) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1267, in handle_500 await asgi_send_html( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 217, in asgi_send_html await asgi_send( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 237, in asgi_send await asgi_start(send, status, headers, content_type) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 246, in asgi_start await send( File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 103, in wrapped_send await send(event) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 482, in send raise RuntimeError(msg % message_type) RuntimeError: Expected ASGI message 'http.response.body', but got 'http.response.start'. ``` I tried with `AsgiFileDownload` like in [DatabaseDownload](https://github.com/simonw/datasette/blob/main/datasette/views/database.py#L150) to deal with the binary nature of the ods file, but the renderer expects a Response: > should be dict or Response However, the `Response` class only supports the following methods, not binary: - html - text - json - redirect How would you suggest me to proceed to have my ods file downloaded? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 871046111,MDExOlB1bGxSZXF1ZXN0NjI2MTMwMTM1,1313,Bump black from 20.8b1 to 21.4b2,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-29T13:58:06Z,2021-04-29T15:47:50Z,2021-04-29T15:47:49Z,CONTRIBUTOR,simonw/datasette/pulls/1313,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b2.
Release notes

Sourced from black's releases.

21.4b2

Black

  • Fix crash if the user configuration directory is inaccessible. (#2158)

  • Clarify circumstances in which Black may change the AST (#2159)

Packaging

  • Install primer.json (used by black-primer by default) with black. (#2154)

21.4b1

Black

  • Fix crash on docstrings ending with "\ ". (#2142)

  • Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

  • Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags in the name of the cache file. Without this fix, changes in these flags would not take effect if the cache had already been populated. (#2131)

  • Don't remove necessary parentheses from assignment expression containing assert / return statements. (#2143)

Packaging

  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

  • Black no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now Black enforces a single empty line. (#1646)

... (truncated)

Changelog

Sourced from black's changelog.

21.4b2

Black

  • Fix crash if the user configuration directory is inaccessible. (#2158)

  • Clarify circumstances in which Black may change the AST (#2159)

Packaging

  • Install primer.json (used by black-primer by default) with black. (#2154)

21.4b1

Black

  • Fix crash on docstrings ending with "\ ". (#2142)

  • Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

  • Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags in the name of the cache file. Without this fix, changes in these flags would not take effect if the cache had already been populated. (#2131)

  • Don't remove necessary parentheses from assignment expression containing assert / return statements. (#2143)

Packaging

  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

... (truncated)

Commits

[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 871157602,MDExOlB1bGxSZXF1ZXN0NjI2MjIyNjc2,1314,Upgrade to GitHub-native Dependabot,27856297,dependabot-preview[bot],closed,0,,,,,1,2021-04-29T15:36:41Z,2021-04-29T15:47:22Z,2021-04-29T15:47:21Z,CONTRIBUTOR,simonw/datasette/pulls/1314,"_Dependabot Preview will be shut down on August 3rd, 2021. In order to keep getting Dependabot updates, please merge this PR and migrate to GitHub-native Dependabot before then._ Dependabot has been fully integrated into GitHub, so you no longer have to install and manage a separate app. This pull request migrates your configuration from Dependabot.com to a config file, using the [new syntax][new_syntax]. When merged, we'll swap out `dependabot-preview` (me) for a new `dependabot` app, and you'll be all set! With this change, you'll now use the [Dependabot page in GitHub][dependabot_page], rather than the [Dependabot dashboard][dashboard], to monitor your version updates, and you'll configure Dependabot through the new config file rather than a UI. If you've got any questions or feedback for us, please let us know by creating an issue in the [dependabot/dependabot-core][issues] repository. [Learn more about migrating to GitHub-native Dependabot][learn] Please note that regular `@dependabot` commands do not work on this pull request. [dashboard]: https://app.dependabot.com/ [dependabot_page]: https://github.com/simonw/datasette/network/updates [issues]: https://github.com/dependabot/dependabot-core/issues/new?assignees=%40dependabot%2Fpreview-migration-reviewers&labels=E%3A+preview-migration&template=migration-issue.md [learn]: http://docs.github.com/code-security/supply-chain-security/upgrading-from-dependabotcom-to-github-native-dependabot [new_syntax]: https://help.github.com/en/github/administering-a-repository/configuration-options-for-dependency-updates [org_secrets_url]: https://github.com/settings/secrets/dependabot [repo_secrets_url]: https://github.com/simonw/datasette/settings/secrets/dependabot ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 870227815,MDExOlB1bGxSZXF1ZXN0NjI1NDU3NTc5,1311,Bump black from 20.8b1 to 21.4b1,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-28T18:25:58Z,2021-04-29T13:58:11Z,2021-04-29T13:58:09Z,CONTRIBUTOR,simonw/datasette/pulls/1311,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b1.
Release notes

Sourced from black's releases.

21.4b1

Black

  • Fix crash on docstrings ending with "\ ". (#2142)

  • Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

  • Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags in the name of the cache file. Without this fix, changes in these flags would not take effect if the cache had already been populated. (#2131)

  • Don't remove necessary parentheses from assignment expression containing assert / return statements. (#2143)

Packaging

  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

  • Black no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now Black enforces a single empty line. (#1646)

  • Black no longer adds an incorrect space after a parenthesized assignment expression in if/while statements (#1655)

  • Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason to split lines (#1824)

  • fixed a crash when PWD=/ on POSIX (#1631)

  • fixed "I/O operation on closed file" when using --diff (#1664)

  • Prevent coloured diff output being interleaved with multiple files (#1673)

  • Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

... (truncated)

Changelog

Sourced from black's changelog.

21.4b1

Black

  • Fix crash on docstrings ending with "\ ". (#2142)

  • Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

  • Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags in the name of the cache file. Without this fix, changes in these flags would not take effect if the cache had already been populated. (#2131)

  • Don't remove necessary parentheses from assignment expression containing assert / return statements. (#2143)

Packaging

  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

  • Black no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now Black enforces a single empty line. (#1646)

  • Black no longer adds an incorrect space after a parenthesized assignment expression in if/while statements (#1655)

  • Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason to split lines (#1824)

  • fixed a crash when PWD=/ on POSIX (#1631)

  • fixed "I/O operation on closed file" when using --diff (#1664)

  • Prevent coloured diff output being interleaved with multiple files (#1673)

... (truncated)

Commits

[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 869237023,MDExOlB1bGxSZXF1ZXN0NjI0NjM1NDQw,1309,Bump black from 20.8b1 to 21.4b0,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-27T20:28:11Z,2021-04-28T18:26:06Z,2021-04-28T18:26:04Z,CONTRIBUTOR,simonw/datasette/pulls/1309,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b0.
Release notes

Sourced from black's releases.

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

  • Black no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now Black enforces a single empty line. (#1646)

  • Black no longer adds an incorrect space after a parenthesized assignment expression in if/while statements (#1655)

  • Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason to split lines (#1824)

  • fixed a crash when PWD=/ on POSIX (#1631)

  • fixed "I/O operation on closed file" when using --diff (#1664)

  • Prevent coloured diff output being interleaved with multiple files (#1673)

  • Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

  • Added parsing support for unparenthesized tuples and yield expressions in annotated assignments (#1835)

  • use lowercase hex strings (#1692)

  • added --extend-exclude argument (PR #2005)

  • speed up caching by avoiding pathlib (#1950)

  • --diff correctly indicates when a file doesn't end in a newline (#1662)

  • Added --stdin-filename argument to allow stdin to respect --force-exclude rules (#1780)

  • Lines ending with fmt: skip will now be not formatted (#1800)

  • PR #2053: Black no longer relies on typed-ast for Python 3.8 and higher

... (truncated)

Changelog

Sourced from black's changelog.

21.4b0

Black

  • Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by Black and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many duplicates. (#2126)

  • Black now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (#1740)

  • Black now cleans up leading non-breaking spaces in comments (#2092)

  • Black now respects --skip-string-normalization when normalizing multiline docstring quotes (#1637)

  • Black no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now Black enforces a single empty line. (#1646)

  • Black no longer adds an incorrect space after a parenthesized assignment expression in if/while statements (#1655)

  • Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason to split lines (#1824)

  • fixed a crash when PWD=/ on POSIX (#1631)

  • fixed "I/O operation on closed file" when using --diff (#1664)

  • Prevent coloured diff output being interleaved with multiple files (#1673)

  • Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

  • Added parsing support for unparenthesized tuples and yield expressions in annotated assignments (#1835)

  • added --extend-exclude argument (PR #2005)

  • speed up caching by avoiding pathlib (#1950)

  • --diff correctly indicates when a file doesn't end in a newline (#1662)

  • Added --stdin-filename argument to allow stdin to respect --force-exclude rules (#1780)

  • Lines ending with fmt: skip will now be not formatted (#1800)

  • PR #2053: Black no longer relies on typed-ast for Python 3.8 and higher

... (truncated)

Commits

[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 866668415,MDU6SXNzdWU4NjY2Njg0MTU=,1308,"Columns named ""link"" display in bold",9599,simonw,closed,0,,,,,3,2021-04-24T05:58:11Z,2021-04-24T06:07:49Z,2021-04-24T06:07:49Z,OWNER,,Reported in office hours today.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 861331159,MDExOlB1bGxSZXF1ZXN0NjE4MDExOTc3,1303,"Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16",27856297,dependabot-preview[bot],closed,0,,,,,1,2021-04-19T13:49:12Z,2021-04-19T18:18:17Z,2021-04-19T18:18:17Z,CONTRIBUTOR,simonw/datasette/pulls/1303,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,chrismp,closed,0,,,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 849568079,MDExOlB1bGxSZXF1ZXN0NjA4MzIzMDI4,1290,Use pytest-xdist to speed up tests,9599,simonw,closed,0,,,,,1,2021-04-03T03:34:36Z,2021-04-03T03:42:29Z,2021-04-03T03:42:28Z,OWNER,simonw/datasette/pulls/1290,"Closes #1289, refs #1212.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 849543502,MDU6SXNzdWU4NDk1NDM1MDI=,1289,Speed up tests with pytest-xdist,9599,simonw,closed,0,,,,,3,2021-04-03T00:47:39Z,2021-04-03T03:42:28Z,2021-04-03T03:42:28Z,OWNER,,"I think I can get this working for almost every test, then use the pattern in https://github.com/pytest-dev/pytest-xdist/issues/385#issuecomment-444545641 to opt specific tests out of being run in parallel.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1289/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 841456306,MDU6SXNzdWU4NDE0NTYzMDY=,1276,"Invalid SQL: ""no such table: pragma_database_list"" on database page",1314318,justinallen,closed,0,,,,,7,2021-03-26T00:03:53Z,2021-03-31T16:27:27Z,2021-03-28T23:52:31Z,NONE,,"Don't think this has been covered here yet. I'm a little stumped with this one and can't tell if it's a bug or I have something misconfigured. Oddly, when running locally the usual list of tables populates (i.e. at /charts a list of tables in charts.db). But when on the web server it throws an Invalid SQL error and ""no such table: pragma_database_list"" below. All the url endpoints seem to work fine aside from this - individual tables (/charts/chart_one), as well as stored queries (/charts/query_one). Not sure if this has anything to do with upgrading to Datasette 0.55, or something to do with our setup, which uses a metadata build script similar to [the one for the 538 server](https://github.com/simonw/fivethirtyeight-datasette/blob/main/make_metadata.py), or something else. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1276/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 843739658,MDExOlB1bGxSZXF1ZXN0NjAzMDgyMjgw,1282,Fix little typo,192568,mroswell,closed,0,,,,,2,2021-03-29T19:45:28Z,2021-03-29T19:57:34Z,2021-03-29T19:57:34Z,CONTRIBUTOR,simonw/datasette/pulls/1282,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1282/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 576722115,MDU6SXNzdWU1NzY3MjIxMTU=,696,Single failing unit test when run inside the Docker image,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2020-03-06T06:16:36Z,2021-03-29T17:04:19Z,2021-03-07T07:41:18Z,OWNER,,"``` docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash root@0e1928cfdf79:/# cd /mnt root@0e1928cfdf79:/mnt# pip install -e .[test] root@0e1928cfdf79:/mnt# pytest ``` I get one failure! It was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]` ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724369025,MDExOlB1bGxSZXF1ZXN0NTA1NzY5NDYy,1031,Fallback to databases in inspect-data.json when no -i options are passed,299380,frankier,closed,0,,,,,6,2020-10-19T07:51:06Z,2021-03-29T01:46:45Z,2021-03-29T00:23:41Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/1031,Currenlty `Datasette.__init__` checks immutables against None to decide whether to fallback to inspect-data.json. This patch modifies the serve command to pass None when no -i options are passed so this fallback works correctly.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1031/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 842881221,MDU6SXNzdWU4NDI4ODEyMjE=,1281,Latest Datasette tags missing from Docker Hub,9599,simonw,closed,0,,,,,7,2021-03-29T00:58:30Z,2021-03-29T01:41:48Z,2021-03-29T01:41:48Z,OWNER,,"Spotted this while testing https://github.com/simonw/datasette/issues/1249#issuecomment-808998719_ https://hub.docker.com/r/datasetteproject/datasette/tags?page=1&ordering=last_updated isn't showing the tags for any version more recent than 0.54.1 - we are up to 0.56 now. But the `:latest` tag is for the new 0.56 release.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 831163537,MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz,1260,Fix: code quality issues,25361949,withshubh,closed,0,,,,,2,2021-03-14T13:56:10Z,2021-03-29T00:22:41Z,2021-03-29T00:22:41Z,NONE,simonw/datasette/pulls/1260,"### Description Hi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you. ### Summary of changes - Replaced ternary syntax with if expression - Removed redundant `None` default - Used `is` to compare type of objects - Iterated dictionary directly - Removed unnecessary lambda expression - Refactored unnecessary `else` / `elif` when `if` block has a `return` statement - Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement - Added .deepsource.toml to continuously analyze and detect code quality issues",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 807433181,MDU6SXNzdWU4MDc0MzMxODE=,1224,can't start immutable databases from configuration dir mode,295329,camallen,closed,0,,,,,0,2021-02-12T17:50:13Z,2021-03-29T00:17:31Z,2021-03-29T00:17:31Z,CONTRIBUTOR,,"Say I have a `/databases/` directory with multiple sqlite db files in that dir (`1.db` & `2.db`) and an `inspect-data.json` file. If I start datasette via `datasette -h 0.0.0.0 /databases/` then the resulting databases are set to `is_mutable: true` as inspected via http://127.0.0.1:8001/-/databases.json I don't want to have to list out the databases by name, e.g. `datasette -i /databases/1.db -i /databases/2.db` as i want the system to autodetect the sqlite dbs i have in the configuration directory According to the docs outlined in https://docs.datasette.io/en/latest/settings.html?highlight=immutable#configuration-directory-mode this should be possible > `inspect-data.json` the result of running datasette inspect - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running I believe that if the `inspect-json.json` file present, then in theory the databases will be automatically set to immutable via this code https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/app.py#L211-L216 However it appears the Click Multiple Options will return a tuple via https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/cli.py#L311-L317 The resulting tuple is passed to the Datasette app via `kwargs` and overrides the behaviour to set the databases to immutable via this arg https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/app.py#L182 If you think this is a bug and needs fixing, I am willing to make a PR to check for the empty `immutable` tuple before calling the Datasette class initializer as I think leaving that class interface alone is the best path here. Thoughts? Also - i'm loving Datasette, it truly is a wonderful tool, thank you :)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1224/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 763207948,MDU6SXNzdWU3NjMyMDc5NDg=,1141,Default styling for bullet point lists,9599,simonw,closed,0,,,,,0,2020-12-12T02:49:33Z,2021-03-29T00:14:05Z,2021-03-29T00:14:05Z,OWNER,,"I just noticed that https://datasette.io/content/recent_releases (which uses `datasette-render-markdown`) is missing its bullet points: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 825217564,MDExOlB1bGxSZXF1ZXN0NTg3MzMyNDcz,1252,Add back styling to lists within table cells (fixes #1141),7476523,bobwhitelock,closed,0,,,,,2,2021-03-09T03:00:57Z,2021-03-29T00:14:04Z,2021-03-29T00:14:04Z,CONTRIBUTOR,simonw/datasette/pulls/1252,"This overrides the Datasette reset - see https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/static/app.css#L35-L38 - to add back the default styling of list items displayed within Datasette table cells. Following this change, the same content as in the original issue looks like this: ![2021-03-09_02:57:32](https://user-images.githubusercontent.com/7476523/110411982-63e5ae80-8083-11eb-9b5c-e5dc825073e2.png) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 842556944,MDExOlB1bGxSZXF1ZXN0NjAyMTA3OTM1,1279,Minor Docs Update. Added `--app` to fly install command.,1019791,koaning,closed,0,,,,,2,2021-03-27T16:58:08Z,2021-03-29T00:11:55Z,2021-03-29T00:11:55Z,CONTRIBUTOR,simonw/datasette/pulls/1279,"Without this flag, there's an error locally. ``` > datasette publish fly bigmac.db Usage: datasette publish fly [OPTIONS] [FILES]... Try 'datasette publish fly --help' for help. Error: Missing option '-a' / '--app'. ``` I also got an error message which later turned out to be because I hadn't added my credit card information yet to `fly`. I wasn't sure if I should add that mention to the docs here, or to submit a bug-report over at https://github.com/simonw/datasette-publish-fly. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 842212586,MDU6SXNzdWU4NDIyMTI1ODY=,1277,Facet by array breaks if table name contains a space,9599,simonw,closed,0,,,,,1,2021-03-26T18:38:19Z,2021-03-27T03:49:38Z,2021-03-27T03:49:34Z,OWNER,,It breaks when you try to select a filtered item.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842062949,MDU6SXNzdWU4NDIwNjI5NDk=,252,Support json-line files,279769,rathboma,closed,0,,,,,1,2021-03-26T15:19:39Z,2021-03-26T15:21:38Z,2021-03-26T15:21:38Z,NONE,,"It's common for many processes to create flat files where each line is a JSON object. So the file isn't a json array. Many tools (like jq) support this natively, it'd be great for sqlite-utils to also!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 839008371,MDU6SXNzdWU4MzkwMDgzNzE=,1274,Might there be some way to comment metadata.json?,192568,mroswell,closed,0,,,,,2,2021-03-23T18:33:00Z,2021-03-23T20:14:54Z,2021-03-23T20:14:54Z,CONTRIBUTOR,,"I don't know what license to use... Would be nice to be able to add a comment regarding that uncertainty in my metadata.json file I like laktak's little video comment in favor of Human json (Hjson) https://stackoverflow.com/questions/244777/can-comments-be-used-in-json Hmmm... one of the commenters there said comments are allowed in yaml... so that's a good argument for yaml. Anyhow, just came to mind, and thought I'd mention it here. Looks like https://hjson.github.io/ has the details.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280013907,MDU6SXNzdWUyODAwMTM5MDc=,164,datasette skeleton command for kick-starting database and table metadata,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-12-07T06:13:28Z,2021-03-23T02:45:12Z,2017-12-07T06:20:45Z,OWNER,,Generates an example `metadata.json` file populated with all of the databases and tables inspected from the specified databases.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 279547886,MDU6SXNzdWUyNzk1NDc4ODY=,163,Document the querystring argument for setting a different time limit,9599,simonw,closed,0,,,,,2,2017-12-05T22:05:08Z,2021-03-23T02:44:33Z,2017-12-06T15:06:57Z,OWNER,,"http://datasette.readthedocs.io/en/latest/sql_queries.html#query-limits Need to explain why this is useful too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273775212,MDU6SXNzdWUyNzM3NzUyMTI=,88,Add NHS England Hospitals example to wiki,15543,tomdyson,closed,0,,,,,4,2017-11-14T12:29:10Z,2021-03-22T23:46:36Z,2017-11-14T22:54:06Z,CONTRIBUTOR,,"https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 837348479,MDU6SXNzdWU4MzczNDg0Nzk=,1269,Don't attempt to run count(*) against virtual tables,9599,simonw,closed,0,,,,,2,2021-03-22T05:57:43Z,2021-03-22T17:40:42Z,2021-03-22T17:40:41Z,OWNER,,"Counting the rows in a virtual table doesn't seem very interesting to me, and it's the cause of at least one crashing bug with SpatiaLite 5.0 on Linux, see https://github.com/simonw/datasette/issues/1268",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 837208901,MDU6SXNzdWU4MzcyMDg5MDE=,1267,Update Datasette alternativeto listening with details,921217,RayBB,closed,0,,,,,1,2021-03-21T23:20:20Z,2021-03-22T04:37:26Z,2021-03-22T04:37:26Z,NONE,,"Hello, I recently learned about Datasette from an old hackernews post. It seems like an awesome project and I actually have use case I might be trying out in the coming months. Alas, to get a better understanding of your project I looked it up on alternativeto to see what it is similar too. I promise it's not spam, it's reputable enough to have a [Wikipedia](https://en.wikipedia.org/wiki/AlternativeTo) page. There was no listing on the website so I went ahead and created a listing that is now approved. I encourage anyone who likes this project and hopes to spread the word to help update the listing by: 1. Adding to the list of software it compares to 2. Uploading screenshots 3. Writing a review 4. Adding ""features"" I know this may seem spammy but I promise I have no affiliation with alternativeto I'm just a happy user and know it's a popular site for discovering software. Here is the listing for datasette: https://alternativeto.net/software/datasette/about/ Cheers",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1267/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836963850,MDU6SXNzdWU4MzY5NjM4NTA=,249,Full text search possibly broken?,36287,prabhur,closed,0,,,,,2,2021-03-21T02:03:44Z,2021-03-21T02:43:32Z,2021-03-21T02:43:32Z,NONE,,"I'm not quite sure if this is an issue with sqlite-utils or datasette. **Background** I was previously using sqlite-utils version < 3.6. I have a bunch of csv files that have some data scraped from a website. ``` sqlite-utils create-table mydb.db post \ posted_date text \ url text \ title text \ raw_text text \ --not-null posted_date \ --not-null url \ --pk=url ``` FTS is enabled via `sqlite-utils enable-fts ./mydb.db post title raw_text` Data is loaded to the table via `sqlite-utils insert ./mydb.db post ${filename} --csv` Note that the data contains text in my language Tamil. Loading happens just fine. datasette serves the db file just fine. It recognizes FTS and shows the ""search"" box. However, none of the queries work. Whatever text I supply, it always returns 0 rows. I literally copy paste words from the row listing on the screen and paste it on the search box. Interestingly, only thing I can remember is switching to sqlite-utils 3.6. I had to do this because the prior version had an issue with column size. I have attached one of the csv files that can be loaded to the table. Substitute ""${filename}"" with that file for the sqlite-utils insert command. [posts_20200417-20201231.csv.zip](https://github.com/simonw/sqlite-utils/files/6176697/posts_20200417-20201231.csv.zip) Interestingly, the FTS based search from datasette worked just fine before this version upgrade. That is, the queries returned results. I will try to downgrade just to see if the theory is correct. I appreciate any help here. Thanks. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 832092321,MDU6SXNzdWU4MzIwOTIzMjE=,1261,Some links aren't properly URL encoded.,812795,brimstone,closed,0,,,,,3,2021-03-15T18:43:59Z,2021-03-21T02:06:44Z,2021-03-20T21:36:06Z,NONE,,"It seems like a percent sign in the query causes some links to end invalid. The json and CSV links on this page don't behave like expected: https://honeypot-brimston3.vercel.app/honeypot?sql=select+time%2C+count%28time%29+as+count+from+%28select+strftime%28%22%25Y-%25m-%25d%22%2C+_etime%29+as+time+from+ssh+%29+group+by+time+order+by+time%3B I can take a swing at trying to fix this, but my python isn't strong and I need a pointer at the right approach and files to change. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836273891,MDU6SXNzdWU4MzYyNzM4OTE=,1266,Documentation for Response.asgi_send(send) method,9599,simonw,closed,0,,,,,1,2021-03-19T18:52:49Z,2021-03-20T21:35:00Z,2021-03-20T21:32:28Z,OWNER,,"I found myself wanting to use this method for https://github.com/simonw/datasette-auth-passwords/issues/15 - but it's not documented. It should be documented. https://github.com/simonw/datasette/blob/8e18c7943181f228ce5ebcea48deb59ce50bee1f/datasette/utils/asgi.py#L320-L340",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836123030,MDU6SXNzdWU4MzYxMjMwMzA=,1265,Support for HTTP Basic Authentication,468612,yunzheng,closed,0,,,,,3,2021-03-19T15:31:09Z,2021-03-19T22:05:12Z,2021-03-19T21:03:09Z,NONE,,"It would be nice if datasette could support [HTTP Basic Authentication](https://en.wikipedia.org/wiki/Basic_access_authentication). For now I could ofcourse leverage Nginx for basic authentication, but it would be nice to have support for this in datasette by default or via a plugin like datasette-auth-github. My main usecase is to put the whole datasette instance behind a username/password prompt via Basic Auth and not specific urls.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 824067604,MDU6SXNzdWU4MjQwNjc2MDQ=,1250,Research: Plugin hook for alternative database connections,9599,simonw,closed,0,,,,,2,2021-03-08T00:28:15Z,2021-03-12T01:01:25Z,2021-03-12T01:01:17Z,OWNER,,"The `Database` class is a natural looking fit for a plugin hook to load custom database connections... potentially even databases other than SQLite. DuckDB (refs #968) could make for a great starting point, since it looks very compatible with the existing SQLite code. The real win would be if this could lead to running Datasette against PostgreSQL. I made some initial explorations in that direction a while ago in #670.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797649915,MDExOlB1bGxSZXF1ZXN0NTY0NjA4MjY0,1211,Use context manager instead of plain open,4488943,kbaikov,closed,0,,,,,3,2021-01-31T07:58:10Z,2021-03-11T16:15:50Z,2021-03-11T16:15:50Z,CONTRIBUTOR,simonw/datasette/pulls/1211,"Context manager with open closes the files after usage. Fixes: https://github.com/simonw/datasette/issues/1208 When the object is already a pathlib.Path i used read_text write_text functions In some cases pathlib.Path.open were used in context manager, it is basically the same as builtin open. Tests are passing: 850 passed, 5 xfailed, 10 xpassed",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 794554881,MDU6SXNzdWU3OTQ1NTQ4ODE=,1208,A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper,4488943,kbaikov,closed,0,,,,,2,2021-01-26T20:56:28Z,2021-03-11T16:15:49Z,2021-03-11T16:15:49Z,CONTRIBUTOR,,"Your code is full of open files that are never closed, especially when you deal with reading/writing json/yaml files. If you run python with warnings enabled this problem becomes evident. This probably contributes to some memory leaks in long running datasettes if the GC will not 'collect' those resources properly. This is easily fixed by using a context manager instead of just using open: ```python with open('some_file', 'w') as opened_file: opened_file.write('string') ``` In some newer parts of the code you use Path objects 'read_text' and 'write_text' functions which close the file properly and are prefered in some cases. If you want I can create a PR for all places i found this pattern in. Bellow is a fraction of places where i found a ResourceWarning: ```python update-docs-help.py: 20 actual = actual.replace(""Usage: cli "", ""Usage: datasette "") 21: open(docs_path / filename, ""w"").write(actual) 22 datasette\app.py: 210 ): 211: inspect_data = json.load((config_dir / ""inspect-data.json"").open()) 212 if immutables is None: 266 if config_dir and (config_dir / ""settings.json"").exists() and not config: 267: config = json.load((config_dir / ""settings.json"").open()) 268 self._settings = dict(DEFAULT_SETTINGS, **(config or {})) 445 self._app_css_hash = hashlib.sha1( 446: open(os.path.join(str(app_root), ""datasette/static/app.css"")) 447 .read() datasette\cli.py: 130 else: 131: out = open(inspect_file, ""w"") 132 loop = asyncio.get_event_loop() 459 if inspect_file: 460: inspect_data = json.load(open(inspect_file)) 461 ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 826613352,MDExOlB1bGxSZXF1ZXN0NTg4NjAxNjI3,1254,Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions,3200608,durkie,closed,0,,,,,6,2021-03-09T20:49:08Z,2021-03-10T18:27:45Z,2021-03-09T22:04:23Z,NONE,simonw/datasette/pulls/1254,"This requires adding the RT Topology library (Spatialite changed to RT Topology from LWGEOM between 4.4 and 5.0), as well as upgrading the GEOS version (which is the reason for switching to `python:3.7.10-slim-buster` as the base image.) `autoconf` and `libtool` are added to build RT Topology, and Spatialite is now built with `--disable-minizip` (minizip wasn't an option in 4.4 and I didn't want to add another dependency) and `--disable-dependency-tracking` which, according to Spatialite, ""speeds up one-time builds""",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 827341657,MDExOlB1bGxSZXF1ZXN0NTg5MjYzMjk3,1256,Minor type in IP adress,6371750,JBPressac,closed,0,,,,,3,2021-03-10T08:28:22Z,2021-03-10T18:26:46Z,2021-03-10T18:26:40Z,CONTRIBUTOR,simonw/datasette/pulls/1256,127.0.01 replaced by 127.0.0.1,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 823035080,MDU6SXNzdWU4MjMwMzUwODA=,1248,duckdb database (very low performance in SQLite),15836677,verajosemanuel,closed,0,,,,,1,2021-03-05T12:20:29Z,2021-03-08T00:25:27Z,2021-03-08T00:25:27Z,NONE,,"My sqlite is getting too big to be processed by datasette (more than 10 minutes waiting to load) so I am working with duckdb and is waaaaay faster. I think the fastest embeddable database actually. https://duckdb.org/ Taking into account DuckDb is SQLite based it would be GREAT to use it with datasette. is that possible? Regards and thanks for a superb job",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 806918878,MDExOlB1bGxSZXF1ZXN0NTcyMjU0MTAz,1223,Add compile option to Dockerfile to fix failing test (fixes #696),7476523,bobwhitelock,closed,0,,,,,2,2021-02-12T03:38:05Z,2021-03-07T12:01:12Z,2021-03-07T07:41:17Z,CONTRIBUTOR,simonw/datasette/pulls/1223,"This test was failing when run inside the Docker container: `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`, with this error: ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` The issue was that the version of sqlite3 built inside the Docker container was built with FTS3 and FTS4 enabled, but without the `SQLITE_ENABLE_FTS3_PARENTHESIS` compile option passed, which adds support for using `AND` and `NOT` within `match` expressions (see https://sqlite.org/fts3.html#compiling_and_enabling_fts3_and_fts4 and https://www.sqlite.org/compile.html). Without this, the `AND` used in the search in this test was being interpreted as a literal string, and so no matches were found. Adding this compile option fixes this. --- I actually ran into this issue because the same test was failing when I ran the test suite on my own machine, outside of Docker, and so I eventually tracked this down to my system sqlite3 also being compiled without this option. I wonder if this is a sign of a slightly deeper issue, that Datasette can silently behave differently based on the version and compilation of sqlite3 it is being used with. On my own system I fixed the test suite by running `pip install pysqlite3-binary`, so that this would be picked up instead of the `sqlite` package, as this seems to be compiled using this option, . Maybe using `pysqlite3-binary` could be installed/recommended by default so a more deterministic version of sqlite is used? Or there could be some feature detection done on the available sqlite version, to know what features are available and can be used/tested?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 815955014,MDExOlB1bGxSZXF1ZXN0NTc5Njk3ODMz,1243,fix small typo,306240,UtahDave,closed,0,,,,,2,2021-02-25T00:22:34Z,2021-03-04T05:46:10Z,2021-03-04T05:46:10Z,CONTRIBUTOR,simonw/datasette/pulls/1243,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 818430405,MDU6SXNzdWU4MTg0MzA0MDU=,1247,datasette.add_memory_database() method,9599,simonw,closed,0,,,,,2,2021-03-01T03:48:38Z,2021-03-01T04:02:26Z,2021-03-01T04:02:26Z,OWNER,,"I just wrote this code: https://github.com/simonw/datasette/blob/47eb885cc2c3aafa03645c330c6f597bee9b3b25/tests/test_facets.py#L334-L335 It would be nice if you didn't have to separately instantiate a database object here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 817597268,MDU6SXNzdWU4MTc1OTcyNjg=,1246,Suggest for ArrayFacet possibly confused by blank values,9599,simonw,closed,0,,,,,3,2021-02-26T19:11:52Z,2021-03-01T03:46:11Z,2021-03-01T03:46:11Z,OWNER,,I sometimes don't get the suggestion for facet-by-array for columns that contain arrays. I think it may be because they have empty spaces in them - or perhaps it's because the null detection doesn't actually work.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718259202,MDU6SXNzdWU3MTgyNTkyMDI=,1005,Remove xfail tests when new httpx is released,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2020-10-09T16:00:19Z,2021-02-28T22:41:08Z,2021-02-28T22:41:08Z,OWNER,,"> My `httpx` pull request adding `raw_path` support was just merged: https://github.com/encode/httpx/pull/1357 - but it's not in a release yet. > > I'm going to mark these tests as `xfail` so I can land this change - I'll remove that once an `httpx` release comes out that I can use to get the tests passing. > _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1000#issuecomment-706263157_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 814591962,MDU6SXNzdWU4MTQ1OTE5NjI=,1240,Allow facetting on custom queries,7107523,Kabouik,closed,0,,,,,3,2021-02-23T15:52:19Z,2021-02-26T18:19:46Z,2021-02-26T18:18:18Z,NONE,,"Facets are a tremendously useful feature, especially for people peeking at the database for the first time and still having little knowledge about the details of the data. It is of great assistance to discover interesting features to explore futher in advanced queries. Yet, it seems it's impossible to use facets when running a custom SQL query, be it from the little gear icons in column names, the facet suggestions at the top (hidden when performing a custom query), or by appending a facet code to the URL. Is there a technical limitation, or is this something that could be unlocked easily?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 817528452,MDU6SXNzdWU4MTc1Mjg0NTI=,1244,Plugin tip: look at the examples linked from the hooks page,9599,simonw,closed,0,,,,,1,2021-02-26T17:18:27Z,2021-02-26T17:30:38Z,2021-02-26T17:27:15Z,OWNER,,"Someone asked ""what are good example plugins I can look at?"" and I realized that the answer is to look through the example links on https://docs.datasette.io/en/stable/plugin_hooks.html - but that tip should be written down somewhere on the https://docs.datasette.io/en/stable/writing_plugins.html page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1244/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 815554385,MDU6SXNzdWU4MTU1NTQzODU=,237,"db[""my_table""].drop(ignore=True) parameter, plus sqlite-utils drop-table --ignore and drop-view --ignore",649467,mhalle,closed,0,,,,,3,2021-02-24T14:55:06Z,2021-02-25T17:11:41Z,2021-02-25T17:11:41Z,NONE,,"When I'm generating a derived table in python, I often drop the table and create it from scratch. However, the first time I generate the table, it doesn't exist, so the drop raises an exception. That means more boilerplate. I was going to submit a pull request that adds an ""if_exists"" option to the `drop` method of tables and views. However, for a utility like sqlite_utils, perhaps the ""IF EXISTS"" SQL semantics is what you want most of the time, and thus should be the default. What do you think?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816523763,MDU6SXNzdWU4MTY1MjM3NjM=,238,.add_foreign_key() corrupts database if column contains a space,9599,simonw,closed,0,,,,,1,2021-02-25T15:07:20Z,2021-02-25T16:54:02Z,2021-02-25T16:54:02Z,OWNER,,"I ran this: db[""Reports""].add_foreign_key(""Reported by ID"", ""Reporters"", ""id"") And got this: ``` ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in add_foreign_keys(self, foreign_keys) 616 # Have to VACUUM outside the transaction to ensure .foreign_keys property 617 # can see the newly created foreign key. --> 618 self.vacuum() 619 620 def index_foreign_keys(self): ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in vacuum(self) 629 630 def vacuum(self): --> 631 self.execute(""VACUUM;"") 632 633 ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in execute(self, sql, parameters) 234 return self.conn.execute(sql, parameters) 235 else: --> 236 return self.conn.execute(sql) 237 238 def executescript(self, sql): DatabaseError: database disk image is malformed ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816560819,MDU6SXNzdWU4MTY1NjA4MTk=,240,table.pks_and_rows_where() method returning primary keys along with the rows,9599,simonw,closed,0,,,,,7,2021-02-25T15:49:28Z,2021-02-25T16:39:23Z,2021-02-25T16:28:23Z,OWNER,,"*Original title: Easier way to update a row returned from .rows* Here's a surprisingly hard problem I ran into while trying to implement #239 - given a row returned by `db[table].rows` how can you update that row? The problem is that the `db[table].update(...)` method requires a primary key. But if you have a row from the `db[table].rows` iterator it might not even contain the primary key - provided the table is a `rowid` table. Instead, currently, you need to introspect the table and, if `rowid` is a primary key, explicitly include that in the `select=` argument to `table.rows_where(...)` - otherwise it will not be returned. A utility mechanism to make this easier would be very welcome.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 813978858,MDU6SXNzdWU4MTM5Nzg4NTg=,1239,JSON filter fails if column contains spaces,9599,simonw,closed,0,,,,,1,2021-02-23T00:18:07Z,2021-02-23T00:22:53Z,2021-02-23T00:22:53Z,OWNER,,"Got this exception: `ERROR: conn=, sql = 'select Address, Affiliation, County, [Has Report], [Latest report notes], [Latest report yes], Latitude, [Location Type], Longitude, Name, id, [Appointment scheduling instructions], [Availability Info], [Latest report] from locations where rowid in (\n select locations.rowid from locations, json_each(locations.Availability Info) j\n where j.value = :p0\n ) and ""Latest report yes"" = :p1 order by id limit 101', params = {'p0': 'Yes: appointment required', 'p1': '1'}: near ""Info"": syntax error`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1239/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 783778672,MDU6SXNzdWU3ODM3Nzg2NzI=,220,Better error message for *_fts methods against views,649467,mhalle,closed,0,,,,,3,2021-01-11T23:24:00Z,2021-02-22T20:44:51Z,2021-02-14T22:34:26Z,NONE,,"enable_fts and its related methods only work on tables, not views. Could those methods and possibly others move up to the Queryable superclass? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797651831,MDU6SXNzdWU3OTc2NTE4MzE=,1212,Tests are very slow. ,4488943,kbaikov,closed,0,,,,,4,2021-01-31T08:06:16Z,2021-02-19T22:54:13Z,2021-02-19T22:54:13Z,CONTRIBUTOR,,"Working on my PR i noticed that tests are very slow. The plain pytest run took about 37 minutes for me. However i could shave of about 10 minutes from that if i used pytest-xdist to parallelize execution. `pytest -n 8` is run only in 28 minutes on my machine. I can create a PR to mention that in your documentation. This will be a simple change to add pytest-xdist to requirements and change a command to run pytest in documentation. Does that make sense to you? After a bit more investigation it looks like python-xdist is not an answer. It creates a race condition for tests that try to clead temp dir before run. Profiling shows that most time is spent on conn.executescript(TABLES) in make_app_client function. Which makes sense. Perhaps the better approach would be look at the app_client fixture which is already session scoped, but not used by all test cases. And/or use conn = sqlite3.connect("":memory:"") which is much faster. And/or truncate tables after each TC instead of deleting the file and re-creating them. I can take a look which is the best approach if you give the go-ahead. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 811680502,MDU6SXNzdWU4MTE2ODA1MDI=,236,--attach command line option for attaching extra databases,9599,simonw,closed,0,,,,,1,2021-02-19T04:38:30Z,2021-02-19T05:10:41Z,2021-02-19T05:08:43Z,OWNER,,"This will enable cross-database joins, as seen in https://github.com/simonw/datasette/issues/283 Also refs #113",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/236/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621286870,MDU6SXNzdWU2MjEyODY4NzA=,113,Syntactic sugar for ATTACH DATABASE,9599,simonw,closed,0,,,,,2,2020-05-19T21:10:00Z,2021-02-19T05:09:12Z,2021-02-19T04:56:36Z,OWNER,,"https://www.sqlite.org/lang_attach.html Maybe something like this: ```python db.attach(""other_db"", ""other_db.db"") ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 811589344,MDU6SXNzdWU4MTE1ODkzNDQ=,1235,Upgrade Python version used by official Datasette Docker image,9599,simonw,closed,0,,,,,2,2021-02-19T00:47:40Z,2021-02-19T01:48:31Z,2021-02-19T01:48:30Z,OWNER,,"Currently uses 3.7.2: https://github.com/simonw/datasette/blob/73bed175631a79e13a521eee82f8451dd0477eb3/Dockerfile#L1 There's a security fix for Python which it would be good to ship in this image (even though I'm reasonably confident it doesn't affect Datasette): https://bugs.python.org/issue42938",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 811407131,MDExOlB1bGxSZXF1ZXN0NTc1OTQwMTkz,1232,--crossdb option for joining across databases,9599,simonw,closed,0,,,,,8,2021-02-18T19:48:50Z,2021-02-18T22:09:13Z,2021-02-18T22:09:12Z,OWNER,simonw/datasette/pulls/1232,"Refs #283. Still needs: - [x] Unit test for --crossdb queries - [x] Show warning on console if it truncates at ten databases (or on web interface) - [x] Show connected databases on the `/_memory` database page - [x] Documentation - [x] https://latest.datasette.io/ demo should demonstrate this feature",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 808843401,MDU6SXNzdWU4MDg4NDM0MDE=,1226,--port option should validate port is between 0 and 65535,9599,simonw,closed,0,,,,,4,2021-02-15T22:01:33Z,2021-02-18T18:41:27Z,2021-02-18T18:41:27Z,OWNER,,"Currently throws an ugly error message: ``` (datasette-graphql) datasette-graphql % datasette fivethirtyeight.db -p 80094 INFO: Started server process [45497] INFO: Waiting for application startup. INFO: Application startup complete. Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/datasette-graphql-n1OSJCS8/bin/datasette"", line 8, in sys.exit(cli()) ... server = await loop.create_server( File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 1461, in create_server sock.bind(sa) OverflowError: bind(): port must be 0-65535. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807174161,MDU6SXNzdWU4MDcxNzQxNjE=,227,Error reading csv files with large column data,295329,camallen,closed,0,,,,,4,2021-02-12T11:51:47Z,2021-02-16T11:48:03Z,2021-02-14T21:17:19Z,NONE,,"*Feel free to close this issue - I mostly added it for reference for future folks that run into this :)* I have a CSV file with one column that has very long strings. When i try to import this file via the `insert` command I get the following error: ``` sqlite-utils insert database.db table_name file_with_large_column.csv Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 774, in insert default=default, File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 705, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1852, in insert_all first_record = next(records) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 703, in docs = (decode_base64_values(doc) for doc in docs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 681, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: field larger than field limit (131072) ``` Built with the docker image `datasetteproject/datasette:0.54` with the following versions: ``` # sqlite-utils --version sqlite-utils, version 3.4.1 # datasette --version datasette, version 0.54 ``` It appears this is a [known issue](https://stackoverflow.com/a/54517228/2761423) reading in csv files in python and [doesn't look to be modifiable](https://github.com/python/cpython/blob/ea46579067fd2d4e164d6605719ffec690c4d621/Modules/_csv.c#L1685) through system / env vars (i may be very wrong on this). Noting that using sqlite3 `import` command work without error (not using the python csv reader) ``` sqlite3 database.db sqlite> .mode csv sqlite> .import file_with_large_column.csv table_name ``` Sadly I couldn't see an easy way around this while using the cli as it appears this value needs to be changed in python code. FWIW I've switched to using https://datasette.io/tools/csvs-to-sqlite for importing csv data and it's working well. Finally, I'm loving https://datasette.io/ thank you very much for an amazing tool and data ecosytem 🙇‍♀️ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807817197,MDU6SXNzdWU4MDc4MTcxOTc=,229,Hitting `_csv.Error: field larger than field limit (131072)`,631242,frosencrantz,closed,0,,,,,3,2021-02-13T19:52:44Z,2021-02-14T21:33:33Z,2021-02-14T21:33:33Z,NONE,,"I have a csv file where one of the fields is so large it is throwing an exception with this error and stops loading: ``` _csv.Error: field larger than field limit (131072) ``` The stack trace occurs here: https://github.com/simonw/sqlite-utils/blob/3.1/sqlite_utils/cli.py#L633 There is a way to handle this that helps: https://stackoverflow.com/questions/15063936/csv-error-field-larger-than-field-limit-131072 One issue I had with this problem was sqlite-utils only provides limited context as to where the problem line is. There is the progress bar, but that is by percent rather than by line number. It would have been helpful if it could have provided a line number. Also, it would have been useful if it had allowed the loading to continue with later lines. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808008305,MDU6SXNzdWU4MDgwMDgzMDU=,230,--sniff option for sniffing delimiters,9599,simonw,closed,0,,,,,8,2021-02-14T17:43:54Z,2021-02-14T21:15:33Z,2021-02-14T19:24:32Z,OWNER,,"> I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/230/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797159961,MDExOlB1bGxSZXF1ZXN0NTY0MjE1MDEx,225,fix for problem in Table.insert_all on search for columns per chunk of rows,261237,nieuwenhoven,closed,0,,,,,2,2021-01-29T20:16:07Z,2021-02-14T21:04:13Z,2021-02-14T21:04:13Z,NONE,simonw/sqlite-utils/pulls/225,"Hi, I ran into a problem when trying to create a database from my Apple Healthkit data using [healthkit-to-sqlite](https://github.com/dogsheep/healthkit-to-sqlite). The program crashed because of an invalid insert statement that was generated for table `rDistanceCycling`. The actual problem turned out to be in [sqlite-utils](https://github.com/simonw/sqlite-utils). `Table.insert_all` processes the data to be inserted in chunks of rows and checks for every chunk which columns are used, and it will collect all column names in the variable `all_columns`. The collection of columns is done using a nested list comprehension that is not completely correct. I'm using a Windows machine and had to make a few adjustments to the tests in order to be able to run them because they had a posix dependency. Thanks, kind regards, Frans ``` # this is a (condensed) chunk of data from my Apple healthkit export that caused the problem. # the 3 last items in the chunk have additional keys: metadata_HKMetadataKeySyncVersion and metadata_HKMetadataKeySyncIdentifier chunk = [{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:09 +0100', 'startDate': '2020-10-10 12:29:06 +0100', 'endDate': '2020-10-10 12:29:07 +0100', 'value': '0.00518016'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:10 +0100', 'startDate': '2020-10-10 12:29:07 +0100', 'endDate': '2020-10-10 12:29:08 +0100', 'value': '0.00544049'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:40:50 +0100', 'endDate': '2020-07-15 16:42:49 +0100', 'value': '0.952092', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520450.99823:616520569.99360:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:42:49 +0100', 'endDate': '2020-07-15 16:44:51 +0100', 'value': '0.848983', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520569.99360:616520691.98826:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:44:51 +0100', 'endDate': '2020-07-15 16:46:50 +0100', 'value': '0.834403', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520691.98826:616520810.98305:119'}] def all_columns_old(): all_columns = [col for col in chunk[0]] all_columns += [column for record in chunk for column in record if column not in all_columns] return all_columns def all_columns_new(): all_columns = [col for col in chunk[0]] for record in chunk: all_columns += [column for column in record if column not in all_columns] return all_columns if __name__ == '__main__': from pprint import pprint print('problem: ') pprint(all_columns_old()) print('\nfix: ') pprint(all_columns_new()) ``` ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 808046597,MDU6SXNzdWU4MDgwNDY1OTc=,234,.insert_all() fails if subsequent chunks contain additional columns,9599,simonw,closed,0,,,,,1,2021-02-14T21:01:51Z,2021-02-14T21:03:40Z,2021-02-14T21:03:40Z,OWNER,,Reported by @nieuwenhoven in #225 along with a proposed fix.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/234/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808036774,MDU6SXNzdWU4MDgwMzY3NzQ=,232,Run tests against Windows in GitHub Actions,9599,simonw,closed,0,,,,,0,2021-02-14T20:09:45Z,2021-02-14T20:39:55Z,2021-02-14T20:39:55Z,OWNER,,"> I'm going to try and get the test suite to run in Windows on GitHub Actions. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/225#issuecomment-778834504_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808037010,MDExOlB1bGxSZXF1ZXN0NTczMTQ3MTY4,233,"Run tests against Ubuntu, macOS and Windows",9599,simonw,closed,0,,,,,0,2021-02-14T20:11:02Z,2021-02-14T20:39:54Z,2021-02-14T20:39:54Z,OWNER,simonw/sqlite-utils/pulls/233,Refs #232,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/233/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 808028757,MDU6SXNzdWU4MDgwMjg3NTc=,231,"limit=X, offset=Y parameters for more Python methods",9599,simonw,closed,0,,,,,2,2021-02-14T19:31:23Z,2021-02-14T20:03:08Z,2021-02-14T20:03:08Z,OWNER,,"> I'm going to add a `offset=` parameter to support this case. Thanks for the suggestion! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/224#issuecomment-778828495_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 792297010,MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2,224,Add fts offset docs.,37962604,polyrand,closed,0,,,,,2,2021-01-22T20:50:58Z,2021-02-14T19:31:06Z,2021-02-14T19:31:06Z,NONE,simonw/sqlite-utils/pulls/224,"The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f""15, 30""`, the standard syntax should work too.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 806743116,MDU6SXNzdWU4MDY3NDMxMTY=,1220,Installing datasette via docker: Path 'fixtures.db' does not exist,30607,aborruso,closed,0,,,,,4,2021-02-11T21:09:14Z,2021-02-12T21:35:17Z,2021-02-12T21:35:17Z,NONE,,"Hi, If I run ``` docker run -p 8001:8001 -v `pwd`:/mnt \ 1 ↵ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 fixtures.db ``` I have ``` Error: Invalid value for '[FILES]...': Path 'fixtures.db' does not exist. ``` If I run `test -f fixtures.db && echo ""it exists.""` I have `it exists.`. What's my error? Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 806861312,MDExOlB1bGxSZXF1ZXN0NTcyMjA5MjQz,1222,"--ssl-keyfile and --ssl-certfile, refs #1221",9599,simonw,closed,0,,,,,0,2021-02-12T00:45:58Z,2021-02-12T00:52:18Z,2021-02-12T00:52:17Z,OWNER,simonw/datasette/pulls/1222,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 770712149,MDExOlB1bGxSZXF1ZXN0NTQyNDA2OTEw,10,BugFix for encoding and not update info.,1277270,riverzhou,closed,0,,,,,1,2020-12-18T08:58:54Z,2021-02-11T22:37:56Z,2021-02-11T22:37:56Z,NONE,dogsheep/evernote-to-sqlite/pulls/10,"Bugfix 1: Traceback (most recent call last): File ""d:\anaconda3\lib\runpy.py"", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File ""d:\anaconda3\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""D:\Anaconda3\Scripts\evernote-to-sqlite.exe\__main__.py"", line 7, in File ""d:\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ File ""d:\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) return ctx.invoke(self.callback, **ctx.params) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\utils.py"", line 11, in find_all_tags chunk = fp.read(1024 * 1024) UnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 383: illegal multibyte sequence Bugfix 2: Traceback (most recent call last): File ""D:\Anaconda3\Scripts\evernote-to-sqlite-script.py"", line 33, in sys.exit(load_entry_point('evernote-to-sqlite==0.3', 'console_scripts', 'evernote-to-sqlite')()) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\cli.py"", line 31, in enex File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\utils.py"", line 28, in save_note AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 748370021,MDExOlB1bGxSZXF1ZXN0NTI1MzcxMDI5,8,"fix import error if note has no ""updated"" element",4028322,mkorosec,closed,0,,,,,0,2020-11-22T22:51:05Z,2021-02-11T22:34:06Z,2021-02-11T22:34:06Z,CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/8,"I got the following error when executing evernote-to-sqlite enex evernote.db evernote.enex ``` ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text' ``` Seems that in some cases the updated element is not added to the note, this is a part of the problematic note: ``` 20201019T074518Z web.clip7 webclipper.evernote ```",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 743297582,MDU6SXNzdWU3NDMyOTc1ODI=,7,evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace',42387931,martinvanwieringen,closed,0,,,,,1,2020-11-15T16:57:28Z,2021-02-11T22:13:17Z,2021-02-11T22:13:17Z,NONE,,"running evernote-to-sqlite 0.2 on windows 10. Command: evernote-to-sqlite enex evernote.db MyNotes.enex I get the followinng error: File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\evernote_to_sqlite\utils.py"", line 46, in save_note note_id = db[""notes""].insert(row, hash_id=""id"", replace=True, alter=True).last_pk TypeError: insert() got an unexpected keyword argument 'replace' Removing replace=True, Leads to below error: note_id = db[""notes""].insert(row, hash_id=""id"", alter=True).last_pk File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 924, in insert return self.insert_all( File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 1046, in insert_all result = self.db.conn.execute(sql, values) sqlite3.IntegrityError: UNIQUE constraint failed: notes.id",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 748372469,MDU6SXNzdWU3NDgzNzI0Njk=,9,ParseError: undefined entity š,4028322,mkorosec,closed,0,,,,,1,2020-11-22T23:04:35Z,2021-02-11T22:10:55Z,2021-02-11T22:10:55Z,CONTRIBUTOR,,"I encountered a parse error if the enex file contained š or   Run command: evernote-to-sqlite enex evernote.db evernote.enex ``` Traceback (most recent call last): ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 35, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: undefined entity š: line 3, column 35 ``` Workaround: ``` sed -i 's/š//g' evernote.enex sed -i 's/ //g' evernote.enex ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 792851444,MDU6SXNzdWU3OTI4NTE0NDQ=,11,XML parse error,3613583,dskrad,closed,0,,,,,2,2021-01-24T17:38:54Z,2021-02-11T21:18:58Z,2021-02-11T21:18:48Z,NONE,,"I am on Windows 10 using Windows Subsystem for Linux, Python 3.8. I installed evernote-to-sqlite via pipx (in a venv). I tried using enex files from the latest version of Evernote for Windows (10.6.9 which only lets you export 50 notes at a time) and from Legacy Evernote (6.25.2.9198 which lets you export all your notes at once). The enex file from latest evernote gives this error: File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: XML or text declaration not at start of entity: line 2, column 6 The enex file from Legacy Evernote gives this error: File ""/home/david/.local/pipx/venvs/evernote-to-sqlite/lib/python3.8/site-packages/evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 802583450,MDU6SXNzdWU4MDI1ODM0NTA=,226,3.4 release is broken - includes a rogue line,9599,simonw,closed,0,,,,,0,2021-02-06T02:08:01Z,2021-02-06T02:10:26Z,2021-02-06T02:10:26Z,OWNER,,"I started seeing weird errors, caused by this line: https://github.com/simonw/sqlite-utils/blob/f8010ca78fed8c5fca6cde19658ec09fdd468420/sqlite_utils/cli.py#L1-L3 That was added by accident in 1b666f9315d4ea6bb332b2e75e48480c26100199 I'm surprised the tests didn't catch this!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 788527932,MDU6SXNzdWU3ODg1Mjc5MzI=,223,--delimiter option for CSV import,9599,simonw,closed,0,,,,,2,2021-01-18T20:25:03Z,2021-02-06T01:39:47Z,2021-02-06T01:34:54Z,OWNER,,"https://bruxellesdata.opendatasoft.com/explore/dataset/dog-toilets/export/?location=12,50.85802,4.38054 says: > CSV uses semicolon (;) as a separator. Would be useful to be able to do this: sqlite-utils insert places.db places places.csv --delimiter ';' `--delimiter` could imply `--csv`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 796234313,MDU6SXNzdWU3OTYyMzQzMTM=,1210,Immutable Database w/ Canned Queries,525780,heyarne,closed,0,,,,,2,2021-01-28T18:08:29Z,2021-02-05T11:30:34Z,2021-02-05T11:30:34Z,NONE,,"I have a database that I only want to read from; when instructing datasette to treat the database as immutable my defined canned queries disappear. Are these two features incompatible or have I hit an unintended bug? Thanks for datasette in any way, it's a joy to use!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 796736607,MDU6SXNzdWU3OTY3MzY2MDc=,56,Not all quoted statuses get fetched?,42315895,gsajko,closed,0,,,,,3,2021-01-29T09:48:44Z,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,," ![image](https://user-images.githubusercontent.com/42315895/106259325-5f75dc80-621f-11eb-8311-db8f2fe2a257.png) In my database I have 13300 quote tweets, but eta 3600 have `quoted_status` empty. I fetched some of them using `https://api.twitter.com/1.1/statuses/show.json?id=xx` and they did have ids of quoted tweets.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 799693777,MDU6SXNzdWU3OTk2OTM3Nzc=,1214,Re-submitting filter form duplicates _x querystring arguments,9599,simonw,closed,0,,,,,3,2021-02-02T21:13:35Z,2021-02-02T21:28:53Z,2021-02-02T21:21:13Z,OWNER,,"Really nasty bug, caused by #1194 fix in 07e163561592c743e4117f72102fcd350a600909 Navigate to this page: https://github-to-sqlite.dogsheep.net/github/labels?_search=help&_sort=id Click ""Apply"" to submit the form and the resulting URL is https://github-to-sqlite.dogsheep.net/github/labels?_search=help&_sort=id&_search=help&_sort=id That's because the (truncated) HTML for the form looks like this: ```html ... ...
... ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed