id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,precipice,open,0,,,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2029161033,I_kwDOCGYnMM548opJ,606,str and int as aliases for text and integer,9599,simonw,closed,0,,,,,2,2023-12-06T18:35:49Z,2023-12-06T19:44:04Z,2023-12-06T18:49:32Z,OWNER,,"I keep making this mistake: ```bash sqlite-utils add-column content.db assets _since int ``` ``` Usage: sqlite-utils add-column [OPTIONS] PATH TABLE COL_NAME [[integer|float|b lob|text|INTEGER|FLOAT|BLOB|TEXT]] Try 'sqlite-utils add-column -h' for help. Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]': 'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,fgregg,open,0,,,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed. this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application"" (see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,fzakaria,open,0,,,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is ""stringifying"" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,mattparmett,open,0,,,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler. Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler. This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it. One example: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537 Other instances: https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2007893839,I_kwDOCGYnMM53rgdP,605,Insert fails with `Error: Python int too large to convert to SQLite INTEGER`; can we use `NUMERIC` here?,12229877,Zac-HD,closed,0,,,,,1,2023-11-23T10:19:46Z,2023-12-08T05:07:54Z,2023-12-08T05:07:54Z,NONE,,"I'm currently working on a new feature for Hypothesis, where we can dump a tidy jsonlines table of all the test cases we tried - including arguments, outcomes, timings, coverage, etc. Exploring this seems like a perfect cases for `sqlite-utils` and `datasette`, but I pretty quickly ran into an integer overflow problem and don't want to recommend that experience to my users. I originally went to report this as a bug... and then found https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895581038 almost exactly matched my repro 😅 https://github.com/simonw/sqlite-utils/issues/110#issuecomment-626391063 suggests that using `NUMERIC` would avoid this overflow error, although ""If the TEXT value is a well-formed integer literal that is too large to fit in a 64-bit signed integer, it is converted to REAL."" suggests that this would come at the cost of rounding to the nearest float value. Maybe I should just convert large integers to float before writing out my json? After a bit more hacking, ""manually cast large integers to float"" seems like a decent solution for my particular case, but having written it up I thought I might as well post this issue anyway - I hope it's useful feedback, and won't mind at all if you close as wontfix if it's not.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1994857251,I_kwDOBm6k_c525xsj,2208,No suggested facets when a column named 'value' is included,198537,rgieseke,open,0,,,,,1,2023-11-15T14:11:17Z,2023-11-15T14:18:59Z,,CONTRIBUTOR,,"When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174 Currently the following is shown (from https://latest.datasette.io/fixtures/facetable) ![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5) When I add a column named 'value' only the JSON facets are processed. ![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d) I think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that. There is also a TODO with a similar question in the same file. I have not looked into that yet. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,honzajavorek,open,0,,,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error: ``` $ datasette Traceback (most recent call last): File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group' ``` I have datasette in my dependencies like this: ```toml [tool.poetry.group.dev.dependencies] datasette = {version = ""1.0a7"", allow-prereleases = true} ``` I had the latest regular version (not pre-release) there originally, but the result was the same: ```toml [tool.poetry.group.dev.dependencies] datasette = ""0.64.5"" ``` Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1988525411,I_kwDOCGYnMM52hn1j,603,Pyhton 3.12 Bug report,1324252,constantinedev,open,0,,,,,1,2023-11-10T22:57:48Z,2023-12-08T05:10:31Z,,NONE,,"I start with new python3 verison 3.12.0 Also have the error where connect DataBase ``` Traceback (most recent call last): File ""/home/t/Development/python/FKPJ/ClinicSYS/run.py"", line 1, in import re, os, io, json, sqlite_utils, requests, pytz, logging File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/__init__.py"", line 1, in from .db import Database File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 277, in class Database: File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 306, in Database filename_or_conn: Optional[Union[str, pathlib.Path, sqlite3.Connection]] = None, ^^^^^^^^^^^^^^^^^^ ``` This bug come from `sqlite-utils` since's v3.33. Anyone get the same ? As well now of the resolved plan just keep the sqlite-utils version in python3.12 with v3.32.1 [tested] but where are the sqlite3.Connection problem.... This won't happen on python version down to 3.11[tested] Just the python3.12.0, I have test this error are come from the sqlite3 connection The error say from `sqlite_utils` and with the sqlite3 Connection, what can I do. Let fix together.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978603203,I_kwDOCGYnMM517xbD,602,`sqlite-utils transform` removes the `AUTOINCREMENT` keyword,4472046,ArsTapatun,open,0,,,,,0,2023-11-06T08:48:43Z,2023-11-06T08:48:43Z,,NONE,,"### Context We ran into this bug randomly, noticing that deleted `ROWID` would get reused after migrating the DB. Using `transform` to change any column in the table will also unexpectedly strip away the `AUTOINCREMENT` keyword from the primary key definition, even if it was not the transformation target. ### Reproducible example **Original database** ```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF $ sqlite3 test.db "".schema mytable"" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ``` **Modified database after sqlite-utils** ```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE IF NOT EXISTS ""mytable"" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139 In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates. You can't even try calling `post_body()` and implement your own custom parsing because of: - #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978022687,I_kwDOBm6k_c515jsf,2204,request.post_body() can only be called once,9599,simonw,open,0,,,,,0,2023-11-05T23:22:03Z,2023-11-05T23:23:23Z,,OWNER,,"This code here: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135 It consumes the messages, which means if you try to call it a second time you won't be able to get at the body. This is efficient - we don't end up with a `request` object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not. Potential solution: set `request._body` the first time it is called, and return that on subsequent calls. Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call `post_body()` multiple times against one of those larger bodies. I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,LyzardKing,open,0,,,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue. I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly. Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: ```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!``` Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977155641,I_kwDOCGYnMM512QA5,601,Move plugin directory into documentation,9599,simonw,open,0,,,,,0,2023-11-04T04:07:52Z,2023-11-04T04:07:52Z,,OWNER,,"https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html https://til.simonwillison.net/readthedocs/stable-docs",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1976986318,I_kwDOCGYnMM511mrO,599,Cannot find spatialite on arm64 linux,37802088,MikeCoats,closed,0,,,,,1,2023-11-03T22:05:51Z,2023-11-04T01:06:31Z,2023-11-04T00:33:28Z,CONTRIBUTOR,,"Initially, I found an issue in `datasette` where it wouldn’t find `spatialite` when running on my Radxa Rock 5B - an RK3588 powered SBC, running the arm64 build of Debian Bullseye. I confirmed the same behaviour on my Raspberry Pi 4 - a BCM2711 powered SBC, running the arm64 build of Debian Bookworm. ``` $ datasette --load-extension=spatialite example.db Error: Could not find SpatiaLite extension ``` I did some digging and realised the issue originates in this project. Even with the `libsqlite3-mod-spatialite` package installed, `pytest` skips all of the GIS tests in the project. ``` $ apt list --installed | grep spatial […] libsqlite3-mod-spatialite/stable,now 5.0.1-3 arm64 [installed] $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` I tracked the issue down to the [`find_sqlite()` function in the `utils.py`](https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/utils.py#L60) file. The [`SPATIALITE_PATHS`](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/utils.py#L34-L39) array doesn’t have an entry for the location of this module on arm64 linux. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,andrewsanchez,open,0,,,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1943259395,I_kwDOEhK-wc5z08kD,16, time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ',3746270,linonetwo,open,0,,,,,0,2023-10-14T13:24:39Z,2023-10-14T13:24:39Z,,NONE,," ``` evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File ""/usr/local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) ^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 46, in save_note ""created"": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 111, in convert_datetime return datetime.datetime.strptime(s, ""%Y%m%dT%H%M%SZ"").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 349, in _strptime raise ValueError(""time data %r does not match format %r"" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' ``` enex is exported by evernote mac client ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1940346034,I_kwDOBm6k_c5zp1Sy,2199,Detailed upgrade instructions for metadata.yaml -> datasette.yaml,9599,simonw,open,0,,,3268330,Datasette 1.0,7,2023-10-12T16:21:25Z,2023-10-12T22:08:42Z,,OWNER,,"> `Exception: Datasette no longer accepts plugin configuration in --metadata. Move your ""plugins"" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.` > > I think we should link directly to documentation that tells people how to perform this upgrade. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,hcarter333,open,0,,,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of `python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite` and `python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""` and `python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll` I got the error: ``` File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # ""extension"" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) #else: conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: `python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1920416843,I_kwDOCGYnMM5ydzxL,597,sqlite-utils insert-files should be able to convert fields,1737541,grimnight,open,0,,,,,0,2023-09-30T22:20:47Z,2023-09-30T22:20:47Z,,NONE,,"Currently using both `insert-files` and `convert` is needed in order to create sqlar files, it would be more convenient if it could be done with just one command. ```shell ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100% ~ ❯ sqlite-utils convert test.sqlar sqlar data ""zlib.compress(value)"" --import=zlib --where ""name = 'test.py'"" [####################################] 100% ~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data ""zlib.compress(sys.stdin.buffer.read())"" --import=zlib --import=sys --where ""name = 'test.py'"" # Alternative way [####################################] 100% ~ ❯ sqlite3 test.sqlar ""SELECT hex(data) FROM sqlar WHERE name = 'test.py';"" | python3 -c ""import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))"" import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ rm test.py ~ ❯ sqlar -l test.sqlar test.py ~ ❯ sqlar -x test.sqlar ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,Olshansk,closed,0,,,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907765514,I_kwDOBm6k_c5xtjEK,2195,`datasette publish` needs support for the new config/metadata split,9599,simonw,open,0,,,,,9,2023-09-21T21:08:12Z,2023-09-21T22:57:48Z,,OWNER,,"> ... which raises the challenge that `datasette publish` doesn't yet know what to do with a config file! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1907695234,I_kwDOBm6k_c5xtR6C,2194,"Deploy failing with ""plugins/alternative_route.py: Not a directory""",9599,simonw,closed,0,,,,,8,2023-09-21T20:17:49Z,2023-09-21T22:08:19Z,2023-09-21T22:08:19Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/6266449018/job/17017460074 This is a bit of a mystery, I don't think I've changed anything recently that could have broken this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907655261,I_kwDOBm6k_c5xtIJd,2193,"""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run",9599,simonw,closed,0,,,,,6,2023-09-21T19:49:34Z,2023-09-21T21:56:43Z,2023-09-21T21:56:43Z,OWNER,,"> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the ""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907281675,I_kwDOCGYnMM5xrs8L,595,Cascading DELETE not working with Table.delete(pk),123451970,cycle-data,closed,0,,,,,1,2023-09-21T15:46:41Z,2023-09-25T09:38:57Z,2023-09-25T09:38:13Z,NONE,,"Hi ! I noticed that when I am trying to use the delete method of the Table object, the record get properly deleted from the table, but the cascading delete triggers on foreign keys do not activate. `self.db[""contact""].delete(contact_id)` I tried querying the database directly via DB Browser and the triggers work without any issue. Looked up the source code and behind the scene this method is just querying the database normally so I'm not exactly sure where this behavior comes from. Thank you in advance for your time ! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1901416155,I_kwDOBm6k_c5xVU7b,2189,Server hang on parallel execution of queries to named in-memory databases,9599,simonw,closed,0,,,,,31,2023-09-18T17:23:18Z,2023-09-21T22:26:21Z,2023-09-21T22:26:21Z,OWNER,,"I've started to encounter a bug where queries to tables inside named in-memory databases sometimes trigger server hangs. I'm still trying to figure out what's going on here - on one occasion I managed to Ctrl+C the server and saw an exception that mentioned a thread lock, but usually hitting Ctrl+C does nothing and I have to `kill -9` the PID instead. This is all running on my M2 Mac. I've seen the bug in the Datasette 1.0 alphas and in Datasette 0.64.3 - but reverting to 0.61 appeared to fix it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1900026059,I_kwDOBm6k_c5xQBjL,2188,"Plugin Hooks for ""compile to SQL"" languages",15178711,asg017,open,0,,,,,2,2023-09-18T01:37:15Z,2023-09-18T06:58:53Z,,CONTRIBUTOR,,"There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples: - Logica https://logica.dev - PRQL https://prql-lang.org - Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage. A few things I'd imagine a `datasette-prql` or `datasette-logica` plugin would do: - `prql=` instead of `sql=` - Code editor support (syntax highlighting, autocomplete) - Hide/show SQL",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,geofinder,open,0,,,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17 Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported. That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed. It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1895266807,I_kwDOBm6k_c5w93n3,2184,Design decision - should configuration be exposed at /-/config ?,9599,simonw,open,0,,,,,0,2023-09-13T21:07:08Z,2023-09-13T21:07:38Z,,OWNER,,"> This made me think. That `{""$env"": ""ENV_VAR""}` hack was introduced back here: > > - https://github.com/simonw/datasette/issues/538 > > The problem it was solving was that metadata was visible to everyone with access to the instance at `/-/metadata` but plugins clearly needed a way to set secret settings. > > Now that this stuff is moving to config, we have some decisions to make: > > 1. Add `/-/config` to let people see the configuration of their instance, and keep the `$env` trick for secret settings. > 2. Say all configuration aside from metadata is secret and make `$env` optional or ditch it entirely. > 3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from `/-/config` > > I've found `/-/metadata` extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like. > > So I'm leaning towards option 1 or 3. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924_ Also refs: - #2093",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1891614971,I_kwDOCGYnMM5wv8D7,594,Represent compound foreign keys in table.foreign_keys output,9599,simonw,open,0,,,,,2,2023-09-12T03:48:24Z,2023-09-12T03:51:13Z,,OWNER,,"Given this schema: ```sql CREATE TABLE departments ( campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, dept_name TEXT, PRIMARY KEY (campus_name, dept_code) ); CREATE TABLE courses ( course_code TEXT PRIMARY KEY, course_name TEXT, campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, FOREIGN KEY (campus_name, dept_code) REFERENCES departments(campus_name, dept_code) ); ``` The output of `db[""courses""].foreign_keys` right now is: ``` [ForeignKey(table='courses', column='campus_name', other_table='departments', other_column='campus_name'), ForeignKey(table='courses', column='dept_code', other_table='departments', other_column='dept_code')] ``` Which suggests two normal foreign keys, not one compound foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1888477283,I_kwDOC8SPRc5wj-Bj,38,Run `rebuild_fts` after building the index,9599,simonw,open,0,,,,,0,2023-09-08T23:17:45Z,2023-09-08T23:17:45Z,,MEMBER,,"In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix: ```bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1886791100,I_kwDOBm6k_c5wdiW8,2180,Plugin hook: `actors_from_ids()`,9599,simonw,closed,0,,,,,6,2023-09-08T01:16:41Z,2023-09-10T17:44:14Z,2023-09-08T04:28:03Z,OWNER,,"In building Datasette Cloud we realized that a bunch of the features we are building need a way of resolving an actor ID to the actual actor, in order to display something more interesting than just an integer ID. Social plugins in particular need this - comments by X, CSV uploaded by X, that kind of thing. I think the solution is a new plugin hook: `actors_from_ids(datasette, ids)` which can return a list of actor dictionaries. The default implementation can return `[{""id"": ""...""}]` for the IDs passed to it. Pluggy has a `firstresult=True` option which is relevant here, since this is the first plugin hook we will have implemented where only one plugin should provide an answer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886771493,I_kwDOCGYnMM5wddkl,592,`table.transform()` should preserve `rowid` values,9599,simonw,closed,0,,,,,6,2023-09-08T00:42:38Z,2023-09-10T17:46:41Z,2023-09-09T00:45:32Z,OWNER,,"I just spotted a bug when using https://datasette.io/plugins/datasette-configure-fts and https://datasette.io/plugins/datasette-edit-schema at the same time. Steps to reproduce: - Configure FTS for a table, then run a test search - Edit the schema for that table and change the order of columns - Run the test search again I got the wrong search results, which I think is because the `_fts` table pointed to the first table by `rowid` but those `rowid` values were entirely rewritten as a consequence of running `table.transform()` on the table. Reconfiguring FTS on the table fixed the problem. I think `table.transform()` should be able to preserve `rowid` values.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886649402,I_kwDOBm6k_c5wc_w6,2179,Flaky test: test_hidden_sqlite_stat1_table,9599,simonw,closed,0,,,,,0,2023-09-07T22:48:43Z,2023-09-07T22:51:19Z,2023-09-07T22:51:19Z,OWNER,,"This test here: https://github.com/simonw/datasette/blob/fbcb103c0cb6668018ace539a01a6a1f156e8d6a/tests/test_api.py#L1011-L1020 It failed for me like this: `E AssertionError: assert [('normal', False), ('sqlite_stat1', True), ('sqlite_stat4', True)] in ([('normal', False), ('sqlite_stat1', True)],)` Looks like some builds of SQLite include a `sqlite_stat4` table.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886350562,I_kwDOBm6k_c5wb2zi,2178,Don't show foreign key links to tables the user cannot access,9599,simonw,closed,0,,,,,5,2023-09-07T17:56:41Z,2023-09-07T23:28:27Z,2023-09-07T23:28:27Z,OWNER,,"Spotted this problem while working on this plugin: - https://github.com/simonw/datasette-public It's possible to make a table public to any users - but then you may end up with situations like this: That table is public, but the foreign key links go to tables that are NOT public. We're also leaking the names of the values in those private tables here, which we shouldn't do. So this is a tiny bit of an information leak. Since this only affects people who have configured a table to be public that has foreign keys to a table that is private I don't think this is worth issuing a vulnerability report about - I very much doubt anyone is running Datasette configured in a way that could result in problems because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1884408624,I_kwDOBm6k_c5wUcsw,2177,Move schema tables from _internal to _catalog,9599,simonw,open,0,,,,,1,2023-09-06T16:58:33Z,2023-09-06T17:04:30Z,,OWNER,,"This came up in discussion over: - https://github.com/simonw/datasette/pull/2174 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879214365,I_kwDOCGYnMM5wAokd,590,Ability to tell if a Database is an in-memory one,9599,simonw,open,0,,,,,1,2023-09-03T19:50:15Z,2023-09-03T19:50:36Z,,OWNER,,"Currently the constructor accepts `memory=True` or `memory_name=...` and uses those to create a connection, but does not record what those values were: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L307-L349 This makes it hard to tell if a database object is to an in-memory or a file-based database, which is sometimes useful to know.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,simonw,open,0,,,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1876407598,I_kwDOBm6k_c5v17Uu,2169,execute-sql on a database should imply view-database/view-permission,9599,simonw,closed,0,,,,,0,2023-08-31T22:45:56Z,2023-08-31T22:46:28Z,2023-08-31T22:46:28Z,OWNER,,"I noticed that a token with `execute-sql` permission alone did not work, because it was not allowed to view the instance of the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1876353656,I_kwDOBm6k_c5v1uJ4,2168,Consider a request/response wrapping hook slightly higher level than asgi_wrapper(),9599,simonw,open,0,,,,,6,2023-08-31T21:42:04Z,2023-09-10T17:54:08Z,,OWNER,,"There's a long justification for why this might be needed here: - https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001 Short version: it would be neat if it was possible to stash some data on the `request` object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware. The `asgi_wrapper()` mechanism doesn't have access to the request or response objects - it gets `scope` and can mess around with `receive` and `send`, but those are pretty low-level primitives. Since Datasette has well-defined `request` and `response` objects now it might be nice to have a middleware layer that can manipulate those directly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1875739055,I_kwDOBm6k_c5vzYGv,2167,Document return type of await ds.permission_allowed(),9599,simonw,open,0,,,,,0,2023-08-31T15:14:23Z,2023-08-31T15:14:23Z,,OWNER,,"The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350 On inspecting the code I'm not 100% sure if it's possible for this. method to return `None`, or if it can only return `True` or `False`. Need to confirm that. https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1874255116,I_kwDOBm6k_c5vtt0M,2164,Ability to only load a specific list of plugins,9599,simonw,closed,0,,,,,1,2023-08-30T19:33:41Z,2023-09-08T04:35:46Z,2023-08-30T22:12:27Z,OWNER,,"I'm going to try and get this working through an environment variable, so that you can start Datasette and it will only load a subset of plugins including those that use the `register_commands()` hook. Initial research on this: - https://github.com/pytest-dev/pluggy/issues/422",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1872043170,I_kwDOBm6k_c5vlRyi,2163,Rename core_X to catalog_X in the internals,9599,simonw,closed,0,,,,,1,2023-08-29T16:45:00Z,2023-08-29T17:01:31Z,2023-08-29T17:01:31Z,OWNER,,"Discussed with Alex this morning. We think the American spelling is fine here (it's shorter than `catalogue`) and that it's a slightly less lazy name than `core_`. Follows: - https://github.com/simonw/datasette/issues/2157",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1868713944,I_kwDOCGYnMM5vYk_Y,588,`table.get(column=value)` option for retrieving things not by their primary key,9599,simonw,open,0,,,,,1,2023-08-28T00:41:23Z,2023-08-28T00:41:54Z,,OWNER,,"This came up working on this feature: - https://github.com/simonw/llm/pull/186 I have a table with this schema: ```sql CREATE TABLE [collections] ( [id] INTEGER PRIMARY KEY, [name] TEXT, [model] TEXT ); CREATE UNIQUE INDEX [idx_collections_name] ON [collections] ([name]); ``` So the primary key is an integer (because it's going to have a huge number of rows foreign key related to it, and I don't want to store a larger text value thousands of times), but there is a unique constraint on the `name` - that would be the primary key column if not for all of those foreign keys. Problem is, fetching the collection by name is actually pretty inconvenient. Fetch by numeric ID: ```python try: table[""collections""].get(1) except NotFoundError: # It doesn't exist ``` Fetching by name: ```python def get_collection(db, collection): rows = db[""collections""].rows_where(""name = ?"", [collection]) try: return next(rows) except StopIteration: raise NotFoundError(""Collection not found: {}"".format(collection)) ``` It would be neat if, for columns where we know that we should always get 0 or one result, we could do this instead: ```python try: collection = table[""collections""].get(name=""entries"") except NotFoundError: # It doesn't exist ``` The existing `.get()` method doesn't have any non-positional arguments, so using `**kwargs` like that should work: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L1495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,asg017,open,0,,,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,simonw,open,0,,,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax. > > Here's what I've come up with: > ``` > datasette \ > -s settings.sql_time_limit_ms 1000 \ > -s plugins.datasette-auth-tokens.manage_tokens true \ > -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ > mydatabase.db tokens.db > ``` > Which would be equivalent to `datasette.yml` containing this: > ```yaml > plugins: > datasette-auth-tokens: > manage_tokens: true > manage_tokens_database: tokens > settings: > sql_time_limit_ms: 1000 > ``` More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865232341,I_kwDOBm6k_c5vLS_V,2153,Datasette --get --actor option,9599,simonw,closed,0,,,,,5,2023-08-24T14:00:03Z,2023-08-28T20:19:15Z,2023-08-28T20:15:53Z,OWNER,,"I experimented with a prototype of this here: - https://github.com/simonw/datasette/issues/2102#issuecomment-1691037971_ Which lets me run requests as if they belonged to a specific actor like this: ```bash datasette fixtures.db --get '/fixtures/facetable.json' --actor '{ ""_r"": { ""r"": { ""fixtures"": { ""facetable"": [ ""vt"" ] } } }, ""a"": ""user"" }' ``` Really useful for testing actors an `_r` options. Is this worth adding as a feature?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1863810783,I_kwDOBm6k_c5vF37f,2150,form label { width: 15% } is a bad default,9599,simonw,closed,0,,,,,4,2023-08-23T18:22:27Z,2023-08-23T18:37:18Z,2023-08-23T18:35:48Z,OWNER,,"See: - https://github.com/simonw/datasette-configure-fts/issues/14 - https://github.com/simonw/datasette-auth-tokens/issues/12",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1858228057,I_kwDOBm6k_c5uwk9Z,2147,Plugin hook for database queries that are run,18899,jackowayed,open,0,,,,,6,2023-08-20T18:43:50Z,2023-08-24T03:54:35Z,,NONE,,"I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.) As far as I can tell reading the docs, there isn't really a hook setup to allow this. Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good. I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just ""haven't needed it yet."" I'm potentially interested in implementing the hook if the latter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1857234285,I_kwDOBm6k_c5usyVt,2145,If a row has a primary key of `null` various things break,9599,simonw,open,0,,,,,23,2023-08-18T20:06:28Z,2023-08-21T17:30:01Z,,OWNER,,"Stumbled across this while experimenting with `datasette-write-ui`. The error I got was a 500 on the `/db` page: > `'NoneType' object has no attribute 'encode'` Tracked it down to this code, which assembles the URL for a row page: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L120-L134 That's because `tilde_encode` can't handle `None`: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L1175-L1178 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,simonw,open,0,,,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855894222,I_kwDOCGYnMM5unrLO,585,CLI equivalents to `transform(add_foreign_keys=)`,9599,simonw,closed,0,,,,,7,2023-08-18T01:07:15Z,2023-08-18T01:51:16Z,2023-08-18T01:51:15Z,OWNER,,"The new options added in: - #577 Deserve consideration in the CLI as well. https://github.com/simonw/sqlite-utils/blob/d2bcdc00c6ecc01a6e8135e775ffdb87572b802b/sqlite_utils/db.py#L1706-L1708",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855885427,I_kwDOBm6k_c5unpBz,2143,De-tangling Metadata before Datasette 1.0,15178711,asg017,open,0,,,,,24,2023-08-18T00:51:50Z,2023-08-24T18:28:27Z,,CONTRIBUTOR,,"Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add ""metadata"" about your ""data"" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries. Specifically, I've found the following problems when working with Datasette metadata: 1. Metadata cannot be updated without re-starting the entire Datasette instance. 2. The `metadata.json`/`metadata.yaml` has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries 3. The Python APIs for defining extra metadata are a bit awkward (the `datasette.metadata()` class, `get_metadata()` hook, etc.) ## Possible solutions Here's a few ideas of Datasette core changes we can make to address these problems. ### Re-vamp the Datasette Python metadata APIs The Datasette object has a single `datasette.metadata()` method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the `get_metadata()` hook. The `get_metadata()` hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do. (I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods) ### Add an optional `datasette_metadata` table Datasette should detect and use metadata stored in a new special table called `datasette_metadata`. This would be a regular table that a user can edit on their own, and would serve as a ""live updating"" source of metadata, than can be changed while the Datasette instance is running. Not too sure what the schema would look like, but I'd imagine: ```sql CREATE TABLE datasette_metadata( level text, target any, key text, value any, primary key (level, target) ) ``` Every row in this table would map to a single metadata ""entry"". - `level` would be one of ""datasette"", ""database"", ""table"", ""column"", which is the ""level"" the entry describes. For example, `level=""table""` means it is metadata about a specific table, `level=""database""` for a specific database, or `level=""datasette""` for the entire Datasette instance. - `target` would ""point"" to the specific object the entry metadata is about, and would depend on what `level` is specific. - `level=""database""`: `target` would be the string name of the database that the metadata entry is about. ex `""fixtures""` - `level=""table""`: `target` would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex `[""fixtures"", ""students""]` - `level=""column""`: `target` would be a JSON array of 3 strings: The database name, table name, and column name. Ex `[""fixtures"", ""students"", ""student_id""`] - `key` would be the type of metadata entry the row has, similar to the current ""keys"" that exist in `metadata.json`. Ex `""about_url""`, `""source""`, `""description""`, etc - `value` would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc A quick sample: level | target | key | value -- | -- | -- | -- datasette | NULL | title | my datasette title... db | fixtures | source | table | [""fixtures"", ""students""] | label_column | student_name column | [""fixtures"", ""students"", ""birthdate""] | description | This `datasette_metadata` would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in `datasette_metadata`, to update descriptions/columns on the fly. ### Re-vamp `metadata.json` and move non-metadata config to another place The motivation behind this is that it's awkward that `metadata.json` contains config about things that are not strictly metadata, including: - Plugin configuration - [Authentication/permissions](https://docs.datasette.io/en/latest/authentication.html#access-permissions-in-metadata) (ex the `allow` key on datasettes/databases/tables - Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases. I think we should move these outside of `metadata.json` and into a different file. The `datasette.json` idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in `datasette.json`, while `metadata.json`/`datasette_metadata` will strictly be about documenting databases/tables/columns. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855836914,I_kwDOCGYnMM5undLy,583,Get rid of test.utils.collapse_whitespace,9599,simonw,closed,0,,,,,1,2023-08-17T23:31:09Z,2023-08-18T00:59:19Z,2023-08-18T00:59:19Z,OWNER,,"I have a neater pattern for this now - instead of: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L472-L475 I now prefer: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L1163-L1171",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1847201263,I_kwDOBm6k_c5uGg3v,2140,Remove all remaining documentation instances of '$ ',9599,simonw,closed,0,,,,,1,2023-08-11T17:42:13Z,2023-08-11T17:52:25Z,2023-08-11T17:45:00Z,OWNER,,"For example this: https://github.com/simonw/datasette/blob/4535568f2ce907af646304d0ebce2500ebd55677/docs/authentication.rst?plain=1#L33-L35 The problem with that `$ ` prefix is that it prevents users from copying and pasting the raw command. https://docs.datasette.io/en/stable/authentication.html#using-the-root-actor",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1846076261,I_kwDOBm6k_c5uCONl,2139,border-color: ##ff0000 bug - two hashes,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-11T01:22:58Z,2023-08-11T05:16:24Z,2023-08-11T05:16:24Z,OWNER,,"Spotted this on https://latest.datasette.io/extra_database ```html
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1844213115,I_kwDOBm6k_c5t7HV7,2138,on_success_message_sql option for writable canned queries,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-10T00:20:14Z,2023-08-10T00:39:40Z,2023-08-10T00:34:26Z,OWNER,,"> Or... how about if the `on_success_message` option could define a SQL query to be executed to generate that message? Maybe `on_success_message_sql`. - https://github.com/simonw/datasette/issues/2134",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843821954,I_kwDOBm6k_c5t5n2C,2137,Redesign row default JSON,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-08-09T18:49:11Z,2023-08-09T19:02:47Z,,OWNER,,"This URL here: https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables ```json { ""database"": ""fixtures"", ""table"": ""simple_primary_key"", ""rows"": [ { ""id"": ""1"", ""content"": ""hello"" } ], ""columns"": [ ""id"", ""content"" ], ""primary_keys"": [ ""id"" ], ""primary_key_values"": [ ""1"" ], ""units"": {}, ""foreign_key_tables"": [ { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_blank_label"", ""count"": 0, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_blank_label=1"" }, { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_label"", ""count"": 1, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_label=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f3"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f3=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f2"", ""count"": 0, ""link"": ""/fixtures/complex_foreign_keys?f2=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f1"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f1=1"" } ], ""query_ms"": 4.226590999678592, ""source"": ""tests/fixtures.py"", ""source_url"": ""https://github.com/simonw/datasette/blob/main/tests/fixtures.py"", ""license"": ""Apache License 2.0"", ""license_url"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""ok"": true, ""truncated"": false } ``` That `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1843710170,I_kwDOBm6k_c5t5Mja,2136,Query view shouldn't return `columns`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-08-09T17:23:57Z,2023-08-09T19:03:04Z,2023-08-09T19:03:04Z,OWNER,,"I just noticed that https://latest.datasette.io/fixtures/roadside_attraction_characteristics.json?_labels=on&_size=1 returns: ```json { ""ok"": true, ""next"": ""1"", ""rows"": [ { ""rowid"": 1, ""attraction_id"": { ""value"": 1, ""label"": ""The Mystery Spot"" }, ""characteristic_id"": { ""value"": 2, ""label"": ""Paranormal"" } } ], ""truncated"": false } ``` But https://latest.datasette.io/fixtures.json?sql=select+rowid%2C+attraction_id%2C+characteristic_id+from+roadside_attraction_characteristics+order+by+rowid+limit+1 returns: ```json { ""rows"": [ { ""rowid"": 1, ""attraction_id"": 1, ""characteristic_id"": 2 } ], ""columns"": [ ""rowid"", ""attraction_id"", ""characteristic_id"" ], ""ok"": true, ""truncated"": false } ``` The `columns` key in the query response is inconsistent with the table response.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843391585,I_kwDOBm6k_c5t3-xh,2134,Add writable canned query demo to latest.datasette.io,9599,simonw,closed,0,,,,,5,2023-08-09T14:31:30Z,2023-08-10T01:22:46Z,2023-08-10T01:05:56Z,OWNER,,"This would be useful while working on: - #2114",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841501975,I_kwDOBm6k_c5twxcX,2133,[feature request]`datasette install plugins.json` options,54462,HaveF,closed,0,,,,,9,2023-08-08T15:06:50Z,2023-08-10T00:31:24Z,2023-08-09T22:04:46Z,NONE,,"Hi, simon ❤️ `datasette plugins --all > plugins.json` could generate all plugins info. On another machine, it would be great to install all plugins just by `datasette install plugins.json`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841343173,I_kwDOBm6k_c5twKrF,2132,Get form fields on query page working again ,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-08-08T13:39:05Z,2023-08-08T13:45:10Z,2023-08-08T13:45:09Z,OWNER,,"Caused by: - #2112 https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+where+%22pk1%22+%3D+%3Ap0+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=b The `:p0` form field is missing. Submitting the form results in this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840417903,I_kwDOBm6k_c5tsoxv,2131,Refactor code that supports templates_considered comment,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-08T01:28:36Z,2023-08-09T15:27:41Z,,OWNER,,"I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167 I think it should move to `datasette.render_template()` - and maybe have a renamed template variable too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1840329615,I_kwDOBm6k_c5tsTOP,2130,Render plugin mechanism needs `error` and `truncated` fields,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,2,2023-08-07T23:19:19Z,2023-08-08T01:51:54Z,2023-08-08T01:47:42Z,OWNER,,"While working on: - https://github.com/simonw/datasette/pull/2118 It became clear that the `render` callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette Needs to grow the ability to be told if an error occurred (an `error` string) and if the results were truncated (a `truncated` boolean).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840324765,I_kwDOBm6k_c5tsSCd,2129,CSV ?sql= should indicate errors,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-07T23:13:04Z,2023-08-08T02:02:21Z,,OWNER,,"> https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now: ```bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' ``` ``` HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1839344979,I_kwDOCGYnMM5toi1T,582,Handling CSV/file input that contains NUL bytes,1448859,betatim,open,0,,,,,0,2023-08-07T12:24:14Z,2023-08-07T12:24:14Z,,NONE,,"I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte. When the processing reaches the line that contains the NUL an exception is raised. I'm wondering if there is something that can be done in `sqlite-utils` to say ""skip lines with encoding errors"" or some such. I think it isn't super straightforward though as the exception comes from inside the `csv` module that does all the parsing. Concretely the file is the `KernelVersions.csv` from https://www.kaggle.com/datasets/kaggle/meta-kaggle This is the command and output: ``` $ sqlite-utils insert --csv kaggle.db kaggle KernelVersions.csv [------------------------------------] 0% [#####################---------------] 60% 00:04:24Traceback (most recent call last): File ""/home/foobar/miniconda/envs/meta-kaggle/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1223, in insert insert_upsert_implementation( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1085, in insert_upsert_implementation db[table].insert_all( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3198, in insert_all chunk = list(chunk) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3742, in fix_square_braces for record in records: File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1071, in docs = (decode_base64_values(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1068, in docs = (verify_is_dict(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1003, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: line contains NUL ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,cadeef,open,0,,,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument: ``` $ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload. ``` If a 'serve' file is created it launches properly (albeit with an empty database called serve): ``` $ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ``` Version (running from HEAD on main): ``` $ datasette --version datasette, version 1.0a2 ``` This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context. I'm happy to debug and land a patch if it's welcome.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1824457306,I_kwDOBm6k_c5svwJa,2122,Parameters on canned queries: fixed or query-generated list?,1563881,meowcat,open,0,,,,,0,2023-07-27T14:07:07Z,2023-07-27T14:07:07Z,,NONE,,"Hi, currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.) * adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is) * making it possible to provide a query which returns selectable values for a parameter, e.g. ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: ""SELECT VALUE FROM generate_series(1, 30, 1)"" # this obviously requires the corresponding sqlite extension instrument: sql: ""SELECT DISTINCT MACHINE FROM calendar_entries"" ``` * making it possible to provide a fixed list of parameters ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,simonw,open,0,,,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6 Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823160748,I_kwDOCGYnMM5sqzms,581,`sqlite-utils convert --pdb` option,9599,simonw,closed,0,,,,,1,2023-07-26T21:02:50Z,2023-07-26T21:07:45Z,2023-07-26T21:06:10Z,OWNER,,While using `sqlite-utils convert` I realized it would be handy if you could pass `--pdb` to have it open the debugger at the first instance of a failed conversion.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822982933,I_kwDOBm6k_c5sqIMV,2117,Figure out what to do about `DatabaseView.name`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:58:06Z,2023-08-08T02:02:07Z,2023-08-08T02:02:07Z,OWNER,,"In the old code: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35 This `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54 Figure out how that should work once I've refactored those classes to view functions instead. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940964,I_kwDOBm6k_c5sp98k,2115,Ensure all tests pass against new query view JSON,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,0,2023-07-26T18:25:20Z,2023-08-08T02:01:39Z,2023-08-08T02:01:38Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822938661,I_kwDOBm6k_c5sp9Yl,2112,Build HTML version of /content?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:23:34Z,2023-08-08T02:01:09Z,2023-08-08T02:01:01Z,OWNER,,"This will help make the hook as robust as possible. - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822937426,I_kwDOBm6k_c5sp9FS,2111,Implement new /content.json?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-07-26T18:22:39Z,2023-08-08T02:00:37Z,2023-08-08T02:00:22Z,OWNER,,"This will be the base that the remaining work builds on top of. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822936521,I_kwDOBm6k_c5sp83J,2110,Merge database index page and query view,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:21:57Z,2023-07-26T19:53:25Z,2023-07-26T19:53:25Z,OWNER,,"Refs: - #2109 The idea here is that hitting `/content` without a `?sql=` will show an empty result set AND default to including a bunch of extras about the list of tables in the database. Then I won't have to think about `/content` and `/content?sql=` as separate pages any more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822934563,I_kwDOBm6k_c5sp8Yj,2109,Plan for getting the new JSON format query views working,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:20:18Z,2023-07-27T00:24:47Z,2023-07-26T18:25:34Z,OWNER,,"I've been stuck on this for too long. I'm breaking it down into a full milestone: https://github.com/simonw/datasette/milestone/29",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822918995,I_kwDOCGYnMM5sp4lT,580,Add way to export to a csv file using the Python library,44324811,kevinlinxc,open,0,,,,,0,2023-07-26T18:09:26Z,2023-07-26T18:09:26Z,,NONE,,"According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822813627,I_kwDOBm6k_c5spe27,2108,some (many?) SQL syntax errors are not throwing errors with a .csv endpoint,536941,fgregg,open,0,,,,,0,2023-07-26T16:57:45Z,2023-07-26T16:58:07Z,,CONTRIBUTOR,,"here's a CTE query that should always fail with a syntax error: ```sql with foo as (nonsense) select * from foo; ``` when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B same with this bad sql ```sql select a, from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B vs https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B but, datasette catches this bad sql at both endpoints: ```sql slect a from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1821108702,I_kwDOCGYnMM5si-ne,579,Special handling for SQLite column of type `JSON`,15178711,asg017,open,0,,,,,0,2023-07-25T20:37:23Z,2023-07-25T20:37:23Z,,CONTRIBUTOR,,"`sqlite-utils` should detect and have specially handling for column with a `JSON` column. For example: ```sql CREATE TABLE ""dogs"" ( id INTEGER PRIMARY KEY, name TEXT, friends JSON ); ``` ## Automatic Nesting According to [""Nested JSON Values""](https://sqlite-utils.datasette.io/en/stable/cli.html#nested-json-values), sqlite-utils will only expand JSON if the `--json-cols` flag is passed. It looks like it'll try to `json.load` all text column to test if its JSON, which can get expensive on non-json columns. Instead, `sqlite-utils` should be default (ie without the `--json-cols` flags) do the `maybe_json()` operation on columns with a declared `JSON` type. So the above table would expand the `""friends""` column as expected, withoutthe `--json-cols` flag: ```bash sqlite-utils dogs.db ""select * from dogs"" | python -mjson.tool ``` ``` [ { ""id"": 1, ""name"": ""Cleo"", ""friends"": [ { ""name"": ""Pancakes"" }, { ""name"": ""Bailey"" } ] } ] ``` --- I'm sure there's other ways `sqlite-utils` can specially handle JSON columns, so keeping this open while I think of more",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1818838294,I_kwDOCGYnMM5saUUW,578,Plugin hook for adding new output formats,9599,simonw,open,0,,,,,5,2023-07-24T17:29:18Z,2023-08-07T15:41:49Z,,OWNER,,"> What would it take to add a format hook? I'm still thinking about my GIS workflow, and being able to do `sqlite-utils query ... --geojson` would be nice. It's the one place my Datasette workflow is messy, having to do `datasette . --get /path/to/query.geojson --setting max_rows_returned 10000 --load-extension spatialite`. > I know the current pattern is `--csv`, but maybe `--format geojson` is more future-proof. https://discord.com/channels/823971286308356157/997738192360964156/1133076679011602432",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1817289521,I_kwDOCGYnMM5sUaMx,577,Get `add_foreign_keys()` to work without modifying `sqlite_master`,9599,simonw,closed,0,,,,,9,2023-07-23T20:40:18Z,2023-08-18T17:43:11Z,2023-08-18T00:48:10Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/13ebcc575d2547c45e8d31288b71a3242c16b886/sqlite_utils/db.py#L1165-L1174 This is the only place in the code that attempts to modify `sqlite_master` directly, which fails on some Python installations. Could this use the `.transform()` trick instead? Or automatically switch to that trick if it hits an error?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816997390,I_kwDOCGYnMM5sTS4O,576,Backfill the release notes prior to 0.4,9599,simonw,closed,0,,,,,2,2023-07-23T05:41:42Z,2023-07-23T05:49:51Z,2023-07-23T05:48:21Z,OWNER,,"Currently the changelog starts at 0.4: https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115 I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816919568,I_kwDOCGYnMM5sS_4Q,575,Python API ability to opt-out of connection plugins,9599,simonw,closed,0,,,,,2,2023-07-22T23:01:13Z,2023-07-22T23:17:22Z,2023-07-22T23:08:22Z,OWNER,,"Plugins affecting the CLI by default makes sense to me. I'm less confident about them _always_ affecting users of the Python API. I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this: ```python from sqlite_utils import Database db = Database(memory=True, execute_plugins=False) # Anything using db from here on will not execute plugins ``` cc @asg017 Refs: - #567 - #574 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816877910,I_kwDOCGYnMM5sS1tW,572,Don't test Python 3.7 against textual,9599,simonw,closed,0,,,,,2,2023-07-22T19:57:03Z,2023-07-22T22:16:50Z,2023-07-22T22:16:50Z,OWNER,,"Spotted this in the GitHub Actions logs: ![IMG_5046](https://github.com/simonw/sqlite-utils/assets/9599/81fb1093-cd8a-4019-a612-2e49b500c933) ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816876211,I_kwDOCGYnMM5sS1Sz,571,`.transform(keep_table=...)` option,9599,simonw,closed,0,,,,,1,2023-07-22T19:49:29Z,2023-07-22T22:32:18Z,2023-07-22T22:32:18Z,OWNER,,">> Also need a design for an option for the `.transform()` method to indicate that the new table should be created with a new name without dropping the old one. > > I think `keep_table=""name_of_table""` is good for this. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/565#issuecomment-1646657324_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857105,I_kwDOCGYnMM5sSwoR,570,`sqlite-utils install -e` option,9599,simonw,closed,0,,,,,0,2023-07-22T18:32:23Z,2023-07-22T18:55:59Z,2023-07-22T18:32:56Z,OWNER,,"As seen in LLM. Needed while working on: - #567",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816851056,I_kwDOCGYnMM5sSvJw,568,"table.create(..., replace=True)",9599,simonw,closed,0,,,,,7,2023-07-22T18:12:22Z,2023-07-22T19:25:35Z,2023-07-22T19:15:44Z,OWNER,,"Found myself using this pattern to quickly prototype a schema: ```python import sqlite_utils db = sqlite_utils.Database(memory=True) print(db[""answers_chunks""].create({ ""id"": int, ""content"": str, ""embedding_type_id"": int, ""embedding"": bytes, ""embedding_content_md5"": str, ""source"": str, }, pk=""id"", transform=True).schema) ``` Using `replace=True` to drop and then recreate the table would be neat here, and would be consistent with other places that use `replace=True`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816830546,I_kwDODEm0Qs5sSqJS,73,Twitter v1 API shutdown,6341745,david-perez,open,0,,,,,0,2023-07-22T16:57:41Z,2023-07-22T16:57:41Z,,NONE,,"I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get: ``` [2023-07-19 21:00:04.937536] File ""/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets[""errors""])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}] ``` It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they [announced they'd be deprecated on 29th April](https://twittercommunity.com/t/reminder-to-migrate-to-the-new-free-basic-or-enterprise-plans-of-the-twitter-api/189737). Unfortunately [retrieving likes using the v2 API](https://developer.twitter.com/en/docs/twitter-api/tweets/likes/introduction) is not part of their [free plan](https://developer.twitter.com/en/portal/products). In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,aki-k,open,0,,,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,simonw,open,0,,,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,simonw,open,0,,,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1805076818,I_kwDOBm6k_c5rl0lS,2102,API tokens with view-table but not view-database/view-instance cannot access the table,9599,simonw,closed,0,9599,simonw,,,20,2023-07-14T15:34:27Z,2023-08-29T16:32:36Z,2023-08-29T16:32:35Z,OWNER,,"> Spotted a problem while working on this: if you grant a token access to view table for a specific table but don't also grant view database and view instance permissions, that token is useless. > > This was a deliberate design decision in Datasette - it's documented on https://docs.datasette.io/en/1.0a2/authentication.html#access-permissions-in-metadata > >> If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries. > > I'm now second-guessing if this was a good decision. _Originally posted by @simonw in https://github.com/simonw/datasette-auth-tokens/issues/7#issuecomment-1636031702_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1803264272,I_kwDOBm6k_c5re6EQ,2101,alter: true support for JSON write API,9599,simonw,open,0,,,,,1,2023-07-13T15:24:11Z,2023-07-13T15:24:18Z,,OWNER,,"Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642 > The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1801394744,I_kwDOCGYnMM5rXxo4,567,Plugin system,15178711,asg017,closed,0,,,,,9,2023-07-12T17:02:14Z,2023-07-22T22:59:37Z,2023-07-22T22:59:36Z,CONTRIBUTOR,,"I'd like there to be a plugin system for sqlite-utils, similar to the datasette/llm plugins. I'd like to make plugins that would do things like: - Register SQLite extensions for more SQL functions + virtual tables - Register new subcommands - Different input file formats for `sqlite-utils memory` - Different output file formats (in addition to `--csv` `--tsv` `--nl` etc. A few real-world use-cases of plugins I'd like to see in sqlite-utils: - Register many of my sqlite extensions in sqlite-utils (`sqlite-http`, `sqlite-lines`, `sqlite-regex`, etc.) - New subcommands to work with `sqlite-vss` vector tables - Input/ouput Parquet/Avro/Arrow IPC files with `sqlite-arrow`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795219865,I_kwDOCGYnMM5rAOGZ,566,`--no-headers` doesn't work on most formats,33625,zellyn,open,0,,,,,2,2023-07-09T03:43:36Z,2023-07-09T04:13:35Z,,NONE,,"Version 3.33 ``` sqlite-utils query library.db 'select asin from audible' --fmt plain --no-headers | head -3 asin 0062804006 0062891421 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795051447,I_kwDOBm6k_c5q_k-3,2097,Drop Python 3.7,9599,simonw,closed,0,,,,,0,2023-07-08T18:39:44Z,2023-08-23T18:18:00Z,2023-08-23T18:18:00Z,OWNER,,"> I'm going to drop Python 3.7. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892_ It's not supported any more: https://devguide.python.org/versions/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2097/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1794097871,I_kwDOBm6k_c5q78LP,2095,"Introduce ""dark mode"" CSS",3315059,jamietanna,open,0,,,,,0,2023-07-07T19:15:58Z,2023-07-07T19:15:58Z,,NONE,,Using [the CSS media query `prefers-color-scheme`](https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme) we can provide a dark-mode version of Datasette,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1786258502,I_kwDOCGYnMM5qeCRG,565,Table renaming: db.rename_table() and sqlite-utils rename-table,9599,simonw,closed,0,,,,,6,2023-07-03T14:07:42Z,2023-07-22T22:12:40Z,2023-07-22T22:12:40Z,OWNER,,"> I find myself wanting two new features in `sqlite-utils`: > - The ability to have the new transformed table set to a specific name, while keeping the old table around > - The ability to rename a table (`sqlite-utils` doesn't have a table rename function at all right now) _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618375042_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786243905,I_kwDOCGYnMM5qd-tB,564,Document that running `db.transform()` tidies up the schema indentation,9599,simonw,closed,0,,,,,0,2023-07-03T13:59:28Z,2023-07-22T22:15:34Z,2023-07-22T22:15:34Z,OWNER,,"> ... and it turns out running `.transform()` with no arguments still fixes the format of the schema! ```pycon >>> db[""log""].add_column(""foo"", str) >>> db[""log""].add_column(""bar"", str)
>>> db[""log""].add_column(""baz"", str)
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT , [foo] TEXT, [bar] TEXT, [baz] TEXT) >>> db[""log""].transform()
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT, [foo] TEXT, [bar] TEXT, [baz] TEXT ) ``` _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618347727_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/564/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1785360409,I_kwDOCGYnMM5qanAZ,563,`--empty-null` option when importing CSV,9599,simonw,closed,0,,,,,1,2023-07-03T05:23:36Z,2023-07-03T05:44:43Z,2023-07-03T05:42:30Z,OWNER,,"CSV files with empty cells in (which come through as the empty string) are common and a bit gross. Having an option that means ""and if it's an empty string store `null` instead) would be cool. I brainstormed name options here https://chat.openai.com/share/c947b738-ee7d-419c-af90-bc84e90987da",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1784794489,I_kwDOCGYnMM5qYc15,562,Explore the intersection between sqlite-utils and dataclasses,9599,simonw,open,0,,,,,1,2023-07-02T19:23:08Z,2023-07-02T19:26:39Z,,OWNER,,"> Aside: this makes me think it might be cool if `sqlite-utils` had a way of working with dataclasses rather than just dicts, and knew how to create a SQLite table to match a dataclass and maybe how to code-generate dataclasses for a specific table schema (dynamically or even using code-generation that can be written to disk, for better editor integrations). _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1616742529_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1783304750,I_kwDOBm6k_c5qSxIu,2094,JS Plugin Hooks for the Code Editor,15178711,asg017,open,0,,,,,0,2023-07-01T00:51:57Z,2023-07-01T00:51:57Z,,CONTRIBUTOR,,"When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor. I'd eventually like to build a JS plugin for [`sqlite-docs`](https://github.com/asg017/sqlite-docs), to add things like: - Inline documentation for tables/columns on hover - Inline docs for custom functions that are loaded in - More detailed autocomplete for tables/columns/functions I did some hacking to see what this would look like, see here: There can be a new hook that allows JS plugins to add new ""extension"" in the CodeMirror editorview here: https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25 Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of `""@codemirror/autocomplete""`, for example. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2094/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781530343,I_kwDOBm6k_c5qL_7n,2093,"Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File",15178711,asg017,open,0,,,,,8,2023-06-29T21:18:23Z,2023-09-11T20:19:32Z,,CONTRIBUTOR,,"Very often I get tripped up when trying to configure my Datasette instances. For example: if I want to change the port my app listen too, do I do that with a CLI flag, a `--setting` flag, inside `metadata.json`, or an env var? If I want to up the time limit of SQL statements, is that under `metadata.json` or a setting? Where does my plugin configuration go? Normally I need to look it up in Datasette docs, and I quickly find my answer, but the number of places where ""config"" goes it overwhelming. - Flat CLI flags like `--port`, `--host`, `--cors`, etc. - `--setting`, like `default_page_size`, `sql_time_limit_ms` etc - Inside `metadata.json`, including plugin configuration Typically my Datasette deploys are extremely long shell commands, with multiple `--setting` and other CLI flags. ## Proposal: Consolidate all ""config"" into `datasette.toml` I propose that we add a new `datasette.toml` that combines ""settings"", ""metadata"", and other common CLI flags like `--port` and `--cors` into a single file. It would be similar to ""Cargo.toml"" in Rust projects, ""package.json"" in Node projects, and ""pyproject.toml"" in Python, etc. A sample of what it could look like: ```toml # ""top level"" configuration that are currently CLI flags on `datasette serve` [config] port = 8020 host = ""0.0.0.0"" cors = true # replaces multiple `--setting` flags [settings] base_url = ""/app/datasette/"" default_allow_sql = true sql_time_limit_ms = 3500 # replaces `metadata.json`. # The contents of datasette-metadata.json could be defined in this file instead, but supporting separate files is nice (since those are easy to machine-generate) [metadata] include=""./datasette-metadata.json"" # plugin-specific [plugins] [plugins.datasette-auth-github] client_id = {env = ""DATASETTE_AUTH_GITHUB_CLIENT_ID""} client_secret = {env = ""GITHUB_CLIENT_SECRET""} [plugins.datasette-cluster-map] latitude_column = ""lat"" longitude_column = ""lon"" ``` ## Pros - Instead of multiple files and CLI flags, everything could be in one tidy file - Editing config in a separate file is easier than editing CLI flags, since you don't have to kill a process + edit a command every time - New users will know ""just edit my `datasette.toml` instead of needing to learn metadata + settings + CLI flags - Better dev experience for multiple environment. For example, could have `datasette -c datasette-dev.toml` for local dev environments (enables SQL, debug plugins, long timeouts, etc.), and a `datasette -c datasette-prod.toml` for ""production"" (lower timeouts, less plugins, monitoring plugins, etc.) ## Cons - Yet another config-management system. Now Datasette users will need to know about metadata, settings, CLI flags, _and_ `datasette.toml`. However with enough documentation + announcements + examples, I think we can get ahead of it. - If toml is chosen, would need to add a toml parser for Python version <3.11 - Multiple sources of config require priority. For example: Would `--setting default_allow_sql off` override the value inside `[settings]`? What about `--port`? ## Other Notes ### Toml I chose toml over json because toml supports comments. I chose toml over yaml because Python 3.11 has builtin support for it. I also find toml easier to work with since it doesn't have the odd ""gotchas"" that YAML has (""ex `3.10` resolving to `3.1`, Norway `NO` resolving to `false`, etc.). It also mimics `pyproject.toml` which is nice. Happy to change my mind about this however ### Plugin config will be difficult Plugin config is currently in `metadata.json` in two places: 1. Top level, under `""plugins.[plugin-name]""`. This fits well into `datasette.toml` as `[plugins.plugin-name]` 2. Table level, under `""databases.[db-name].tables.[table-name].plugins.[plugin-name]`. This doesn't fit that well into `datasette.toml`, unless it's nested under `[metadata]`? ### Extensions, static, one-off plugins? We could also include equivalents of `--plugins-dir`, `--static`, and `--load-extension` into `datasette.toml`, but I'd imagine there's a few security concerns there to think through. ### Explicitly list with plugins to use? I believe Datasette by default will load all install plugins on startup, but maybe `datasette.toml` can specify a list of plugins to use? For example, a dev version of `datasette.toml` can specify `datasette-pretty-traces`, but the prod version can leave it out",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2093/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781047747,I_kwDOBm6k_c5qKKHD,2092,test_homepage intermittent failure,9599,simonw,closed,0,,,,,2,2023-06-29T15:20:37Z,2023-06-29T15:26:28Z,2023-06-29T15:24:13Z,OWNER,,"e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852 ``` =================================== FAILURES =================================== ________________________________ test_homepage _________________________________ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python ds_client = @pytest.mark.asyncio async def test_homepage(ds_client): response = await ds_client.get(""/.json"") assert response.status_code == 200 assert ""application/json; charset=utf-8"" == response.headers[""content-type""] data = response.json() assert data.keys() == {""fixtures"": 0}.keys() d = data[""fixtures""] assert d[""name""] == ""fixtures"" assert d[""tables_count""] == 24 assert len(d[""tables_and_views_truncated""]) == 5 assert d[""tables_and_views_more""] is True # 4 hidden FTS tables + no_primary_key (hidden in metadata) assert d[""hidden_tables_count""] == 6 # 201 in no_primary_key, plus 6 in other hidden tables: > assert d[""hidden_table_rows_sum""] == 207, data E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}} E assert 0 == 207 ``` My guess is that this is a timing error, where very occasionally the ""count rows but stop counting if it exceeds a time limit"" thing fails.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,simonw,open,0,,,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1780973290,I_kwDOBm6k_c5qJ37q,2089,codespell test failure,9599,simonw,closed,0,,,,,5,2023-06-29T14:40:10Z,2023-06-29T14:48:11Z,2023-06-29T14:48:10Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/5413443676/jobs/9838999356 ``` codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt shell: /usr/bin/bash -e {0} env: pythonLocation: /opt/hostedtoolcache/Python/3.9.17/x64 LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.17/x64/lib docs/metadata.rst:192: displaing ==> displaying ``` This failure is legit, it found a spelling mistake: https://github.com/simonw/datasette/blob/ede62036180993dbd9d4e5d280fc21c183cda1c3/docs/metadata.rst#L192",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2089/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1777548699,I_kwDOCGYnMM5p8z2b,561,`--stop-after` option for `insert` and `upsert` commands,9599,simonw,closed,0,,,,,1,2023-06-27T18:44:15Z,2023-06-27T18:50:09Z,2023-06-27T18:50:08Z,OWNER,,I found myself wanting to insert rows from a 849MB CSV file without processing the whole thing: https://huggingface.co/datasets/jerpint-org/HackAPrompt-Playground-Submissions/tree/main,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1773450152,I_kwDOCGYnMM5ptLOo,559,sqlean support,9599,simonw,closed,0,,,,,0,2023-06-25T19:27:26Z,2023-06-25T23:25:53Z,2023-06-25T23:25:53Z,OWNER,,"If sqlean is available, use that. Refs: - https://github.com/nalgeon/sqlean.py/issues/1#issuecomment-1605707788 This will provide a good workaround for: - #235 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,simonw,open,0,,,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080 > May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1764792125,I_kwDOBm6k_c5pMJc9,2086,Show information on startup in directory configuration mode,9599,simonw,open,0,,,,,0,2023-06-20T07:13:33Z,2023-06-20T07:13:33Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098 > One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1762180409,I_kwDOBm6k_c5pCL05,2085,Interactive row selection in Datasette ,24938923,learning4life,open,0,,,,,0,2023-06-18T08:29:45Z,2023-06-18T08:31:23Z,,NONE,,"Simon did a excellent [prototype](https://til.simonwillison.net/datasette/row-selection-prototype) of an interactive row selection in Datasette. I hope this [functionality](https://camo.githubusercontent.com/3d4a0f31fb6a27fd279f809af5b53dc3b76faa63c7721e228951c5252b645a77/68747470733a2f2f7374617469632e73696d6f6e77696c6c69736f6e2e6e65742f7374617469632f323032332f6461746173657474652d7069636b65722e676966) can be turned into a Datasette plugin. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1761613778,I_kwDOBm6k_c5pABfS,2084,Support facets for columns that contain timestamps,19492893,devxpy,open,0,,,,,0,2023-06-17T03:33:54Z,2023-06-17T03:33:54Z,,NONE,," Django has this very nice filter for datetime fields - It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now... ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1754174496,I_kwDOCGYnMM5ojpQg,558,Ability to define unique columns when creating a table,1910303,aguinane,open,0,,,,,0,2023-06-13T06:56:19Z,2023-08-18T01:06:03Z,,NONE,,"When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set. ```python from sqlite_utils import Database columns = {""mRID"": str, ""name"": str} db = Database(""example.db"") db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], if_not_exists=True) db[""ExampleTable""].create_index([""mRID""], unique=True, if_not_exists=True) ``` So something like this would add the UNIQUE flag to the table definition. ```python db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], unique=[""mRID""], if_not_exists=True) ``` ```sql CREATE TABLE ExampleTable ( mRID TEXT PRIMARY KEY NOT NULL UNIQUE, name TEXT ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1751214236,I_kwDOC8SPRc5oYWic,36,Getting sqlite_master may not be modified when creating dogsheep index,8711912,khushmeeet,open,0,,,,,0,2023-06-11T03:21:53Z,2023-06-11T03:21:53Z,,NONE,,"When creating a `dogsheep` index from `config.yml` file on pocket.db (created using pocket-to-sqlite), I am getting this error ``` Traceback (most recent call last): File ""/Users/khushmeeet/.pyenv/versions/3.11.2/bin/dogsheep-beta"", line 8, in sys.exit(cli()) ^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/cli.py"", line 36, in index run_indexer( File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 32, in run_indexer ensure_table_and_indexes(db, tokenize) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 91, in ensure_table_and_indexes table.add_foreign_key(*fk) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2155, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 1116, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified ``` Command I ran to get this error ``` dogsheep-beta index pocket.db config.yml ``` Dogsheep version ``` dogsheep-beta, version 0.10.2 ``` Python version ``` Python 3.11.2 ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1740150327,I_kwDOCGYnMM5nuJY3,557,Aliased ROWID option for tables created from alter=True commands,7908073,chapmanjacobd,closed,0,,,,,2,2023-06-04T05:29:28Z,2023-06-14T06:09:21Z,2023-06-05T19:26:26Z,CONTRIBUTOR,,"> If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values. ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK. It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1740026046,I_kwDOCGYnMM5ntrC-,556,Support storing incrementally piped values,601708,mcint,open,0,,,,,1,2023-06-04T00:45:23Z,2023-06-04T01:21:15Z,,CONTRIBUTOR,,"I'm trying to use sqlite-utils to data generated incrementally. There are a few aspects of this that I don't currently know how to handle. I would like an option to apply writes incrementally, line-by-line as they are received. I would like an option to echo incremental progress. And, it would be nice to have In particular, I'm using CoreLocationCLI -w -j to generate, newline-delimited JSON. One variant of the command `stdbuf -oL CoreLocationCLI -w -j | pee 'sqlite-utils insert loc.db loc -' nl` `pee`, from `moreutils`, is like `tee` but spawns and pipes to the processes created by invoking each of its arguments, so, for gratuitous demonstration, `pee 'sponge out.log' cat` would behave like `tee`. It looks like I can get what I want with: `stdbuf -oL CoreLocationCLI -w -j | while read line; do <<<""$line"" sqlite-utils insert loc.db loc -; echo ""$line""; done | nl` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1733198948,I_kwDOCGYnMM5nToRk,555,Filter table by a large bunch of ids,10843208,redraw,open,0,,,,,1,2023-05-31T00:29:51Z,2023-06-14T22:01:57Z,,NONE,,"Hi! this might be a question related to both SQLite & sqlite-utils, and you might be more experienced with them. I have a large bunch of ids, and I'm wondering which is the best way to query them in terms of performance, and simplicity if possible. The naive approach would be something like `select * from table where rowid in (?, ?, ?...)` but that wouldn't scale if ids are >1k. Another approach might be creating a temp table, or in-memory db table, insert all ids in that table and then join with the target one. I failed to attach an in-memory db both using sqlite-utils, and plain sql's execute(), so my closest approach is something like, ```python def filter_existing_video_ids(video_ids): db = get_db() # contains a ""videos"" table db.execute(""CREATE TEMPORARY TABLE IF NOT EXISTS tmp (video_id TEXT NOT NULL PRIMARY KEY)"") db[""tmp""].insert_all([{""video_id"": video_id} for video_id in video_ids]) for row in db[""tmp""].rows_where(""video_id not in (select video_id from videos)""): yield row[""video_id""] db[""tmp""].drop() ``` That kinda worked, I couldn't find an option in sqlite-utils's `create_table()` to tell it's a temporary table. Also, `tmp` table is not dropped finally, neither using `.drop()` despite being created with the keyword `TEMPORARY`. I believe it should be automatically dropped after connection/session ends though I read.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1727478903,I_kwDOBm6k_c5m9zx3,2081,Update Endpoints defined in metadata throws 403 Forbidden after a while,15085007,cutmasta-kun,open,0,,,,,0,2023-05-26T11:52:30Z,2023-05-26T11:52:30Z,,NONE,,"Hello. I expose an endpoint to update `tasks`: ``` { ""title"": ""My Datasette Instance"", ""databases"": { ""tasks"": { ""queries"": { ""update_task"": { ""sql"": ""UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID"", ""write"": true, ""on_success_message"": ""Task updated"", ""on_success_redirect"": ""/tasks/tasks.json"", ""on_error_message"": ""Task update failed"", ""on_error_redirect"": ""/tasks.json"", ""params"": [""queueID"", ""taskData"", ""status"", ""result"", ""systemMessage""] } } } } } ``` This works really well! But after a while, the Datasette Instanz answers with **403 Forbidden**. I have to delete the database and recreate it in order to work again. Any help here? (´。_。`)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1726531350,I_kwDOBm6k_c5m6McW,2079,Datasette should serve Access-Control-Max-Age,9599,simonw,closed,0,,,,,8,2023-05-25T21:50:50Z,2023-05-25T22:56:28Z,2023-05-25T22:08:35Z,OWNER,,"Currently the CORS headers served are: https://github.com/simonw/datasette/blob/9584879534ff0556e04e4c420262972884cac87b/datasette/utils/__init__.py#L1139-L1143 Serving `Access-Control-Max-Age: 600` would allow browsers to cache that for 10 minutes, avoiding additional CORS pre-flight OPTIONS requests during that time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726236847,I_kwDOBm6k_c5m5Eiv,2078,Resolve the difference between `wrap_view()` and `BaseView`,9599,simonw,closed,0,,,,,16,2023-05-25T17:44:32Z,2023-05-26T00:18:46Z,2023-05-26T00:18:46Z,OWNER,,"There are two patterns for implementing views in Datasette at the moment. I want to combine those. Part of: - #2053",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2078/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1720096994,I_kwDOCGYnMM5mhpji,554,"`IndexError` when doing `.insert(..., pk='id')` after `insert_all`",1231935,xavdid,open,0,,,,,1,2023-05-22T17:13:02Z,2023-05-22T17:18:33Z,,NONE,,"I believe this is related to https://github.com/simonw/sqlite-utils/issues/98. When `pk` is specified by table A's `insert` call, it throws an index error if a different table has written a row with a higher rowid than exists in the first table. Here's a basic example: ```py from sqlite_utils import Database def test_pk_for_insert(fresh_db): user = {""id"": ""abc"", ""name"": ""david""} fresh_db[""users""].insert(user, pk=""id"") fresh_db[""comments""].insert_all( [ {""id"": ""def"", ""text"": ""ok""}, {""id"": ""ghi"", ""text"": ""great""}, ], ) fresh_db[""users""].insert( user, ignore=True, # BUG: when specifying pk on the second insert call # db.py goes into a block it doesn't expect and we get the error pk=""id"", ) if __name__ == ""__main__"": db = Database(""bug.db"") if db[""users""].exists(): raise ValueError( ""bug only shows on a new database - remove bug.db before running the script"" ) test_pk_for_insert(db) ``` The error is: ```py File ""/Users/david/projects/reddit-to-sqlite/.venv/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2960, in insert_chunk row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: list index out of range ``` The issue is in this block: https://github.com/simonw/sqlite-utils/blob/2747257a3334d55e890b40ec58fada57ae8cfbfd/sqlite_utils/db.py#L2954-L2958 relevant locals are: - `pk`: `'id'` - `result.lastrowid`: `2` What's most interesting is the comment `# self.last_rowid will be 0 if a ""INSERT OR IGNORE"" happened`, which doesn't seem to be the case here. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1718612569,I_kwDOCGYnMM5mb_JZ,552,Document how to setup shell auto-completion,9599,simonw,closed,0,,,,,1,2023-05-21T19:20:41Z,2023-05-21T21:05:16Z,2023-05-21T21:03:40Z,OWNER,,"https://click.palletsprojects.com/en/8.1.x/shell-completion/ This works for `zsh`: eval ""$(_SQLITE_UTILS_COMPLETE=zsh_source sqlite-utils)"" This will probably work for `bash`: eval ""$(_SQLITE_UTILS_COMPLETE=bash_source sqlite-utils)"" Need to add this to the installation docs here: https://sqlite-utils.datasette.io/en/stable/installation.html - along with the pattern for adding that to `.zshrc` or whatever.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718607907,I_kwDOCGYnMM5mb-Aj,551,Make as many examples in the CLI docs as possible copy-and-pastable,9599,simonw,closed,0,,,,,6,2023-05-21T19:04:10Z,2023-05-21T21:04:04Z,2023-05-21T20:57:24Z,OWNER,,"e.g. in this section: https://sqlite-utils.datasette.io/en/stable/cli.html#running-queries-directly-against-csv-or-json The little copy button will also copy the `$ ` which breaks the examples when copied.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718576761,I_kwDOCGYnMM5mb2Z5,548,analyze-tables should validate provide --column names,9599,simonw,closed,0,,,,,1,2023-05-21T17:20:24Z,2023-05-21T17:35:52Z,2023-05-21T17:35:52Z,OWNER,,"Noticed this while testing: - #547 If you pass a non-existent column to `-c/--column` you don't get an error message.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718572201,I_kwDOCGYnMM5mb1Sp,547,No need to show common values if everything is null,9599,simonw,closed,0,,,,,1,2023-05-21T17:05:07Z,2023-05-21T17:19:21Z,2023-05-21T17:19:21Z,OWNER,,"Noticed this: ``` % sqlite-utils analyze-tables content.db repos -c delete_branch_on_merge --common-limit 20 --no-least repos.delete_branch_on_merge: (1/1) Total rows: 158 Null rows: 158 Blank rows: 0 Distinct values: 0 Most common: 158: None ``` The `158: None` there is duplicate information considering we already know there are 158/158 null rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718517882,I_kwDOCGYnMM5mboB6,545,Try out Trogon for a tui interface,9599,simonw,closed,0,,,,,6,2023-05-21T14:08:25Z,2023-05-21T19:33:13Z,2023-05-21T18:41:58Z,OWNER,,https://github.com/Textualize/trogon,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/545/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718515590,I_kwDOCGYnMM5mbneG,544,New options for analyze-tables --common-limit --no-most and --no-least,9599,simonw,closed,0,,,,,2,2023-05-21T14:03:19Z,2023-05-21T17:03:06Z,2023-05-21T16:19:31Z,OWNER,,"The ""least common"" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values. sqlite-utils analyze-tables content.db repos --common-limit 20 --no-least",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,simonw,open,0,,,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline I had to do this workaround: ```sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] ``` I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1702354223,I_kwDOBm6k_c5ld90v,2070,Mechanism for deploying a preview of a branch using Vercel,9599,simonw,closed,0,,,,,2,2023-05-09T16:21:45Z,2023-05-09T16:25:00Z,2023-05-09T16:24:31Z,OWNER,,"I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml It deployed the `json-extras-query` branch here: https://datasette-preview-json-extras-query.vercel.app/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1701018909,I_kwDOCGYnMM5lY30d,543,Tests broken on Windows due to new convert() lambda names,9599,simonw,closed,0,,,,,0,2023-05-08T22:11:29Z,2023-05-08T22:19:04Z,2023-05-08T22:19:04Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/4920084038/jobs/8788501314 ```python sql = 'update [example] set [dt] = lambda_-9223371942137158589([dt]);' ``` From: - #526",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1700936245,I_kwDOCGYnMM5lYjo1,542,Remove `skip_false=True` and `--no-skip-false` in `sqlite-utils` 4.0,9599,simonw,open,0,,,9374594,4.0 backwards incomatible changes,1,2023-05-08T21:04:28Z,2023-05-08T21:07:41Z,,OWNER,,"Following: - #527 The only reason I didn't remove fix this mis-feature entirely is that it represents a backwards incompatible change. I'll make that change in 4.0.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1700840265,I_kwDOCGYnMM5lYMNJ,541,Get tests to pass with `pytest -Werror`,9599,simonw,open,0,,,,,1,2023-05-08T19:57:23Z,2023-05-08T19:59:35Z,,OWNER,,"Inspired by: - #534",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1699184583,I_kwDOCGYnMM5lR3_H,540,sphinx.builders.linkcheck build error,9599,simonw,closed,0,,,,,4,2023-05-07T18:37:09Z,2023-05-08T04:56:13Z,2023-05-07T18:42:36Z,OWNER,,"https://readthedocs.org/projects/sqlite-utils/builds/20512693/ ``` Running Sphinx v6.2.1 Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 442, in load_extension mod = import_module(extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File """", line 1014, in _gcd_import File """", line 991, in _find_and_load File """", line 975, in _find_and_load_unlocked File """", line 671, in _load_unlocked File """", line 783, in exec_module File """", line 219, in _call_with_frames_removed File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/builders/linkcheck.py"", line 20, in from requests import Response File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/requests/__init__.py"", line 43, in import urllib3 File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/urllib3/__init__.py"", line 38, in raise ImportError( ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168 The above exception was the direct cause of the following exception: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/cmd/build.py"", line 280, in build_main app = Sphinx(args.sourcedir, args.confdir, args.outputdir, File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 225, in __init__ self.setup_extension(extension) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 404, in setup_extension self.registry.load_extension(self, extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 445, in load_extension raise ExtensionError(__('Could not import extension %s') % extname, sphinx.errors.ExtensionError: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) Extension error: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699174055,I_kwDOCGYnMM5lR1an,539,"`--raw-lines` option, like `--raw` for multiple lines",9599,simonw,closed,0,,,,,4,2023-05-07T18:07:46Z,2023-05-07T18:43:24Z,2023-05-07T18:26:18Z,OWNER,,I wanted to output newline-separated output of the first column of every row in the results - like `--row` but for more than one line.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1698865182,I_kwDOBm6k_c5lQqAe,2069,[BUG] Cannot insert new data to deployed instance,31861128,yqlbu,open,0,,,,,1,2023-05-07T02:59:42Z,2023-05-07T03:17:35Z,,NONE,,"## Summary Recently, I deployed an instance of datasette to Vercel with the following plugins: - datasette-auth-tokens - datasette-insert With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel: ```console File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ``` I think it is a potential bug. ## Reproduce
metadata.json
```json { ""plugins"": { ""datasette-insert"": { ""allow"": { ""id"": ""*"" } }, ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""INSERT_TOKEN"" }, ""actor"": { ""id"": ""repeater"" } } ], ""param"": ""_auth_token"" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H ""Authorization: Bearer "" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File ""/var/task/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/var/task/datasette/app.py"", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File ""/var/task/datasette/utils/__init__.py"", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File ""/var/task/datasette_insert/__init__.py"", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File ""/var/task/datasette_insert/__init__.py"", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File ""/var/task/datasette/database.py"", line 167, in execute_write_fn raise result File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1695428235,I_kwDOCGYnMM5lDi6L,538,`table.upsert_all` fails to write rows when `not_null` is present,1231935,xavdid,closed,0,,,,,9,2023-05-04T07:30:38Z,2023-05-08T20:06:35Z,2023-05-08T19:27:02Z,NONE,,"I found an odd bug today, where calls to `table.upsert_all` don't write rows if you include the `not_null` kwarg. ## Repro Example ```py from sqlite_utils import Database db = Database(""upsert-test.db"") db[""comments""].upsert_all( [{""id"": 1, ""name"": ""david""}], pk=""id"", not_null=[""name""], ) assert list(db[""comments""].rows) # err! ``` The schema is correctly created: ```sql CREATE TABLE [comments] ( [id] INTEGER PRIMARY KEY, [name] TEXT NOT NULL ) ``` But no rows are created. Removing either the `not_null` kwargs works as expected, as does an `insert_all` call. ## Version Info - Python: `3.11.0` - sqlite-utils: `3.30` - sqlite: `3.39.5 2022-10-14`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1690765434,I_kwDOBm6k_c5kxwh6,2067,Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6,39538958,justmars,open,0,,,,,1,2023-05-01T12:42:28Z,2023-05-03T00:16:03Z,,NONE,,"Hi! Wondering if this issue is limited to my local system or if it affects others as well. It seems like 3.11 errors out on a ""litestream-restored"" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6. To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli. ```sh # create new shell with 3.11.3 litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite # SQLite version 3.41.2 2023-03-22 11:56:21 # Enter "".help"" for usage hints. # sqlite> .tables # _litestream_lock _litestream_seq movie # sqlite> ``` However this get me an `OperationalError` when reading via datasette:
Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) Traceback (most recent call last): File ""/tester/.venv/bin/datasette"", line 8, in sys.exit(cli()) ^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py"", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 660, in check_databases await database.execute_fn(check_connection) File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py"", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```
In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1686042269,I_kwDOBm6k_c5kfvad,2066,Failing test: httpx.InvalidURL: URL too long,9599,simonw,closed,0,,,,,10,2023-04-27T03:48:47Z,2023-04-27T04:27:50Z,2023-04-27T04:27:50Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/4815723640/jobs/8574667731 ``` def urlparse(url: str = """", **kwargs: typing.Optional[str]) -> ParseResult: # Initial basic checks on allowable URLs. # --------------------------------------- # Hard limit the maximum allowable URL length. if len(url) > MAX_URL_LENGTH: > raise InvalidURL(""URL too long"") E httpx.InvalidURL: URL too long /opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/httpx/_urlparse.py:155: InvalidURL =========================== short test summary info ============================ FAILED tests/test_csv.py::test_max_csv_mb - httpx.InvalidURL: URL too long ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1686033652,I_kwDOBm6k_c5kftT0,2065,Datasette cannot be installed with Rye,9599,simonw,closed,0,,,,,4,2023-04-27T03:35:42Z,2023-04-27T05:09:36Z,2023-04-27T05:09:36Z,OWNER,,"https://github.com/mitsuhiko/rye I tried this: rye install datasette But now: ``` % ~/.rye/shims/datasette Traceback (most recent call last): File ""/Users/simon/.rye/shims/datasette"", line 5, in from datasette.cli import cli File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/cli.py"", line 17, in from .app import ( File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/app.py"", line 14, in import pkg_resources ModuleNotFoundError: No module named 'pkg_resources' ``` I think that's because `setuptools` is not included in Rye.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1665510265,I_kwDOBm6k_c5jRat5,2060,Clean up a bunch of warnings from ruff,9599,simonw,open,0,,,,,0,2023-04-13T01:23:02Z,2023-04-13T01:23:02Z,,OWNER,,"See: - #2056 `ruff` spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,mtdukes,open,0,,,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users. Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,simonw,open,0,,,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it. ``` File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request await self.ds.refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas await self._refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database ``` That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1662951875,I_kwDOBm6k_c5jHqHD,2057,DeprecationWarning: pkg_resources is deprecated as an API,9599,simonw,closed,0,,,,,25,2023-04-11T17:41:20Z,2023-09-21T22:09:10Z,2023-09-21T22:09:10Z,OWNER,,"Got this running tests against Python 3.11. ``` ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/datasette/app.py:14: in import pkg_resources ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/pkg_resources/__init__.py:121: in warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) E DeprecationWarning: pkg_resources is deprecated as an API ``` I ran with `pytest -Werror --pdb -x` to get the debugger for that warning, but it turned out searching the code worked better. It's used in these two places: https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/plugins.py#L43-L50 https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/app.py#L1037",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2057/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1657861026,I_kwDOBm6k_c5i0POi,2054,"Make detailed notes on how table, query and row views work right now",9599,simonw,open,0,,,,,13,2023-04-06T18:21:09Z,2023-04-07T20:14:38Z,,OWNER,,"Research to help influence the following: - #2049 - #2053 - #2050 - #262 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1655860104,I_kwDOCGYnMM5ismuI,535,rows: --transpose or psql extended view-like functionality,7908073,chapmanjacobd,closed,0,,,,,2,2023-04-05T15:37:33Z,2023-06-15T08:39:49Z,2023-06-14T22:05:28Z,CONTRIBUTOR,,"It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`) In other words instead of this: ``` sqlite-utils rows --limit 5 --fmt github track_metadata.db songs ``` | track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 | The output would look something like this: ``` $ for col in (sqlite-columns track_metadata.db songs) sqlite-utils --fmt github track_metadata.db ""select $col from songs order by rowid desc limit 5"" end ``` | track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 | ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1650981564,I_kwDOJHON9s5iZ_q8,12,Error running pytest,14314871,amlestin,open,0,,,,,0,2023-04-02T15:02:36Z,2023-04-02T15:07:10Z,,NONE,,"`______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'` Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using `python -m pytest` The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,simonw,open,0,,,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649791661,I_kwDOBm6k_c5iVdKt,2050,Row page JSON should use new ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-03-31T17:56:53Z,2023-03-31T17:59:49Z,,OWNER,,"https://latest.datasette.io/fixtures/facetable/2.json Related: - #2049 - #1709 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related: - #262 I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too. Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646068413,I_kwDOBm6k_c5iHQK9,2048,Test failures encountered while packaging for GNU Guix,8332263,Apteryks,open,0,,,,,0,2023-03-29T15:36:54Z,2023-03-29T15:36:54Z,,NONE,,"Hello, While reviewing a packaged submitted to Guix to add `datasette`, the test suite produces the following errors: ``` =================================== FAILURES =================================== _________________________ test_row_strange_table_name __________________________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_row_strange_table_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"" ) > assert response.status == 200 E assert 400 == 200 E + where 400 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where ""rowid""=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv _______________ test_database_page_for_database_with_dot_in_name _______________ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_with_dot = def test_database_page_for_database_with_dot_in_name(app_client_with_dot): response = app_client_with_dot.get(""/fixtures~2Edot.json"") > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___________________ test_tilde_encoded_database_names[fo%o] ____________________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'fo%o' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __________________ test_tilde_encoded_database_names[f~/c.d] ___________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'f~/c.d' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ______________ test_database_with_space_in_name[/searchable.json] ______________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________________ test_database_with_space_in_name[.json] ____________________ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ______________ test_database_with_space_in_name[/searchable_view] ______________ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects _____________________ test_database_with_space_in_name[/] ______________________ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_database_with_space_in_name[/searchable] _________________ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________ test_database_with_space_in_name[/searchable_view.json] ____________ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_weird_database_names[database (1).sqlite] ________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _____________ test_weird_database_names[test-database (1).sqlite] ______________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['
', '', '']] @pytest.mark.parametrize( ""path,expected"", ( ( ""/fixtures/compound_primary_key/a,b"", [ [ '', '', '', ] ], ), ( ""/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd"", [ [ '', '', '', ] ], ), ), ) def test_row_html_compound_primary_key(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563'] @pytest.mark.parametrize( ""path,expected_classes"", [ (""/"", [""index""]), (""/fixtures"", [""db"", ""db-fixtures""]), (""/fixtures?sql=select+1"", [""query"", ""db-fixtures""]), ( ""/fixtures/simple_primary_key"", [""table"", ""db-fixtures"", ""table-simple_primary_key""], ), ( ""/fixtures/neighborhood_search"", [""query"", ""db-fixtures"", ""query-neighborhood_search""], ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", [""table"", ""db-fixtures"", ""table-tablewithslashescsv-fa7563""], ), ( ""/fixtures/simple_primary_key/1"", [""row"", ""db-fixtures"", ""table-simple_primary_key""], ), ], ) def test_css_classes_on_body(app_client, path, expected_classes): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html' @pytest.mark.parametrize( ""path,expected_considered"", [ (""/"", ""*index.html""), (""/fixtures"", ""database-fixtures.html, *database.html""), ( ""/fixtures/simple_primary_key"", ""table-fixtures-simple_primary_key.html, *table.html"", ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""table-fixtures-tablewithslashescsv-fa7563.html, *table.html"", ), ( ""/fixtures/simple_primary_key/1"", ""row-fixtures-simple_primary_key.html, *row.html"", ), ], ) def test_templates_considered(app_client, path, expected_considered): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json' @pytest.mark.parametrize( ""path,expected"", ( # Instance index page (""/"", ""http://localhost/.json""), # Table page (""/fixtures/facetable"", ""http://localhost/fixtures/facetable.json""), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json"", ), # Row page ( ""/fixtures/no_primary_key/1"", ""http://localhost/fixtures/no_primary_key/1.json"", ), # Database index page ( ""/fixtures"", ""http://localhost/fixtures.json"", ), # Custom query page ( ""/fixtures?sql=select+*+from+facetable"", ""http://localhost/fixtures.json?sql=select+*+from+facetable"", ), # Canned query page ( ""/fixtures/neighborhood_search?text=town"", ""http://localhost/fixtures/neighborhood_search.json?text=town"", ), # /-/ pages ( ""/-/plugins"", ""http://localhost/-/plugins.json"", ), ), ) def test_alternate_url_json(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B' @pytest.mark.parametrize( ""path,expected"", [ ( ""/fixtures/neighborhood_search"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text="", ), ( ""/fixtures/neighborhood_search?text=ber"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text=ber"", ), (""/fixtures/pragma_cache_size"", None), ( # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬 ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC"", ""/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B"", ), (""/fixtures/magic_parameters"", None), ], ) def test_edit_sql_link_on_canned_queries(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _______________________ test_table_with_slashes_in_name ________________________ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"" ) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError __________________ test_custom_query_with_unicode_characters ___________________ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_custom_query_with_unicode_characters(app_client): # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json response = app_client.get( ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"" ) > assert [{""id"": 1, ""name"": ""San Francisco""}] == response.json /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/__init__.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , s = '', idx = 0 def raw_decode(self, s, idx=0): """"""Decode a JSON document from ``s`` (a ``str`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """""" try: obj, end = self.scan_once(s, idx) except StopIteration as err: > raise JSONDecodeError(""Expecting value"", s, err.value) from None E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""path,expected_rows"", [ ( ""/fixtures/searchable.json?_search=dog"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # Special keyword shouldn't break FTS query ""/fixtures/searchable.json?_search=AND"", [], ), ( # Without _searchmode=raw this should return no results ""/fixtures/searchable.json?_search=te*+AND+do*"", [], ), ( # _searchmode=raw ""/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # _searchmode=raw combined with _search_COLUMN ""/fixtures/searchable.json?_search_text2=te*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], ], ), ( ""/fixtures/searchable.json?_search=weasel"", [[2, ""terry dog"", ""sara weasel"", ""puma""]], ), ( ""/fixtures/searchable.json?_search_text2=dog"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ( ""/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ], ) def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _____ test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] ______ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {'searchmode': 'raw'}, querystring = '_search=te*+AND+do*' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {}, querystring = '_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: ""/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest"" arguments: (""-vv"" ""-n"" ""24"" ""-m"" ""not serial"") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds ``` The tests run in a private namespace without internet connectivity, and the Python dependencies are at: ``` python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9. Thank you!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1641013220,I_kwDOBm6k_c5hz9_k,2045,First column on a view page has no facet option in cog menu,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2023-03-26T18:02:47Z,2023-03-26T18:02:48Z,,OWNER,,"e.g. first column on this page - cog menu has no option to facet. https://datasette.io/content/tools ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,simonw,open,0,,,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0. The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1633077183,I_kwDOBm6k_c5hVse_,2041,Remove obsolete table POST code,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-03-21T01:01:40Z,2023-03-21T01:17:44Z,2023-03-21T01:17:43Z,OWNER,,"Spotted this in: - #1999 `POST /db/table` currently executes obsolete code for inserting a row - I replaced that with `/db/table/-/insert` in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1622640374,I_kwDOCGYnMM5gt4b2,534, ResourceWarning: unclosed file,1244826,djhenderson,closed,0,,,,,1,2023-03-14T03:02:18Z,2023-05-08T19:56:29Z,2023-05-08T19:56:29Z,NONE,,"Issuing either ``` py -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` or ``` set pythonwarnings=default sqlite-utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` exhibits a ResourceWarning indicating that the CSV file being loaded is not closed. sqlite-utils --version sqlite-utils, version 3.30 py --version Python 3.11.2 Windows Version 10.0.19045 Build 19045 SQLite version 3.41.0 2023-02-21 18:09:37 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620516340,I_kwDOCGYnMM5glx30,533,ReadTheDocs error: not all arguments converted during string formatting,9599,simonw,closed,0,,,,,2,2023-03-12T21:21:05Z,2023-03-12T21:25:33Z,2023-03-12T21:25:33Z,OWNER,,"This came up as a failure running tests for: - #531 Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/ ``` File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620515757,I_kwDOBm6k_c5glxut,2039,Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\`,15178711,asg017,open,0,,,,,0,2023-03-12T21:18:52Z,2023-03-12T21:18:52Z,,CONTRIBUTOR,,"From the Datasette discord: A user tried running the following command on windows: ``` datasette --load-extension=""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"" ``` This failed with `""The specified module could not be found""`, because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at `""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""`, it instead tried to load the extension at `""C""` with entrypoint `""\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"". This is hard because most absolute windows paths have a colon in them, like `C:\foo.txt` or `D:\bar.txt`. I'd image the `--static` flag is also vulnerable to this type of bug. The ""solution"" is to use a relative path instead, but that doesn't feel that great. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,voltagex,closed,0,,,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json` ``` sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json [###################################-] 99% 00:00:00 Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ``` Please show more information as to *why* this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,simonw,open,0,,,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here: ```diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( ""Allow display of SQL trace debug information with ?_trace=1"", ), Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""), + Setting(""strict_templates"", False, ""Raise errors for undefined template variables""), ) _HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead"" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting(""strict_templates""): + env_extras[""undefined""] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters[""escape_css_string""] = escape_css_string self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus ``` Explored this idea a bit in: - #1999",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,simonw,open,0,,,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,simonw,open,0,,,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it: > Or just ""Actions ⚙️ "" And got back: > `Actions ‚öôÔ∏è` Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this: ```python s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè' s = s.encode('latin-1') s = s.decode('utf-8') s = s.encode('macroman') s = s.decode('utf-8') print(s) ``` ",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,simonw,open,0,,,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,simonw,open,0,,,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then. Problem is I don't actually know what order it iterates over the notes in.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615862295,I_kwDOBm6k_c5gUBoX,2036,"`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems",9599,simonw,closed,0,,,,,6,2023-03-08T20:11:44Z,2023-03-08T20:57:34Z,2023-03-08T20:57:34Z,OWNER,,"See this issue: - https://github.com/simonw/datasette.io/issues/141",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615692818,I_kwDOBm6k_c5gTYQS,2035,Potential feature: special support for `?a=1&a=2` on the query page,9599,simonw,open,0,,,3268330,Datasette 1.0,14,2023-03-08T18:05:03Z,2023-03-31T16:09:08Z,,OWNER,,"From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138 The key idea is to make it easier for people to implement `where id in (...)` that's populated from query string arguments. What if you could add `?id=11&id=32&id=62` to the URL and have that made available as a list that can be used in the query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1612296210,I_kwDOBm6k_c5gGbAS,2033,`datasette install -r requirements.txt`,9599,simonw,closed,0,,,,,2,2023-03-06T22:17:17Z,2023-03-06T22:54:52Z,2023-03-06T22:27:34Z,OWNER,,"Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,cldellow,open,0,,,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1595340692,I_kwDOCGYnMM5fFveU,530,"add ability to configure ""on delete"" and ""on update"" attributes of foreign keys:",536941,fgregg,open,0,,,,,2,2023-02-22T15:44:14Z,2023-05-08T20:39:01Z,,CONTRIBUTOR,,"sqlite supports these, and it would be quite nice to be able to add them with sqlite-utils. https://www.sqlite.org/foreignkeys.html#fk_actions",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,gk7279,closed,0,,,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1592327343,I_kwDOBm6k_c5e6Pyv,2029,"Sorry Simon, didn't know how else to contact you",5804626,llchristopherson,open,0,,,,,0,2023-02-20T19:02:53Z,2023-02-20T19:02:53Z,,NONE,,"Hi Simon, Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org. Thanks, Laura",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2029/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,dmick,open,0,,,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1581090327,I_kwDOCGYnMM5ePYYX,529,Microsoft line endings,7908073,chapmanjacobd,closed,0,,,,,1,2023-02-12T02:20:48Z,2023-06-14T23:12:12Z,2023-06-14T23:11:47Z,CONTRIBUTOR,,"sqlite-utils prints `\r\n` but [it should probably](https://devblogs.microsoft.com/commandline/extended-eol-in-notepad/) print `\n` (unless the platform is detected as Windows?) It has tripped me up a few times when piping the output of sqlite-utils to other programs: ``` $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579973223,I_kwDOBm6k_c5eLHpn,2024,Mention WAL mode in documentation,9599,simonw,open,0,,,,,1,2023-02-10T16:11:10Z,2023-02-10T16:11:53Z,,OWNER,,It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578790070,I_kwDOCGYnMM5eGmy2,527,`Table.convert()` skips falsey values,167893,mcarpenter,closed,0,,,,,5,2023-02-10T00:00:52Z,2023-05-09T21:15:05Z,2023-05-08T21:03:24Z,CONTRIBUTOR,,"# Summary By design, `Table.convert()` does [not attempt](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2663) conversion of falsey values (`None`, `""""`, `0`, ...). This is surprising (directly contradicts the docstring) and `convert()` may quietly skip cells where the user assumed a conversion would take place. # Example Increment a column of integers by one ``` python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 0}, {col:1}]) print(table.get(1)) # 0 print(table.get(2)) # 1 print() table.convert(col, lambda x: x+1) print(table.get(1)) # got 0, expected 1 ⚠⚠⚠ print(table.get(2)) # got 2, expected 2 ``` Another example might be, say, transforming cells containing empty string to `NULL`. # Discussion This was, I think, a pragmatic choice so that consumers can skip writing guard clauses for these falsey values (particularly from the CLI). But this surprising undocumented behavior can lead to incorrect data. I don't think this is a good trade-off between convenience and correctness. In the absence of this convenience users will either have to write guard clauses into their conversion expressions (or adapt the called function to do the same), so: ``` python fn(value) if value else value ``` instead of: ``` python fn(value) ``` This is more typing and sometimes I will forget, and there will be errors. (But they will be noisy errors, which is a good thing). Such a change will certainly inconvenience some existing consumers; there will be some breakage. But I think this is worth it to avoid quietly not converting some values by default, which can lead to quietly bad data. I have a PR that I will attach, please take a look and see what you think.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,DavidPratten,closed,0,,,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1577548579,I_kwDOBm6k_c5eB3sj,2021,Docker images for 1.0 alphas?,1563881,meowcat,open,0,,,,,0,2023-02-09T09:35:52Z,2023-02-09T09:35:52Z,,NONE,,"Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2021/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575880841,I_kwDOBm6k_c5d7giJ,2020,"Documentation refers to ""off"" setting; doesn't seem to work, ""false"" does",1350673,dmick,open,0,,,,,0,2023-02-08T10:38:10Z,2023-02-08T10:38:10Z,,NONE,,"https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using ""off"" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a ""JSON boolean"" and have the values ""true"" or ""false"". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2020/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575131737,I_kwDOCGYnMM5d4ppZ,525,Repeated calls to `Table.convert()` fail,167893,mcarpenter,closed,0,,,,,4,2023-02-07T22:40:47Z,2023-05-08T21:59:41Z,2023-05-08T21:54:02Z,CONTRIBUTOR,,"## Summary When using the API, repeated calls to `Table.convert()` do not work correctly since all conversions quietly use the callable (function, lambda) from the first call to `convert()` only. Subsequent invocations with different callables use the callable from the first invocation only. ## Example ```python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 1}]) print(table.get(1)) table.convert(col, lambda x: x*2) print(table.get(1)) def zeroize(x): return 0 #zeroize = lambda x: 0 #zeroize.__name__ = 'zeroize' table.convert(col, zeroize) print(table.get(1)) ``` Output: ``` {'x': 1} {'x': 2} {'x': 4} ``` Expected: ``` {'x': 1} {'x': 2} {'x': 0} ``` ## Explanation This is some relevant [documentation](https://github.com/simonw/sqlite-utils/blob/1491b66dd7439dd87cd5cd4c4684f46eb3c5751b/docs/python-api.rst#registering-custom-sql-functions:~:text=By%20default%20registering%20a%20function%20with%20the%20same%20name%20and%20number%20of%20arguments%20will%20have%20no%20effect). * `Table.convert()` takes a `Callable` to perform data conversion on a column * The `Callable` is passed to `Database.register_function()` * `Database.register_function()` uses the callable's `__name__` attribute for registration * (Aside: all lambdas have a `__name__` of ``: I thought this was the problem, and it was close, but not quite) * However `convert()` first wraps the callable by local function [`convert_value()`](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2661) * Consequently `register_function()` sees name `convert_value` for all invocations from `convert()` * `register_function()` silently ignores registrations using the same name, retaining only the first such registration There's a mismatch between the comments and the code: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L404 but actually the existing function is returned/used instead (as the ""registering custom sql functions"" doc I linked above says too). Seems like this can be rectified to match the comment? ## Suggested fix I think there are four things: 1. The call to `register_function()` from `convert()`should have an explicit `name=` parameter (to continue using `convert_value()` and the progress bar). 2. For functions, this name can be the real function name. (I understand the sqlite api needs a name, and it's nice if those are recognizable names where possible). For lambdas would `'lambda-{uuid}'` or similar be acceptable? 3. `register_function()` really should throw an error on repeated attempts to register a duplicate (function, arity)-pair. 4. A test? I haven't looked at the test framework here but seems this should be testable. ## See also - #458 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1573424830,I_kwDOBm6k_c5dyI6-,2019,Refactor out the keyset pagination code,9599,simonw,open,0,,,,,14,2023-02-06T23:04:00Z,2023-02-08T01:40:46Z,,OWNER,,"While working on: - #1999 I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493 Extracting that into a utility function would simplify that code a lot.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2019/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1572766460,I_kwDOCGYnMM5dvoL8,524,Transformation type `--type DATETIME`,21095447,4l1fe,closed,0,,,,,15,2023-02-06T15:18:42Z,2023-02-15T12:10:54Z,2023-02-15T12:10:54Z,NONE,,"Hey. Currently i do transformation with the type `--type TEXT`, but i noticed using the sqlalchemy based library [dataset](https://github.com/pudo/dataset) that is reading and writing differ depending on the column types `TEXT`, `DATETIME`. Is it possible to alter a column type to `DATETIME` somehow using Sqlite-Utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1571711808,I_kwDOBm6k_c5drmtA,2018,`check_visibility` gives confusing (wrong?) results if permission is `None`,193185,cldellow,open,0,,,,,0,2023-02-06T01:03:08Z,2023-02-06T01:03:46Z,,CONTRIBUTOR,,"I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ (""update-row"", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571207083,I_kwDOBm6k_c5dprer,2016,Database metadata fields like description are not available in the index page template's context,9993,palewire,open,0,,,3268330,Datasette 1.0,1,2023-02-05T02:25:53Z,2023-02-05T22:56:43Z,,NONE,,"When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,simonw,open,0,,,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1565179870,I_kwDOBm6k_c5dSr_e,2013,Datasette uses non-standard quoting for identifiers,193185,cldellow,open,0,,,,,0,2023-02-01T00:05:39Z,2023-02-01T00:06:30Z,,CONTRIBUTOR,,"Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html: > ""keyword"" A keyword in double-quotes is an identifier. > [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2013/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564774831,I_kwDOBm6k_c5dRJGv,2012,Missing space in database summary,9599,simonw,open,0,,,,,0,2023-01-31T18:01:13Z,2023-01-31T18:01:13Z,,OWNER,,"Spotted this on an instance index page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2012/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564769997,I_kwDOBm6k_c5dRH7N,2011,"Applied facet did not result in an ""x"" icon to dismiss it",9599,simonw,open,0,,,,,1,2023-01-31T17:57:44Z,2023-01-31T17:58:54Z,,OWNER,,"![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png) That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata It's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1563264257,I_kwDOBm6k_c5dLYUB,2010,Row page should default to card view,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-01-30T21:49:37Z,2023-01-30T21:52:06Z,,OWNER,,"Datasette currently uses the same table layout on the row pages as it does on the table pages: https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect https://datasette.io/content/pypi_packages/datasette-column-inspect If you shrink down to mobile width you get this instead, on both of those pages: I think that view, which I think of as the ""card view"", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1560662739,I_kwDOBm6k_c5dBdLT,2007,`render_cell()` hook should take an optional `request` argument,9599,simonw,closed,0,,,,,1,2023-01-28T03:13:00Z,2023-08-09T17:15:03Z,2023-01-28T03:34:26Z,OWNER,,From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2007/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1560651350,I_kwDOCGYnMM5dBaZW,523,Feature request: trim all leading and trailing white space for all columns for all tables in a database,536941,fgregg,open,0,,,,,1,2023-01-28T02:40:10Z,2023-01-28T02:41:14Z,,CONTRIBUTOR,,"It's pretty common that i need to trim leading or trailing white space from lots of columns in a database a part of an initial ETL. I use the following recipe a lot, and it would be great to include this functionality into sqlite-utils `trimify.sql` ```sql select 'select group_concat(''update [' || name || '] set ['' || name || ''] = trim(['' || name || ''])'', ''; '') || ''; '' as sql_to_run from pragma_table_info('''||name||''');' from sqlite_schema; ``` then something like: ```bash sqlite3 example.db < scripts/trimify.sql > table_trim.sql && \ sqlite3 $example.db < table_trim.sql > trim.sql && \ sqlite3 $example.db < trim.sql ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557599877,I_kwDODFE5qs5c1xaF,12,location history changes,14809320,gerardrbentley,open,0,,,,,0,2023-01-26T03:57:25Z,2023-01-26T03:57:25Z,,NONE,,"not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from ""Location History.json"" to ""Records.json"" `""timestampMs""` is not present, `""timestamp""` is roughly iso timestamp ```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%SZ"").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%S.%fZ"").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open(""Takeout/Location History/Records.json"") ) db[""location_history""].upsert_all( ( { ""id"": id_for_location_history(row), ""latitude"": row[""latitudeE7""] / 1e7, ""longitude"": row[""longitudeE7""] / 1e7, ""accuracy"": row[""accuracy""], ""timestampMs"": get_timestamp_ms(row[""timestamp""]), ""when"": row[""timestamp""], } for row in location_history[""locations""] ), pk=""id"", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=("","", "":""), sort_keys=True).encode(""utf8"") ).hexdigest()[:6] return ""{}-{}"".format( row['timestamp'], first_six, ) ``` example locations from mine ```json { ""latitudeE7"": 427220206, ""longitudeE7"": -923423972, ""accuracy"": 10, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:31:50.867Z"" } ``` ```json { ""latitudeE7"": 427011317, ""longitudeE7"": -923448300, ""accuracy"": 5, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:33:53Z"" }, ```",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 2}",, 1557507274,I_kwDOBm6k_c5c1azK,2005,`extra_template_vars` should be OK to return `None`,9599,simonw,open,0,,,,,1,2023-01-26T01:40:45Z,2023-01-26T01:41:50Z,,OWNER,,"Got this exception and had to make sure it always returned `{}`: ``` File "".../python3.11/site-packages/datasette/app.py"", line 1049, in render_template assert isinstance(extra_vars, dict), ""extra_vars is of type {}"".format( AssertionError: extra_vars is of type ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1554032168,I_kwDOBm6k_c5coKYo,2002,Document how actors are displayed,9599,simonw,open,0,,,,,0,2023-01-24T00:08:49Z,2023-01-24T00:08:49Z,,OWNER,,"https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/__init__.py#L1052-L1056 This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553615704,I_kwDOBm6k_c5cmktY,2001,Datasette is not compatible with SQLite's strict quoting compilation option,406380,gwk,open,0,,,,,4,2023-01-23T19:10:07Z,2023-01-25T04:59:58Z,,NONE,,"I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a ""MySQL 3.x misfeature"". See https://www.sqlite.org/quirks.html#dblquote for background. Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment. My experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response. The error: `sqlite3.OperationalError: no such column: geometry_columns` The responsible SQL: `'select 1 from sqlite_master where tbl_name = ""geometry_columns""'` I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `""select 1 from sqlite_master where tbl_name = 'geometry_columns'""`. With this change, I get a little further, but have the same problem with the first table name in my database (in my case, ""Meta""): ``` OperationalError: no such column: Meta Traceback (most recent call last): File ""/Users/gwk/external/datasette/datasette/app.py"", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/index.py"", line 70, in get ""fts_table"": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/utils/__init__.py"", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - ""GET / HTTP/1.1"" 500 Internal Server Error ``` I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon. Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: ``` sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0); ``` As far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553425465,I_kwDOCGYnMM5cl2Q5,522,Add COLUMN_TYPE_MAPPING for timedelta,81377,maport,closed,0,,,,,0,2023-01-23T16:49:54Z,2023-11-04T00:49:51Z,2023-11-04T00:49:51Z,NONE,,"Currently trying to create a column with Python type `datetime.timedelta` results in an error: ``` >>> from sqlite_utils import Database >>> db = Database(""test.db"") >>> test_tbl = db['test'] >>> test_tbl.insert({'col1': datetime.timedelta()}) Traceback (most recent call last): File """", line 1, in File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 2979, in insert return self.insert_all( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 3082, in insert_all self.create( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 1574, in create self.db.create_table( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 961, in create_table sql = self.create_table_sql( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 852, in create_table_sql column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` The reason this would be useful is that `MySQLdb` uses `timedelta` for MySQL `TIME` columns: ``` >>> import MySQLdb >>> conn = MySQLdb.connect(host='database', user='user', passwd='pw') >>> csr = conn.cursor() >>> csr.execute(""SELECT CAST('11:20' AS TIME)"") >>> tuple(csr) ((datetime.timedelta(seconds=40800),),) ``` So currently any attempt to convert a MySQL DB with a `TIME` column using `db-to-sqlite` will result in the above error. I was rather surprised that `MySQLdb` uses `timedelta` for `TIME` columns but I see that [this column type](https://dev.mysql.com/doc/refman/8.0/en/time.html) is intended for time intervals as well as the time of day so it makes sense. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/522/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1552368054,I_kwDOBm6k_c5ch0G2,2000,rewrite_sql hook,193185,cldellow,open,0,,,,,1,2023-01-23T01:02:52Z,2023-01-23T06:08:01Z,,CONTRIBUTOR,,"I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like ```python def rewrite_sql(datasette, database, request, fn, sql, params) ``` It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement. The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed. Plugins that could be written with this hook: - https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching - a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004)) - a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID - a plugin to maintain audit tables when users write to a table - a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue Flaws with this idea: `execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2000/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,simonw,open,0,,,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1550536442,I_kwDOCGYnMM5ca076,521,Custom JSON encoder,31504,janrito,open,0,,,,,0,2023-01-20T09:19:40Z,2023-01-20T09:19:40Z,,NONE,,"It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1538197093,I_kwDOBm6k_c5brwZl,1995,foreign_keys error 500,137183,jonschoning,open,0,,,,,0,2023-01-18T15:27:36Z,2023-01-18T16:44:01Z,,NONE,,"**Error 500 expected string or bytes-like object** [espial-new.sqlite3.zip](https://github.com/simonw/datasette/files/10447965/espial-new.sqlite3.zip) run `datasette espial-new.sqlite3` & click on any table other than `User` ``` /home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """""".format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk[""other_column""]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk[""other_table""]), │ │ 817 │ │ │ placeholders="", "".join([""?""] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f""[{s}]"" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 134, in view return await self.dispatch_request(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 91, in dispatch_request return await handler(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 603, in _data_traced await self.ds.expand_foreign_keys( File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 814, in expand_foreign_keys other_column=escape_sqlite(fk[""other_column""]), File ""/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py"", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - ""GET /espial-new/bookmark HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:38574 - ""GET /-/static/app.css?d59929 HTTP/1.1"" 200 OK ``` Schema: ``` CREATE TABLE IF NOT EXISTS ""user"" ( ""id"" INTEGER PRIMARY KEY, ""name"" VARCHAR NOT NULL, ""password_hash"" VARCHAR NOT NULL, ""api_token"" VARCHAR NULL, ""private_default"" BOOLEAN NOT NULL, ""archive_default"" BOOLEAN NOT NULL, ""privacy_lock"" BOOLEAN NOT NULL, CONSTRAINT ""unique_user_name"" UNIQUE (""name"") ); CREATE TABLE IF NOT EXISTS ""bookmark"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), ""href"" VARCHAR NOT NULL, ""description"" VARCHAR NOT NULL, ""extended"" VARCHAR NOT NULL, ""time"" TIMESTAMP NOT NULL, ""shared"" BOOLEAN NOT NULL, ""to_read"" BOOLEAN NOT NULL, ""selected"" BOOLEAN NOT NULL, ""archive_href"" VARCHAR NULL, CONSTRAINT ""unique_user_href"" UNIQUE (""user_id"", ""href""), CONSTRAINT ""unique_user_slug"" UNIQUE (""user_id"", ""slug"") ); CREATE TABLE IF NOT EXISTS ""bookmark_tag"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""tag"" VARCHAR NOT NULL, ""bookmark_id"" INTEGER NOT NULL REFERENCES ""bookmark"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""seq"" INTEGER NOT NULL, CONSTRAINT ""unique_user_tag_bookmark_id"" UNIQUE (""user_id"", ""tag"", ""bookmark_id""), CONSTRAINT ""unique_user_bookmark_id_tag_seq"" UNIQUE (""user_id"", ""bookmark_id"", ""tag"", ""seq"") ); CREATE TABLE IF NOT EXISTS ""note"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), ""length"" INTEGER NOT NULL, ""title"" VARCHAR NOT NULL, ""text"" VARCHAR NOT NULL, ""is_markdown"" BOOLEAN NOT NULL, ""shared"" BOOLEAN NOT NULL DEFAULT false, ""created"" TIMESTAMP NOT NULL, ""updated"" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1995/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1536851861,I_kwDOBm6k_c5bmn-V,1994,Stuck on loading screen,10913053,jackhagley,open,0,,,,,1,2023-01-17T18:33:49Z,2023-01-23T08:21:08Z,,NONE,,"Can’t actually open it! Downloaded today from the releases tab Running macOS13.1 ``` bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms ``` STUCK",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1994/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1533673397,I_kwDOBm6k_c5baf-1,1991,fts5 tables are not auto-detected and hidden,83819,keturn,open,0,,,,,0,2023-01-15T06:00:42Z,2023-01-20T04:54:24Z,,NONE,,"I set up a [Datasette instance](https://huggingface.co/spaces/Sygil/INE-dataset-explorer/tree/main) and was following the docs on full-text search. When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either. If I [manually set](https://huggingface.co/spaces/keturn/INED-datasette/blob/main/metadata.json#L9) `fts_table` for a view, then search does work as expected. My table and view creation code looks like this: ```py connection.execute(""""""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """""")   connection.execute(""""""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """""") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1991/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1532000914,I_kwDOBm6k_c5bUHqS,1990,Suggestion: Highlight error messages ('These facets timed out'),116795,pax,open,0,,,,,0,2023-01-13T09:40:58Z,2023-01-13T09:40:58Z,,NONE,,"I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the _These facets timed out_ notice. It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1990/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529707837,I_kwDOBm6k_c5bLX09,1988,Reconsider pattern where plugins could break existing template context,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2023-01-11T21:13:43Z,2023-01-11T21:25:05Z,,OWNER,,"> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing. _Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1528448642,I_kwDOBm6k_c5bGkaC,1985,Don't let Datasette(path) without a list cause weird errors,9599,simonw,closed,0,,,,,1,2023-01-11T05:17:44Z,2023-01-11T18:25:04Z,2023-01-11T18:25:04Z,OWNER,,"I got a confusing `sqlite3.OperationalError: disk I/O error` error in my tests, it turned out it was because this: ```python ds = Datasette(path) ``` Should have been this: ```python ds = Datasette([path]) ``` _Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524983536,I_kwDOBm6k_c5a5Wbw,1981,Canned query field labels truncated,9599,simonw,open,0,,,,,1,2023-01-09T06:04:24Z,2023-01-09T06:05:44Z,,OWNER,,"Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776 ![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524867951,I_kwDOBm6k_c5a46Nv,1980,"""Cannot sort table by id"" when sortable_columns is used",9599,simonw,open,0,,,,,2,2023-01-09T03:21:33Z,2023-01-09T03:23:53Z,,OWNER,,"I had an instance with this in `metadata.yml`: ```yaml databases: timezones: tables: timezones: sortable_columns: - tzid ``` When I clicked on the ""Apply"" button here: It sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message: > 500: Cannot sort table by id",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524431805,I_kwDODEm0Qs5a3Pu9,72,"Import thread, including self- and others' replies",601708,mcint,open,0,,,,,0,2023-01-08T09:51:06Z,2023-01-08T09:51:06Z,,NONE,,"statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this. `twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234` twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is [implemented in twarc](https://sourcegraph.com/github.com/DocNow/twarc/-/blob/twarc/client.py?L708-766&subtree=true), using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a [search query](https://stackoverflow.com/a/30480103/1020467), use of [undocumented `related_results` api](https://stackoverflow.com/a/9419346/1020467), or with requested inclusion of [newer conversation_id](https://stackoverflow.com/a/68115718/1020467) with subsequent query. ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524076587,I_kwDOBm6k_c5a15Ar,1979,More useful error message if enable_load_extension is not available,9599,simonw,closed,0,,,,,5,2023-01-07T19:13:19Z,2023-01-08T00:21:23Z,2023-01-08T00:21:23Z,OWNER,,"I get this from: datasette --load-extension spatialite --get /-/versions.json ``` File ""/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/datasette/app.py"", line 614, in _prepare_connection conn.enable_load_extension(True) AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' ``` It would be useful if Datasette caught this error and output something more friendly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1522778923,I_kwDOBm6k_c5aw8Mr,1978,Document datasette.urls.row and row_blob,25778,eyeseast,closed,0,,,,,2,2023-01-06T15:45:51Z,2023-01-09T14:30:00Z,2023-01-09T14:30:00Z,CONTRIBUTOR,,"These are in the codebase but not in documentation. I think everything else in this class is documented. ```python class Urls: ... def row(self, database, table, row_path, format=None): path = f""{self.table(database, table)}/{row_path}"" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) def row_blob(self, database, table, row_path, column): return self.table(database, table) + ""/{}.blob?_blob_column={}"".format( row_path, urllib.parse.quote_plus(column) ) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned 1516815571,I_kwDOBm6k_c5aaMTT,1975,_col=id can cause id column to export twice in CSV export,9599,simonw,open,0,,,,,0,2023-01-03T00:25:15Z,2023-01-03T00:25:21Z,,OWNER,,"https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1 ```csv id,id,title,body 1,1,WaSP Phase II,""

The Web Standards project has launched Phase II.

"" ``` That should not have two `id` columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1975/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,mmngreco,open,0,,,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the `README.md` I've faced this error: ```bash (venv) mgreco@pop-os apple-health master* (23:44|0s) $ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml Importing from HealthKit [------------------------------------] 0% Traceback (most recent call last): File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')()) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite for tag, el in find_all_tags( File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags for event, el in parser.read_events(): File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events raise event File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: syntax error: line 156, column 0 ``` So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads: 1. Uncompress the zip and move the new folder where `export.xml` is. 1. Create a `patch.txt` with the following content ```diff --- export.xml 2022-09-18 15:17:09.000000000 -0400 +++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400 @@ -15,6 +15,7 @@ HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED + HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED > - + - + - device CDATA #IMPLIED - - -> ]> ``` 1. Apply the path with the command: `patch < patch.txt` 1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml` 1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515815014,I_kwDOBm6k_c5aWYBm,1973,render_cell plugin hook's row object is not a sqlite.Row,193185,cldellow,open,0,,,,,4,2023-01-01T20:27:46Z,2023-01-29T00:40:31Z,,CONTRIBUTOR,,"From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette: > row - sqlite.Row > The SQLite row object that the value being rendered is part of This appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue. I have a table: ```sql CREATE TABLE IF NOT EXISTS ""dss_job_stats""( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) ); ``` On datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like: ``` CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')]) ``` I expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`. I can work around this, but was wondering if this was intended behaviour?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515186569,I_kwDOBm6k_c5aT-mJ,1972,Fix Sphinx warning about extlink extension,9599,simonw,closed,0,,,,,0,2022-12-31T19:12:04Z,2022-12-31T19:13:26Z,2022-12-31T19:13:26Z,OWNER,,"``` [sphinx-autobuild] > sphinx-build -b html /Users/simon/Dropbox/Development/datasette/docs /Users/simon/Dropbox/Development/datasette/docs/_build Running Sphinx v5.3.0 loading pickled environment... done WARNING: extlinks: Sphinx-6.0 will require a caption string to contain exactly one '%s' and all other '%' need to be escaped as '%%'. ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1972/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515182998,I_kwDOBm6k_c5aT9uW,1970,"Path ""None"" in _internal database table",9599,simonw,closed,0,,,,,2,2022-12-31T18:51:05Z,2022-12-31T19:22:58Z,2022-12-31T18:52:49Z,OWNER,,"See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1509783085,I_kwDOBm6k_c5Z_XYt,1969,sql-formatter javascript is not now working with CloudFlare rocketloader,536941,fgregg,open,0,,,,,0,2022-12-23T21:14:06Z,2023-01-10T01:56:33Z,,CONTRIBUTOR,,"This is probably not a bug with datasette, but I thought you might want to know, @simonw. I noticed today that my CloudFlare proxied datasette instance lost the ""Format SQL"" option. I'm pretty sure it was there last week. In the CloudFlare settings, if I turn off [Rocket Loader](https://developers.cloudflare.com/fundamentals/speed/rocket-loader/), I get the ""Format SQL"" option back. Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading? I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1505411725,I_kwDODFdgUs5ZusKN,78,self-hosted or corp github enterprise,549431,ebdavison,open,0,,,,,0,2022-12-20T22:51:45Z,2022-12-20T22:51:45Z,,NONE,,"We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,CharlesNepote,open,0,,,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,lbellomo,closed,0,,,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1501778647,I_kwDOBm6k_c5Zg1LX,1964,Cog menu is not keyboard accessible (also no ARIA),9599,simonw,open,0,,,,,1,2022-12-18T06:36:28Z,2022-12-18T06:37:28Z,,OWNER,,"This menu here: https://latest.datasette.io/fixtures/attraction_characteristic You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard. ![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1964/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501713288,I_kwDOBm6k_c5ZglOI,1963,0.63.3 bugfix release,9599,simonw,closed,0,,,,,2,2022-12-18T02:48:15Z,2022-12-18T03:26:55Z,2022-12-18T03:26:55Z,OWNER,,"I'm going to ship a release which back-ports these two fixes: - https://github.com/simonw/datasette/issues/1958 - https://github.com/simonw/datasette/issues/1955",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1500636982,I_kwDOBm6k_c5Zcec2,1962,"Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`",9599,simonw,open,0,,,,,1,2022-12-16T17:56:51Z,2022-12-16T21:55:29Z,,OWNER,,"In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead: - #1959 But I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result. The main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead. This requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs: - #1843",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1962/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1499081664,I_kwDOBm6k_c5ZWivA,1959,Refactor test suite to use mostly `async def` tests,9599,simonw,closed,0,,,,,9,2022-12-15T21:02:54Z,2022-12-17T21:49:37Z,2022-12-17T21:49:36Z,OWNER,,"I got blocked working on this issue due to weird and hard-to-debug test suite problems: - #1955 The test suite has needed a major upgrade for several years now. It has a LOT of `def test_...` synchronous functions that could be upgraded to `async def` for better performance and less test complexity - I've used the new `async def` pattern in plugins and new tests for a couple of years now. Hopefully I can get more of the tests to use in-memory named databases too, ideally so I can fix this consistent problem: - #1843",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1959/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1497577017,I_kwDOBm6k_c5ZQzY5,1957,Reconsider row value truncation on query page,9599,simonw,open,0,,,,,1,2022-12-14T23:49:47Z,2022-12-14T23:50:50Z,,OWNER,,"Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos ```sql select json_group_array(full_name) from repos ``` ![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png) My intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json The truncation isn't helping here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1497288666,I_kwDOBm6k_c5ZPs_a,1956,Handle abbreviations properly in permission_allowed_actor_restrictions,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T19:54:21Z,2022-12-14T20:04:29Z,2022-12-14T20:04:28Z,OWNER,,"This code currently assumes abbreviations are: ```pyton action_initials = """".join([word[0] for word in action.split(""-"")]) ``` https://github.com/simonw/datasette/blob/1a3dcf494376e32f7cff110c86a88e5b0a3f3924/datasette/default_permissions.py#L182-L208 That's no longer correct, they are now registered by the new plugin hook: - #1939 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1956/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1496652622,I_kwDOBm6k_c5ZNRtO,1955,"invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things",32839123,Rik-de-Kort,closed,0,,,,,36,2022-12-14T13:39:56Z,2022-12-19T04:34:16Z,2022-12-18T02:45:18Z,NONE,,"In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: `gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app`. My most recent deployment, however, fails loudly by shouting that `Datasette.invoke_startup()` was not called. It does not seem to be possible to call `invoke_startup` when running using a uvicorn command directly like this (I've reproduced this locally using `uvicorn`). Two candidates that I have tried: * Uvicorn has a `--factory` option, but the app factory has to be synchronous, so no `await invoke_startup` there * `asyncio.get_event_loop().run_until_complete` is also not an option because `uvicorn` already has the event loop running. One additional option is: * Use Gunicorn's [server hooks](https://docs.gunicorn.org/en/stable/settings.html#server-hooks) to call `invoke_startup`. These are also synchronous, but I might be able to get ahead of the event loop starting here. In my current deployment setup, it does not appear to be possible to use `datasette serve` directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described [here](https://github.com/simonw/azure-functions-datasette)) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation. Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears! Almost forgot, minimal reproducer: ```python from datasette import Datasette ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ``` Save as app.py in the same folder as global-power-plants.db, and then try running `uvicorn app:app`. Opening the resulting Datasette instance in the browser will show the error message.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1495821607,I_kwDOBm6k_c5ZKG0n,1953,Release notes for Datasette 1.0a2,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T06:26:40Z,2022-12-15T02:02:15Z,2022-12-15T02:01:08Z,OWNER,,"https://github.com/simonw/datasette/milestone/27?closed=1 https://github.com/simonw/datasette/compare/1.0a1...9ad76d279e2c3874ca5070626a25458ce129f126",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1953/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1495716243,I_kwDOBm6k_c5ZJtGT,1952,Improvements to /-/create-token restrictions interface,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-12-14T05:22:39Z,2022-12-14T05:23:13Z,,OWNER,,"> It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks. > > That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402_ Also, DO show the `_memory` database there if Datasette was started in `--crossdb` mode.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1952/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1495431932,I_kwDOBm6k_c5ZInr8,1951,`datasette.create_token(...)` method for creating signed API tokens,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,6,2022-12-14T01:25:34Z,2022-12-14T02:43:45Z,2022-12-14T02:42:05Z,OWNER,,"I need this for: - #1947 And I can refactor this to use it too: - #1855 By making this a documented internal API it can be used by other plugins too. It's also going to be really useful for writing tests.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1951/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1495241162,I_kwDOBm6k_c5ZH5HK,1950,"Bad ?_sort returns a 500 error, should be a 400",9599,simonw,closed,0,,,,,2,2022-12-13T22:08:16Z,2022-12-13T22:23:22Z,2022-12-13T22:23:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable?_sort=bad ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1497909798,I_kwDOBm6k_c5ZSEom,1958,datasette --root running in Docker doesn't reliably show the magic URL,11729897,davidhaley,closed,0,,,,,11,2022-12-13T16:29:13Z,2022-12-16T00:59:12Z,2022-12-16T00:55:19Z,NONE,,"I followed these steps: `docker run datasetteproject/datasette pip install datasette-upload-csvs` `docker commit $(docker ps -lq) datasette-with-plugins` `docker run -p 8001:8001 -v $(pwd):/mnt datasette-with-plugins datasette --root -p 8001 -h 0.0.0.0` Visited: http://127.0.0.1:8001/-/plugins ![image](https://user-images.githubusercontent.com/11729897/207392071-d939cd5e-1d96-4e11-b0be-dc06dd207866.png) Visited: http://localhost:8001/-/upload-csvs ![image](https://user-images.githubusercontent.com/11729897/207389241-3e96ca66-ca74-4a16-8b7d-4427ee862c5e.png) I may have missed a step? Thank you. --- Ubuntu 22.04.1 LTS",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1958/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493471221,I_kwDOBm6k_c5ZBI_1,1949,`.json` errors should be returned as JSON,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,10,2022-12-13T06:14:12Z,2022-12-15T00:46:27Z,,OWNER,,"Eg the error in this issue: - #1945 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1949/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493404423,I_kwDOBm6k_c5ZA4sH,1948,500 error on permission debug page when testing actors with _r,9599,simonw,open,0,,,,,1,2022-12-13T05:22:03Z,2022-12-13T05:22:19Z,,OWNER,," The 500 error is silent unless you are looking at the DevTools network pane. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1948/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493390939,I_kwDOBm6k_c5ZA1Zb,1947,UI to create reduced scope tokens from the `/-/create-token` page,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,22,2022-12-13T05:10:48Z,2022-12-14T05:22:00Z,2022-12-14T05:13:24Z,OWNER,,"Split from: - #1855",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1947/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493339206,I_kwDOBm6k_c5ZAoxG,1946,`datasette --get` mechanism for sending tokens,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-13T04:25:05Z,2022-12-13T04:36:57Z,2022-12-13T04:36:57Z,OWNER,,"> For the tests for `datasette create-token` it would be useful if `datasette --get` had a mechanism for sending an `Authorization: Bearer X` header. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1855#issuecomment-1347731288_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1946/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493306655,I_kwDOBm6k_c5ZAg0f,1945,`view-instance` should not be checked for /-/actor.json,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,0,2022-12-13T04:01:46Z,2022-12-13T04:11:56Z,2022-12-13T04:11:56Z,OWNER,,"Spotted this while testing: - #1855 ``` export TOKEN=$(datasette create-token root --secret s -a foo) curl -H ""Authorization: Bearer $TOKEN"" http://localhost:8002/-/actor.json ``` Returned a Forbidden error (and not in JSON either).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1945/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1490576818,I_kwDOBm6k_c5Y2GWy,1943,`/-/permissions` should list available permissions,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-12-11T23:38:03Z,2022-12-15T00:41:37Z,,OWNER,,"> Idea: a `/-/permissions` introspection endpoint for listing registered permissions _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1939#issuecomment-1345691103_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1943/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1487764628,I_kwDOCGYnMM5YrXyU,518,flake8 ValueError: Error code '#' supplied to 'extend-ignore' option...,9599,simonw,closed,0,,,,,0,2022-12-10T01:30:24Z,2022-12-10T01:36:46Z,2022-12-10T01:36:46Z,OWNER,,"> `Error code '#' supplied to 'extend-ignore' option does not match '^[A-Z]{1,3}[0-9]{0,3}$'` https://github.com/simonw/sqlite-utils/actions/runs/3662011265/jobs/6190770361 I think from this: https://github.com/simonw/sqlite-utils/blob/e660635cea6c32f4022818380b1e1ee88e7c93a6/setup.cfg#L1-L3 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487757143,I_kwDOCGYnMM5YrV9X,517,Drop support for Python 3.6,9599,simonw,closed,0,,,,,1,2022-12-10T01:23:31Z,2022-12-10T01:36:36Z,2022-12-10T01:36:36Z,OWNER,,"CI has started failing for Python 3.6: https://github.com/simonw/sqlite-utils/actions/runs/3576322798 It's fixable by swiching away from `ubuntu-latest` according to: - https://github.com/actions/setup-python/issues/355#issuecomment-1335042510 But https://endoflife.date/python says that 3.6 end of life was almost 6 years ago, and end of security support nearly 1 year ago. So I'm OK dropping support entirely - Python 3.6 users will still be able to install version 3.30, just not any releases that come next.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487738738,I_kwDOBm6k_c5YrRdy,1942,Option for plugins to request that JSON be served on the page,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-12-10T01:08:53Z,2022-12-10T01:11:30Z,,OWNER,,"Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say ""I'd like the JSON for a page to be included in a variable on the HTML page""? `datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data. This idea fits with my overall goals to unify the JSON and HTML context too. Refs: - #1711",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1942/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1486036269,I_kwDOBm6k_c5Ykx0t,1941,Mechanism for supporting key rotation for DATASETTE_SECRET,9599,simonw,open,0,,,,,1,2022-12-09T05:24:53Z,2022-12-09T05:25:20Z,,OWNER,,"Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire. Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly. Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1941/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1485757511,I_kwDOBm6k_c5YjtxH,1939,register_permissions(datasette) plugin hook,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,20,2022-12-09T01:33:25Z,2022-12-13T02:07:50Z,2022-12-13T02:05:56Z,OWNER,,"A plugin hook that adds more named permissions to the list which is initially populated here: https://github.com/simonw/datasette/blob/e539c1c024bc62d88df91d9107cbe37e7f0fe55f/datasette/permissions.py#L1-L19 Originally imagined this hook in this comment: - https://github.com/simonw/datasette/issues/1881#issuecomment-1301639370 I need this for a few reasons: - https://github.com/simonw/datasette/issues/1636 - Needs it in order to validate that permissions defined in `metadata.json` are set in the right place (don't set an instance permissions at table level for example) - https://github.com/simonw/datasette/issues/1855 - Needs it to be able to register additional abbreviations for use in signed cookies - And for validation when you use `datasette create-token` and pass in extra permissions - The https://latest.datasette.io/-/permissions debug interface needs it to add extra debug options to the `
a/b.c-dcabca/b.c-dc
> >>> db[""blah""].columns > [Column(cid=0, name='id', type='INTEGER', notnull=0, default_value=None, is_pk=0), Column(cid=1, name='name', type='TEXT', notnull=1, default_value=""'bob'"", is_pk=0)] > ``` > Note how a default value of the Python string `bob` is represented in the results of `PRAGMA table_info()` as `default_value=""'bob'""` - it's got single quotes added to it! > > So comparing default values from introspecting the database needs me to first parse that syntax. This may require a new table introspection method. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/468#issuecomment-1229279539_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353088849,I_kwDOBm6k_c5Qpn9R,1795,Consider automatically cleaning up curly quotes in searches,9599,simonw,open,0,,,,,0,2022-08-27T16:35:25Z,2022-08-27T16:35:25Z,,OWNER,,"If your phone helpfully adds curly quotes for you then phrase searches against FTS won't work: “Rebecca Sugar” In regular (not `?_searchmode=raw` search mode Datasette could clean these up for you to help avoid that mistake.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,hubgit,open,0,,,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1352946135,I_kwDOCGYnMM5QpFHX,472,Reuse the locals/globals fix from --functions for other code accepting options,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T05:12:05Z,2022-08-27T05:20:12Z,2022-08-27T05:20:12Z,OWNER,,"I figured out a workaround for the ugly `global x` hack here: - https://github.com/simonw/sqlite-utils/issues/471#issuecomment-1229120653",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932716,I_kwDOCGYnMM5QpB1s,471,sqlite-utils query --functions mechanism for registering extra functions,9599,simonw,closed,0,,,8355157,3.29,12,2022-08-27T03:57:53Z,2022-09-07T03:46:26Z,2022-08-27T05:10:57Z,OWNER,,"It would be really cool if you could register additional custom SQL functions for use with the `sqlite-utils query` command - something like this: ``` sqlite-utils data.db 'update images set domain = extract_domain(url)' --functions ' from urllib.parse import urlparse def extract_domain(url): return urlparse(url).netloc ' ``` Every function defined in that code block would be registered with the connection, unless the name began with an underscore.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932038,I_kwDOCGYnMM5QpBrG,470,Upgrade `--load-extension` to accept entrypoints like Datasette,9599,simonw,closed,0,,,8355157,3.29,6,2022-08-27T03:53:20Z,2022-08-27T05:55:49Z,2022-08-27T05:55:48Z,OWNER,,"Imitate: - https://github.com/simonw/datasette/pull/1789 ``` # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the ""sqlite3_foo_init"" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the ""sqlite3_bar_init"" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352931464,I_kwDOCGYnMM5QpBiI,469,sqlite-utils rows --order option,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T03:49:51Z,2022-08-27T04:30:49Z,2022-08-27T04:10:32Z,OWNER,,"For consistency with `search`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#search ``` -o, --order TEXT Order by ('column' or 'column desc') ``` I wanted to run `sqlite-utils rows db.db mytable --order 'rowid desc'` to see the most recently imported rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1348169997,I_kwDOCGYnMM5QW3EN,467,Mechanism for ensuring a table has all the columns,9599,simonw,closed,0,,,8355157,3.29,13,2022-08-23T15:50:23Z,2022-08-27T23:19:41Z,2022-08-27T23:17:56Z,OWNER,,Suggested by @jefftriplett on Discord: https://discord.com/channels/823971286308356157/997738192360964156/1011655389063958600,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1347717749,I_kwDOBm6k_c5QVIp1,1791,Updating metadata.json on Datasette for MacOS,1780782,ment4list,open,0,,,,,1,2022-08-23T10:41:16Z,2022-08-23T13:29:51Z,,NONE,,"I've installed Datasette for Mac as per [the documentation](https://docs.datasette.io/en/stable/installation.html#datasette-desktop-for-mac) and it's working great! However, I'm not sure how to go about adding something like ""[Canned Queries](https://docs.datasette.io/en/stable/sql_queries.html#canned-queries)"" or utilising other advanced features or settings by manipulating the `metadata.json` or `settings.json` files. I can view these files from the Datasette App from the top right ""burger"" menu but it only shows the contents of the file with no way to edit or change it. Am I missing something? Where can I update the `metadata.json` file using the MacOS App? PS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1345561209,I_kwDOBm6k_c5QM6J5,1790,A better HTML title for canned query pages,9599,simonw,open,0,,,,,0,2022-08-21T18:27:46Z,2022-08-21T18:27:46Z,,OWNER,,"https://scotrail.datasette.io/scotrail/assemble_sentence?terms=This+train+is+formed+of%2Cbomb+which Current title is: `scotrail: with phrases as ( select key, value from json_each('["' || replace(:terms, ',', '","') || '"]')),matches as (select phrases.key, phrases.value, ( select File from announcements where announcements.Transcription like '%' || trim(phrases.value) || '%' order by length(announcements.Transcription) limit 1 ) as Filefrom phrases),results as ( select key, announcements.Transcription, announcements.mp3 from announcements join matches on announcements.File = matches.File order by key)select 'Combined sentence:' as mp3, group_concat(Transcription, ' ') as Transcription, -1 as keyfrom results unionselect mp3, Transcription, keyfrom resultsorder by key` I think a better title would be: `scotrail: assemble_sentence, terms = This train is formed of,bomb which`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1790/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1343732788,I_kwDOBm6k_c5QF7w0,1788,Make it more obvious that Datasette publish can publish multiple databases,9599,simonw,closed,0,,,,,0,2022-08-18T22:57:51Z,2022-08-18T23:06:16Z,2022-08-18T23:06:16Z,OWNER,,Feedback initially for `datasette-publish-fly` but it applies to the others too.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1788/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1343422749,I_kwDOBm6k_c5QEwEd,1787,"Move ""datasette --get"" from Getting Started to CLI Reference",9599,simonw,closed,0,,,,,5,2022-08-18T17:53:39Z,2022-08-18T21:57:09Z,2022-08-18T21:56:21Z,OWNER,,It really shouldn't be here: https://docs.datasette.io/en/0.62/getting_started.html#datasette-get,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1787/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1342430983,I_kwDOBm6k_c5QA98H,1786,Adjust height of textarea for no JS case,9599,simonw,closed,0,,,,,4,2022-08-18T01:15:15Z,2022-10-27T21:50:12Z,2022-08-18T16:06:09Z,OWNER,,"Datasette Lite: https://lite.datasette.io/?sql=https://gist.githubusercontent.com/simonw/1f8a91123ccefd8844187225b1832d7a/raw/5069075b86aa79358fbab3d4482d1d269077d632/recipes.sql#/data?sql=select+id%2C+name%2C+ingredients%2C+%28%0A++select+json_group_array%28value%29+from+json_each%28ingredients%29%0A++where+value+in+%28select+value+from+json_each%28%3Ap0%29%29%0A%29+as+matching_ingredients%0Afrom+recipes%0Awhere+json_array_length%28matching_ingredients%29+%3E+0%0Aorder+by+json_array_length%28matching_ingredients%29+desc&p0=%5B%22sugar%22%2C+%22cheese%22%5D ![46F8101E-8CE3-4F61-B200-F865E6B5DBCC](https://user-images.githubusercontent.com/9599/185270723-f55513b0-b561-434d-9d7c-4fe5be9756e0.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1340900019,I_kwDOBm6k_c5P7IKz,1785,Can't use cog menu to facet by first column in a view,9599,simonw,open,0,,,,,0,2022-08-16T21:27:23Z,2022-08-16T21:27:23Z,,OWNER,,"https://latest.datasette.io/fixtures/paginated_view Compare with: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1785/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1339663518,I_kwDOBm6k_c5P2aSe,1784,"Include ""entrypoint"" option on `--load-extension`?",15178711,asg017,closed,0,,,,,2,2022-08-16T00:22:57Z,2022-08-23T18:34:31Z,2022-08-23T18:34:31Z,CONTRIBUTOR,,"## Problem SQLite extensions have the option to define multiple ""entrypoints"" in each loadable extension. For example, the upcoming version of `sqlite-lines` will have 2 entrypoints: the default `sqlite3_lines_init` (which SQLite will automatically guess for) and `sqlite3_lines_noread_init`. The `sqlite3_lines_noread_init` version omits functions that read from the filesystem, which is necessary for security purposes when running untrusted SQL (which Datasette does). (Similar multiple entrypoints will also be added for sqlite-http). The `--load-extension` flag, however, doesn't give the option to specify a different entrypoint, so the default one is always used. ## Proposal I want there to be a new command line option of the `--load-extension` flag to specify a custom entrypoint like so: ``` datasette my.db \ --load-extension ./lines0 sqlite3_lines0_noread_init ``` Then, under the hood, this line of code: https://github.com/simonw/datasette/blob/7af67b54b7d9bca43e948510fc62f6db2b748fa8/datasette/app.py#L562 Would look something like this: ```python conn.execute(""SELECT load_extension(?, ?)"", [extension, entrypoint]) ``` One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options (""arity""). So I guess it could also use a `:` delimiter like `--static`: ``` datasette my.db \ --load-extension ./lines0:sqlite3_lines0_noread_init ``` Or maybe even a new flag name? ``` datasette my.db \ --load-extension-entrypoint ./lines0 sqlite3_lines0_noread_init ``` Personally I prefer the `:` option... and maybe even `--load-extension` -> `--load`? Definitely out of scope for this issue tho ``` datasette my.db \ --load./lines0:sqlite3_lines0_noread_init ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1784/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1339444565,I_kwDOBm6k_c5P1k1V,1783,Better guidance as to what to do after you've installed Datasette,9599,simonw,open,0,,,,,2,2022-08-15T20:11:06Z,2022-08-15T20:14:01Z,,OWNER,,"Feedback [from Discord](https://discord.com/channels/823971286308356157/823971286941302908/1008822978793984060): > hello, love the project and came for help and to point out a possible gap in the docs. starting with ""getting started"" and ""installation"" every thing looks great, but then there's a giant leap after you have it installed and running. from the user perspective of ""i have a csv of set of csvs that i want to turn into a table(s), what do i do next?"" --- so something like maybe a page for creating your first project should go after ""installation"". - https://docs.datasette.io/en/0.62/getting_started.html - https://docs.datasette.io/en/0.62/installation.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1783/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1338278056,I_kwDOBm6k_c5PxICo,1782,Release notes for Datasette 0.62,9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-08-14T15:26:45Z,2022-08-14T17:40:45Z,2022-08-14T17:32:54Z,OWNER,,"I've written a lot of these already for the alphas: - https://github.com/simonw/datasette/releases/tag/0.62a0 - https://github.com/simonw/datasette/releases/tag/0.62a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338137350,I_kwDOBm6k_c5PwlsG,1781,Ensure Datasette Lite is promoted in docs and README,9599,simonw,closed,0,,,8303187,Datasette 0.62,1,2022-08-14T05:12:35Z,2022-08-14T15:24:40Z,2022-08-14T15:24:40Z,OWNER,,As of 0.62 https://lite.datasette.io is a supported piece of the overall Datasette ecosystem.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338001039,I_kwDOCGYnMM5PwEaP,464,Link from documentation to source code,9599,simonw,closed,0,,,,,5,2022-08-13T16:19:57Z,2022-08-17T23:38:03Z,2022-08-17T23:38:03Z,OWNER,,Twitter conversation asking for ways to automate this here: https://twitter.com/simonw/status/1558260492015046656,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1337541526,I_kwDOBm6k_c5PuUOW,1780,`facet_time_limit_ms` and `sql_time_limit_ms` overlap?,53165,davepeck,open,0,,,,,1,2022-08-12T17:55:37Z,2022-08-15T23:50:08Z,,NONE,,"I needed more than the default 200ms to facet a specific column in a database I was working with, so I ran `datasette` with `--setting facet_time_limit_ms 30000` — definitely overkill! But it still didn't work; it took a moment to realize I also needed to up my `sql_time_limit_ms` to something larger too. I'm happy to submit a PR that documents this behavior if it's helpful. Or, if there's a code change we'd like to make (like making sure `sql_time_limit_ms` is always set to the larger of itself and `facet_time_limit_ms`), happy to do that too. Apologies if I missed this somewhere in the docs. And: thanks. I'm really enjoying the simple, effective tooling datasette gives me out of the box for exploring my databases!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1780/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1334628400,I_kwDOBm6k_c5PjNAw,1779,google cloudrun updated their limits on maxscale based on memory and cpu count,536941,fgregg,closed,0,,,8303187,Datasette 0.62,13,2022-08-10T13:27:21Z,2022-08-14T19:42:59Z,2022-08-14T17:07:34Z,CONTRIBUTOR,,"if you don't set an explicit limit on container scaling, then [google defaults to 100](https://cloud.google.com/run/docs/configuring/max-instances#limits) google recently updated the [limits on container scaling](https://cloud.google.com/run/docs/configuring/max-instances#limits), such that if you set up datasette to use more memory or cpu, then you need to set the maxScale argument much smaller than 100. would be nice if `datasette publish` could do this math for you and set the right maxScale. [Log of an failing publish run](https://github.com/labordata/warehouse/runs/7764725972?check_suite_focus=true#step:8:332). ``` ERROR: (gcloud.run.deploy) spec.template.spec.containers[0].resources.limits.cpu: Invalid value specified for cpu. For the specified value, maxScale may not exceed 15. Consider running your workload in a region with greater capacity, decreasing your requested cpu-per-instance, or requesting an increase in quota for this region if you are seeing sustained usage near this limit, see https://cloud.google.com/run/quotas. Your project may gain access to further scaling by adding billing information to your account. Traceback (most recent call last): File ""/home/runner/.local/bin/datasette"", line 8, in sys.exit(cli()) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/datasette/publish/cloudrun.py"", line 160, in cloudrun check_call( File ""/usr/lib/python3.8/subprocess.py"", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/labordata/datasette warehouse --memory 8Gi --cpu 2' returned non-zero exit status 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1779/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1326349129,I_kwDOCGYnMM5PDntJ,461,Consider including animated SVG console demos,9599,simonw,open,0,,,,,1,2022-08-02T20:10:04Z,2022-08-02T20:12:14Z,,OWNER,,"I recorded this one using https://github.com/nbedos/termtosvg - with `pipx install termtosvg` and then `termtosvg` - execute demo - `exit` to save. ![sqlite-utils-insert-json](https://user-images.githubusercontent.com/9599/182464206-f4976af4-eda8-4020-8257-4ada1867fb44.svg) ```json [ { ""id"": 1, ""name"": ""Catimus"" }, { ""id"": 2, ""name"": ""Feliopia"" } ] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1324659241,I_kwDOCGYnMM5O9LIp,459,Single quoted transform recipes on Windows do not work as expected ,19921,shakeel,open,0,,,,,0,2022-08-01T16:14:54Z,2022-08-01T16:14:54Z,,CONTRIBUTOR,,"Trying to follow the tutorial for sqlite-utils and datasette https://datasette.io/tutorials/clean-data on Windows 11 OS `Microsoft Windows [Version 10.0.22622.440]`, with sqlite-utils and datasette installed using pipx. ``` pipx list package datasette 0.61.1, installed using Python 3.10.4 - datasette.exe package sqlite-utils 3.28, installed using Python 3.10.4 - sqlite-utils.exe ``` In the step to transform dates into ISO dates the quoted value `'r.parsedatetime(value)'` is copied verbatim into the columns instead of applying the output of the Python recipe. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ 'r.parsedatetime(value)' --dry-run 1975/01/31 00:00:00+00 --- becomes: r.parsedatetime(value) Would affect 13568 rows ``` However, if I change the code from single quotes to double quotes, it works as expected. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ ""r.parsedatetime(value)"" --dry-run 1975/01/31 00:00:00+00 --- becomes: 1975-01-31T00:00:00+00:00 Would affect 13568 rows ``` Specifying the transform code recipe should work with single quotes on Windows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1323346408,I_kwDOBm6k_c5O4Kno,1775,i18n support,428820,johnfelipe,open,0,,,,,9,2022-07-31T02:51:04Z,2023-02-10T18:04:40Z,,NONE,,"I want contribute for translate UI to es, de, de and it if you share strings",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1775/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1323332006,I_kwDOBm6k_c5O4HGm,1774,Request of feature for mongo,428820,johnfelipe,open,0,,,,,0,2022-07-31T01:00:05Z,2022-07-31T01:00:05Z,,NONE,,Will love if can we use datasette for mongo and all pipelines and workflows,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1774/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1320243134,I_kwDOCGYnMM5OsU--,458,Support custom names for registered functions,9599,simonw,closed,0,,,8355157,3.29,1,2022-07-28T00:13:00Z,2022-08-27T03:56:01Z,2022-07-28T00:13:57Z,OWNER,,"In this example: ```python @db.register_function def reverse_string(s): return """".join(reversed(list(s))) print(db.execute('select reverse_string(""hello"")').fetchone()[0]) ``` There's currently no way to over-ride the automatically selected name for the SQL function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1318907685,I_kwDOBm6k_c5OnO8l,1773,500 error if sorted by a column not in the ?_col= list,9599,simonw,closed,0,,,8303187,Datasette 0.62,4,2022-07-27T01:20:27Z,2022-08-14T16:06:25Z,2022-08-14T15:44:05Z,OWNER,,"For example: https://latest.datasette.io/fixtures/sortable?_sort_desc=sortable&_col=sortable_with_nulls That's `?_sort_desc=sortable&_col=sortable_with_nulls` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1773/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1310243385,I_kwDOCGYnMM5OGLo5,456,feature request: pivot command,536941,fgregg,open,0,,,,,5,2022-07-20T00:58:08Z,2022-07-20T17:50:50Z,,CONTRIBUTOR,,pivoting long-format table to wide-format tables is pretty common and kind of pain. would love to see this feature in sqlite-utils!,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y:
``` The track in the first row has an album, the second track doesn't. Now I extract album information into a separate column: ```ipython In [4]: db[""listens""].extract(columns=[""album_title""], table=""albums"", fk_column=""album_id"") Out[4]:
In [5]: list(db[""albums""].rows) Out[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}] In [6]: list(db[""listens""].rows) Out[6]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] ``` This behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty). Now I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table. ```ipython In [7]: db[""listens""].extract(columns=[""track_title"", ""album_id""], table=""tracks"", fk_column=""track_id"") Out[7]:
In [8]: list(db[""tracks""].rows) Out[8]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] In [9]: list(db[""listens""].rows) Out[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}] ``` Extracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL. Changing the order of extracts doesn't help. I poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1198822563,I_kwDOBm6k_c5HdJSj,1706,"[feature] immutable mode for a directory, not just individual sqlite file",9020979,hydrosquall,open,0,,,,,4,2022-04-10T00:50:57Z,2022-12-09T19:11:40Z,,CONTRIBUTOR,,"## Motivation - I have a directory of sqlite databases - I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode) - Currently using this flag throws the following error IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory' ## Proposal Immutable flag works for both single files and directories datasette -i /folder-of-sqlite-files",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1706/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1197926598,I_kwDOBm6k_c5HZujG,1705,How to upgrade your plugin for 1.0 documentation,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-08T23:16:47Z,2022-12-13T05:29:05Z,,OWNER,,"Among other things, needed by: - #1704",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1197925865,I_kwDOBm6k_c5HZuXp,1704,File PRs against incompatible plugins pinning to datasette<1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-04-08T23:15:30Z,2022-04-08T23:15:30Z,,OWNER,,"As part of the preparation for the 1.0 release, test all existing known plugins against the alpha. For any that break, submit a PR suggesting they pin to a version <1.0 - and include a link to the documentation on how to upgrade the plugin for 1.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1196327155,I_kwDOBm6k_c5HToDz,1702,Be more consistent with column quoting,9599,simonw,open,0,,,,,0,2022-04-07T16:59:20Z,2022-04-07T16:59:20Z,,OWNER,,"This tutorial made me notice that Datasette is pretty inconsistent with how column quoting works: https://datasette.io/tutorials/learn-sql It has examples of each of `""table_name""` and `[table_name]` and `table_name`, and it uses single quoted values too. Datasette should generate SQL as consistently as possible to support learners. That tutorial should also provide a tiny bit of extra information about what's going on here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1194790504,I_kwDOBm6k_c5HNw5o,1701,Use + for spaces instead of ~20,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-04-06T15:40:48Z,2022-04-06T15:55:10Z,2022-04-06T15:55:05Z,OWNER,,"Tilde encoding introduced in #1657 means that database files with spaces in the name - e.g. the Apple Mail `Envelope Index` database - end up with URLs like this: http://127.0.0.1:8001/Envelope~20Index I think this would be prettier: http://127.0.0.1:9933/Envelope+Index",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1193090967,I_kwDOBm6k_c5HHR-X,1699,Proposal: datasette query,25778,eyeseast,open,0,,,,,6,2022-04-05T12:36:43Z,2022-04-11T01:32:12Z,,CONTRIBUTOR,,"I started sketching out a plugin to add a `datasette query` subcommand to export data from the command line. This is based on discussions in #1356 and #1605. Before I get too far down this rabbit hole, I figure it's worth getting some feedback here (unless this should happen in `Discussions`). Here's what I'm thinking: At its most basic, it will write the results of a query to STDOUT. ```sh datasette query -d data.db 'select * from data' > results.json ``` This isn't much improvement over using [sqlite-utils](https://github.com/simonw/sqlite-utils). To make better use of datasette and its ecosystem, run `datasette query` using a canned query defined in a `metadata.yml` file. For example, using the metadata file from [alltheplaces-datasette](https://github.com/eyeseast/alltheplaces-datasette/blob/main/metadata.yml): ```sh cd alltheplaces-datasette datasette query -d alltheplaces.db -m metadata.yml count_by_spider ``` That query would be good to get as CSV, and we can auto-discover metadata and databases in the current directory: ```sh cd alltheplaces-datasette datasette query count_by_spider -f csv ``` In this case, `count_by_spider` is a canned query defined on the `alltheplaces` database. If the same query is defined on multiple databases or its otherwise unclear which database `query` should use, pass the `-d` or `--database` option. If a query takes parameters, I can pass them in at runtime, using the `--param` or `-p` option: ```sh datasette query -d data.db -p value something 'select * from neighborhoods where some_column = :value' ``` I'm very interested in feedback on this, including whether it should be a plugin or in Datasette core. (I don't have a strong opinion about this, but I'm prototyping it as a plugin to start.)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1189113609,I_kwDOBm6k_c5G4G8J,1697,"`Request.fake(..., url_vars={})`",9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-04-01T01:48:40Z,2022-04-01T02:02:18Z,2022-04-01T02:02:10Z,OWNER,,"I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well: ```python from datasette.utils.asgi import Request class Request(Request): @classmethod def fake(cls, path_with_query_string, method=""GET"", scheme=""http"", url_vars=None): """"""Useful for constructing Request objects for tests"""""" path, _, query_string = path_with_query_string.partition(""?"") scope = { ""http_version"": ""1.1"", ""method"": method, ""path"": path, ""raw_path"": path_with_query_string.encode(""latin-1""), ""query_string"": query_string.encode(""latin-1""), ""scheme"": scheme, ""type"": ""http"", } if url_vars: scope[""url_route""] = { ""kwargs"": url_vars } return cls(scope, None) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1186696202,I_kwDOBm6k_c5Gu4wK,1696,Show foreign key label when filtering,9599,simonw,open,0,,,,,2,2022-03-30T16:18:54Z,2023-01-29T20:56:20Z,,OWNER,,"For example here: 3 corresponds to ""Human Related: Other"" - it would be neat to display this in this area of the page somehow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1185868354,I_kwDOBm6k_c5GrupC,1695,Option to un-filter facet not shown for `?col__exact=value`,9599,simonw,open,0,,,,,2,2022-03-30T04:44:02Z,2022-03-30T04:46:18Z,,OWNER,,"Spotted this on a page with `COUNTY__exact=Lee` in the URL: ![CleanShot 2022-03-29 at 21 41 46@2x](https://user-images.githubusercontent.com/9599/160752849-a9039343-3770-4655-920b-f19e25687a57.png) With `COUNTY=Lee` you get this instead: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182227211,I_kwDOBm6k_c5Gd1sL,1692,[plugins][feature request]: Support additional script tag attributes when loading custom JS,9020979,hydrosquall,open,0,,,,,2,2022-03-27T01:16:03Z,2022-03-30T06:14:51Z,,CONTRIBUTOR,,"## Motivation - The build system for my new [plugin](https://github.com/hydrosquall/datasette-nteract-data-explorer) has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette. - I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers. To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, `nomodule` and `defer`. My injected scripts should look something like this: ```html ``` ## Proposal To achieve this, I propose additional optional properties to the API accepted by the `extra_js_urls` hook and custom JS field the `metadata.json` [described here](https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript). Under this API, I'd write something like this to get the above HTML rendered in Datasette. ```json { ""extra_js_urls"": [ { ""url"": ""/index.my-es-module-bundle.js"", ""module"": true, }, { ""url"": ""/index.my-legacy-fallback-bundle.js"", ""nomodule"": """", ""defer"": true } ] } ``` ## Resources - [MDN on the script tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script) - There may be other properties that could be added that are potentially valuable, like `async` or `referrerpolicy`, but I don't have an immediate need for those.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182143895,I_kwDOBm6k_c5GdhWX,1691,Bug in pytest-httpx example,9599,simonw,closed,0,,,,,0,2022-03-26T22:45:30Z,2022-03-26T22:46:09Z,2022-03-26T22:46:09Z,OWNER,,"https://docs.datasette.io/en/0.61.1/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx says: ```python async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url='https://www.example.com/', data='Hello world', ) ``` That's wrong - `data=` should be `text=`. https://github.com/Colin-b/pytest_httpx/blob/v0.20.0/README.md#reply-with-custom-body",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182141761,I_kwDOBm6k_c5Gdg1B,1690,"Idea: `datasette.set_actor_cookie(response, actor)`",9599,simonw,open,0,,,,,2,2022-03-26T22:41:52Z,2022-03-26T22:43:00Z,,OWNER,,"I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/__init__.py#L79-L92 ```python redirect_response = Response.redirect(""/"") expires_at = int(time.time()) + (24 * 60 * 60) redirect_response.set_cookie( ""ds_actor"", datasette.sign( { ""a"": profile_response.json(), ""e"": baseconv.base62.encode(expires_at), }, ""actor"", ), ) return redirect_response ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182065616,I_kwDOBm6k_c5GdOPQ,1689,datasette.add_message() documentation is incorrect,9599,simonw,closed,0,,,,,1,2022-03-26T20:49:42Z,2022-03-26T21:35:57Z,2022-03-26T20:51:21Z,OWNER,,"https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says: `.add_message(request, message, message_type=datasette.INFO)` But in the code it's: https://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181364043,I_kwDOBm6k_c5Gai9L,1687,Make show_json.html or a similar mechanism stable for plugins,9599,simonw,open,0,,,,,0,2022-03-25T23:42:45Z,2022-03-25T23:42:45Z,,OWNER,,"I used `show_json.html` in the new `datasette-packages` plugin, which means it will break if that template changes: - https://github.com/simonw/datasette-packages/issues/3 It would be useful if it (or something like it) was documented and stable for plugins to use. Also relevant: - #878",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1687/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181236173,I_kwDOCGYnMM5GaDvN,422,Reconsider not running convert functions against null values,9599,simonw,open,0,,,,,1,2022-03-25T20:22:40Z,2022-03-25T20:23:21Z,,OWNER,,"I just got caught out by the fact that `None` values are not processed by the `.convert()` mechanism https://github.com/simonw/sqlite-utils/blob/0b7b80bd40fe86e4d66a04c9f607d94991c45c0b/sqlite_utils/db.py#L2504-L2510 I had run this code while working on #420 and I wasn't sure why it didn't work: ``` $ sqlite-utils add-column content.db articles score float $ sqlite-utils convert content.db articles score ' import random random.seed(10) def convert(value): global random return random.random() ' ``` The reason it didn't work is that the newly added `score` column was full of `null` values. I fixed it by doing this instead: $ sqlite-utils add-column content.db articles score float --not-null-default 1.0 But this indicates to me that the design of `convert()` here may be incorrect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181037277,I_kwDOBm6k_c5GZTLd,1686,heroku bails if app name specifed in datasette publish is the same as existing app,2115933,tlongers,open,0,,,,,0,2022-03-25T17:10:34Z,2022-03-25T17:10:34Z,,NONE,,"Seem that `heroku` does not accept an app overwrite triggered by specifying the app name using `datasette publish`, as below: ``` datasette publish heroku some.db --name ""jazzy-name"" ``` The resulting error has the below traceback: ``` Creating jazzy-name... ! ▸ Name jazzy-name is already taken Traceback (most recent call last): File ""/opt/homebrew/bin/datasette"", line 33, in sys.exit(load_entry_point('datasette==0.60.1', 'console_scripts', 'datasette')()) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/datasette/publish/heroku.py"", line 127, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 420, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 524, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'jazzy-name', '--json']' returned non-zero exit status 1. ``` It's a solid failsafe, but does `datasette publish` have a way to force an overwrite?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1180427792,I_kwDOCGYnMM5GW-YQ,421,"""Error: near ""("": syntax error"" when using sqlite-utils indexes CLI",24938923,learning4life,closed,0,,,,,8,2022-03-25T07:12:51Z,2022-04-13T22:41:59Z,2022-04-13T22:41:59Z,NONE,,"This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 **New error when using CLI: ""sqlite-utils indexes global.db --table""** ``` (app-root) sqlite-utils indexes global.db --table Error: near ""("": syntax error (app-root) sqlite-utils --version sqlite-utils, version 3.25.1 (app-root) sqlite3 --version 3.36.0 2021-06-18 18:36:39 (app-root) python --version Python 3.8.11 ``` Dockerfile ``` FROM centos/python-38-centos7 USER root RUN yum update -y RUN yum upgrade -y # epel RUN yum -y install epel-release && yum clean all # SQLite RUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel WORKDIR /build/ COPY sqlite-autoconf-3360000.tar.gz ./ RUN tar -zxf sqlite-autoconf-3360000.tar.gz WORKDIR /build/sqlite-autoconf-3360000 RUN ./configure RUN make RUN make install # RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip RUN pip install sqlite-utils ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1179998071,I_kwDOBm6k_c5GVVd3,1684,Mechanism for disabling faceting on large tables only,9599,simonw,open,0,,,,,1,2022-03-24T20:06:11Z,2022-03-24T20:13:19Z,,OWNER,,"Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions One option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1179928510,I_kwDOBm6k_c5GVEe-,1683,allow_facet: False should be respected by column cog menu,9599,simonw,closed,0,,,,,0,2022-03-24T19:05:06Z,2022-03-24T19:16:36Z,2022-03-24T19:16:36Z,OWNER,,"The column cog menu currently shows ""Facet by this"" even if faceting is disabled for the Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178546862,I_kwDOCGYnMM5GPzKu,420,Document how to use a `--convert` function that runs initialization code first,770231,strada,closed,0,,,,,12,2022-03-23T19:07:36Z,2022-08-28T11:34:37Z,2022-03-25T20:07:33Z,NONE,,"When I have an insert command with transform like this: ``` cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert ' d = enchant.Dict(""en_US"") row[""is_dictionary_word""] = d.check(row[""name""]) ' --import=enchant --ignore ``` I noticed as the number of rows increases the operation becomes quite slow, likely due to the creation of the `d = enchant.Dict(""en_US"")` object for each row. Is there a way to share that instance `d` between transform function calls, like a shared context?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178521513,I_kwDOBm6k_c5GPs-p,1682,SQL queries against databases with different routes are broken,9599,simonw,closed,0,,,,,1,2022-03-23T18:42:57Z,2022-03-23T18:48:16Z,2022-03-23T18:48:16Z,OWNER,,"500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable Here's the trace: ``` File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 54, in data return await QueryView(self.ds).data( File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 232, in data self.ds.get_database(database), sql File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py"", line 401, in get_database return self.databases[name] KeyError: 'fixtures-aa7318b' ``` It looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago! _Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1682/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178456794,I_kwDOCGYnMM5GPdLa,418,Add generated files to .gitignore,25778,eyeseast,closed,0,,,,,0,2022-03-23T17:48:12Z,2022-03-24T21:01:44Z,2022-03-24T21:01:44Z,CONTRIBUTOR,,"I end up with these in my local directory: .hypothesis/ Pipfile Pipfile.lock pyproject.toml Might as well gitignore them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1177101697,I_kwDOBm6k_c5GKSWB,1681,Potential bug in numeric handling where_clause for filters,9599,simonw,open,0,,,,,2,2022-03-22T17:43:50Z,2022-03-22T17:49:09Z,,OWNER,,"> Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`: > > https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212 > > Though... it looks like there's a bug in that? It doesn't account for `float` values - `""3.5"".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175894898,I_kwDOBm6k_c5GFrty,1680,Consider simplifying permissions for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-03-21T20:17:29Z,2022-03-21T20:17:29Z,,OWNER,,"Permission checks right now can express one of three opinions: - `False` means ""so not grant this permisson"" - `True` means ""grant this permission"" - `None` means ""I have no opinion"" But... there's also a concept of a ""default"" for a given permission check, which might be `False` or `True`. I worry this is too complicated. Could this be simplified before 1.0? In particular the default concept. See also: - #1676 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175854982,I_kwDOBm6k_c5GFh-G,1679,Research: how much overhead does the n=1 time limit have?,9599,simonw,closed,0,,,3268330,Datasette 1.0,11,2022-03-21T19:27:46Z,2022-03-21T21:55:57Z,2022-03-21T21:55:56Z,OWNER,,"https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200 ```python @contextmanager def sqlite_timelimit(conn, ms): deadline = time.perf_counter() + (ms / 1000) # n is the number of SQLite virtual machine instructions that will be # executed between each check. It's hard to know what to pick here. # After some experimentation, I've decided to go with 1000 by default and # 1 for time limits that are less than 50ms n = 1000 if ms < 50: n = 1 def handler(): if time.perf_counter() >= deadline: return 1 conn.set_progress_handler(handler, n) try: yield finally: conn.set_progress_handler(None, n) ``` How often do I set a time limit of 50 or less? How much slower does it go thanks to this code?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175715988,I_kwDOBm6k_c5GFACU,1678,Make `check_visibility()` a documented API,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:30:34Z,2022-03-21T19:04:03Z,2022-03-21T19:01:46Z,OWNER,,"Spotted this while working on: - #1677 https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175694248,I_kwDOBm6k_c5GE6uo,1677,Remove `check_permission()` from `BaseView`,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:18:18Z,2022-03-21T18:45:04Z,2022-03-21T18:45:03Z,OWNER,,"Follow-on from: - #1675 Refs: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175648453,I_kwDOBm6k_c5GEvjF,1675,Extract out `check_permissions()` from `BaseView,9599,simonw,closed,0,,,,,7,2022-03-21T16:39:46Z,2022-03-21T17:14:31Z,2022-03-21T17:13:21Z,OWNER,,"> I'm going to refactor this stuff out and document it so it can be easily used by plugins: https://github.com/simonw/datasette/blob/4a4164b81191dec35e423486a208b05a9edc65e4/datasette/views/base.py#L69-L103 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1074136176_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174717287,I_kwDOBm6k_c5GBMNn,1674,Tweak design of /.json,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T22:58:01Z,2022-03-20T22:58:40Z,,OWNER,,"https://latest.datasette.io/.json Currently: ```json { ""_memory"": { ""name"": ""_memory"", ""hash"": null, ""color"": ""a6c7b9"", ""path"": ""/_memory"", ""tables_and_views_truncated"": [], ""tables_and_views_more"": false, ""tables_count"": 0, ""table_rows_sum"": 0, ""show_table_row_counts"": false, ""hidden_table_rows_sum"": 0, ""hidden_tables_count"": 0, ""views_count"": 0, ""private"": false }, ""fixtures"": { ""name"": ""fixtures"", ""hash"": ""645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa"", ""color"": ""645005"", ""path"": ""/fixtures"", ""tables_and_views_truncated"": [ { ""name"": ""compound_three_primary_keys"", ""columns"": [ ""pk1"", ""pk2"", ""pk3"", ""content"" ], ""primary_keys"": [ ""pk1"", ""pk2"", ""pk3"" ], ""count"": 1001, ""hidden"": false, ""fts_table"": null, ""num_relationships_for_sorting"": 0, ""private"": false }, ``` As-of this issue the `""path""` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns: - #1668",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174708375,I_kwDOBm6k_c5GBKCX,1673,Streaming CSV spends a lot of time in `table_column_details`,9599,simonw,open,0,,,,,1,2022-03-20T22:25:28Z,2022-03-20T22:34:06Z,,OWNER,,"At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do: datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on' While investigating: - #1355 And spotted this: ``` datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2) Total Samples 5800 GIL: 71.00%, Active: 98.00%, Threads: 4 %Own %Total OwnTime TotalTime Function (filename:line) 8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212) 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614) 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81) 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120) 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571) 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140) ``` Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174697144,I_kwDOBm6k_c5GBHS4,1672,Refactor CSV handling code out of DataView,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T21:47:00Z,2022-03-20T21:52:39Z,,OWNER,,"> I think the way to get rid of most of the remaining complexity in `DataView` is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1672/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174655187,I_kwDOBm6k_c5GA9DT,1671,Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply,9308268,rayvoelker,open,0,,,,,8,2022-03-20T19:17:24Z,2022-03-22T17:43:12Z,,NONE,,"I found a strange behavior, and I'm not sure if it's related to views and boolean values perhaps, or if there's something else weird going on here, but I'll provide an example that may help show what I'm seeing happen. ```bash #!/bin/bash echo ""\""id\"",\""expiration_date\"" 0,2018-01-04 1,2019-01-05 2,2020-01-06 3,2021-01-07 4,2022-01-08 5,2023-01-09 6,2024-01-10 7,2025-01-11 8,2026-01-12 9,2027-01-13 "" > test.csv csvs-to-sqlite test.csv test.db sqlite-utils create-view --replace test.db test_view ""select id, expiration_date, case when julianday('NOW') >= julianday(expiration_date) then 1 else 0 end as has_expired FROM test"" ``` ```bash datasette test.db ``` ![image](https://user-images.githubusercontent.com/9308268/159178745-9c6152f7-eac6-4bf9-bef5-a2d63d3ee13f.png) ![image](https://user-images.githubusercontent.com/9308268/159178824-c8952137-270c-42a4-ad1c-f6ad2c51e499.png) ![image](https://user-images.githubusercontent.com/9308268/159178877-23e00b36-443a-43ef-83e5-e0bdddd3fdcd.png) ![image](https://user-images.githubusercontent.com/9308268/159178918-65922cc7-2514-4735-a72d-4904b99976d4.png) Thanks again and let me know if you want me to provide anything else!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change. After shipping, release a new `datasette-hashed-urls` that depends on it, also this: - https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174306154,I_kwDOBm6k_c5F_n1q,1668,"Introduce concept of a database `route`, separate from its name",9599,simonw,closed,0,,,3268330,Datasette 1.0,20,2022-03-19T16:48:28Z,2022-03-20T16:43:16Z,2022-03-20T16:43:16Z,OWNER,,"Some issues came up in the new `datasette-hashed-urls` plugin relating to the way it renames databases on startup to achieve unique URLs that depend on the database SHA-256 content: - https://github.com/simonw/datasette-hashed-urls/issues/10 - https://github.com/simonw/datasette-hashed-urls/issues/9 - https://github.com/simonw/datasette-hashed-urls/issues/8 All three of these could be addressed by making the ""path"" concept for a database (the `/foo` bit where it is served) work independently of the database's name, which would be used for default display and also as the alias when configuring cross-database aliases.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173023272,I_kwDOCGYnMM5F6uoo,416,Options for how `r.parsedate()` should handle invalid dates,638427,mattkiefer,closed,0,,,,,11,2022-03-17T23:29:55Z,2022-05-03T21:36:49Z,2022-03-21T04:01:39Z,NONE,,"Exceptions are normal expected behavior when typecasting an invalid format. However, r.parsedate() is really just re-formatting strings and keeping the type as text. So it may be better to print-and-pass on exception so the user can see a complete list of invalid values -- while also allowing for the parser to reformat the remaining valid values. ``` sqlite-utils convert idfpr.db license ""Expiration Date"" ""r.parsedate(value)"" [#######-----------------------------] 21% 00:01:57Traceback (most recent call last): File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/db.py"", line 2336, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 652, in parse raise ParserError(""String does not contain a date: %s"", timestr) dateutil.parser._parser.ParserError: String does not contain a date: / / ``` In this case, I had just one variation of an invalid date: ' / / '. But theoretically there could be many values that would have to be fixed one at a time with the current exception handling. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,dotcs,closed,0,,,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command. Let's first create a simple database from JSON string ```console $ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo - $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""abc""}] ``` and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns)) ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run Traceback (most recent call last): File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert for row in db.conn.execute(sql, where_args).fetchall(): sqlite3.OperationalError: user-defined function raised exception ``` But without the `--dry-run` flag it does work as expected: ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""bar""}] ``` ```console $ sqlite-utils --version sqlite-utils, version 3.25.1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features: - `db.hash` - `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead: - #647 - https://github.com/simonw/datasette-hashed-urls/issues/3 https://github.com/simonw/datasette-hashed-urls Sub-tasks: - [x] Remove hashed URL mode implementation - [x] Update documentation - [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170144879,I_kwDOBm6k_c5Fvv5v,1660,Refactor and simplify Datasette routing and views,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-15T19:56:56Z,2022-03-21T19:19:12Z,2022-03-21T19:19:01Z,OWNER,,"While working on: - https://github.com/simonw/datasette/issues/1657 - https://github.com/simonw/datasette/issues/1439 It became very clear that the least maintainable part of Datasette at the moment is the way routing to the database, table and row views work - in particular the subclassing mechanism with BaseView and DataView, but also the complex variety of ways in which the URL routes capture different named regular expression groups.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1169840669,I_kwDOBm6k_c5Fulod,1658,Revert main to version that passes tests,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-15T15:37:02Z,2022-03-19T04:04:50Z,2022-03-15T15:42:58Z,OWNER,,"> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1658/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1168995756,I_kwDOBm6k_c5FrXWs,1657,Tilde encoding: use ~ instead of - for dash-encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2022-03-14T22:55:17Z,2022-03-15T18:25:11Z,2022-03-15T18:01:58Z,OWNER,,Refs #1439,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166731361,I_kwDOCGYnMM5Fiuhh,414,I forgot to include the changelog in the 3.25.1 release,9599,simonw,closed,0,,,,,7,2022-03-11T18:32:36Z,2022-03-11T18:40:39Z,2022-03-11T18:40:39Z,OWNER,,"I pushed a release for https://github.com/simonw/sqlite-utils/releases/tag/3.25.1 but forgot to include the release notes in `docs/changelog.rst` This means https://sqlite-utils.datasette.io/en/stable/changelog.html isn't showing them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166587040,I_kwDOCGYnMM5FiLSg,413,Display autodoc type information more legibly,9599,simonw,closed,0,,,,,5,2022-03-11T15:58:20Z,2022-03-11T18:07:10Z,2022-03-11T18:07:10Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.25/reference.html#sqlite_utils.db.Table.insert looks like this at the moment: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1163369515,I_kwDOBm6k_c5FV5wr,1655,query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data,536941,fgregg,open,0,,,,,8,2022-03-09T00:56:40Z,2023-10-17T21:53:17Z,,CONTRIBUTOR,,"[this page](https://labordata.bunkum.us/opdr-8335ea3?sql=with+most_recent_lu+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%28%0D%0A++++++select%0D%0A++++++++*%0D%0A++++++from%0D%0A++++++++lm_data%0D%0A++++++order+by%0D%0A++++++++f_num%2C%0D%0A++++++++receive_date+desc%0D%0A++++%29+t%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++aff_abbr+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+abbr_local_name%2C%0D%0A++coalesce%28%0D%0A++++regexp_match%28%27%28.*%3F%29%28%2C%3F+AFL-CIO%24%29%27%2C+union_name%29%2C%0D%0A++++regexp_match%28%27%28.*%3F%29%28+IND%24%29%27%2C+union_name%29%2C%0D%0A++++union_name%0D%0A++%29+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+full_local_name%2C%0D%0A++*%0D%0Afrom%0D%0A++most_recent_lu%0D%0Awhere+%28desig_num+IS+NOT+NULL+OR+unit_name+IS+NOT+NULL%29+AND+desig_name+%21%3D+%27HQ%27%0D%0Alimit%0D%0A++5000+offset+0) is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb. it's using over a 1G on chrome 99. i found this because, i was trying to figure out why editing the SQL was getting very slow. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1161969891,I_kwDOBm6k_c5FQkDj,1654,Adopt a code of conduct,9599,simonw,closed,0,,,,,5,2022-03-07T22:00:24Z,2022-03-07T22:19:35Z,2022-03-07T22:19:35Z,OWNER,,"This is long overdue, especially given the size of the project now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161937073,I_kwDOBm6k_c5FQcCx,1653,Mechanism to default a table to sorting by multiple columns,9599,simonw,open,0,,,,,2,2022-03-07T21:20:11Z,2022-03-07T21:23:39Z,,OWNER,,"### Discussed in https://github.com/simonw/datasette/discussions/1652
Originally posted by **zaneselvans** March 7, 2022 It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order): ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: created_time ``` But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette? ```sql SELECT respondent_id, report_year, spplmnt_num, row_number, row_seq, row_prvlg, acct_num, depr_plnt_base, est_avg_srvce_lf, net_salvage, apply_depr_rate, mrtlty_crv_typ, avg_remaining_lf, report_prd FROM f1_edcfu_epda WHERE respondent_id = 210 AND report_year = 2020 ORDER BY report_year, report_prd, respondent_id, spplmnt_num, row_number LIMIT 1000 ``` The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key. I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names... ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: ""(report_year, report_prd, respondent_id, spplmnt_num, row_number)"" ``` and they all give me server errors like: ``` Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number) ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1161584460,I_kwDOBm6k_c5FPF9M,1651,Get rid of the no-longer necessary ?_format=json hack for tables called x.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-07T15:40:42Z,2022-03-19T04:04:50Z,2022-03-15T18:25:42Z,OWNER,,"Tidy up from: - #1439",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160750713,I_kwDOBm6k_c5FL6Z5,1650,Implement redirects from old % encoding to new dash encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,5,2022-03-06T23:40:02Z,2022-03-07T19:26:15Z,2022-03-07T19:26:14Z,OWNER,,"> One big advantage to this scheme is that redirecting old links to `%2F` pages (e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators) is easy - if you see a `%` in the `raw_path`, redirect to that page with the `%` replaced by `-`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1439#issuecomment-1060044007_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160407071,I_kwDOBm6k_c5FKmgf,1647,Test failures with SQLite 3.37.0+ due to column affinity case,9599,simonw,closed,0,,,,,5,2022-03-05T17:37:46Z,2022-03-05T19:56:28Z,2022-03-05T19:47:04Z,OWNER,,"These three tests are failing on my local machine: ``` FAILED tests/test_internals_database.py::test_table_column_details[facetable-expected0] - AssertionError: assert [Column(cid=0, name='pk', type='INTEGER', no... FAILED tests/test_internals_database.py::test_table_column_details[sortable-expected1] - AssertionError: assert [Column(cid=0, name='pk1', type='varchar(30)'... FAILED tests/test_table_html.py::test_sort_links - AssertionError: assert [{'a_href': None,\n 'attrs': {'class': ['col-Link'],\n 'data-column': '... ``` I ran `pytest --lf -vv` and the output had things like this in it: ``` E - Column(cid=1, name='created', type='text', notnull=0, default_value=None, is_pk=0, hidden=0), E ? ^^^^ E + Column(cid=1, name='created', type='TEXT', notnull=0, default_value=None, is_pk=0, hidden=0), ... E {'a_href': '/fixtures/sortable?_sort=sortable_with_nulls_2', E 'attrs': {'class': ['col-sortable_with_nulls_2'], E 'data-column': 'sortable_with_nulls_2', E 'data-column-not-null': '0', E - 'data-column-type': 'real', E ? ^^^^ E + 'data-column-type': 'REAL', E ? ^^^^ ``` Something is causing column types to come back in uppercase where previously they were lowercase.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160182768,I_kwDOCGYnMM5FJvvw,412,Optional Pandas integration,9599,simonw,open,0,,,,,13,2022-03-05T01:49:27Z,2022-06-14T15:36:29Z,,OWNER,,"It would be neat if there was a way to use this more seamlessly with Pandas, in particular Pandas dataframes - but without making Pandas a required dependency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1160034488,I_kwDOCGYnMM5FJLi4,411,Support for generated columns,25778,eyeseast,open,0,,,,,8,2022-03-04T20:41:33Z,2022-03-11T22:32:43Z,,CONTRIBUTOR,,"This is a fairly new feature -- SQLite version 3.31.0 (2020-01-22) -- that I, admittedly, haven't gotten to work yet. But it looks _incredibly_ useful: https://dgl.cx/2020/06/sqlite-json-support I'm not sure if this is an option on `add-column` or a separate command like `add-generated-column`. Either way, it needs an argument to populate it. It could be something like this: ```sh sqlite-utils add-column data.db table-name generated --as 'json_extract(data, ""$.field"")' --virtual ``` More here: https://www.sqlite.org/gencol.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/411/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1154399841,I_kwDOBm6k_c5Ezr5h,1645,"Sensible `cache-control` headers for static assets, including those served by plugins",697092,curiousleo,open,0,,,3268330,Datasette 1.0,4,2022-02-28T18:12:03Z,2022-03-08T02:59:29Z,,NONE,,"## What I'm seeing With `default_cache_ttl = 86400`, I see the following: A table view returns `Cache-control: max-age=86400`: ![Screenshot_20220228_190000](https://user-images.githubusercontent.com/697092/156034352-4d64683e-39c8-49af-81df-0217a5957bbd.png) A static asset returns no `Cache-control` header: ![Screenshot_20220228_185933](https://user-images.githubusercontent.com/697092/156034363-d0b03cc2-5889-4ed2-b601-8c1846b8469a.png) ## What I expected to see I expected the static asset to return a `Cache-control` header indicating that this response can be cached. ## Why this matters I'm productionising a Datasette deployment right now and was looking into putting it behind a Varnish instance. I was surprised to see requests for static assets being served from Datasette rather than Varnish, this is what led me to look more closely at the response headers. While Datasette serves those static assets pretty quickly, I don't see why Datasette should serve them. By their nature, static assets like images and JS files are very cacheable, so it should be easy to serve them from a cache like Varnish. (Note that Varnish can easily be configured to override this header, enabling caching for static assets. But it would be better if this override was not necessary.) ## Discussion It seems clear to me that serving static assets without a `Cache-control` header is not ideal. I see two options here: A. Static assets use the same logic as table / SQL views to set the `Cache-control` header based on `default_cache_ttl`. B. An additional setting for static assets is introduced (`default_static_cache_ttl`, say).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1645/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1152072027,I_kwDOBm6k_c5Eqzlb,1642,Dependency issue with asgiref and uvicorn,9599,simonw,closed,0,,,,,1,2022-02-26T18:00:35Z,2022-03-05T01:11:27Z,2022-03-05T01:11:17Z,OWNER,,"``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. datasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible. ``` That's after I forced an upgrade of `uvicorn` due to this warning: ``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. uvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,simonw,open,0,,,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1149310456,I_kwDOBm6k_c5EgRX4,1641,Tweak mobile keyboard settings,9599,simonw,open,0,,,,,1,2022-02-24T13:47:10Z,2022-02-24T13:49:26Z,,OWNER,,"https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12 `autocorrect=""off""` is worth experimenting with. Twitter: https://twitter.com/forestgregg/status/1496842959563726852",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,broccolihighkicks,open,0,,,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron. I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc). I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes). ```python Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ERROR: Exception in ASGI application Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ``` Thanks, I am finding Datasette very useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1148638868,I_kwDOBm6k_c5EdtaU,1639,Make datasette-redirect-forbidden unneccessary,9599,simonw,open,0,,,,,0,2022-02-23T22:18:46Z,2022-02-23T22:18:46Z,,OWNER,,"I wrote `datasette-redirect-forbidden` today because I needed 403 errors to redirect to `/-/login` and it was the quickest way to solve that problem. This should be a feature of Datasette core. - https://github.com/simonw/datasette-redirect-forbidden/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1145882578,I_kwDOCGYnMM5ETMfS,408,`deterministic=True` fails on versions of SQLite prior to 3.8.3,24938923,learning4life,closed,0,,,,,6,2022-02-21T14:36:43Z,2022-03-13T16:54:09Z,2022-03-02T00:38:11Z,NONE,,"Hi, love your work. I am unable to lookup indexes in a database using sqlite-utils: ` sqlite-utils indexes city_spec.db --table` or `sqlite-utils indexes city_spec.db MyTable ` **Software** sqlite-utils, version 3.24 sqlite3 --version: 3.36.0 **Output:** Traceback (most recent call last): File ""/opt/app-root/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/decorators.py"", line 26, in new_func return f(get_current_context(), *args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 2123, in indexes ctx.invoke( File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 1624, in query db.register_fts4_bm25() File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 403, in register_fts4_bm25 self.register_function(rank_bm25, deterministic=True) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 399, in register_function register(fn) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 392, in register self.conn.create_function(name, arity, fn, **kwargs) sqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1142107925,I_kwDOBm6k_c5EEy8V,1638,`filters_from_request` plugin hook docs should mention that returning an async function is allowed,9599,simonw,open,0,,,,,0,2022-02-18T00:08:26Z,2022-02-18T00:08:26Z,,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#filters-from-request-request-database-table-datasette doesn't mention that you can return an `async` function - but you can, and in fact Datasette itself uses that here: https://github.com/simonw/datasette/blob/aa7f0037a46eb76ae6fe9bf2a1f616c58738ecdf/datasette/filters.py#L43-L47",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1138008042,I_kwDOBm6k_c5D1J_q,1636,"""permissions"" propery in metadata for configuring arbitrary permissions",9599,simonw,closed,0,,,8711695, Datasette 1.0a2,14,2022-02-15T00:25:59Z,2022-12-13T02:40:50Z,2022-12-13T02:40:50Z,OWNER,,"The `""allow""` block mechanism can already be used to configure various default permissions. When adding permissions to `datasette-tiddlywiki` I realized it would be good to be able to configure arbitrary permissions such as `edit-tiddlywiki` there too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this: ```Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2' RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json ENV PORT 8080 EXPOSE 8080 CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db ``` This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,henrikek,open,0,,,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url. ![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png) And the result is: ![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png) If I add the marked row to url_builder.py it seams to work: ![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1128466114,I_kwDOCGYnMM5DQwbC,406,Creating tables with custom datatypes,82988,psychemedia,open,0,,,,,5,2022-02-09T12:16:31Z,2022-09-15T18:13:50Z,,NONE,,"Via https://stackoverflow.com/a/18622264/454773 I note the ability to register custom handlers for novel datatypes that can map into and out of things like sqlite `BLOB`s. From a quick look and a quick play, I didn't spot a way to do this in `sqlite_utils`? For example: ```python # Via https://stackoverflow.com/a/18622264/454773 import sqlite3 import numpy as np import io def adapt_array(arr): """""" http://stackoverflow.com/a/31312102/190597 (SoulNibbler) """""" out = io.BytesIO() np.save(out, arr) out.seek(0) return sqlite3.Binary(out.read()) def convert_array(text): out = io.BytesIO(text) out.seek(0) return np.load(out) # Converts np.array to TEXT when inserting sqlite3.register_adapter(np.ndarray, adapt_array) # Converts TEXT to np.array when selecting sqlite3.register_converter(""array"", convert_array) ``` ```python from sqlite_utils import Database db = Database('test.db') # Reset the database connection to used the parsed datatype # sqlite_utils doesn't seem to support eg: # Database('test.db', detect_types=sqlite3.PARSE_DECLTYPES) db.conn = sqlite3.connect(db_name, detect_types=sqlite3.PARSE_DECLTYPES) # Create a table the old fashioned way # but using the new custom data type vector_table_create = """""" CREATE TABLE dummy (title TEXT, vector array ); """""" cur = db.conn.cursor() cur.execute(vector_table_create) # sqlite_utils doesn't appear to support custom types (yet?!) # The following errors on the ""array"" datatype """""" db[""dummy""].create({ ""title"": str, ""vector"": ""array"", }) """""" ``` We can then add / retrieve records from the database where the datatype of the `vector` field is a custom registered `array` type (which is to say, a `numpy` array): ```python import numpy as np db[""dummy""].insert({'title':""test1"", 'vector':np.array([1,2,3])}) for row in db.query(""SELECT * FROM dummy""): print(row['title'], row['vector'], type(row['vector'])) """""" test1 [1 2 3] """""" ``` It would be handy to be able to do this idiomatically in `sqlite_utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1128139375,I_kwDOCGYnMM5DPgpv,405,"`Database(memory_name=""name"")` constructor argument",9599,simonw,closed,0,,,,,2,2022-02-09T07:15:03Z,2022-02-16T01:23:16Z,2022-02-16T01:23:16Z,OWNER,,"SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process. Datasette supports this - SQLite could support it too. https://docs.datasette.io/en/0.60.2/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128120451,I_kwDOCGYnMM5DPcCD,404,Add example of `--convert` to the help for `sqlite-utils insert`,9599,simonw,closed,0,,,,,2,2022-02-09T06:49:09Z,2022-02-09T06:56:35Z,2022-02-09T06:55:16Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of `--convert` in action. I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126692066,I_kwDOCGYnMM5DJ_Ti,403,Document how to add a primary key to a rowid table using `sqlite-utils transform --pk`,536941,fgregg,closed,0,,,,,4,2022-02-08T01:39:40Z,2022-02-09T04:22:43Z,2022-02-08T19:33:59Z,CONTRIBUTOR,,"*Original title: Add option for adding a new, serial, primary key* sometimes we have tables that don't have primary keys, but ought to have them. we *can* use rowid for that, but it would often be nicer to have an explicit primary key. using the current value of rowid would be fine.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ``` > % mkdir /tmp/data > % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit) > ^CINFO: Shutting down > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK > ``` > The first time I ran Datasette I got two databases - `fixtures` and `created` > > BUT... when I ran Datasette the second time it looked like this: > > > > This is the same result you get if you run: > > datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db > > This is caused by this Datasette issue: > - https://github.com/simonw/datasette/issues/509 > > So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831 _Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125576543,I_kwDOBm6k_c5DFu9f,1630,Review datasette.utils and decide which functions should be documented for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-02-07T06:39:52Z,2022-02-07T06:39:52Z,,OWNER,,"Follows: - #1176",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1125297737,I_kwDOCGYnMM5DEq5J,402,Advanced class-based `conversions=` mechanism,9599,simonw,open,0,,,,,14,2022-02-06T19:47:41Z,2022-02-16T10:18:55Z,,OWNER,,"The `conversions=` parameter works like this at the moment: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions ```python db[""places""].insert( {""name"": ""Wales"", ""geometry"": wkt}, conversions={""geometry"": ""GeomFromText(?, 4326)""}, ) ``` This proposal is to support values in that dictionary that are objects, not strings, which can represent more complex conversions - spun out from #399. New proposed mechanism: ```python from sqlite_utils.utils import LongitudeLatitude db[""places""].insert( { ""name"": ""London"", ""point"": (-0.118092, 51.509865) }, conversions={""point"": LongitudeLatitude}, ) ``` Here `LongitudeLatitude` is a magical value which does TWO things: it sets up the `GeomFromText(?, 4326)` SQL function, and it handles converting the `(51.509865, -0.118092)` tuple into a `POINT({} {})` string. This would involve a change to the `conversions=` contract - where it usually expects a SQL string fragment, but it can also take an object which combines that SQL string fragment with a Python conversion function. Best of all... this resolves the `lat, lon` v.s. `lon, lat` dilemma because you can use `from sqlite_utils.utils import LongitudeLatitude` OR `from sqlite_utils.utils import LatitudeLongitude` depending on which you prefer! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030739566_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1125081640,I_kwDOCGYnMM5DD2Io,401,Update SpatiaLite example in the documentation,9599,simonw,closed,0,,,,,2,2022-02-06T02:02:07Z,2022-02-06T02:05:03Z,2022-02-06T02:03:24Z,OWNER,,"This one here: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions It should take advantage of the new methods from: - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125077063,I_kwDOCGYnMM5DD1BH,400,`sqlite-utils create-table` ... `--if-not-exists`,9599,simonw,closed,0,,,,,1,2022-02-06T01:32:53Z,2022-02-06T01:34:53Z,2022-02-06T01:34:46Z,OWNER,,"Inspired by: - #397 To match the option on `create-index`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-index ``` --if-not-exists Ignore if index already exists ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124731464,I_kwDOCGYnMM5DCgpI,399,"Make it easier to insert geometries, with documentation and maybe code",9599,simonw,open,0,,,,,25,2022-02-05T00:11:26Z,2023-05-16T03:11:52Z,,OWNER,,"In playing with the new SpatiaLite helpers from #385 I noticed that actually populating geometry columns is still a little bit tricky. Here's what I ended up doing: ```python import httpx, sqlite_utils db = sqlite_utils.Database(""/tmp/spatial.db"") attractions = httpx.get(""https://latest.datasette.io/fixtures/roadside_attractions.json?_shape=array"").json() db[""attractions""].insert_all(attractions, pk=""pk"") # Schema of that table is now: # CREATE TABLE [attractions] ( # [pk] INTEGER PRIMARY KEY, # [name] TEXT, # [address] TEXT, # [latitude] FLOAT, # [longitude] FLOAT # ) db.init_spatialite() db[""attractions""].add_geometry_column(""point"", ""POINT"") db.execute("""""" update attractions set point = GeomFromText( 'POINT(' || longitude || ' ' || latitude || ')', 4326 ) """""") ``` That last line took some figuring out - especially the need for the SRID of `4326`, without which I got this error: > `IntegrityError: attractions.point violates Geometry constraint [geom-type or SRID not allowed]` It would be good to both document this in more detail, but ideally also to come up with a more obvious pattern for inserting common types of spatial data. Also related: - #398 - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1124237013,I_kwDOCGYnMM5DAn7V,398,Add SpatiaLite helpers to CLI,25778,eyeseast,closed,0,,,,,9,2022-02-04T14:01:28Z,2022-02-16T01:02:29Z,2022-02-16T00:58:07Z,CONTRIBUTOR,,"Now that #385 is merged, add CLI versions of those methods. ```sh # init spatialite sqlite-utils init-spatialite database.db # or maybe/also sqlite-utils create database.db --enable-wal --spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this needs to create a table if it doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column sqlite-utils create-spatial-index database.db table-name geometry ``` Should be mostly straightforward. The one thing worth highlighting in docs is that geometry columns can only be added to existing tables. Trying to add a geometry column to a table that doesn't exist yet might mean you have a schema like `{""rowid"": int, ""geometry"": bytes}`. Might be worth nudging people to explicitly create a table first, then add geometry columns. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123851690,I_kwDOCGYnMM5C_J2q,396,"mypy failure, sqlite_utils/utils.py:56",9599,simonw,closed,0,,,,,0,2022-02-04T06:08:09Z,2022-02-04T06:10:33Z,2022-02-04T06:10:33Z,OWNER,,"https://github.com/simonw/sqlite-utils/runs/5062725880?check_suite_focus=true > `sqlite_utils/utils.py:56: error: Incompatible return value type (got ""None"", expected ""str"")`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123849278,I_kwDOCGYnMM5C_JQ-,395,"""apt-get: command not found"" error on macOS",9599,simonw,closed,0,,,,,1,2022-02-04T06:03:42Z,2022-02-04T06:10:58Z,2022-02-04T06:10:58Z,OWNER,,"Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123393829,I_kwDODFE5qs5C9aEl,10,sqlite3.OperationalError: no such table: main.my_activity,69208826,glxblt14,open,0,,,,,1,2022-02-03T17:59:29Z,2022-03-20T02:38:07Z,,NONE,,"Hello, When i run the command `google-takeout-to-sqlite my-activity db.db takeout-20220203T174446Z-001.zip`, i get this error : ``` Traceback (most recent call last): File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""C:\Users\julie\AppData\Local\Programs\Python\Python39-32\Scripts\google-takeout-to-sqlite.exe\__main__.py"", line 7, in File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1053, in main rv = self.invoke(ctx) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\cli.py"", line 31, in my_activity utils.save_my_activity(db, zf) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\utils.py"", line 19, in save_my_activity db[""my_activity""].create_index([""time""]) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\sqlite_utils\db.py"", line 629, in create_index self.db.conn.execute(sql) sqlite3.OperationalError: no such table: main.my_activity ``` Thank you for your help Sorry for my bad English EDIT: i used the json format",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122557010,I_kwDOBm6k_c5C6NxS,1627,Get the tests passing against Windows,9599,simonw,open,0,,,,,0,2022-02-03T01:23:06Z,2022-02-03T01:23:32Z,,OWNER,,"> OK, the tests do NOT pass against Windows! https://github.com/simonw/datasette/runs/5044105941 > > _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1626#issuecomment-1028515161_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122450452,I_kwDOBm6k_c5C5zwU,1625,Try running tests against macOS and Windows in addition to Ubuntu,9599,simonw,open,0,,,,,0,2022-02-02T22:25:57Z,2022-02-02T22:25:57Z,,OWNER,,"I already do this for `sqlite-utils`: https://github.com/simonw/sqlite-utils/blob/3.22.1/.github/workflows/test.yml Related: - #1617 - #1545",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122446693,I_kwDOCGYnMM5C5y1l,394,Test against Python 3.11-dev,9599,simonw,open,0,,,,,1,2022-02-02T22:21:03Z,2022-02-03T21:06:35Z,,OWNER,,"Same as: - https://github.com/simonw/datasette/issues/1621",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette"" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked % curl -I 'https://latest.datasette.io/' HTTP/1.1 200 OK link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette"" content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:52 GMT Server: Google Frontend Transfer-Encoding: chunked ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from: - #1620 ``` % curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122413719,I_kwDOBm6k_c5C5qyX,1621,Test against Python 3.11 dev version,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-02-02T21:38:57Z,2022-03-19T04:04:49Z,2022-02-02T21:58:54Z,OWNER,,"To avoid another surprise like we got with 3.10: https://simonwillison.net/2021/Oct/9/finding-and-reporting-a-bug/ From a quick GitHub code search it looks like `3.11-dev` should work: https://cs.github.com/urllib3/urllib3/blob/7bec77e81aa0a194c98381053225813f5347c9d2/.github/workflows/ci.yml#L60",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,simonw,open,0,,,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment: datasette fixtures.db -p 3344 --setting base_url /foo/bar/ Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3 But... that `json` link goes here, which is a 404: http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121121305,I_kwDOBm6k_c5C0vQZ,1618,"Reconsider policy on blocking queries containing the string ""pragma""",770231,strada,open,0,,,,,6,2022-02-01T19:39:46Z,2022-02-02T19:42:03Z,,NONE,,"First of all, thanks for creating this cool project, and also supporting publishing to various hosting services out of the box. While testing out, I noticed legitimate queries such as ``` select * from books where title like 'Pragmatic%' ``` or ``` select * from books where title = 'The Pragmatic Programmer' ``` are blocked, due to the regular expression check here: https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L185 Example as seen from a Datasette instance: https://fivethirtyeight.datasettes.com/polls?sql=select+*+from+books+where+title+like+%27Pragmatic%25%27%0D%0A I'd propose a regular expression like ``` re.compile(f""pragma_(?!({'|'.join(allowed_pragmas)}))""), ``` instead of ``` re.compile(f""pragma(?!_({'|'.join(allowed_pragmas)}))""), ``` I can create a pull request with this change, unless the maintainers think it would allow unwanted queries to be executed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1118585417,I_kwDOCGYnMM5CrEJJ,393,Better documentation for insert-replace,9599,simonw,closed,0,,,,,1,2022-01-30T15:40:23Z,2022-02-03T22:13:24Z,2022-02-03T22:13:24Z,OWNER,,"Currently: https://sqlite-utils.datasette.io/en/stable/python-api.html#insert-replacing-data > If you want to insert a record or replace an existing record with the same primary key, using the replace=True argument to .insert() or .insert_all(): Should describe the exception you get first, then how to use replace to avoid it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1117132741,I_kwDOBm6k_c5ClhfF,1615,Potential simplified publishing mechanism,369053,aidansteele,closed,0,,,,,2,2022-01-28T08:34:50Z,2022-02-02T07:34:21Z,2022-02-02T07:34:17Z,NONE,,"Hi, Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet. I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services. So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting. E.g. user `johndoe` creates a repo `my-animals` with a couple of files: `dogs.csv`, `cats.csv` and the following GitHub Actions workflow: ```yaml # .github/workflows/push.yml on: push # this allows the publish action to use OIDC to authenticate johndoe/my-animals permissions: id-token: write contents: read jobs: publish: runs-on: ubuntu-latest steps: - uses: actions/setup-python@v2 - run: pip install sqlite-utils - uses: actions/checkout@v2 - run: | set -eux sqlite-utils create-database animals.db sqlite-utils insert animals.db dogs dogs.csv --csv sqlite-utils insert animals.db cats cats.csv --csv - uses: datasette-hub/publish@v1 with: db: animals.db metadata: meta.yml # this step is helpful for debugging why the # generated sqlite db was rejected - uses: actions/upload-artifact@v2 if: failure() with: path: animals.db retention-days: 1 ``` This would then cause a Datasette instance to be available at `https://johndoe-my-animals.datasette-hub.test/`. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette. What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with [git scraping](https://simonwillison.net/2020/Oct/9/git-scraping/).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1115435536,I_kwDOBm6k_c5CfDIQ,1614,Try again with SQLite codemirror support,9599,simonw,open,0,,,,,1,2022-01-26T20:05:20Z,2022-12-23T21:27:10Z,,OWNER,,"I tried and failed to implement autocomplete a while ago. Relevant code: https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335 Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114640101,I_kwDOCGYnMM5CcA7l,392,`sqlite-utils bulk --batch-size` option,9599,simonw,closed,0,,,,,4,2022-01-26T05:17:11Z,2022-01-26T18:17:59Z,2022-01-26T18:17:59Z,OWNER,,"> Could add support for `--batch-size` as seen in `insert`/`upsert` too - causing it to break the list up into batches and commit for each one. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/391#issuecomment-1021876055_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114638930,I_kwDOCGYnMM5CcApS,391,`sqlite-utils bulk` progress bar,9599,simonw,closed,0,,,,,2,2022-01-26T05:14:49Z,2022-01-26T05:17:20Z,2022-01-26T05:16:51Z,OWNER,,"It can easily have a progress bar because it works by looping through an iterator: https://github.com/simonw/sqlite-utils/blob/a9fca7efa4184fbb2a65ca1275c326950ed9d3c1/sqlite_utils/cli.py#L1014-L1018 Should also support the `--silent` option if I add this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114628238,I_kwDOBm6k_c5Cb-CO,1613,Improvements to help make Datasette a better tool for learning SQL,9599,simonw,open,0,,,,,5,2022-01-26T04:56:07Z,2022-01-26T16:41:46Z,,OWNER,,Tracking issue for the general goal of making Datasette a better tool for learning SQL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1613/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114557284,I_kwDOCGYnMM5Cbstk,390,`sqlite-utils upsert` should require `--pk` more elegantly,9599,simonw,closed,0,,,,,1,2022-01-26T02:20:31Z,2022-01-26T03:20:25Z,2022-01-26T03:19:43Z,OWNER,,"Currently throws an ugly traceback: ``` % echo '[ {""id"": 1, ""name"": ""Lila""}, {""id"": 1, ""name"": ""Lila""} ]' | sqlite-utils upsert data.db chickens - Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 1104, in upsert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 906, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2615, in insert_all raise PrimaryKeyRequired(""upsert() requires a pk"") sqlite_utils.db.PrimaryKeyRequired: upsert() requires a pk ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114544727,I_kwDOCGYnMM5CbppX,389,Plausible analytics for documentation,9599,simonw,closed,0,,,,,2,2022-01-26T01:58:35Z,2022-01-26T02:07:41Z,2022-01-26T02:07:41Z,OWNER,,"```html ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/388#issuecomment-1021785268_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114543475,I_kwDOCGYnMM5CbpVz,388,Link to stable docs from older versions,9599,simonw,closed,0,,,,,7,2022-01-26T01:55:46Z,2023-03-26T23:43:12Z,2022-01-26T02:00:22Z,OWNER,,"https://sqlite-utils.datasette.io/en/2.14.1/ isn't showing a link to the stable release right now. I should also apply the same fix I used for Datasette in: - https://github.com/simonw/datasette/issues/1608 TIL: https://til.simonwillison.net/readthedocs/link-from-latest-to-stable",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114147905,I_kwDOBm6k_c5CaIxB,1612,Move canned queries closer to the SQL input area,639012,jsfenfen,closed,0,,,3268330,Datasette 1.0,5,2022-01-25T17:06:39Z,2022-03-19T04:04:49Z,2022-01-25T18:34:21Z,CONTRIBUTOR,,"*Original title: Consider placing example queries above the sql input?* Hi! Have been enjoying deploying ad hoc datasettes for collaborators to pick over! I keep finding myself manually ""fixing"" the database.html template so that the ""example queries"" (canned queries) appear directly *over* the sql box? So they are sorta more a suggestion for collaborators who aren't inclined to write their own queries? My sense is any time I go to the trouble of writing canned queries my users should see 'em? (( I have also considered a client-side reactive-ish option where selecting a query just places the raw SQL in the box and doesn't execute it, but this seems to end up being an inconvenience, rather than a teaching tool. )) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1113384383,I_kwDOBm6k_c5CXOW_,1611,Avoid ever running count(*) against SpatiaLite KNN table,9599,simonw,open,0,,,,,1,2022-01-25T03:32:54Z,2022-02-02T06:45:47Z,,OWNER,,"Got this in a trace: Looks like running `count(*)` against KNN took 83s! It ignored the time limit. And still only returned a count of 0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1111293050,I_kwDOCGYnMM5CPPx6,387,Python library docs should start with a self contained example,9599,simonw,closed,0,,,,,1,2022-01-22T06:23:56Z,2022-01-26T01:37:17Z,2022-01-26T01:35:30Z,OWNER,,You have to read a lot of stuff in a lot of different places to get started with the Python library. Add a getting started introduction to https://sqlite-utils.datasette.io/en/stable/python-api.html,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109884720,I_kwDOBm6k_c5CJ38w,1609,"Ensure ""pip install datasette"" still works with Python 3.6",9599,simonw,closed,0,,,,,12,2022-01-21T00:08:10Z,2022-01-24T19:20:09Z,2022-01-21T02:24:13Z,OWNER,,"## Original title: Can I keep ""pip install datasette"" working on Python 3.6? I dropped support for 3.6 in: - #1577 I'm getting reports that `pip3 install datasette` throws an error on that Python, even though I haven't made that new release yet - presumably due to lack of pinning of Uvicorn: https://twitter.com/ldodds/status/1484289475195080706 Is it possible to get `pip` on that version of Python to install the highest possible version of the packages that are still known to support Python 3.6? If so, how?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1609/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that. On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike. These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108671952,I_kwDOBm6k_c5CFP3Q,1605,Scripted exports,25778,eyeseast,open,0,,,,,10,2022-01-19T23:45:55Z,2022-11-30T15:06:38Z,,CONTRIBUTOR,,"Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries. I used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries. This is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108300685,I_kwDOBm6k_c5CD1ON,1604,Option to assign a domain/subdomain using `datasette publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-19T16:21:17Z,2022-01-19T16:23:54Z,,OWNER,,Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108235694,I_kwDOBm6k_c5CDlWu,1603,A proper favicon,9599,simonw,closed,0,,,3268330,Datasette 1.0,19,2022-01-19T15:24:55Z,2022-03-19T04:04:49Z,2022-01-20T06:07:31Z,OWNER,,"Tips here: https://adamj.eu/tech/2022/01/18/how-to-add-a-favicon-to-your-django-site/ - I think a PNG served at `/favicon.ico` is the best option, since safari doesn't support SVG yet. Relevant code: https://github.com/simonw/datasette/blob/cb29119db9115b1f40de2fb45263ed77e3bfbb3e/datasette/app.py#L182-L183 I can reuse the icon for https://datasette.io/desktop",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1107557831,I_kwDOCGYnMM5CA_3H,386,"Better ""contributing"" documentation",9599,simonw,closed,0,,,,,0,2022-01-19T02:11:48Z,2022-01-19T02:15:21Z,2022-01-19T02:15:21Z,OWNER,,"This page jumps straight into running the tests: https://sqlite-utils.datasette.io/en/latest/contributing.html It should add a little more about expected collaboration styles - opening an issue before filing a pull request - and probably link to https://simonwillison.net/2022/Jan/12/how-i-build-a-feature/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/386/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1105916061,I_kwDOBm6k_c5B6vCd,1601,Add KNN and data_licenses to hidden tables list,25778,eyeseast,closed,0,,,,,5,2022-01-17T14:19:57Z,2022-01-20T21:29:44Z,2022-01-20T04:38:54Z,CONTRIBUTOR,,"They're generated by Spatialite and not very interesting in most cases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1104691662,I_kwDOBm6k_c5B2EHO,1600,plugins --all example should use cog,9599,simonw,closed,0,,,,,1,2022-01-15T11:47:49Z,2022-01-20T05:06:21Z,2022-01-20T05:04:16Z,OWNER,,The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102966378,I_kwDOBm6k_c5Bve5q,1599,Add architecture documentation,9599,simonw,open,0,,,,,0,2022-01-14T04:55:38Z,2022-01-14T04:56:03Z,,OWNER,,"Inspired by https://matklad.github.io/2021/02/06/ARCHITECTURE.md.html Good example: https://github.com/rust-analyzer/rust-analyzer/blob/d7c99931d05e3723d878bea5dc26766791fa4e69/docs/dev/architecture.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102637351,I_kwDOBm6k_c5BuOkn,1598,Replace update-docs-help.py script with cog,9599,simonw,closed,0,,,,,1,2022-01-14T00:33:27Z,2022-01-14T00:47:57Z,2022-01-14T00:47:57Z,OWNER,,"I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism: https://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help ``` Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. ``` ``` Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension TEXT Path to a SQLite extension to load --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102568047,I_kwDOBm6k_c5Bt9pv,1596,Documentation page warning of changes coming in 1.0,9599,simonw,open,0,,,,,0,2022-01-13T23:26:04Z,2022-01-13T23:26:04Z,,OWNER,,I should start this relatively soon.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1100499619,I_kwDOBm6k_c5BmEqj,1592,Row pages should show links to foreign keys,9599,simonw,open,0,,,,,1,2022-01-12T15:50:20Z,2022-01-12T15:52:17Z,,OWNER,,Refs #1518 refactor.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1592/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1100015398,I_kwDOBm6k_c5BkOcm,1591,Maybe let plugins define custom serve options?,9599,simonw,open,0,,,,,7,2022-01-12T08:18:47Z,2022-01-15T11:56:59Z,,OWNER,,"https://twitter.com/psychemedia/status/1481171650934714370 > can extensions be passed their own cli args? eg `--ext-tiddlywiki-dbname tiddlywiki2.sqlite` ? I've thought something like this might be useful for other plugins in the past, too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1099897648,I_kwDOCGYnMM5Bjxsw,384,Add examples to every `--help`,9599,simonw,closed,0,,,,,0,2022-01-12T05:31:25Z,2022-01-26T03:15:02Z,2022-01-26T03:15:02Z,OWNER,,Everything on https://sqlite-utils.datasette.io/en/stable/cli-reference.html would benefit from an example.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099723916,I_kwDOBm6k_c5BjHSM,1590,Table+query JSON and CSV links broken when using `base_url` setting,1001306,eelkevdbos,closed,0,,,7571612,Datasette 0.60,11,2022-01-11T23:46:39Z,2022-01-14T01:16:34Z,2022-01-14T01:16:08Z,NONE,,"Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set. In the follow asgi example, I'm hosting a custom Datasette instance: ```python # asgi.py import pathlib from asgi_cors import asgi_cors from channels.routing import URLRouter from django.urls import re_path from datasette.app import Datasette datasette_ = Datasette( files=[], settings={ ""base_url"": ""/datasettes/"", ""plugins"": {} }, config_dir=pathlib.Path('.'), ) application = URLRouter([ re_path(r""^datasettes/.*"", asgi_cors(datasette_.app(), allow_all=True)), ]) ``` Running it with: ```shell $ daphne -p 8002 asgi:application ``` Using a simple query on the `_memory` table: ```sql select sqlite_version() ``` http://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29 It renders the following upon inspection: ![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png) I am using datasette version `0.59.4`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099586786,I_kwDOCGYnMM5Bilzi,383,Add documentation page with the output of `--help`,9599,simonw,closed,0,,,,,4,2022-01-11T20:25:58Z,2022-01-11T22:55:05Z,2022-01-11T21:44:05Z,OWNER,,"Can be maintained using `cog` from #373. Similar in purpose to the API reference page, but this is for the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099585611,I_kwDOCGYnMM5BilhL,382,`--where` option for `sqlite-rows`,9599,simonw,closed,0,,,,,1,2022-01-11T20:24:23Z,2022-01-11T23:33:14Z,2022-01-11T23:32:47Z,OWNER,,CLI equivalent of `table.rows_where()` - should accept parameters too. Work on this at the same time as #381.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099584685,I_kwDOCGYnMM5BilSt,381,`sqlite-utils rows` options `--limit` and `--offset`,9599,simonw,closed,0,,,,,2,2022-01-11T20:23:12Z,2022-01-11T23:33:37Z,2022-01-11T23:19:36Z,OWNER,,Because I often want to use it just to preview a few rows from the database. Piping through `| head -n 20` works for JSON and CSV (they stream) but not for `--table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098574572,I_kwDOCGYnMM5Beurs,380,Release notes for 3.21,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-11T02:12:30Z,2022-01-11T02:34:26Z,2022-01-11T02:34:26Z,OWNER,,For these commits: https://github.com/simonw/sqlite-utils/compare/3.20...129141572f249ea290e2a075437e2ebaad215859,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/380/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098544628,I_kwDOCGYnMM5BenX0,379,CLI options for running ANALYZE,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-11T01:09:16Z,2022-01-11T01:38:01Z,2022-01-11T01:36:48Z,OWNER,,"> The Python methods are all done now, next step is the CLI options. I'll do those in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009508865_ - [x] `sqlite-utils analyze` command - [x] `sqlite-utils create-index --analyze` option (see #365) - [x] `sqlite-utils insert --analyze` option - [x] `sqlite-utils upsert --analyze` option In #378 I also added `.delete_where(..., analyze=True)` but there isn't currently a `sqlite-utils delete-where` CLI command - deletions via CLI are expected to be handled using SQL queries.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098309897,I_kwDOCGYnMM5BduEJ,378,analyze=True parameter for some methods,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T19:54:52Z,2022-01-11T01:08:11Z,2022-01-11T01:08:09Z,OWNER,,"This would cause `ANALYZE` to be run against the relevant table at the end of executing the method. > Having browsed the API reference I think the methods that would benefit from an `analyze=True` parameter are: - [x] `table.create_index` - [x] `table.insert_all` - [x] `table.upsert_all` - [x] `table.delete_where` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009288898_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/378/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097436959,I_kwDOCGYnMM5BaY8f,376,`--nl` mode should ignore blank lines,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T04:10:54Z,2022-01-10T19:27:41Z,2022-01-10T04:12:46Z,OWNER,,Spotted this while manually testing #364 - there's no reason `--nl` should crash if you feed it an empty line in between JSON objects.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097332098,I_kwDODEm0Qs5BZ_WC,64,Include all entities for tweets,111631,max,open,0,,,,,0,2022-01-09T23:35:28Z,2022-01-09T23:35:28Z,,NONE,,"Per our conversation [on Twitter](https://twitter.com/mschoening/status/1480312477246054401): It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type `media`.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135860,I_kwDOCGYnMM5BZPb0,374,`--fmt` should imply `-t`,9599,simonw,closed,0,,,7558727,3.21,4,2022-01-09T08:23:07Z,2022-01-10T19:27:26Z,2022-01-09T18:07:59Z,OWNER,,Not sure why I didn't implement this.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097129710,I_kwDOCGYnMM5BZN7u,372,Idea: `suffix` and `stem` file columns,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-09T07:48:53Z,2022-01-10T19:27:34Z,2022-01-09T20:17:00Z,OWNER,,"For https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-data-from-files Given a file called `dogs.jpg` stem would be `dogs` and ext would be `jpg`. Need to decide what happens for `dogs.and.cats.jpg.gz`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/372/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097128334,I_kwDOCGYnMM5BZNmO,371,Support mutating row in `--convert` without returning it,9599,simonw,closed,0,,,7558727,3.21,6,2022-01-09T07:38:44Z,2022-01-10T19:27:30Z,2022-01-09T20:06:15Z,OWNER,,"Currently you have to do this: ``` $ sqlite-utils insert dogs.db dogs dogs.json --convert ' row[""is_good""] = 1 return row' ``` Would be neat if this worked too: ``` $ sqlite-utils insert dogs.db dogs dogs.json \ --convert 'row[""is_good""] = 1' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097091527,I_kwDOCGYnMM5BZEnH,369,Research how much of a difference analyze / sqlite_stat1 makes,9599,simonw,closed,0,,,,,11,2022-01-09T03:03:36Z,2022-02-03T21:07:41Z,2022-02-03T21:07:35Z,OWNER,,"> Is there a downside to having a `sqlite_stat1` table if it has wildly incorrect statistics in it? _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008163050_ More generally: how much of a difference does the `sqlite_stat1` table created by `ANALYZE` make to queries? I'm particularly interested in `group by` / `count *` queries since Datasette uses those for faceting.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table > > This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096563265,I_kwDOCGYnMM5BXDpB,366,Python library methods for calling ANALYZE,9599,simonw,closed,0,,,7558727,3.21,10,2022-01-07T18:28:01Z,2022-01-11T01:09:33Z,2022-01-11T01:09:33Z,OWNER,,"> Relevant documentation: https://www.sqlite.org/lang_analyze.html _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1007633376_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096558279,I_kwDOCGYnMM5BXCbH,365,create-index should run analyze after creating index,536941,fgregg,closed,0,,,7558727,3.21,16,2022-01-07T18:21:25Z,2022-01-11T02:43:34Z,2022-01-11T01:36:48Z,CONTRIBUTOR,,"sqlite's query planner depends upon analyze to make good use of indices. It would be nice if analyze was run as part of the create-index command. If data is inserted later, things can get out date, but it would still probably be a net win. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096536240,I_kwDOBm6k_c5BW9Cw,1586,run analyze on all databases as part of start up or publishing,536941,fgregg,open,0,,,,,1,2022-01-07T17:52:34Z,2022-02-02T07:13:37Z,,CONTRIBUTOR,,"Running `analyze;` lets sqlite's query planner make *much* better use of any indices. It might be nice if the analyze was run as part of the start up of ""serve"" or ""publish"".",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1095570074,I_kwDOCGYnMM5BTRKa,364,`--batch-size 1` doesn't seem to commit for every item,9599,simonw,closed,0,,,7558727,3.21,16,2022-01-06T18:18:50Z,2022-01-10T19:27:17Z,2022-01-10T05:36:19Z,OWNER,,"I'm trying this, but it doesn't seem to write anything to the database file until I hit `CTRL+C`: ``` heroku logs --app=simonwillisonblog --tail | grep 'measure#nginx.service' | \ sqlite-utils insert /tmp/herokutail.db log - --import re --convert ""$(cat < sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 949, in insert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 834, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2602, in insert_all first_record = next(records) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 3044, in fix_square_braces for record in records: File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 831, in docs = (decode_base64_values(doc) for doc in docs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 86, in decode_base64_values to_fix = [ File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 89, in if isinstance(doc[k], dict) TypeError: string indices must be integers ``` It would be nicer if that returned a more useful error message. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/361#issuecomment-1006295276_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1094974713,I_kwDOCGYnMM5BQ_z5,362,upsert --detect-types is broken,9599,simonw,closed,0,,,,,0,2022-01-06T05:12:10Z,2022-01-06T06:54:45Z,2022-01-06T06:28:34Z,OWNER,,"Noticed this thanks to syntax highlighting in VS Code showing an unused variable - need to fix it and add a test. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1091850530,I_kwDODEm0Qs5BFFEi,63,Import archive error 'withheld_in_countries',521097,pauloxnet,open,0,,,,,0,2022-01-01T16:58:59Z,2022-01-01T16:58:59Z,,NONE,,"Importing the twitter archive I received this error: ```bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File ""/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 759, in import_ archive.import_from_file(db, filename, content) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2625, in insert_all self.insert_chunk( File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2406, in insert_chunk result = self.db.execute(query, params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries ``` I found only a single tweet with the key `withheld_in_countries` in `tweet.js` that seems the problems: ```JSON [ { ""tweet"" : { ""retweeted"" : false, ""source"" : ""Twitter for Android"", ""entities"" : { ""hashtags"" : [ { ""text"" : ""NowOnAndroid"", ""indices"" : [ ""64"", ""77"" ] } ], ""symbols"" : [ ], ""user_mentions"" : [ { ""name"" : ""Periscope"", ""screen_name"" : ""PeriscopeCo"", ""indices"" : [ ""3"", ""15"" ], ""id_str"" : ""1111111111"", ""id"" : ""222222222"" } ], ""urls"" : [ { ""url"" : ""https://t.co/xxxxxxxxx"", ""expanded_url"" : ""https://vine.co/v/xxxxxxxxx"", ""display_url"" : ""vine.co/v/xxxxxxxxxx"", ""indices"" : [ ""78"", ""101"" ] } ] }, ""display_text_range"" : [ ""0"", ""101"" ], ""favorite_count"" : ""0"", ""id_str"" : ""1111111111111111111111"", ""truncated"" : false, ""retweet_count"" : ""0"", ""withheld_in_countries"" : [ ""TR"" ], ""id"" : ""000000000000000000"", ""possibly_sensitive"" : false, ""created_at"" : ""Fri Aug 14 06:04:03 +0000 2015"", ""favorited"" : false, ""full_text"" : ""RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk"", ""lang"" : ""en"" } } ] ``` I solved the error removing the key from the `tweet.js` but I'm reporting this error to improve the project.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091838742,I_kwDOBm6k_c5BFCMW,1585,Fire base caching for `publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-01T15:38:15Z,2022-01-01T15:40:38Z,,OWNER,,"https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848 Could this enable easier vanity URLs of the format `https://$project_id.web.app/`? How about CDN caching?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091819089,I_kwDOCGYnMM5BE9ZR,360,MemoryError,559453,nzaar9,closed,0,,,,,1,2022-01-01T13:39:17Z,2022-03-21T04:22:46Z,2022-03-21T04:22:46Z,NONE,,"HI, when dealing with large json file (~170GB) i got the following error ``` Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1126, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1051, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1393, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 752, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/cli.py"", line 1300, in memory rows, format_used = rows_from_file(csv_fp, format=format, encoding=encoding) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 185, in rows_from_file return rows_from_file(buffered, format=Format.JSON) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 156, in rows_from_file decoded = json.load(fp) File ""/usr/lib/python3.9/json/__init__.py"", line 293, in load return loads(fp.read(), MemoryError ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1091257796,I_kwDOBm6k_c5BC0XE,1584,give error with recursive sql,58088336,tunguyenatwork,open,0,,,,,0,2021-12-30T18:53:16Z,2021-12-30T18:53:16Z,,NONE,,"I got an error ""near ""WITH"": syntax error"" after I upgraded to version 0.59 from 0.52.4. This error is related to recursive sql. It works great on the previous version but it failed after upgraded. Below is an example of sql: WITH RECURSIVE manager_of(position, super_position) AS (SELECT position, case ifnull(INDIRECT_SUPER_POSITION,'') when '' then super_position else INDIRECT_SUPER_POSITION end as SUPER_POSITION FROM position where super_position<>'SGV000000001' and super_position!='' and position <> super_position),chain_manager_of_position(position, level) AS (SELECT super_position, 1 as level FROM manager_of WHERE super_position!='' and (position=:pos or position in (Select position from employee where employee=:ein)) UNION ALL SELECT super_position, level+1 as level FROM manager_of JOIN chain_manager_of_position USING(position)) SELECT * FROM chain_manager_of_position left join employee using(position) where employee is not NULL order by level limit 1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090810196,I_kwDOBm6k_c5BBHFU,1583,consider adding deletion step of cloudbuild artifacts to gcloud publish,536941,fgregg,open,0,,,,,1,2021-12-30T00:33:23Z,2021-12-30T00:34:16Z,,CONTRIBUTOR,,"right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun. after successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090798237,I_kwDOCGYnMM5BBEKd,359,Use RETURNING if available to populate last_pk,9599,simonw,open,0,,,,,0,2021-12-29T23:43:23Z,2021-12-29T23:43:23Z,,OWNER,,"Inspired by this: https://news.ycombinator.com/item?id=29729283 > Because SQLite is effectively serializing all the writes for us, we have zero locking in our code. We used to have to lock when inserting new items (to get the LastInsertRowId), but the newer version of SQLite supports the RETURNING keyword, so we don't even have to lock on inserts now.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1089529555,I_kwDOBm6k_c5A8ObT,1581,"when hashed urls are turned on, the _memory db has improperly long-lived cache expiry",536941,fgregg,closed,0,,,,,1,2021-12-28T00:05:48Z,2022-03-24T04:08:18Z,2022-03-24T04:08:18Z,CONTRIBUTOR,,"if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database. in particular, this header is set: `cache-control: max-age=31536000` this is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561). Either the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087931918,I_kwDOBm6k_c5A2IYO,1579,`.execute_write(... block=True)` should be the default behaviour,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-12-23T18:54:28Z,2022-01-13T22:28:08Z,2021-12-23T19:18:26Z,OWNER,,"Every single piece of code I've written against the write APIs has used the `block=True` option to wait for the result. Without that, it instead fires the write into the queue but then continues even before it has finished executing. `block=True` should clearly be the default behaviour here!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087919372,I_kwDOBm6k_c5A2FUM,1578,Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key,9599,simonw,open,0,,,,,4,2021-12-23T18:27:59Z,2021-12-24T21:33:19Z,,OWNER,,"Found this while working on https://github.com/simonw/datasette-tiddlywiki Then clicking on `/tiddlywiki/tiddlers/%24%3A%2FDefaultTiddlers` returns a 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6* > `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/ > > Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days? > > Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087181951,I_kwDOBm6k_c5AzRR_,1576,Traces should include SQL executed by subtasks created with `asyncio.gather`,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2021-12-22T20:52:02Z,2022-02-05T05:21:35Z,2022-02-05T05:19:53Z,OWNER,,"I tried running some parallel SQL queries using `asyncio.gather()` but the SQL that was executed didn't show up in the trace rendered by https://datasette.io/plugins/datasette-pretty-traces I realized that was because traces are keyed against the current task ID, which changes when a sub-task is run using `asyncio.gather` or similar. The faceting and suggest faceting queries are missing from this trace: ![image](https://user-images.githubusercontent.com/9599/147153855-2d611f07-922a-4d18-9e6e-4be89e010dc4.png) > The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25 > > This is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively ""child tasks"" of the current request, and hence should be tracked the same. > > https://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-999870993_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084257842,I_kwDOBm6k_c5AoHYy,1575,__call__() got an unexpected keyword argument 'specname',9599,simonw,closed,0,,,,,1,2021-12-20T01:24:04Z,2021-12-20T01:48:03Z,2021-12-20T01:47:57Z,OWNER,,"> I've installed the alpha version but get an error when starting up Datasette: ``` Traceback (most recent call last): File ""/Users/tim/.pyenv/versions/stock-exchange/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py"", line 15, in from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py"", line 31, in from .views.database import DatabaseDownload, DatabaseView File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py"", line 25, in from datasette.plugins import pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py"", line 29, in mod = importlib.import_module(plugin) File ""/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py"", line 9, in @hookimpl(specname=""filters_from_request"") TypeError: __call__() got an unexpected keyword argument 'specname' ``` _Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084185188,I_kwDOBm6k_c5An1pk,1573,Make trace() a documented internal API,9599,simonw,open,0,,,,,1,2021-12-19T20:32:56Z,2021-12-19T21:13:13Z,,OWNER,,"This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52 Including the new `kwargs` pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1084007781,I_kwDOBm6k_c5AnKVl,1572,"""Query took"" should be ""Queries took""",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-19T04:03:00Z,2022-01-13T22:27:43Z,2021-12-19T04:03:24Z,OWNER,,"This is misleading, since usually there have been more than one query executed: ![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083927147,I_kwDOBm6k_c5Am2pr,1571,Track number of executions for execute_write_many() in traces,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T19:16:17Z,2022-01-13T22:27:49Z,2021-12-19T20:30:40Z,OWNER,,"Spotted while working on #1555 There's no indication there of how many times `execute_write_many()` executed the SQL. Solving this is a tiny bit tricky because `params_seq` is an iterator that we don't want to exhaust before passing it to `conn.executemany()` - so we need to instead wrap it in something that counts how many times it was called. But then we need a way to attach that to the trace here: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/database.py#L115-L122 So probably need to redesign the `trace()` decorator to allow extra pairs to be attached to it within the `with` statement. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods: > > - `db.execute_write(sql, params=None, block=False)` > - `db.execute_write_script(sql, block=False)` > - `db.execute_write_many(sql, params_seq, block=False)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this: ```diff diff --git a/datasette/database.py b/datasette/database.py index 468e936..1a424f5 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -94,10 +94,14 @@ class Database: f""file:{self.path}{qs}"", uri=True, check_same_thread=False ) - async def execute_write(self, sql, params=None, block=False): + async def execute_write(self, sql, params=None, executescript=False, block=False): + assert not executescript and params, ""Cannot use params with executescript=True"" def _inner(conn): with conn: - return conn.execute(sql, params or []) + if executescript: + return conn.executescript(sql) + else: + return conn.execute(sql, params or []) with trace(""sql"", database=self.name, sql=sql.strip(), params=params): results = await self.execute_write_fn(_inner, block=block) ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083718998,I_kwDOBm6k_c5AmD1W,1567,Remove undocumented sqlite_functions mechanism,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T01:51:10Z,2022-01-13T22:27:04Z,2021-12-18T01:54:46Z,OWNER,,"I added this in 0b8c1b0a6da9cb8ac0d28cc90dd783de87554036 but it's never been documented and the same thing can now be achieved using the `prepare_connection` plugin hook. https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L262 https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L551-L552 It's used here in the tests: https://github.com/simonw/datasette/blob/69244a617b1118dcbd04a8f102173f04680cf08c/tests/fixtures.py#L156",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083657868,I_kwDOBm6k_c5Al06M,1565,Documented JavaScript variables on different templates made available for plugins,9599,simonw,open,0,,,,,8,2021-12-17T22:30:51Z,2021-12-19T22:37:29Z,,OWNER,,"While working on https://github.com/simonw/datasette-leaflet-freedraw/issues/10 I found myself writing this atrocity to figure out the SQL query used for a specific table page: ```javascript let innerSql = Array.from(document.getElementsByTagName(""span"")).filter( el => el.innerText == ""View and edit SQL"" )[0].parentElement.getAttribute(""title"") ``` This is obviously bad - it's very brittle, and will break if I ever change the text on that link (like localizing it for example). Instead, I think pages like that one should have a block of script at the bottom something like this: ```javascript window.datasette = window.datasette || {}; datasette.view_name = 'table'; datasette.table_sql = 'select * from ...'; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection: ```pycon >>> from datasette.app import Datasette >>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""]) >>> db = ds.add_memory_database('geo') >>> await db.execute_write(""select InitSpatialMetadata(1)"") UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f') no such function: InitSpatialMetadata ``` It looks like the code that loads additional modules only works on read-only connections, not on write connections: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153 Compared to: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon >>> ds = Datasette(memory=True) Traceback (most recent call last): File """", line 1, in TypeError: __init__() missing 1 required positional argument: 'files' >>> ds = Datasette(memory=True, files=[]) ``` I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518. ``` % curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 '' <head> <title>fixtures: facetable: 14 rows where where on_earth = 1 sorted by pk ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082651698,I_kwDOCGYnMM5Ah_Qy,358,Support for CHECK constraints,11597658,luxint,open,0,,,,,7,2021-12-16T21:19:45Z,2022-09-25T07:15:59Z,,NONE,,"Hi, I noticed the `transform.table()` method doesn't have an option to add/change or drop a check constraint (see https://sqlite.org/lang_createtable.html -> 3.7 Check Constraints. would be great to have this as an option! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used. > > I'll open a separate issue to redesign this better for Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082564912,I_kwDOBm6k_c5AhqEw,1557,`?_nosuggest=1` parameter for disabling facet suggestions on table view,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-16T19:21:42Z,2022-01-13T22:26:48Z,2021-12-16T19:24:59Z,OWNER,,"Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1081318247,I_kwDOBm6k_c5Ac5tn,1556,"Show count of facet values always, not just for `?_facet_size=max`",9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-15T17:49:01Z,2022-01-13T22:26:07Z,2021-12-15T17:58:06Z,OWNER,,"> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1556/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 1079422215,I_kwDOCGYnMM5AVq0H,357,pytest-runner is not required,4067843,pgajdos,closed,0,,,,,1,2021-12-14T07:51:24Z,2021-12-16T20:43:19Z,2021-12-16T20:43:13Z,NONE,,Deprecated pytest-runner is not necessary for running the testsuite.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1 I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc ```sql SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns' FROM sqlite_schema AS m, pragma_index_list(m.name) AS il, pragma_index_info(il.name) AS ii WHERE m.type='table' ORDER BY 1; ``` https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit. it would be great if a response is truncated that an header signalling that was set in the header. i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1078702875,I_kwDOBm6k_c5AS7Mb,1552,Allow to set `facets_array` in metadata (like current `facets`),3556,davidbgk,closed,0,,,7571612,Datasette 0.60,9,2021-12-13T16:00:44Z,2022-01-13T22:26:15Z,2021-12-16T18:47:48Z,CONTRIBUTOR,,"For now, you can set a `facets` value (array) in your metadata file but I couldn't find a way to set a `facets_array` in order to provide default facets for arrays (like tags). My use-case is to access to [that kind of view](https://latest.datasette.io/fixtures/facetable?_facet_array=tags) by default without URL's parameters as with other default facets. _I'm new to datasette, and I'm willing to help with a PR if that is not already implemented and I missed it!_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor. I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077620955,I_kwDOBm6k_c5AOzDb,1549,Redesign CSV export to improve usability,536941,fgregg,open,0,,,3268330,Datasette 1.0,5,2021-12-11T19:02:12Z,2022-04-04T11:17:13Z,,CONTRIBUTOR,,"*Original title: Set content type for CSV so that browsers will attempt to download instead opening in the browser* Right now, if the user clicks on the CSV related to a table or a query, the response header for the content type is ""content-type: text/plain; charset=utf-8"" Most browsers will try to open a file with this content-type in the browser. This is not what most people want to do, and lots of folks don't know that if they want to download the CSV and open it in the a spreadsheet program they next need to save the page through their browser. It would be great if the response header could be something like ``` 'Content-type: text/csv'); 'Content-disposition: attachment;filename=MyVerySpecial.csv'); ``` which would lead browsers to open a download dialog. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077560091,I_kwDODEm0Qs5AOkMb,61,"Data Pull fails for ""Essential"" level access to the Twitter API (for Documentation)",57161638,jmnickerson05,open,0,,,,,1,2021-12-11T14:59:41Z,2022-10-31T14:47:58Z,,NONE,,"Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note. ![image](https://user-images.githubusercontent.com/57161638/145681272-8c85b3b9-be95-44ff-9760-1bafa4917ce2.png) And this is how I'm surfacing the message from utils.py: ![image](https://user-images.githubusercontent.com/57161638/145681005-2776c0ad-9822-4461-b43a-450ab2e828eb.png) ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077431957,I_kwDOCGYnMM5AOE6V,356,`sqlite-utils insert --convert` option,9599,simonw,closed,0,,,,,11,2021-12-11T07:24:48Z,2022-01-06T06:30:13Z,2022-01-06T06:28:53Z,OWNER,,"Idea come to me while re-reading this: https://simonwillison.net/2021/Aug/6/sqlite-utils-convert/ This is a bit of a hack: ``` cat /tmp/log.txt | \ jq --raw-input '{line: .}' --compact-output | \ sqlite-utils insert /tmp/logs.db log - --nl ``` Would be great if you could pipe lines to `insert` and transform them on the way in. A `--convert python-code` option, modeled after `sqlite-utils convert`, could do this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077322009,I_kwDOCGYnMM5ANqEZ,355,Allow users to pass a full convert() function definition,9599,simonw,closed,0,,,,,4,2021-12-10T23:59:58Z,2021-12-11T00:51:15Z,2021-12-11T00:49:31Z,OWNER,,"> I think the fix for this is to change the rules about what code is accepted in both the `-` mode and the literal code string mode: you can pass in a Python expression, OR a fragment that gets turned into a function, OR code that implements its own `def convert(value)` function. So this would work too: > ```sh > sqlite-utils convert my.db mytable col1 ' > def convert(value): > return value.upper() > ' > ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991381679_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077243232,I_kwDOCGYnMM5ANW1g,354,Test failure in test_rebuild_fts,9599,simonw,closed,0,,,,,7,2021-12-10T21:27:55Z,2021-12-11T01:08:46Z,2021-12-11T01:08:46Z,OWNER,,"Not sure why this has only just started failing, but I'm getting this: https://github.com/simonw/sqlite-utils/runs/4488687639 ``` E sqlite3.DatabaseError: database disk image is malformed sqlite_utils/db.py:425: DatabaseError _______________________ test_rebuild_fts[searchable_fts] _______________________ fresh_db = > table_to_fix = 'searchable_fts' @pytest.mark.parametrize(""table_to_fix"", [""searchable"", ""searchable_fts""]) def test_rebuild_fts(fresh_db, table_to_fix): table = fresh_db[""searchable""] table.insert(search_records[0]) table.enable_fts([""text"", ""country""]) # Run a search rows = list(table.search(""tanuki"")) assert len(rows) == 1 assert { ""rowid"": 1, ""text"": ""tanuki are running tricksters"", ""country"": ""Japan"", ""not_searchable"": ""foo"", }.items() <= rows[0].items() # Delete from searchable_fts_data fresh_db[""searchable_fts_data""].delete_where() # This should have broken the index with pytest.raises(sqlite3.DatabaseError): list(table.search(""tanuki"")) # Running rebuild_fts() should fix it > fresh_db[table_to_fix].rebuild_fts() ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077102934,I_kwDOCGYnMM5AM0lW,353,"Allow passing a file of code to ""sqlite-utils convert""",536941,fgregg,closed,0,,,,,8,2021-12-10T18:06:14Z,2021-12-11T01:38:29Z,2021-12-11T01:09:39Z,CONTRIBUTOR,,"sqlite-utils is so nice, but the ergonomics of the multiline code in kind of tough. It's really hard (maybe impossible) to make the newlines play well with Makefiles. it would be great to write your code fragment in a separate file and direct it into the sqlite-utils either like ```sqlite-utils convert my.db my_table my_column < custom_code.py``` or ```sqlite-utils convert my.db my_table my_column --custom-code=custom_code.py``` Thanks, as ever, for these great tools!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name): `` My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`): `` So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication. Thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072792507,I_kwDOCGYnMM4_8YO7,352,`sqlite-utils insert --extract colname`,9599,simonw,open,0,,,,,4,2021-12-07T00:55:44Z,2022-02-03T22:59:36Z,,OWNER,,"Is there a reason I've not added `--extract` as an option for `sqlite-utils insert` next? There's a `extracts=` option for the various `table.insert()` etc methods - last line in this code block: https://github.com/simonw/sqlite-utils/blob/213a0ff177f23a35f3b235386366ff132eb879f1/sqlite_utils/db.py#L2483-L2495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072780607,I_kwDOCGYnMM4_8VU_,351,Support `--import xml.etree.ElementTree` in `sqlite-utils convert`,9599,simonw,closed,0,,,,,1,2021-12-07T00:40:29Z,2021-12-11T00:11:25Z,2021-12-11T00:11:25Z,OWNER,,"It's not possible to use a module that requires a nested import, such as `xml.etree.ElementTree`, at the moment. I found and fixed this bug in `git-history`, I should replicate that fix (and accompanying documentation) here: https://github.com/simonw/git-history/issues/39",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,simonw,open,0,,,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072106103,I_kwDOBm6k_c4_5wp3,1542,feature request: order and dependency of plugins (that use js),33631,fs111,open,0,,,,,1,2021-12-06T12:40:45Z,2021-12-15T17:47:08Z,,NONE,,"I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js. Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it. Basically what I want to do is: my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized. Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,simonw,open,0,,,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071071397,I_kwDODFdgUs4_10Cl,69,View that combines issues and issue comments,9599,simonw,open,0,,,,,1,2021-12-04T00:34:33Z,2021-12-04T00:34:52Z,,MEMBER,,I want to see a reverse chronologically ordered interface onto both issues and comments - essentially a unified log of comments and issues opened across one or multiple projects.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1069881276,I_kwDOBm6k_c4_xRe8,1541,Different default layout for row page,9599,simonw,open,0,,,,,1,2021-12-02T18:56:36Z,2021-12-02T18:56:54Z,,OWNER,,"The row page displays as a table even though it only has one table row. maybe default to the same display as the narrow page version, even for wide pages?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1068791148,I_kwDOBm6k_c4_tHVs,1540,Idea: hover to reveal details of linked row,9599,simonw,open,0,,,,,6,2021-12-01T19:28:07Z,2021-12-09T23:38:39Z,,OWNER,," Hovering over that could work a little bit like GitHub issue links: ![hover](https://user-images.githubusercontent.com/9599/144300537-9cd9e9af-ac16-42db-842f-37661bc94063.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1067775061,I_kwDOBm6k_c4_pPRV,1539,Research PRAGMA query_only,9599,simonw,open,0,,,,,0,2021-11-30T23:30:24Z,2021-11-30T23:30:24Z,,OWNER,,"https://www.sqlite.org/pragma.html#pragma_query_only > The query_only pragma prevents data changes on database files when enabled. When this pragma is enabled, any attempt to CREATE, DELETE, DROP, INSERT, or UPDATE will result in an [SQLITE_READONLY](https://www.sqlite.org/rescode.html#readonly) error. However, the database is not truly read-only. You can still run a [checkpoint](https://www.sqlite.org/wal.html#ckpt) or a [COMMIT](https://www.sqlite.org/lang_transaction.html) and the return value of the [sqlite3_db_readonly()](https://www.sqlite.org/c3ref/db_readonly.html) routine is not affected. Would it be worth adding this as an extra protection against accidental writes to a DB file over a read-only connection?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1067771698,I_kwDOCGYnMM4_pOcy,348,Command for creating an empty database,9599,simonw,closed,0,,,7558727,3.21,6,2021-11-30T23:24:27Z,2022-01-13T07:06:59Z,2022-01-09T20:33:20Z,OWNER,,"I sometimes find the need to create an empty SQLite database file - for example if I want to enable WAL on it before using it with another script. I currently do that like this: sqlite3 my.db vacuum sqlite-utils enable-wal my.db It would be nice if `sqlite-utils` had a convenience command for doing this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066563554,I_kwDOCGYnMM4_knfi,346,Way to test SQLite 3.37 (and potentially other versions) in CI,9599,simonw,open,0,,,,,5,2021-11-29T22:21:06Z,2021-11-29T23:12:49Z,,OWNER,,"> Need to figure out a good pattern for testing this in CI too - it will currently skip the new tests if it doesn't have SQLite 3.37 or higher. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982076924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066501534,I_kwDOCGYnMM4_kYWe,345,`table.strict` introspection boolean for identifying STRICT mode tables,9599,simonw,closed,0,,,,,2,2021-11-29T21:05:10Z,2021-11-29T22:45:26Z,2021-11-29T22:44:36Z,OWNER,,"> From the STRICT docs: >> The SQLite parser accepts a comma-separated list of table options after the final close parenthesis in a CREATE TABLE statement. As of this writing (2021-08-23) only two options are recognized: >> >> - STRICT >> - [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html) > > So I think I need to read the `CREATE TABLE` statement from the `sqlite_master` table, split on the last `)`, split those tokens on `,` and see if `create` is in there (case insensitive). _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982020757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066474200,I_kwDOCGYnMM4_kRrY,344,Support STRICT tables,9599,simonw,closed,0,,,,,14,2021-11-29T20:32:23Z,2023-12-08T05:22:39Z,2023-12-08T05:22:39Z,OWNER,,"New in SQLite 3.37.0, released a few days ago: https://www.sqlite.org/stricttables.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065432388,I_kwDOBm6k_c4_gTVE,1534,Maybe return JSON from HTML pages if `Accept: application/json` is sent,9599,simonw,closed,0,,,,,4,2021-11-28T20:48:09Z,2022-04-27T21:59:34Z,2022-02-02T23:39:33Z,OWNER,,"Relates to #1533 - and to the work I've been doing on the https://github.com/simonw/datasette-table Web Component. It would be useful to support users pasting in a URL to a Datasette table or query without first having to add the `.json` extension themselves - since then other systems could hit that URL with `Accept: application/json` to get back the JSON representation without first needing to read the `Link: ` header from #1533 to figure out what the URL to that JSON is. (There is weird logic deep in Datasette that says that you add `.json` to the path UNLESS the table name itself ends with `.json`, in which case you add `?_format=json` - this is super-confusing). [Update: I removed that confusing feature here: [https://simonwillison.net/2022/Mar/19/weeknotes/](https://simonwillison.net/2022/Mar/19/weeknotes/)]",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_ I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059209412,I_kwDOBm6k_c4_IkDE,1523,Come up with a more elegant solution for base_url than ds.urls.path(),9599,simonw,open,0,,,,,0,2021-11-20T19:05:22Z,2021-11-20T19:05:22Z,,OWNER,,"While fixing #1519 I added a lot of ugly code that looks like this: https://github.com/simonw/datasette/blob/08947fa76433d18988aa1ee1d929bd8320c75fe2/datasette/facets.py#L228-L230 See these two commits in particular: fe687fd0207c4c56c4778d3e92e3505fc4b18172 and 08947fa76433d18988aa1ee1d929bd8320c75fe2 It would be great to come up with a less verbose and error-prone way of handling this problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058896236,I_kwDOBm6k_c4_HXls,1522,Deploy a live instance of demos/apache-proxy,9599,simonw,closed,0,,,,,34,2021-11-19T20:32:55Z,2021-11-23T03:00:34Z,2021-11-20T18:51:56Z,OWNER,,"> I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1521#issuecomment-974322178_ I started by following https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058815557,I_kwDOBm6k_c4_HD5F,1521,Docker configuration for exercising Datasette behind Apache mod_proxy,9599,simonw,closed,0,,,,,10,2021-11-19T18:46:18Z,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,,"> Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974310208_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058803238,I_kwDOBm6k_c4_HA4m,1520,Pattern for avoiding accidental URL over-rides,9599,simonw,open,0,,,,,1,2021-11-19T18:28:05Z,2021-11-19T18:29:26Z,,OWNER,,"Following #1517 I'm experimenting with a plugin that does this: ```python @hookimpl def register_routes(): return [ (r""/(?P[^/]+)/(?P[^/]+?)$"", Table().view), ] ``` This is supposed to replace the default table page with new code... but there's a problem: `/-/versions` on that instance now returns 404 `Database '-' does not exist`! Need to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058790545,I_kwDOBm6k_c4_G9yR,1519,base_url is omitted in JSON and CSV views,157158,phubbard,closed,0,,,,,22,2021-11-19T18:10:45Z,2021-12-01T17:50:09Z,2021-11-20T19:11:21Z,NONE,,"I have a datasette deployment, using Apache2 to reverse proxy: ProxyPass /ged http://thor.phfactor.net:8001 ProxyPreserveHost On In settings.json I have ```json { ""base_url"": ""/ged/"", ""trace_debug"": 1, ""template_debug"": 1 } ``` and datasette works correctly. However, if you view a query and then click on the 'This data as json, CSV' both links omit the base_url prefix and are therefore 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058196641,I_kwDOCGYnMM4_Esyh,342,Extra options to `lookup()` which get passed to `insert()`,9599,simonw,closed,0,,,,,7,2021-11-19T06:53:03Z,2021-11-19T07:26:54Z,2021-11-19T07:26:54Z,OWNER,,"For https://github.com/simonw/git-history/issues/12 I found myself wanting to pass extra options to `lookup()` to set the column order, primary key etc.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058072543,I_kwDOBm6k_c4_EOff,1518,Complete refactor of TableView and table.html template,9599,simonw,open,0,,,3268330,Datasette 1.0,45,2021-11-19T02:55:16Z,2022-03-15T18:35:49Z,,OWNER,,"Split from #878. The current `TableView` class is by far the most complex part of Datasette, and the most difficult to work on: https://github.com/simonw/datasette/blob/0.59.2/datasette/views/table.py In #878 I started exploring a new pattern for building views. In doing so it became clear that `TableView` is the first beast that I need to slay - if I can refactor that into something neat the pattern for building other views will emerge as a natural consequence. I've been trying to build this as a `register_routes()` plugin, as originally suggested in #870 - though unfortunately it looks like those plugins can't replace existing Datasette default views at the moment, see #1517. [UPDATE: I was wrong about this, plugins can over-ride default views just fine] I also know that I want to have a fully documented template context for `table.html` as a major step on the way to Datasette 1.0, see #1510. All of this adds up to the `TableView` factor being a major project that will unblock a whole flurry of other things - so I'm going to work on that in this separate issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes. It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1056746091,I_kwDOBm6k_c4-_Kpr,1515,Handle foreign keys that point to a non-existent table,9599,simonw,open,0,,,,,0,2021-11-17T23:40:13Z,2021-11-18T01:31:56Z,,OWNER,,"Spotted in https://github.com/simonw/datasette-graphql/issues/79 Demo: https://datasette-graphql-demo.datasette.io/fixtures/bad_foreign_key The foreign key links to a 404 page. ![B87009C7-CFCA-4DF9-8FBA-FA3E6CA28EC2](https://user-images.githubusercontent.com/9599/142334788-4d1a4acd-bc87-4426-b333-d46b221afcec.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets. This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets. Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1054246919,I_kwDOBm6k_c4-1ogH,1511,Review plugin hooks for Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-11-15T23:26:05Z,2021-11-16T01:20:14Z,,OWNER,,I need to perform a detailed review of the plugin interface - especially the plugin hooks like [register_facet_classes()](https://docs.datasette.io/en/stable/plugin_hooks.html#register-facet-classes) which I don't yet have complete confidence in.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1053136495,I_kwDOCGYnMM4-xZZv,341,`hash_id: Optional[Any]` should be `hash_id: Optional[str]`,9599,simonw,closed,0,,,,,0,2021-11-15T02:12:39Z,2021-11-15T02:19:31Z,2021-11-15T02:19:31Z,OWNER,,"In a few places: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L642 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L751 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1049 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1230 But it's correct here: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L2470",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053122092,I_kwDOCGYnMM4-xV4s,339,`table.lookup()` option to populate additional columns when creating a record,9599,simonw,closed,0,,,,,4,2021-11-15T01:41:17Z,2021-11-15T02:02:34Z,2021-11-15T02:02:00Z,OWNER,,"> For the commits table I feel like I want a version of `table.lookup()` that can be passed additional columns to populate only if the record does not exist yet. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/12#issuecomment-967455017_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053087862,I_kwDOCGYnMM4-xNh2,338,"dict, list, tuple should all map to TEXT",9599,simonw,closed,0,,,,,0,2021-11-15T00:28:01Z,2021-11-15T00:36:03Z,2021-11-15T00:36:03Z,OWNER,,"> This relates to the fact that dictionaries, lists and tuples get special treatment and are converted to JSON strings, using this code: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L2937-L2947 > > So the `COLUMN_TYPE_MAPPING` should include those too - right now it looks like this: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L165-L188 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/322#issuecomment-968401459_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,simonw,closed,0,,,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation. https://readthedocs.org/projects/datasette/builds/15268454/ ``` /home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html Running Sphinx v1.8.5 loading translations [en]... done making output directory... building [mo]: targets for 0 po files that are out of date building [html]: targets for 27 source files that are out of date updating environment: 27 added, 0 changed, 0 removed reading sources... [ 3%] authentication Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main app.build(args.force_all, filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update len(to_build)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc pub.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish self.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms self.document.transformer.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms Transformer.apply_transforms(self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms transform.apply(**kwargs) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply apply_source_workaround(n) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052826038,I_kwDOBm6k_c4-wNm2,1506,Columns beginning with an underscore do not facet correctly,9599,simonw,closed,0,,,,,1,2021-11-14T02:20:32Z,2021-11-14T04:45:21Z,2021-11-14T04:45:21Z,OWNER,,"Datasette treats columns that start with an underscore as querystring parameters it should ignore! Discovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052247023,I_kwDOBm6k_c4-uAPv,1505,Datasette should have an option to output CSV with semicolons,9599,simonw,open,0,,,,,1,2021-11-12T18:02:21Z,2021-11-16T11:40:52Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1051277222,I_kwDOBm6k_c4-qTem,1504,Link to ?_size=max at bottom of table page,9599,simonw,open,0,,,,,0,2021-11-11T19:06:33Z,2021-11-11T19:06:33Z,,OWNER,,"This can have text such as ""Show 1,000 rows per page"", based on the max size limit setting. Would make it easier for people to see more data at once without having to know how to hack the URL, similar to the `...` for facet sizes I added in #1337.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1050163432,I_kwDOBm6k_c4-mDjo,1503,`?_nocol=` removes that column from the filter interface,9599,simonw,closed,0,,,,,1,2021-11-10T18:22:50Z,2021-11-14T05:08:27Z,2021-11-14T04:53:07Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable This causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter. ![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,gustavorps,open,0,,,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1044267332,I_kwDOCGYnMM4-PkFE,336,"sqlite-util tranform --column-order mangles columns of type ""timestamp""",536941,fgregg,closed,0,,,,,1,2021-11-04T01:15:38Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,CONTRIBUTOR,,"Reproducible code below: ```bash > echo 'create table bar (baz text, created_at timestamp default CURRENT_TIMESTAMP)' | sqlite3 foo.db > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE bar (baz text, created_at timestamp default CURRENT_TIMESTAMP); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT 'CURRENT_TIMESTAMP' ); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT '''CURRENT_TIMESTAMP''' ); ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1042569687,I_kwDOCGYnMM4-JFnX,335,sqlite-utils index-foreign-keys fails due to pre-existing index,596279,zaneselvans,closed,0,,,,,11,2021-11-02T16:22:11Z,2021-11-14T22:55:56Z,2021-11-14T22:55:56Z,NONE,,"While running the command: ```sh sqlite-utils index-foreign-keys $SQLITE_DIR/pudl.sqlite ``` I got the following error: ``` Traceback (most recent call last): File ""/home/zane/miniconda3/envs/pudl-dev/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 454, in index_foreign_keys db.index_foreign_keys() File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 902, in index_foreign_keys table.create_index([fk.column]) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1563, in create_index self.db.execute(sql) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: index idx_generators_eia860_report_date already exists ``` This DB was created with the foreign key constraint `PRAGMA` enabled and a bunch of column-level `CHECK` constraints. Is this an expected behavior? Should one not try to index foreign keys if FK constraints are already being enforced within the DB? I'm also noticing that the size of the DB after FK indexes have been added went from 483MB to 835MB, which seems like a much bigger jump than when I've done this previously. Software versions... * sqlite-utils 3.17.1 * sqlite 3.36.0 * SQLAlchemy 1.4.26 (used to create the DB)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1041778507,I_kwDOCGYnMM4-GEdL,334,Filter by datetime objects using rows_where(),11642379,viseshrp,closed,0,,,,,0,2021-11-02T00:44:08Z,2021-11-13T19:23:21Z,2021-11-13T19:23:21Z,NONE,,"Firstly, thanks for this nice utility. It would be nice to have an example in the docs on how to filter by date range using `rows_where()`. This doesn't seem to work: ``` table.rows_where('datetime(created) between datetime(""2021-10-31T17:29:59.277428-04:00"") AND datetime(""2021-11-01T03:44:04.544651+00:00"")') ``` I could probably just use `db.query()`, which works for the above, but it would be nice if I could pass in `datetime` objects in `rows_where()`. Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1034535001,I_kwDOBm6k_c49qcBZ,1497,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""",9599,simonw,closed,0,,,,,18,2021-10-24T22:57:07Z,2023-01-18T17:13:45Z,2021-10-24T23:36:55Z,OWNER,,"This means the Datasette 0.59.1 release has not been published to Docker Hub. Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true ``` Preparing to unpack .../libc6_2.32-4_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype Checking for services that may need to be restarted... Checking init scripts... Unpacking libc6:amd64 (2.32-4) over (2.28-10) ... Setting up libc6:amd64 (2.32-4) ... /usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory dpkg: error processing package libc6:amd64 (--configure): installed libc6:amd64 package post-installation script subprocess returned error exit status 127 Errors were encountered while processing: libc6:amd64 E: Sub-process /usr/bin/dpkg returned an error code (1) The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100 ``` Same problem when I attempted to publish using the ""Push specific Docker tag"" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1033864602,I_kwDOBm6k_c49n4Wa,1496,Named parameters docs should include an example of a cast,9599,simonw,closed,0,,,,,1,2021-10-22T18:56:04Z,2021-10-22T19:38:23Z,2021-10-22T19:34:27Z,OWNER,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters It's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,chenrui333,closed,0,,,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command. My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks! relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028056713,I_kwDOCGYnMM49RuaJ,332,`sqlite-utils memory --flatten` option to flatten nested JSON,22523840,rdtq,closed,0,,,,,1,2021-10-16T14:04:42Z,2021-11-14T23:05:05Z,2021-11-14T23:05:05Z,NONE,,"currently --flatten option works only for `insert` command, it would be cool if it worked for `memory` as well to query nested json",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,andreaslongo,closed,0,,,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"``` Python 3.9.5 mypy 0.910 sqlite-utils 3.17.1 ``` While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error: ``` mypy . src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ Found 2 errors in 2 files (checked 7 source files) ``` When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away. ``` al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la total 200 drwx------ 3 al al 4096 Oct 14 22:00 . drwx------ 117 al al 4096 Oct 12 21:12 .. -rw------- 1 al al 64409 Oct 12 21:11 cli.py -rw------- 1 al al 109092 Oct 12 21:11 db.py -rw------- 1 al al 0 Oct 14 22:00 py.typed -rw------- 1 al al 684 Oct 12 21:11 recipes.py -rw------- 1 al al 7988 Oct 12 21:11 utils.py -rw------- 1 al al 113 Oct 12 21:11 __init__.py ``` I would like to suggest adding a `py.typed` file to the repository. See also the mypy docs on creating PEP 561 compatible packages: https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1025754125,I_kwDOBm6k_c49I8QN,1488,Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects'),9599,simonw,closed,0,,,,,5,2021-10-13T22:37:22Z,2021-10-14T18:03:45Z,2021-10-14T18:03:45Z,OWNER,,This is caused by a change made to `httpx` in https://github.com/encode/httpx/releases/tag/0.20.0,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1023243105,I_kwDOBm6k_c48_XNh,1486,pipx installation instructions for plugins don't reference pipx inject,41546558,RhetTbull,closed,0,,,,,0,2021-10-12T00:43:42Z,2021-10-13T21:09:11Z,2021-10-13T21:09:11Z,CONTRIBUTOR,,"The datasette [installation instructions](https://github.com/simonw/datasette/blob/main/docs/installation.rst) discuss how to install with pipx, how to upgrade with pipx, and how to upgrade plugins with pipx but do not mention how to install a plugin with pipx. You discussed this on your [blog](https://til.simonwillison.net/python/installing-upgrading-plugins-with-pipx) but looks like this didn't make it in when you updated the docs for pipx (#756). I'll submit a PR shortly to fix this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1021849766,I_kwDOBm6k_c486DCm,1483,Running a search on page 2 of results should not preserve ?_next=,9599,simonw,closed,0,,,,,0,2021-10-10T01:18:12Z,2021-10-13T21:08:10Z,2021-10-13T21:08:10Z,OWNER,,Reported by @eigenfoo in https://github.com/simonw/datasette/issues/1470,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1483/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see: - https://github.com/aio-libs/janus/issues/358 This is a tracking issue for anything else that shows up. This is also needed for the Homebrew package to upgrade to 3.10: - https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1015646369,I_kwDOBm6k_c48iYih,1480,Exceeding Cloud Run memory limits when deploying a 4.8G database,110420,ghing,open,0,,,,,9,2021-10-04T21:20:24Z,2022-10-07T04:39:10Z,,CONTRIBUTOR,,"When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message: > Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M. Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally: - Real Memory Size: 70.6 MB - Virtual Memory Size: 4.51 GB - Shared Memory Size: 2.5 MB - Private Memory Size: 57.4 MB I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow. This is somewhat related to #1082, but on a different platform, so I decided to open a new issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1480/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1010112818,I_kwDOBm6k_c48NRky,1479,"Win32 ""used by another process"" error with datasette publish",76450761,kirajano,open,0,,,,,7,2021-09-28T19:12:00Z,2023-09-07T02:14:16Z,,NONE,,"I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette. Failed to deploy. Attaching logs: 1. Tried with an app created via `flyctl apps create frosty-fog-8565` and the ran `datasette publish fly covid.db --app frosty-fog-8565` ``` Deploying frosty-fog-8565 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080 Error error connecting to docker: An unknown error occured. Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpgcm8cz66\\frosty-fog-8565' ``` 2. Tried also with an app that gets autogenerate when running `flyctl launch`. This also generates the .toml file. Ran then `datasette publish fly covid.db --app dark-feather-168` **but different error now** ```Deploying dark-feather-168 ==> Validating app configuration Error not possible to validate configuration: server returned Post ""https://api.fly.io/graphql"": unexpected EOF Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpnoyewcre\\dark-feather-168' ``` These are also the contents of the generated **.toml file** in 2 scenario: ``` # fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00 app = ""dark-feather-168"" kill_signal = ""SIGINT"" kill_timeout = 5 processes = [] [env] [experimental] allowed_public_ports = [] auto_rollback = true [[services]] http_checks = [] internal_port = 8080 processes = [""app""] protocol = ""tcp"" script_checks = [] [services.concurrency] hard_limit = 25 soft_limit = 20 type = ""connections"" [[services.ports]] handlers = [""http""] port = 80 [[services.ports]] handlers = [""tls"", ""http""] port = 443 [[services.tcp_checks]] grace_period = ""1s"" interval = ""15s"" restart_limit = 6 timeout = ""2s"" ``` 3. But also trying `datasette package covid.db` to create a local DOCKERFILE to later try to push it via `flyctl deploy` fails as well. ```[+] Building 147.3s (11/11) FINISHED => [internal] load build definition from Dockerfile 0.2s => => transferring dockerfile: 396B 0.0s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for docker.io/library/python:3.8 4.7s => [auth] library/python:pull token for registry-1.docker.io 0.0s => [internal] load build context 0.1s => => transferring context: 82.37kB 0.0s => [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s => => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s => => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s => => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s => => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s => => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s => => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s => => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s => => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s => => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s => => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s => => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s => => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s => => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s => => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s => => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s => => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s => => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s => => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s => => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s => => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s => => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s => => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s => [2/5] COPY . /app 2.3s => [3/5] WORKDIR /app 0.2s => [4/5] RUN pip install -U datasette 26.9s => [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s => exporting to image 1.2s => => exporting layers 1.2s => => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them Traceback (most recent call last): ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py"", line 283, in package call(args) File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpkb27qid3\\datasette'```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1479/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,mroswell,open,0,,,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency. Would love to have documentation on how to modify the template to: - Remove the generated ID from the table - Link the federal ID to the detail page - and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key. I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports. Perhaps this isn't a template issue, maybe more of a db manipulation... Margie",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1006016302,I_kwDOBm6k_c479pcu,1477,Consider adding request to the documented default template context,9599,simonw,open,0,,,,,0,2021-09-24T02:34:09Z,2021-09-24T02:34:09Z,,OWNER,,I made a plugin for this today but I think perhaps it should be a default thing instead: https://datasette.io/plugins/datasette-template-request,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1005891028,I_kwDOCGYnMM479K3U,329,Rethink approach to [ and ] in column names (currently throws error),9599,simonw,closed,0,,,,,12,2021-09-23T22:14:24Z,2021-11-15T02:57:51Z,2021-11-15T02:57:51Z,OWNER,,"> I think it's best to still keep `[` and `]` out of column names though. Transforming them into `(` and `)` seems reasonable - but should that happen here or in `sqlite-utils`? I think in `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/datasette-app/issues/121#issuecomment-926200398_ This is a rethinking of the solution to: - https://github.com/simonw/sqlite-utils/issues/86",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 999902754,I_kwDOBm6k_c47mU4i,1473,base logo link visits `undefined` rather than href url,192568,mroswell,open,0,,,,,2,2021-09-18T04:17:04Z,2021-09-19T00:45:32Z,,CONTRIBUTOR,,"I have two connected sites: http://www.SaferOrToxic.org (a Hugo website) and: http://disinfectants.SaferOrToxic.org/disinfectants/listN (a datasette table page) The latter is linked as ""The List"" in the former's menu. (I'd love a prettier URL, but that's what I've got.) On: http://disinfectants.SaferOrToxic.org/disinfectants/listN ... all the other menu links should point back to: https://www.SaferOrToxic.org And they do! But the logo, for some reason--though it has an href pointing to: https://www.SaferOrToxic.org Keeps going to this instead: https://disinfectants.saferortoxic.org/disinfectants/undefined What is causing that? How can I fix it? In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.) But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page (""The List,"") and then have the List point back to the home website. The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly. But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined This is the HTML: ``` ``` Is this somehow related to cloudflare? Or something in the datasette code? I'm starting to think it's a cloudflare issue. Can I at least rule out it being a datasette issue? My repository is here: https://github.com/mroswell/list-N (BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 995098231,MDU6SXNzdWU5OTUwOTgyMzE=,1470,?_sort=rowid with _next= returns error,19851673,eigenfoo,closed,0,,,,,4,2021-09-13T16:36:15Z,2021-10-18T19:30:15Z,2021-10-10T01:15:03Z,NONE,,"For example: - Go to https://cryptics.eigenfoo.xyz/clues/clues?_next=100 (this is the second page of results in a Datasette site) - Search anything using the FTS search bar. For example, searching for `hello` will take you to https://cryptics.eigenfoo.xyz/clues/clues?_search=hello&_sort=rowid&_next=100 - A `500 Error: list index out of range` is raised. This is because the search URL includes the `&_next=100` UTM parameter, carried over from where the FTS search was run. However, there isn't a second page in the search results, so a `list index out of range` error is raised. You can confirm that removing this UTM parameter from the URL returns the appropriate search results. The FTS search request should strip any `_next` UTM parameter. --- ```bash datasette, version 0.58.1 sqlite-utils, version 3.17 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 994450961,MDU6SXNzdWU5OTQ0NTA5NjE=,1469,"Column cog shows ""facet by this"" when already default faceted",9599,simonw,closed,0,,,,,2,2021-09-13T04:51:26Z,2021-10-13T21:20:07Z,2021-10-13T21:20:07Z,OWNER,,"e.g. on https://covid-19.datasettes.com/covid/economist_excess_deaths But if you add `?_facet=country` to the URL that goes away: https://covid-19.datasettes.com/covid/economist_excess_deaths?_facet_size=5&_facet=country The logic that decides if the ""Facet by this"" item is shown does not take default `metadata.json` facets into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 994390593,MDU6SXNzdWU5OTQzOTA1OTM=,1468,Faceting for custom SQL queries,72577720,MichaelTiemannOSC,closed,0,,,,,2,2021-09-13T02:52:16Z,2021-09-13T04:54:22Z,2021-09-13T04:54:17Z,CONTRIBUTOR,,"Facets are awesome. But not when I need to join to tidy tables together. Or even just running explicitly the default SQL query that simply lists all the rows and columns of a table (up to SIZE). That is to say, when I browse a table, I see facets: https://latest.datasette.io/fixtures/compound_three_primary_keys But when I run a custom query, I don't: https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101 Is there an idiom to cause custom SQL to come back with facet suggestions?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991467558,MDU6SXNzdWU5OTE0Njc1NTg=,1466,Add Datasette Desktop to installation documentation,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-09-08T19:41:27Z,2022-01-13T22:28:28Z,2022-01-13T21:55:18Z,OWNER,,See https://datasette.io/desktop and https://simonwillison.net/2021/Sep/8/datasette-desktop/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991191951,MDU6SXNzdWU5OTExOTE5NTE=,1464,clean checkout & clean environment has test failures,51016,ctb,open,0,,,,,6,2021-09-08T14:16:23Z,2021-09-13T22:17:17Z,,CONTRIBUTOR,,"I followed the instructions [here](https://docs.datasette.io/en/stable/contributing.html#setting-up-a-development-environment), and even after running `python update-docs-help.py` I get the following failed tests -- any thoughts? ``` FAILED tests/test_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] ``` This is with python 3.9.7 and lots of other packages, as in attached environment listing from `conda list`. [conda-installed.txt](https://github.com/simonw/datasette/files/7129487/conda-installed.txt) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 990844088,MDU6SXNzdWU5OTA4NDQwODg=,325,sqlite-utils memory can't deal with multiple files with the same name,144773,karlb,closed,0,,,,,4,2021-09-08T08:14:42Z,2021-09-22T20:52:56Z,2021-09-22T20:45:45Z,NONE,,"When I use multiple files with the same name, e.g. in `sqlite-utils memory a/bug.csv b/bug.csv`, sqlite-utils creates invalid views. ``` Traceback (most recent call last): File ""/home/karl/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 1299, in memory db[csv_table].transform(types=tracker.types) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1287, in transform self.db.execute(sql) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: error in view t1: no such table: main.bug ``` This can be reproduced with ```sh #!/bin/bash mkdir foo mkdir bar echo -e 'col1,col2\nval1,val2' > foo/bug.csv echo -e 'col3,col4\nval3,val4' > bar/bug.csv sqlite-utils memory */bug.csv 'SELECT 1' ``` Ideally, the tables would get unique names by including the next path segment until the names are unique. But just making the numbered t* aliases work would be good enough. This problem can of course be worked around by renaming the files, but it would be nice if this case was handled more gracefully. Thanks a lot for this great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 990367646,MDU6SXNzdWU5OTAzNjc2NDY=,1462,"Separate out ""debug"" options from ""root"" options",9599,simonw,open,0,,,,,1,2021-09-07T21:27:34Z,2021-09-07T21:34:33Z,,OWNER,,"> I ditched ""root"" for ""admin"" because root by default gives you a whole bunch of stuff which I think could be confusing: > > > > Maybe the real problem here is that I'm conflating ""root"" permissions with ""debug"" options. Perhaps there should be an extra Datasette mode that unlocks debug tools for the root user? _Originally posted by @simonw in https://github.com/simonw/datasette-app-support/issues/8#issuecomment-914638998_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 989986586,MDU6SXNzdWU5ODk5ODY1ODY=,1461,Try blacken-docs,9599,simonw,closed,0,,,,,3,2021-09-07T13:28:50Z,2021-09-07T16:13:59Z,2021-09-07T16:13:59Z,OWNER,,https://github.com/asottile/blacken-docs,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 989109888,MDU6SXNzdWU5ODkxMDk4ODg=,1460,Override column metadata with metadata from another column,72577720,MichaelTiemannOSC,open,0,,,,,0,2021-09-06T12:13:33Z,2021-09-06T12:13:33Z,,CONTRIBUTOR,,"I have a table from the PUDL project (https://github.com/catalyst-cooperative/pudl) that looks like this: ``` CREATE TABLE fuel_ferc1 ( id INTEGER NOT NULL, record_id TEXT, utility_id_ferc1 INTEGER, report_year INTEGER, plant_name_ferc1 TEXT, fuel_type_code_pudl VARCHAR(7), fuel_unit VARCHAR(7), fuel_qty_burned FLOAT, fuel_mmbtu_per_unit FLOAT, fuel_cost_per_unit_burned FLOAT, fuel_cost_per_unit_delivered FLOAT, fuel_cost_per_mmbtu FLOAT, PRIMARY KEY (id), FOREIGN KEY(plant_name_ferc1, utility_id_ferc1) REFERENCES plants_ferc1 (plant_name_ferc1, utility_id_ferc1), CONSTRAINT fuel_ferc1_fuel_type_code_pudl_enum CHECK (fuel_type_code_pudl IN ('coal', 'oil', 'gas', 'solar', 'wind', 'hydro', 'nuclear', 'waste', 'unknown')), CONSTRAINT fuel_ferc1_fuel_unit_enum CHECK (fuel_unit IN ('ton', 'mcf', 'bbl', 'gal', 'kgal', 'gramsU', 'kgU', 'klbs', 'btu', 'mmbtu', 'mwdth', 'mwhth', 'unknown')) ); ``` Note that `fuel_unit` is a unit that **pint** can understand, and that `fuel_qty_burned` is a column of data that could be expressed in terms of actual units, not merely as a dimensionless number. Ditto the `fuel_cost_per_unit_...` columns. Is there a way to give a column a default metadata unit (such as *tons* or *USD/ton*) and then let that be overridden when the metadata in another column says *barrels* or *USD/gramsU*? @catalyst-cooperative",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 988556488,MDU6SXNzdWU5ODg1NTY0ODg=,1459,suggestion: allow `datasette --open` to take a relative URL,51016,ctb,open,0,,,,,1,2021-09-05T17:17:07Z,2021-09-05T19:59:15Z,,CONTRIBUTOR,,"(soft suggestion because I'm not sure I'm using datasette right yet) Over at https://github.com/ctb/2021-sourmash-datasette, I'm playing around with datasette, and I'm creating some static pages to send people to the right facets. There may well be better ways of achieving this end goal, and I will find out if so, I'm sure! But regardless I think it might be neat to support an option to allow `-o/--open` to take a relative URL, that then gets appended to the hostname and port. This would let me improve my documentation. I don't see any downsides, either, but 🤷 there may well be some :) Happy to dig in and provide a PR if it's of interest. I'm not sure off the top of my head how to support an optional value to a parameter in argparse - the current `-o` behavior is kinda nice so it'd be suboptimal to require a url for `-o`. Maybe `--open-url=` or something would work? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 988553806,MDU6SXNzdWU5ODg1NTM4MDY=,1457,suggestion: distinguish names in `--static` documentation,51016,ctb,closed,0,,,,,0,2021-09-05T17:04:27Z,2021-10-14T18:39:55Z,2021-10-14T18:39:55Z,CONTRIBUTOR,,"Over in https://docs.datasette.io/en/stable/custom_templates.html#serving-static-files, there is the slightly comical example command - ``` datasette -m metadata.json --static static:static/ --memory ``` (now, with MORE STATIC!) It took me a while to sort out all the URLs and paths involved because I wasn't being very clever. But in the interests of simplification and distinction, I might suggest something like ``` datasette -m metadata.json --static loc:static-files/ --memory ``` I will submit a PR for your consideration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988552851,MDU6SXNzdWU5ODg1NTI4NTE=,1456,conda install results in non-functioning `datasette serve` due to out-of-date asgiref,51016,ctb,open,0,,,,,0,2021-09-05T16:59:55Z,2021-09-05T16:59:55Z,,CONTRIBUTOR,,"Over in https://github.com/ctb/2021-sourmash-datasette, I discovered that the following commands fail: ``` conda create -n datasette4 -y datasette=0.58.1 conda activate datasette4 datasette gathertax.db ``` with `ImportError: cannot import name 'WebSocketScope' from 'asgiref.typing'`. This appears to be because asgiref 3.3.4 doesn't have WebSocketScope, but later versions do - a simple ``` pip install asgiref==3.4.1 ``` fixes the problem for me, at least to the point where I can run datasette and poke around as usual. I note that over in the conda-forge recipe, https://github.com/conda-forge/datasette-feedstock/blob/master/recipe/meta.yaml pins asgiref to < 3.4.0, but I'm not sure why - so I'm not sure how to best resolve this issue :).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 986829194,MDU6SXNzdWU5ODY4MjkxOTQ=,14,xml.etree.ElementTree.Parse Error - mismatched tag,46968,step21,open,0,,,,,1,2021-09-02T14:46:36Z,2021-09-02T14:53:11Z,,NONE,,"This is an error message I get upon parsing the enex file of my Inbox. Please find the full error message below. Any hints welcome. ``` Importing from ENEX [##################------------------] 50% 00:00:50 Traceback (most recent call last): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 17, in find_all_tags for event, el in parser.read_events(): File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1329, in read_events raise event File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1301, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: mismatched tag: line 6837961, column 2 ``` ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 984939366,MDU6SXNzdWU5ODQ5MzkzNjY=,58,"Error: Use either --since or --since_id, not both - still broken",42904,rubenv,closed,0,,,,,1,2021-09-01T09:45:28Z,2021-09-21T17:37:41Z,2021-09-21T17:37:41Z,CONTRIBUTOR,,"Hi Simon, It appears the fix for #57 doesn't fix things for me: ``` $ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6 ``` ``` $ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both ``` Is there any way I can help debug this?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 983221851,MDU6SXNzdWU5ODMyMjE4NTE=,34,Data folder as index command parameter,1223625,humrochagf,open,0,,,,,0,2021-08-30T21:29:33Z,2021-08-30T21:29:33Z,,NONE,,"Hi, First of all, thank you for this wonderful project :smile: I started to use dogsheep to make my personal data searchable, and by using the project I noticed an issue with the index command. It always expects you are running it from the root folder from where the data is located, so I got some errors while trying to make it work on my setup. I separate all databases inside a `data` folder (I published my setup to be easier to follow: https://github.com/humrochagf/my-dogsheep) Before, I configured `dogsheep.yml` to add the data folder to its path like this: ```yml data/twitter.db: tweets: sql: |- ... ``` And running the index command like this: ``` dogsheep-beta index data/dogsheep.db dogsheep.yml ``` It worked to the normal search feature with no problem this way, but when I started adding `display_sql` rules the app started to crash, because at datasette `get_database` it was looking for `data/twitter` and it only had a db called `twitter` there. So my workaround to that was to cd into the data folder and run the indexer. You can check the way I'm doing it at this line of the makefile: https://github.com/humrochagf/my-dogsheep/blob/main/makefile#L3 It works but it would be nice to have an option to pass the path where the data is located to the index function.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 982803408,MDU6SXNzdWU5ODI4MDM0MDg=,1454,Feature Request: Publish to IPFS,1560788,blitmap,open,0,,,,,0,2021-08-30T13:36:18Z,2021-08-30T13:36:18Z,,NONE,,"Hello, I am a huge fan of this being used for exploring data. I think it has a lot of flexibility not found in other tools. I'm not sure if what I'm asking for is possible: Can this be extended to publish to IPFS? IPFS is an attractive hosting option for decentralized journalism. Food for thought ~",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 981681138,MDU6SXNzdWU5ODE2ODExMzg=,1450,"Datasette --help should show something more useful than ""Datasette!""",9599,simonw,closed,0,,,,,1,2021-08-28T00:44:51Z,2021-08-28T00:49:07Z,2021-08-28T00:49:07Z,OWNER,,"https://github.com/simonw/datasette/blob/a1a33bb5822214be1cebd98cd858b2058d91a4aa/datasette/cli.py#L122-L127 _Originally spotted by @simonw in https://github.com/simonw/datasette/issues/1449#issuecomment-907539668_ ``` ~ % datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 981676832,MDU6SXNzdWU5ODE2NzY4MzI=,1449,`register_commands()` plugin hook to register extra CLI commands,9599,simonw,closed,0,,,,,13,2021-08-28T00:26:21Z,2021-08-28T01:58:35Z,2021-08-28T01:43:11Z,OWNER,,"The `datasette` CLI tool currently has 7 subcommands: `serve`, `inspect`, `install`, `package`, `plugins`, `publish` and `uninstall`. A plugin hook could allow plugins to register extra subcommands. I've avoided this for quite a while because I didn't have good use-cases for it - but the existence of the `datasette install xxx` command for installing packages into the correct virtual environment means that actually there's a good reason to do this: it would allow plugins to provide additional command-line mechanisms without the user having to understand how virtual environments work in order to install those commands into the same environment as the rest of Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 979627285,MDU6SXNzdWU5Nzk2MjcyODU=,323,`table.convert()` method should clean up after itself,9599,simonw,closed,0,,,,,1,2021-08-25T21:15:39Z,2021-08-25T21:25:26Z,2021-08-25T21:25:18Z,OWNER,,"It currently works like this: https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/sqlite_utils/db.py#L2177-L2195 It's registering a function called `convert_value()` and then failing to de-register that function once it has finished. It might even be possible for two queries running against the same connection to clobber each other's `convert_value()` functions, leading to incorrect behaviour. So two fixes: firstly it should register the function with a unique name (maybe add a random suffix). Secondly, it should de-register that function once it has finished.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978357984,MDU6SXNzdWU5NzgzNTc5ODQ=,1446,Modify base.html template to support optional sticky footer,9599,simonw,closed,0,,,,,3,2021-08-24T18:11:12Z,2021-08-31T01:54:59Z,2021-08-24T20:32:47Z,OWNER,,"The neatest way to have the footer stick to the bottom of the browser window that I've found is to use the flexbox pattern from https://css-tricks.com/couple-takes-sticky-footer/ ```html
content
``` ```css html, body { height: 100%; } body { display: flex; flex-direction: column; } .content { flex: 1 0 auto; } .footer { flex-shrink: 0; } ``` I tried this in a custom plugin but it ended up having to duplicate the entire `base.html` template just to get a wrapper around the not-footer content. I think Datasette's own `base.html` template should have this wrapper element instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 977323133,MDU6SXNzdWU5NzczMjMxMzM=,1445,Ability to search for text across all columns in a table,9599,simonw,open,0,,,,,5,2021-08-23T18:50:48Z,2021-08-23T19:10:17Z,,OWNER,,"When I'm working with new data I often find myself wanting to run a search for text embedded in ANY of the columns of a table, without having to even fully understand the schema first. I figured out a trick for doing that using a SQL-generated SQL query here: https://til.simonwillison.net/datasette/search-all-columns-trick But maybe this should be a core Datasette feature? Or a plugin?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1445/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 977128935,MDU6SXNzdWU5NzcxMjg5MzU=,21,Duplicate Column,32016596,FabianHertwig,open,0,,,,,1,2021-08-23T15:00:44Z,2021-08-23T17:00:59Z,,NONE,,"Hey, thank you for this repo! When I try to convert my export, I get a multiple column error. Here is the stack trace: ```sh (.venv) (base) computer:bodyweight_app user$ healthkit-to-sqlite ./data/Health_export.zip ./data/healthkit.db Importing from HealthKit [###############################-----] 87% 00:00:22 Traceback (most recent call last): File ""/MyProject/.venv/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 146, in write_records batch_size=50, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 2579, in insert_all extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1246, in create extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 767, in create_table self.execute(sql) File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: duplicate column name: metadata_Meal ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/21/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 976405225,MDU6SXNzdWU5NzY0MDUyMjU=,320,sqlite-utils memory --analyze option,9599,simonw,closed,0,,,,,2,2021-08-22T15:37:10Z,2021-08-22T15:46:56Z,2021-08-22T15:44:29Z,OWNER,,To provide a way of running [analyze-tables](https://sqlite-utils.datasette.io/en/stable/cli.html#analyzing-tables) directly against JSON or CSV data.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 976399638,MDU6SXNzdWU5NzYzOTk2Mzg=,319,[Enhancement] Please allow 'insert-files' to insert content as text.,66709385,pjamargh,closed,0,,,,,10,2021-08-22T15:10:46Z,2021-08-24T23:33:45Z,2021-08-24T23:33:44Z,NONE,,"'insert-files' creates BLOB columns for file contents. Transforming the column to TEXT still keep the content as binary. Even though I'm sure there is a transform that can be applied decoding the text it would be great to have a argument to make 'insert-files' to do it as text (with optional text encoding). The use case is a bunch of htmls (single file) on a directory structure that inserted with this command could be served in Datasette allowing full text search.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 975166271,MDU6SXNzdWU5NzUxNjYyNzE=,20,Add index on workout_points.date,9599,simonw,open,0,,,,,2,2021-08-20T01:08:04Z,2021-08-20T01:12:48Z,,MEMBER,,"Sorting that by date makes sense for seeing most recent points, and my DB has 2.5m points in so it's an expensive sort!",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,simonw,closed,0,,,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974995592,MDU6SXNzdWU5NzQ5OTU1OTI=,1443,datasette.databases should be a documented property,9599,simonw,closed,0,,,,,1,2021-08-19T19:53:04Z,2021-08-19T21:25:07Z,2021-08-19T21:23:47Z,OWNER,,"https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/app.py#L231 I want to use it in https://github.com/simonw/datasette-block-robots/issues/5",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974987856,MDU6SXNzdWU5NzQ5ODc4NTY=,1442,Mechanism to cause specific branches to deploy their own demos,9599,simonw,closed,0,,,,,6,2021-08-19T19:41:39Z,2021-08-19T21:11:45Z,2021-08-19T21:09:40Z,OWNER,,"A useful capability would be if it was super-easy to say ""any pushes to branch X should be deployed to `latest-X.datasette.io`"". I'd like to use this for the column query information work in #1434",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974067156,MDU6SXNzdWU5NzQwNjcxNTY=,318,Research: handle gzipped CSV directly,9599,simonw,open,0,,,,,2,2021-08-18T21:23:04Z,2021-08-18T21:25:30Z,,OWNER,,"Would it be worthwhile for the `sqlite-utils` command-line tool to grow features to efficiently directly interact with gzipped CSV data? Maybe add `--gz` options to both `insert` and to the various commands that output query results.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 973139047,MDU6SXNzdWU5NzMxMzkwNDc=,1439,Rethink how .ext formats (v.s. ?_format=) works before 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,48,2021-08-17T23:32:51Z,2022-03-15T20:51:26Z,2022-03-15T20:51:26Z,OWNER,,"Datasette currently has surprising special behaviour for if a table name ends in `.csv` - which can happen when a tool like `csvs-to-sqlite` creates tables that match the filename that they were imported from. https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv illustrates this behaviour: it links to `.csv` and `.json` that look like this: - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv&_size=max Where normally Datasette would add the `.csv` or `.json` extension to the path component of the URL (as seen on other pages such as https://latest.datasette.io/fixtures/facet_cities) here the [path_with_format() function](https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/utils/__init__.py#L710) notices that there is already a `.` in the path and instead adds `?_format=csv` to the query string instead. The problem with this mechanism is that it's pretty surprising. Anyone writing external code to Datasette who wants to get back the `.csv` or `.json` version giving the URL to a table page will need to know about and implement this behaviour themselves. That's likely to cause all kinds of bugs in the future.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 972918533,MDU6SXNzdWU5NzI5MTg1MzM=,1438,Query page .csv and .json links are not correctly URL-encoded on Vercel under unknown specific conditions,9599,simonw,open,0,,,,,7,2021-08-17T17:35:36Z,2021-08-18T00:22:23Z,,OWNER,,"> Confirmed: https://thesession.vercel.app/thesession?sql=select+*+from+tunes+where+name+like+%22%25wise+maid%25%22%0D%0A is a page where the URL correctly encoded `%` as `%25` - but then in the HTML on that page that links to the CSV and JSON versions we get this: > > ```html >

This data as > json, > CSV >

> ``` Those CSV and JSON links are incorrect. _Originally posted by @simonw in https://github.com/simonw/datasette-publish-vercel/issues/48#issuecomment-900497579_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 972827346,MDU6SXNzdWU5NzI4MjczNDY=,317,Link to a better example on docs index,9599,simonw,closed,0,,,,,1,2021-08-17T15:43:40Z,2021-08-18T18:31:43Z,2021-08-18T18:31:43Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/7a19822ac9ee24be2fbb4c2326a0bf2f3d2d9c4d/docs/index.rst#L39 Is a very old example,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 970626625,MDU6SXNzdWU5NzA2MjY2MjU=,1435,Turn off suggest facets on tables with large numbers of columns,9599,simonw,open,0,,,,,0,2021-08-13T18:30:48Z,2021-08-13T18:30:48Z,,OWNER,,If a table has 200 columns it will take multiple seconds to try and suggest facets. I should either quit after the first 20 or not suggest facets at all.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 970320615,MDU6SXNzdWU5NzAzMjA2MTU=,316,Fix visible backticks on reference page,9599,simonw,closed,0,,,,,1,2021-08-13T11:37:46Z,2021-08-14T05:12:23Z,2021-08-14T05:10:48Z,OWNER,,"https://sqlite-utils.datasette.io/en/latest/reference.html Search for backtick to reveal various minor markup bugs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969855774,MDU6SXNzdWU5Njk4NTU3NzQ=,1432,Rename Datasette.__init__(config=) parameter to settings=,9599,simonw,open,0,,,,,8,2021-08-13T01:00:27Z,2021-10-19T01:16:41Z,,OWNER,,"> While I'm doing this I should rename this internal variable to avoid confusion in the future: > > https://github.com/simonw/datasette/blob/e837095ef35ae155b4c78cc9a8b7133a48c94f03/datasette/app.py#L203 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1431#issuecomment-898072940_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 969840302,MDU6SXNzdWU5Njk4NDAzMDI=,1431,`--help-config` should be called `--help-settings`,9599,simonw,closed,0,,,,,1,2021-08-13T00:46:48Z,2021-08-13T01:01:58Z,2021-08-13T01:01:58Z,OWNER,,Follow-on from #1105 rebranding exercise.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969548935,MDU6SXNzdWU5Njk1NDg5MzU=,1429,UI for setting `?_size=max` on table page,9599,simonw,open,0,,,,,2,2021-08-12T20:52:09Z,2021-08-13T04:37:41Z,,OWNER,,"It defaults to 100 per page, but you can increase that to 1000 per page using `?_size=max` (or higher if `max_returned_rows` is set higher than that). But... that's only available to people who know how to hack URLs. Solution: add a link that sets that option to the pagination block at the bottom of the table: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 965440017,MDU6SXNzdWU5NjU0NDAwMTc=,315,`.delete_where()` returns `[]` when it should return self,9599,simonw,closed,0,,,,,1,2021-08-10T21:54:55Z,2021-08-10T23:09:29Z,2021-08-10T23:09:29Z,OWNER,,"If the table doesn't exist it should still return `self`, not `[]`: https://github.com/simonw/sqlite-utils/blob/ee469e3122d6f5973ec2584c1580d930daca2e7c/sqlite_utils/db.py#L1676-L1683 Spotted with `mypy` while working on #312.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965210966,MDU6SXNzdWU5NjUyMTA5NjY=,314,Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`,9599,simonw,closed,0,,,,,2,2021-08-10T18:03:59Z,2021-08-18T22:25:21Z,2021-08-18T22:25:21Z,OWNER,,"> Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/312#issuecomment-896200682_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965166058,MDU6SXNzdWU5NjUxNjYwNTg=,313,`.add_foreign_keys()` doesn't reject being called with a View,9599,simonw,closed,0,,,,,0,2021-08-10T17:22:17Z,2021-08-10T17:25:34Z,2021-08-10T17:25:34Z,OWNER,,"Spotted this bug using `mypy` while working on #311 / #312! ``` % mypy sqlite_utils sqlite_utils/db.py:725: error: Item ""View"" of ""Union[Table, View]"" has no attribute ""foreign_keys"" Found 1 error in 1 file (checked 5 source files) ``` Refers to this code: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L710-L720 It's a bug! We run some checks earlier but none of them ensure that it's a view: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L697-L709",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965102534,MDU6SXNzdWU5NjUxMDI1MzQ=,311,Add reference documentation generated from docstrings,9599,simonw,closed,0,,,,,4,2021-08-10T16:04:00Z,2021-08-11T12:03:50Z,2021-08-11T12:03:50Z,OWNER,,"Using https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html I'm not a big fan of this kind of documentation because it so often comes in place of narrative documentation - but the library has great narrative documentation now, so the reference documentation can link to it in places. This will also encourage me to add good docstrings everywhere, useful for IDEs and suchlike.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 964400482,MDU6SXNzdWU5NjQ0MDA0ODI=,310,`sqlite-utils insert --flatten` option to flatten nested JSON,9599,simonw,closed,0,,,,,3,2021-08-09T21:23:08Z,2021-10-16T13:54:56Z,2021-08-09T21:44:06Z,OWNER,,"I had to do this with a `jq` recipe today: https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs ``` cat log.json | jq -c '[leaf_paths as $path | { ""key"": $path | join(""_""), ""value"": getpath($path) }] | from_entries' \ | sqlite-utils insert /tmp/logs.db logs - --nl --alter --batch-size 1 ``` That was to turn something like this: ```json { ""httpRequest"": { ""latency"": ""0.112114537s"", ""requestMethod"": ""GET"", ""requestSize"": ""534"", ""status"": 200, }, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels"": { ""service"": ""datasette-io"" } } ``` Into this instead: ```json { ""httpRequest_latency"": ""0.112114537s"", ""httpRequest_requestMethod"": ""GET"", ""httpRequest_requestSize"": ""534"", ""httpRequest_status"": 200, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels_service"": ""datasette-io"" } ``` I have to do this often enough that I think it should be an option, `--flatten` - so I can do this instead: ``` cat log.json | sqlite-utils insert /tmp/logs.db logs - --flatten ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 964322136,MDU6SXNzdWU5NjQzMjIxMzY=,1426,"Manage /robots.txt in Datasette core, block robots by default",9599,simonw,open,0,,,,,9,2021-08-09T19:56:56Z,2021-12-04T07:11:29Z,,OWNER,,"See accompanying Twitter thread: https://twitter.com/simonw/status/1424820203603431439 > Datasette currently has a plugin for configuring robots.txt, but I'm beginning to think it should be part of core and crawlers should be blocked by default - having people explicitly opt-in to having their sites crawled and indexed feels a lot safer https://datasette.io/plugins/datasette-block-robots I have a lot of Datasettes deployed now, and tailing logs shows that they are being *hammered* by search engine crawlers even though many of them are not interesting enough to warrant indexing. I'm starting to think blocking crawlers would actually be a better default for most people, provided it was well documented and easy to understand how to allow them. Default-deny is usually a better policy than default-allow!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1426/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 963897111,MDU6SXNzdWU5NjM4OTcxMTE=,309,"sqlite-utils insert errors should show SQL and parameters, if possible",16622642,scaleoutsean,closed,0,,,,,6,2021-08-09T11:24:14Z,2021-08-09T23:40:29Z,2021-08-09T22:25:58Z,NONE,,"I've tried several approaches, but this is the current one: ```sh echo $json-line | sqlite-utils insert json.db jsontable --truncate --alter --detect-types - ``` In all cases, I get this error: ```sh OverflowError: Python int too large to convert to SQLite INTEGER Traceback (most recent call last): File ""/home/sean/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 841, in insert insert_upsert_implementation( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 780, in insert_upsert_implementation db[table].insert_all( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2145, in insert_all self.insert_chunk( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1957, in insert_chunk result = self.db.execute(query, params) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 257, in execute return self.conn.execute(sql, parameters) ``` I googled the error and checked SO answers and advice, all good. I changed my JSON file to not use integers so I no longer get this error. Of course, that makes using the database a bit harder, so I also tried to solve the problem by modifying DB structure (while using integers in JSON). If change all `INTEGER` Data Types to something else (`STRING`, `TEXT`) and try to import again using `--truncate`, I still get this error. I suppose I should tell sqlite-utils which columns should use non-INTEGER Data Type rather than rely on it to check my SQL table configuration. If that is the case, can this error be a bit more specific for easier troubleshooting - maybe tell us which which record caused the problem when that error is thrown? My table has 60+ columns, many of which use 64-bit integers (not all records are large or known in advance), so while I can modify JSON to use strings instead of integers, it decreases usability and finding out which records have values for which SQLite integers aren't sufficient requires some work (I'm thinking about parsing all integers with `jq` and sorting output by length to identify those columns, but I'd prefer if sqlite-utils could tell me that). My environment: - Python 3.8.10 - sqlite-utils 3.14 - pandas 1.3.1 - numpy 1.21.1 - sqlite-fts4 1.0.1 - sqlite 3.31.1-4ubuntu0.2 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963528457,MDU6SXNzdWU5NjM1Mjg0NTc=,1425,render_cell() hook should support returning an awaitable,9599,simonw,closed,0,,,,,11,2021-08-08T22:32:29Z,2021-08-09T07:14:35Z,2021-08-09T03:00:37Z,OWNER,,"Many of the plugin hooks can return an awaitable - e.g. https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hook-extra-template-vars - but `render_cell()` doesn't support this. I recently found myself wanting to execute an additional SQL query from that hook, but it wasn't possible to do that since I couldn't use `await`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963527045,MDU6SXNzdWU5NjM1MjcwNDU=,1424,Document exceptions that can be raised by db.execute() and friends,9599,simonw,open,0,,,,,4,2021-08-08T22:23:25Z,2021-08-08T22:27:31Z,,OWNER,,Not currently covered here: https://docs.datasette.io/en/stable/internals.html#await-db-execute-sql,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 962391325,MDU6SXNzdWU5NjIzOTEzMjU=,1423,Show count of facet values if ?_facet_size=max,9599,simonw,closed,0,,,,,9,2021-08-06T04:42:20Z,2021-12-15T17:48:40Z,2021-08-16T18:56:43Z,OWNER,,"I sometimes want to get a count of the values in a facet - if it's a facet of US states for example I want to know if all 50 are represented. Idea: if `?_facet_size=max` is present, add a count to the facet heading. So on: https://latest.datasette.io/fixtures/compound_three_primary_keys?_facet=content&_facet_size=max&_facet=pk1&_facet=pk2#facet-pk2 It could have something like this: Note that the first column shows >1000 - because in that case we've truncated the facet calculation since the maximum allowed returned rows is 1000.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1423/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 961367843,MDU6SXNzdWU5NjEzNjc4NDM=,1422,Ability to default to hiding the SQL for a canned query,9599,simonw,closed,0,,,,,4,2021-08-05T02:51:39Z,2021-08-07T05:32:29Z,2021-08-07T05:32:29Z,OWNER,,"I'm working on a project with some HUGE (400+ lines of SQL) canned queries right now. Any time you land on the canned query page you have to scroll down a long distance to get to the results! Would be useful to be able to default to https://latest.datasette.io/fixtures/magic_parameters?_hide_sql=1 without needing the parameter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 961008507,MDU6SXNzdWU5NjEwMDg1MDc=,308,Add an interactive tutorial as a Jupyter notebook,9599,simonw,open,0,,,,,2,2021-08-04T20:34:22Z,2021-08-04T21:30:59Z,,OWNER,,Can show people how to open this up in Binder.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 959999095,MDU6SXNzdWU5NTk5OTkwOTU=,1421,"""Query parameters"" form shows wrong input fields if query contains ""03:31"" style times",6988,j4mie,closed,0,,,,,11,2021-08-04T07:29:04Z,2021-08-09T03:41:07Z,2021-08-09T03:33:02Z,NONE,,"Datasette version `0.58.1`. I'm guessing this is a bug in the code that looks for `:param`-style query parameters.. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959898166,MDU6SXNzdWU5NTk4OTgxNjY=,1420,`datasette publish cloudrun --cpu X` option,9599,simonw,closed,0,,,,,5,2021-08-04T05:04:31Z,2021-08-05T00:54:59Z,2021-08-04T05:33:48Z,OWNER,,"For setting the number of vCPUs - current valid values are 1, 2 or 4: https://cloud.google.com/run/docs/configuring/cpu Pass that through to `gcloud run deploy --image IMAGE_URL --cpu CPU`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959710008,MDU6SXNzdWU5NTk3MTAwMDg=,1419,`publish cloudrun` should deploy a more recent SQLite version,536941,fgregg,open,0,,,,,3,2021-08-04T00:45:55Z,2021-08-05T03:23:24Z,,CONTRIBUTOR,,"I recently changed from deploying a datasette using `datasette publish heroku` to `datasette publish cloudrun`. [A query that ran on the heroku site](https://odpr.bunkum.us/odpr-6c2f4fc?sql=with+pivot_members+as+%28%0D%0A++select%0D%0A++++f_num%2C%0D%0A++++max%28union_name%29+as+union_name%2C%0D%0A++++max%28aff_abbr%29+as+abbreviation%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2007%0D%0A++++%29+as+%222007%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2008%0D%0A++++%29+as+%222008%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2009%0D%0A++++%29+as+%222009%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2010%0D%0A++++%29+as+%222010%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2011%0D%0A++++%29+as+%222011%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2012%0D%0A++++%29+as+%222012%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2013%0D%0A++++%29+as+%222013%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2014%0D%0A++++%29+as+%222014%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2015%0D%0A++++%29+as+%222015%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2016%0D%0A++++%29+as+%222016%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2017%0D%0A++++%29+as+%222017%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2018%0D%0A++++%29+as+%222018%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2019%0D%0A++++%29+as+%222019%22%0D%0A++from%0D%0A++++lm_data%0D%0A++++left+join+ar_membership+using+%28rpt_id%29%0D%0A++where%0D%0A++++members+%21%3D+%27%27%0D%0A++++and+trim%28desig_name%29+%3D+%27NHQ%27%0D%0A++++and+union_name+not+in+%28%0D%0A++++++%27AFL-CIO%27%2C%0D%0A++++++%27CHANGE+TO+WIN%27%2C%0D%0A++++++%27FOOD+ALLIED+SVC+TRADES+DEPT+AFL-CIO%27%0D%0A++++%29%0D%0A++++and+f_num+not+in+%28387%2C+296%2C+123%2C+347%2C+531897%2C+30410%2C+49%29%0D%0A++++and+lower%28ar_membership.category%29+IN+%28%0D%0A++++++%27active+members%27%2C%0D%0A++++++%27active+professional%27%2C%0D%0A++++++%27regular+members%27%2C%0D%0A++++++%27full+time+member%27%2C%0D%0A++++++%27full+time+members%27%2C%0D%0A++++++%27active+education+support+professional%27%2C%0D%0A++++++%27active%27%2C%0D%0A++++++%27active+member%27%2C%0D%0A++++++%27members%27%2C%0D%0A++++++%27see+item+69%27%2C%0D%0A++++++%27regular%27%2C%0D%0A++++++%27dues+paying+members%27%2C%0D%0A++++++%27building+trades+journeyman%27%2C%0D%0A++++++%27full+per+capita+tax+payers%27%2C%0D%0A++++++%27active+postal%27%2C%0D%0A++++++%27active+membership%27%2C%0D%0A++++++%27full-time+member%27%2C%0D%0A++++++%27regular+active+member%27%2C%0D%0A++++++%27journeyman%27%2C%0D%0A++++++%27member%27%2C%0D%0A++++++%27journeymen%27%2C%0D%0A++++++%27one+half+per+capita+tax+payers%27%2C%0D%0A++++++%27schedule+1%27%2C%0D%0A++++++%27active+memebers%27%2C%0D%0A++++++%27active+members+-+us%27%2C%0D%0A++++++%27dues+paying+membership%27%0D%0A++++%29%0D%0A++GROUP+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++*%0D%0Afrom%0D%0A++pivot_members%0D%0Aorder+by%0D%0A++%222019%22+desc%3B), now throws a syntax error on the [cloudrun site](https://labordata.bunkum.us/odpr-6c2f4fc?sql=with+pivot_members+as+%28%0D%0A++select%0D%0A++++f_num%2C%0D%0A++++max%28union_name%29+as+union_name%2C%0D%0A++++max%28aff_abbr%29+as+abbreviation%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2007%0D%0A++++%29+as+%222007%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2008%0D%0A++++%29+as+%222008%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2009%0D%0A++++%29+as+%222009%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2010%0D%0A++++%29+as+%222010%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2011%0D%0A++++%29+as+%222011%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2012%0D%0A++++%29+as+%222012%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2013%0D%0A++++%29+as+%222013%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2014%0D%0A++++%29+as+%222014%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2015%0D%0A++++%29+as+%222015%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2016%0D%0A++++%29+as+%222016%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2017%0D%0A++++%29+as+%222017%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2018%0D%0A++++%29+as+%222018%22%2C%0D%0A++++sum%28number%29+filter+%28%0D%0A++++++where%0D%0A++++++++yr_covered+%3D+2019%0D%0A++++%29+as+%222019%22%0D%0A++from%0D%0A++++lm_data%0D%0A++++left+join+ar_membership+using+%28rpt_id%29%0D%0A++where%0D%0A++++members+%21%3D+%27%27%0D%0A++++and+trim%28desig_name%29+%3D+%27NHQ%27%0D%0A++++and+union_name+not+in+%28%0D%0A++++++%27AFL-CIO%27%2C%0D%0A++++++%27CHANGE+TO+WIN%27%2C%0D%0A++++++%27FOOD+ALLIED+SVC+TRADES+DEPT+AFL-CIO%27%0D%0A++++%29%0D%0A++++and+f_num+not+in+%28387%2C+296%2C+123%2C+347%2C+531897%2C+30410%2C+49%29%0D%0A++++and+lower%28ar_membership.category%29+IN+%28%0D%0A++++++%27active+members%27%2C%0D%0A++++++%27active+professional%27%2C%0D%0A++++++%27regular+members%27%2C%0D%0A++++++%27full+time+member%27%2C%0D%0A++++++%27full+time+members%27%2C%0D%0A++++++%27active+education+support+professional%27%2C%0D%0A++++++%27active%27%2C%0D%0A++++++%27active+member%27%2C%0D%0A++++++%27members%27%2C%0D%0A++++++%27see+item+69%27%2C%0D%0A++++++%27regular%27%2C%0D%0A++++++%27dues+paying+members%27%2C%0D%0A++++++%27building+trades+journeyman%27%2C%0D%0A++++++%27full+per+capita+tax+payers%27%2C%0D%0A++++++%27active+postal%27%2C%0D%0A++++++%27active+membership%27%2C%0D%0A++++++%27full-time+member%27%2C%0D%0A++++++%27regular+active+member%27%2C%0D%0A++++++%27journeyman%27%2C%0D%0A++++++%27member%27%2C%0D%0A++++++%27journeymen%27%2C%0D%0A++++++%27one+half+per+capita+tax+payers%27%2C%0D%0A++++++%27schedule+1%27%2C%0D%0A++++++%27active+memebers%27%2C%0D%0A++++++%27active+members+-+us%27%2C%0D%0A++++++%27dues+paying+membership%27%0D%0A++++%29%0D%0A++GROUP+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++*%0D%0Afrom%0D%0A++pivot_members%0D%0Aorder+by%0D%0A++%222019%22+desc%3B). I suspect this is because they are running different versions of sqlite3. - Heroku: sqlite3 3.31.1 ([-/versions](https://odpr.bunkum.us/-/versions)) - Cloudrun: sqlite3 3.27.2 ([-/versions](https://labordata.bunkum.us/-/versions)) If so, it would be great to 1. harmonize the sqlite3 versions across platforms 2. update the docker files so as to update the sqlite3 version for cloudrun",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1419/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 959305209,MDU6SXNzdWU5NTkzMDUyMDk=,307,codespell to spell check documentation,9599,simonw,closed,0,,,,,0,2021-08-03T16:48:19Z,2021-08-03T16:48:53Z,2021-08-03T16:48:53Z,OWNER,,As seen in https://github.com/simonw/datasette/issues/1417 and https://til.simonwillison.net/python/codespell,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959278472,MDU6SXNzdWU5NTkyNzg0NzI=,1417,Use codespell in CI to spot spelling errors,9599,simonw,closed,0,,,,,1,2021-08-03T16:14:15Z,2021-08-03T16:36:40Z,2021-08-03T16:36:40Z,OWNER,,"I noticed Rich is using this: https://github.com/willmcgugan/rich/commit/9c12a4537499797c43725fff5276ef0da62423ef#diff-ce84a1b2c9eb4ab3ea22f610cad7111cb9a2f66365c3b24679901376a2a73ab2 Ran it against the Datasette docs and found a bunch of obvious fixes, surprisingly with no false positives. ``` datasette % codespell docs/*.rst docs/authentication.rst:63: perfom ==> perform docs/authentication.rst:76: perfom ==> perform docs/changelog.rst:429: repsonse ==> response docs/changelog.rst:503: permissons ==> permissions docs/changelog.rst:717: compatibilty ==> compatibility docs/changelog.rst:1172: browseable ==> browsable docs/deploying.rst:191: similiar ==> similar docs/internals.rst:434: Respons ==> Response, respond docs/internals.rst:440: Respons ==> Response, respond docs/internals.rst:717: tha ==> than, that, the docs/performance.rst:42: databse ==> database docs/plugin_hooks.rst:667: utilites ==> utilities docs/publish.rst:168: countainer ==> container docs/settings.rst:352: inalid ==> invalid docs/sql_queries.rst:406: preceeded ==> preceded, proceeded ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959276629,MDU6SXNzdWU5NTkyNzY2Mjk=,1416,"Use rich to render tracebacks on errors, if available",9599,simonw,closed,0,,,,,0,2021-08-03T16:12:08Z,2021-08-03T16:12:51Z,2021-08-03T16:12:51Z,OWNER,,"> Now thinking I should try adding Rich as an optional dependency to Datasette - if it's there, show tracebacks using it. Could be really handy for development > https://twitter.com/simonw/status/1422576091055616003",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 959137143,MDU6SXNzdWU5NTkxMzcxNDM=,1415,feature request: document minimum permissions for service account for cloudrun,536941,fgregg,open,0,,,,,4,2021-08-03T13:48:43Z,2023-11-05T16:46:59Z,,CONTRIBUTOR,,"Thanks again for such a powerful project. For deploying to cloudrun from github actions, I'd like to create a service account with minimal permissions. It would be great to document what those minimum permission that need to be set in the IAM. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1415/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 958516743,MDU6SXNzdWU5NTg1MTY3NDM=,306,Configure sphinx.ext.extlinks for issues,9599,simonw,closed,0,,,,,2,2021-08-02T21:19:19Z,2021-08-02T21:39:34Z,2021-08-02T21:29:22Z,OWNER,,As seen in Datasette: https://github.com/simonw/datasette/issues/1227,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957741820,MDU6SXNzdWU5NTc3NDE4MjA=,305,Python: need a way to execute a count with an extra where clause,9599,simonw,closed,0,,,,,1,2021-08-02T04:52:02Z,2021-08-02T05:08:22Z,2021-08-02T05:08:22Z,OWNER,,I need this for #304. I'll probably add this to the `.execute_count()` method as `where=` and `where_args=`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957731178,MDU6SXNzdWU5NTc3MzExNzg=,304,"`table.convert(..., where=)` and `sqlite-utils convert ... --where=`",9599,simonw,closed,0,,,,,3,2021-08-02T04:27:23Z,2021-08-02T19:00:00Z,2021-08-02T18:58:10Z,OWNER,,"For applying the conversion to a subset of rows selected using the where clause. Should also take optional arguments, as seen in `db[""dogs""].delete_where(""age < ?"", [3])`. Follows #302 and #251. This was originally https://github.com/simonw/sqlite-transform/issues/9",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957529248,MDU6SXNzdWU5NTc1MjkyNDg=,302,Python library version of `sqlite-utils convert`,9599,simonw,closed,0,9599,simonw,,,1,2021-08-01T16:11:02Z,2021-08-02T04:47:40Z,2021-08-02T04:47:40Z,OWNER,,"Spin off from #251. The ability to execute Python functions to convert and split columns should be part of the library too, not just the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957383814,MDU6SXNzdWU5NTczODM4MTQ=,301,insert-files should get a --silent option,9599,simonw,closed,0,,,,,0,2021-08-01T04:11:03Z,2021-08-02T19:12:21Z,2021-08-02T19:12:21Z,OWNER,,"The new `sqlite-utils convert` command I'm adding in #251 will have a `--silent` option for turning off the progress bars. The only other command that has progress bars right now is `insert-files` so it should get this option too, for consistency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957347860,MDU6SXNzdWU5NTczNDc4NjA=,1412,Mention WAL mode in the documentation (plus backup tips),9599,simonw,open,0,,,,,0,2021-08-01T00:27:11Z,2021-08-01T00:27:11Z,,OWNER,,"This is useful for people who are deploying Datasette with any write functionality, especially if they might be using something like EFS. I can add a section about this to the bottom of https://docs.datasette.io/en/stable/deploying.html and then link to that section from both https://docs.datasette.io/en/stable/sql_queries.html#writable-canned-queries and https://docs.datasette.io/en/stable/internals.html#await-db-execute-write-sql-params-none-block-false Also useful: mention that just copying a SQLite database file while it is being written to may not get a consistent file, so tell people to use one of the SQLite backup mechanisms instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957345476,MDU6SXNzdWU5NTczNDU0NzY=,1411,Canned query ?sql= is pointlessly echoed in query string starting from hidden mode,9599,simonw,closed,0,,,,,1,2021-08-01T00:17:13Z,2021-08-01T03:27:30Z,2021-08-01T00:58:17Z,OWNER,,"Example: https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1 Submitting that form again results in this: https://latest.datasette.io/fixtures/neighborhood_search?sql=%0D%0Aselect+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities%0D%0A++++++++on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B%0D%0A&_hide_sql=1&text=cork Because the HTML on https://latest.datasette.io/fixtures/neighborhood_search?text=cork&_hide_sql=1 includes this: ```html

Custom SQL query returning 1 row (show)

```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957315684,MDU6SXNzdWU5NTczMTU2ODQ=,1410,Rename settings to `default_allow_facet` and `default_allow_download` and `default_allow_csv_stream`,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2021-07-31T20:27:12Z,2021-07-31T20:27:49Z,,OWNER,,"> If I was prone to over-thinking (which I am) I'd note that `allow_facet` and `allow_download` and `allow_csv_stream` are all settings that do NOT have an equivalent in the newer permissions system, which is itself a little weird and inconsistent. > > So maybe there's a future task where I introduce those as both permissions and metadata `""allow_x""` blocks, then rename the settings themselves to be called `default_allow_facet` and `default_allow_download` and `default_allow_csv_stream`. > > If I was going to do that I should get it in before Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1409#issuecomment-890400425_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1410/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957310278,MDU6SXNzdWU5NTczMTAyNzg=,1409,`default_allow_sql` setting (a re-imagining of the old `allow_sql` setting),9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2021-07-31T19:48:56Z,2023-01-07T18:06:01Z,2023-01-05T00:51:31Z,OWNER,,"In 49d6d2f7b0f6cb02e25022e1c9403811f1fa0a7c as part of #813 I removed the `allow_sql` setting - on the basis that users could disable the ability to execute custom SQL queries using the new permission system instead. I don't think this was the right decision. Disabling custom SQL is an important security capability, and explaining how to do it using permissions is significantly more complex than letting people know they can add `--setting allow_sql off`. So I want to bring that setting back - maybe with a different, better name - and have it modify the default for that option if the permissions system doesn't have an opinion. That way people can still use the setting but then use permissions to allow specific signed-in users access to execute SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957302085,MDU6SXNzdWU5NTczMDIwODU=,1408,"Review places in codebase that use os.chdir(), in particularly relating to tests",9599,simonw,open,0,,,,,2,2021-07-31T18:57:06Z,2021-07-31T19:00:32Z,,OWNER,,"> To clarify: the core problem here is that an error is thrown any time you call `os.getcwd()` but the directory you are currently in has been deleted. > > `runner.isolated_filesystem()` assumes that the current directory in has not been deleted. But the various temporary directory utilities in `pytest` work by creating directories and then deleting them. > > Maybe there's a larger problem here that I play a bit fast and loose with `os.chdir()` in both the test suite and in various lines of code in Datasette itself (in particular in the publish commands)? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1406#issuecomment-890390198_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957298475,MDU6SXNzdWU5NTcyOTg0NzU=,1407,OSError: AF_UNIX path too long in ds_unix_domain_socket_server,9599,simonw,closed,0,,,,,2,2021-07-31T18:36:06Z,2021-07-31T19:03:44Z,2021-07-31T19:03:44Z,OWNER,,"Got this exception while working on #1406. ``` @pytest.fixture(scope=""session"") def ds_unix_domain_socket_server(tmp_path_factory): socket_folder = tmp_path_factory.mktemp(""uds"") uds = str(socket_folder / ""datasette.sock"") ds_proc = subprocess.Popen( [""datasette"", ""--memory"", ""--uds"", uds], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=tempfile.gettempdir(), ) # Give the server time to start time.sleep(1.5) # Check it started successfully > assert not ds_proc.poll(), ds_proc.stdout.read().decode(""utf-8"") E AssertionError: INFO: Started server process [48453] E INFO: Waiting for application startup. E INFO: Application startup complete. E Traceback (most recent call last): E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/datasette"", line 33, in E sys.exit(load_entry_point('datasette', 'console_scripts', 'datasette')()) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ E return self.main(*args, **kwargs) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1062, in main E rv = self.invoke(ctx) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke E return _process_result(sub_ctx.command.invoke(sub_ctx)) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke E return ctx.invoke(self.callback, **ctx.params) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 763, in invoke E return __callback(*args, **kwargs) E File ""/Users/simon/Dropbox/Development/datasette/datasette/cli.py"", line 583, in serve E uvicorn.run(ds.app(), **uvicorn_kwargs) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/main.py"", line 393, in run E server.run() E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 50, in run E loop.run_until_complete(self.serve(sockets=sockets)) E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 616, in run_until_complete E return future.result() E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 67, in serve E await self.startup(sockets=sockets) E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 133, in startup E server = await asyncio.start_unix_server( E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/streams.py"", line 132, in start_unix_server E return await loop.create_unix_server(factory, path, **kwds) E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/unix_events.py"", line 296, in create_unix_server E sock.bind(path) E OSError: AF_UNIX path too long E E assert not 1 E + where 1 = >() E + where > = .poll ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 956832836,MDU6SXNzdWU5NTY4MzI4MzY=,300,Returning underlying cause for User Defined Functions ,71236,wsargent,closed,0,,,,,1,2021-07-30T15:08:21Z,2021-08-02T21:53:50Z,2021-08-02T21:53:50Z,NONE,,"The sqlite3 client takes user defined functions and replaces the text with ""user-defined function raised exception`"" so it's not apparent what's gone wrong: ``` Unexpected error: user-defined function raised exception ``` As mentioned in https://code.djangoproject.com/ticket/29500 and https://stackoverflow.com/questions/45824209/how-to-get-an-error-kind-from-sqlite-create-function/45834923#45834923 the workaround for this is to enable callback tracebacks: ``` sqlite3.enable_callback_tracebacks(True) ``` It would be nice if https://sqlite-utils.datasette.io/en/stable/python-api.html#registering-custom-sql-functions either included a reference to `enable_callback_tracebacks` or if registering a user defined function set this flag automatically.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 956303470,MDU6SXNzdWU5NTYzMDM0NzA=,1406,Tests failing with FileNotFoundError in runner.isolated_filesystem,9599,simonw,closed,0,,,,,8,2021-07-30T00:39:00Z,2021-07-31T18:56:35Z,2021-07-31T18:56:35Z,OWNER,,"e.g. https://github.com/simonw/datasette/runs/3197141955 I've seen this error before, but I don't yet have a good workaround for it. ``` @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.6.14/x64/lib/python3.6/site-packages/click/testing.py:466: FileNotFoundError =========================== short test summary info ============================ FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_apt_get_install FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[---setting force_https_urls on] FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting base_url /foo---setting base_url /foo --setting force_https_urls on] FAILED tests/test_publish_cloudrun.py::test_publish_cloudrun_extra_options[--setting force_https_urls off---setting force_https_urls off] FAILED tests/test_publish_heroku.py::test_publish_heroku_requires_heroku - Fi... FAILED tests/test_publish_heroku.py::test_publish_heroku_installs_plugin - Fi... FAILED tests/test_publish_heroku.py::test_publish_heroku - FileNotFoundError:... FAILED tests/test_publish_heroku.py::test_publish_heroku_plugin_secrets - Fil... ================== 8 failed, 920 passed in 188.22s (0:03:08) =================== ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 955316250,MDU6SXNzdWU5NTUzMTYyNTA=,1405,utils.parse_metadata() should be a documented internal function,9599,simonw,closed,0,,,,,3,2021-07-28T23:51:39Z,2021-07-29T23:33:30Z,2021-07-29T23:30:24Z,OWNER,,Because it's used by this plugin: https://github.com/simonw/datasette-remote-metadata,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 953352015,MDU6SXNzdWU5NTMzNTIwMTU=,1404,`register_routes()` hook should take `datasette` argument,9599,simonw,closed,0,,,,,1,2021-07-26T23:00:33Z,2021-07-26T23:27:07Z,2021-07-26T23:26:00Z,OWNER,,Currently that plugin hook takes no arguments at all. This means it's not possible to conditionally register routes based on Datasette plugin configuration.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 953218043,MDU6SXNzdWU5NTMyMTgwNDM=,1403,Labels explaining what hidden tables are for,9599,simonw,open,0,,,,,0,2021-07-26T19:29:22Z,2022-03-21T22:20:37Z,,OWNER,,"A reasonable question: ""What are those hidden tables for?"" This could be answered by adding a small piece of explanatory text to each table - based on if it's related to FTS or to SpatiaLite or configured to be hidden for some other reason.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 952189173,MDU6SXNzdWU5NTIxODkxNzM=,3,Use HN algolia endpoint to retrieve trees,9599,simonw,open,0,,,,,3,2021-07-25T03:35:27Z,2021-07-25T18:41:17Z,,MEMBER,,"The `trees` command currently has to make a request for every single comment. Algolia have an endpoint that bundles the entire thread together into a single request. `https://hn.algolia.com/api/v1/items/ID` Here's an example that loads quickly, with about 50 comments: https://hn.algolia.com/api/v1/items/27941108 It doesn't appear to use pagination at all - if a thread is big then the response is big. I ran this search to find some stories with more than 1000 comments: https://hn.algolia.com/api/v1/search?tags=story&numericFilters=num_comments%3E=1000 Here's one: https://news.ycombinator.com/item?id=25015967 with 4759 comments. Hitting the API takes 41s and returns 3.7 MB of JSON! ``` wget 'https://hn.algolia.com/api/v1/items/25015967' 0.03s user 0.04s system 0% cpu 41.368 total /tmp % ls -lah 25015967 -rw-r--r-- 1 simon wheel 3.7M Jul 24 20:31 25015967 ```",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 952179830,MDU6SXNzdWU5NTIxNzk4MzA=,2,Command for fetching Hacker News threads from the search API,9599,simonw,open,0,,,,,4,2021-07-25T02:00:45Z,2021-07-25T03:12:57Z,,MEMBER,,"I want to be able to fetch every item for a domain, e.g. https://news.ycombinator.com/from?site=simonwillison.net",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 952154468,MDU6SXNzdWU5NTIxNTQ0Njg=,299,Ability to see just specific table schemas with `sqlite-utils schema`,9599,simonw,closed,0,,,,,1,2021-07-24T22:00:05Z,2021-07-24T22:12:01Z,2021-07-24T22:08:46Z,OWNER,,"It currently accepts no arguments. Allowing for optional arguments specifying tables would be useful: sqlite-utils schema fixtures.db facetable searchable ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951817328,MDU6SXNzdWU5NTE4MTczMjg=,12,403 when getting token,285352,treyhunner,open,0,,,,,1,2021-07-23T18:43:26Z,2021-10-12T18:31:57Z,,NONE,,"I tried to use https://your-foursquare-oauth-token.glitch.me/ to get my Swarm auth token and got a 403 after I clicked the Allow button: ![image](https://user-images.githubusercontent.com/285352/126826478-60e53614-263d-40bb-9f1d-c1a676644eb0.png) I'm not sure if this is the right repo to report this in",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 951581763,MDU6SXNzdWU5NTE1ODE3NjM=,298,Read lines with JSON object,2172260,qqilihq,closed,0,,,,,2,2021-07-23T13:28:52Z,2021-08-03T06:50:47Z,2021-08-02T21:55:16Z,NONE,,"I found this posted on HN a while ago and love it -- thank you! As a minor improvement, it would be great to have the ability to parse a file with line-separated JSON objects. Currently the parser obviously requires an array wrapping all these objects.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951185411,MDU6SXNzdWU5NTExODU0MTE=,1402,feature request: social meta tags,536941,fgregg,open,0,,,,,2,2021-07-23T01:57:23Z,2021-07-26T19:31:41Z,,CONTRIBUTOR,,"it would be very nice if the twitter, slack, and other social media could make rich cards when people post a link to a datasette instance ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 950664971,MDU6SXNzdWU5NTA2NjQ5NzE=,1401,unordered list is not rendering bullet points in description_html on database page,536941,fgregg,open,0,,,,,2,2021-07-22T13:24:18Z,2021-10-23T13:09:10Z,,CONTRIBUTOR,,"Thanks for this tremendous package, @simonw! In the `description_html` for a database, I [have an unordered list](https://github.com/labordata/warehouse/blob/fcea4502e5b615b0eb3e0bdcb45ec634abe20bb6/warehouse_metadata.yml#L19-L22). However, on the database page on the deployed site, it is not rendering this as a bulleted list. ![Screenshot 2021-07-22 at 09-21-51 nlrb](https://user-images.githubusercontent.com/536941/126645923-2777b7f1-fd4c-4d2d-af70-a35e49a07675.png) Page here: https://labordata-warehouse.herokuapp.com/nlrb-9da4ae5 The documentation gives an [example of using an unordered list](https://docs.datasette.io/en/stable/metadata.html#using-yaml-for-metadata) in a `description_html`, so I expected this will work.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 947044667,MDU6SXNzdWU5NDcwNDQ2Njc=,1398,Documentation on using Datasette as a library,9599,simonw,open,0,,,,,1,2021-07-18T14:15:27Z,2021-07-30T03:21:49Z,,OWNER,,"Instantiating `Datasette()` directly is an increasingly interesting pattern. I do it in tests all the time, but thanks to `datasette.client` there are plenty of neat things you can do with it in a library context. Maybe support `from datasette import Datasette` for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 944903881,MDU6SXNzdWU5NDQ5MDM4ODE=,1396,"""invalid reference format"" publishing Docker image",9599,simonw,closed,0,,,,,9,2021-07-15T01:02:07Z,2021-10-19T08:10:26Z,2021-07-15T19:47:25Z,OWNER,,"Error ocurred at the end of the publish flow for Datasette 0.58: https://github.com/simonw/datasette/runs/3072216421 ``` Removing intermediate container cf32b9440907 ---> dfd6985b2afc Successfully built dfd6985b2afc Successfully tagged ***/datasette:0.58 invalid reference format Error: Process completed with exit code 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944870799,MDU6SXNzdWU5NDQ4NzA3OTk=,1394,Big performance boost on faceting: skip the inner order by,9599,simonw,closed,0,,,,,4,2021-07-14T23:32:29Z,2021-07-16T02:23:32Z,2021-07-15T00:05:50Z,OWNER,,"I just noticed something that could make for a huge performance improvement in faceting. The default query used by Datasette when faceting looks like this: ```sql select country_long, count(*) from ( select * from [global-power-plants] order by rowid ) where country_long is not null group by country_long order by count(*) desc ``` Here it takes 53ms: https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D+order+by+rowid%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc Note that there's a `order by rowid` in there which isn't necessary - the order on that inner query doesn't matter since we're grouping and counting. I had assumed SQLite would optimize this away - but it turns out it doesn't! Consider this version of the query, with that pointless order by removed: ``` select country_long, count(*) from ( select * from [global-power-plants] ) where country_long is not null group by country_long order by count(*) desc ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc runs in 7.2ms! I tried this optimization on a table with 2.5m rows in it - without the optimization it took 5 seconds, with the optimization it took 450ms. So this is a very significant improvement!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1394/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944846776,MDU6SXNzdWU5NDQ4NDY3NzY=,297,Option for importing CSV data using the SQLite .import mechanism,9599,simonw,open,0,,,,,23,2021-07-14T22:36:41Z,2023-09-22T20:49:52Z,,OWNER,,"As seen in https://til.simonwillison.net/sqlite/import-csv - `.mode csv` and then `.import school.csv schools` is hugely faster than importing via `sqlite-utils insert` and doing the work in Python - but it can only be implemented by shelling out to the `sqlite3` CLI tool, it's not functionality that is exposed to the Python `sqlite3` module. An option to use this would be useful - maybe something like this: sqlite-utils insert blah.db blah blah.csv --fast",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 944326512,MDU6SXNzdWU5NDQzMjY1MTI=,296,"`table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option",32427188,deafmute1,closed,0,,,,,6,2021-07-14T11:26:47Z,2021-08-18T20:13:12Z,2021-08-18T20:10:48Z,NONE,,"Hi, Recently got this error: ``` Traceback (most recent call last): File """", line 1, in File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 38, in start(""/home/ethan/git/music-metadata-indexer/sample"", ""/home/ethan/git/music-metadata-indexer/test.db"") File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 23, in start scanner.build_database() File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 79, in build_database _import_song(self.db, Path(dirpath).joinpath(f), self.logger) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 23, in _import_song db.add_song(filepath) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/index.py"", line 166, in add_song for match in self.search(""albums"", album): File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1625, in search cursor = self.db.execute( File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 243, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: fts5: syntax error near ""."" ``` So, the error seems to suggest there was a ""."" character somewhere in the SQL command that was causing the error. I did a little digging and found this in the docs: https://www.sqlite.org/fts5.html#fts5_strings. ""."" is one of the many prohibited characters. My solution was to just strip these out of the query using this line `query = query.translate({e: None for e in itertools.chain(range(0,26), range(27, 48), range(58,65), range(91,95), [96], range(123,128))})` Perhaps this could be included into the `table.search()` function? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 941300946,MDU6SXNzdWU5NDEzMDA5NDY=,1391,Stop using generated columns in fixtures.db,9599,simonw,closed,0,,,,,5,2021-07-10T18:26:11Z,2021-07-10T19:26:58Z,2021-07-10T19:26:00Z,OWNER,,"Refs #1376 - but I also keep running into this myself, where I try to run something against `fixtures.db` and get this confusing error: sqlite3.DatabaseError: malformed database schema (generated_columns) - near ""AS"": syntax error I'm going to stop using generated columns in `fixtures.db` and instead dynamically generate the generated column table for the duration of the relevant test.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 940891698,MDU6SXNzdWU5NDA4OTE2OTg=,1390,Mention restarting systemd in documentation,9599,simonw,closed,0,,,,,2,2021-07-09T16:05:15Z,2021-07-09T16:32:57Z,2021-07-09T16:32:33Z,OWNER,,"https://docs.datasette.io/en/stable/deploying.html#running-datasette-using-systemd Need to clarify that if you add a new database or change metadata you need to restart systemd.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 940077168,MDU6SXNzdWU5NDAwNzcxNjg=,1389,"""searchmode"": ""raw"" in table metadata",9599,simonw,closed,0,,,,,6,2021-07-08T17:32:10Z,2021-07-10T18:33:13Z,2021-07-10T18:33:13Z,OWNER,,"> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw > > But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account) > Here it is: > > ``` > { > ""databases"": { > ""index"": { > ""tables"": { > ""summary"": { > ""sort"": ""title"", > ""searchmode"": ""raw"" > } > } > } > } > } _Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 939051549,MDU6SXNzdWU5MzkwNTE1NDk=,1388,Serve using UNIX domain socket,80737,aslakr,closed,0,,,,,13,2021-07-07T16:13:37Z,2021-07-11T01:18:38Z,2021-07-10T23:38:32Z,CONTRIBUTOR,,Would it be possible to make datasette serve using UNIX domain socket similar to Uvicorn's ``--uds``?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 935930820,MDU6SXNzdWU5MzU5MzA4MjA=,1387,absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs,9599,simonw,closed,0,,,,,8,2021-07-02T16:58:25Z,2021-07-02T17:58:23Z,2021-07-02T17:33:05Z,OWNER,,"Reported in the wild on https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber - the ""next page"" link links to https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib?_facet=bib_level_callnumber&_next=100 That installation uses `""base_url"": ""/collection-analysis/""` Weirdly all of the other links on that page - to facet results, sort orders, row permalinks etc - work fine. It's JUST the `next_url` one that is broken. Also broken in their JSON: https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1 returns ```json ""suggested_facets"": [], ""next"": ""1"", ""next_url"": ""https://127.0.0.1:8010/collection-analysis/current_collection-3d4a4b7/bib.json?_size=1&_next=1"", ""private"": false, ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 934123448,MDU6SXNzdWU5MzQxMjM0NDg=,295,Insert with --tsv and --no-headers give error about --nl arguments,7288187,davidscotson,closed,0,,,,,1,2021-06-30T21:01:01Z,2021-08-18T20:19:04Z,2021-08-18T20:18:57Z,NONE,,"Not quite sure if this is a bug, or just an assumption I made but I thought `--tsv` and `--no-headers` would work together when inserting from a file, and currently they seem not to (sqlite-utils, version 3.12, installed on Mac OS X via brew) Instead it says: `Error: Use just one of --nl, --csv or --tsv` As if it has interpreted the --no-headers as --nl. The --help does specifically say CSV: `--no-headers CSV file has no header row` And this heading in the documentation also only refers to CSV, but the text does mention TSV in passing, and I'd generally expect them to behave the same in most cases. https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 931752773,MDU6SXNzdWU5MzE3NTI3NzM=,294,Add a `sqlite-utils memory` example to the README,9599,simonw,closed,0,,,,,0,2021-06-28T16:35:59Z,2021-08-18T21:40:03Z,2021-08-18T21:40:03Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 930946817,MDU6SXNzdWU5MzA5NDY4MTc=,7,KeyError: 'accuracy' when processing Location History,403152,davidwilemski,open,0,,,,,0,2021-06-27T14:39:43Z,2021-06-27T14:39:43Z,,NONE,,"I'm new to both the dogsheep tools and datasette but have been experimenting a bit the last few days and these are really cool tools! I encountered a problem running my Google location history through this tool running the latest release in a docker container: ``` Traceback (most recent call last): File ""/usr/local/bin/google-takeout-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/cli.py"", line 49, in my_activity utils.save_location_history(db, zf) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 27, in save_location_history db[""location_history""].upsert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1105, in upsert_all return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 990, in insert_all chunk = list(chunk) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 33, in ""accuracy"": row[""accuracy""], KeyError: 'accuracy' ``` It looks like the tool assumes the `accuracy` key will be in every location history entry. My first attempt at a local patch to get myself going was to convert accessing the `accuracy` key to a `.get` instead to hopefully make the row nullable but I wasn't quite sure what `sqlite_utils` would do there. That did work in that the import happened and so I was going to propose a patch that made that change but in updating the existing test to include an entry with a missing accuracy entry, I noticed the expected type of the field appeared to be changing to a string in the test (and from a quick scan through the sqlite_utils code, probably TEXT in the database). Given this change in column type, it seemed that opening an issue first before proposing a fix seemed warranted. It seems the schema would need to be explicitly specified if you wanted a nullable integer column. Now that I've done a successful import run using my initial fix of calling `.get` on the row dict, I can see with datasette that I only have 7 data points (out of ~250k) that have a null accuracy column. They are all from 2011-2012 in an import that includes points spanning ~2010-2016 so perhaps another approach might be to filter those entries out during import if it really is that infrequent? I'm happy to provide a PR for a fix but figured I'd ask about which direction is preferred first.",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 930807135,MDU6SXNzdWU5MzA4MDcxMzU=,1384,Plugin hook for dynamic metadata,9599,simonw,open,0,,,,,22,2021-06-26T22:36:03Z,2022-03-14T00:36:42Z,,OWNER,,"@brandonrobertz contributed an implementation of this in PR #1368, which I just merged. Opening this ticket to track further work on this before it goes out in a Datasette release (likely preceded by an alpha).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 927789811,MDU6SXNzdWU5Mjc3ODk4MTE=,292,Add contributing documentation,9599,simonw,closed,0,,,,,0,2021-06-23T02:13:05Z,2021-06-25T17:53:51Z,2021-06-25T17:53:51Z,OWNER,,Like https://docs.datasette.io/en/latest/contributing.html (but simpler) - should cover how to run `black` and `flake8` and `mypy` and how to run the tests.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/292/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 927766296,MDU6SXNzdWU5Mjc3NjYyOTY=,291,Adopt flake8,9599,simonw,closed,0,,,,,2,2021-06-23T01:19:37Z,2021-06-24T17:50:27Z,2021-06-24T17:50:27Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 927385540,MDU6SXNzdWU5MjczODU1NDA=,8,any guidance / experience on imessage-to-sqlite ?,2675621,Casyfill,open,0,,,,,0,2021-06-22T15:46:16Z,2021-06-22T15:46:16Z,,NONE,,,214746582,dogsheep.github.io,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 926777310,MDU6SXNzdWU5MjY3NzczMTA=,290,`db.query()` method (renamed `db.execute_returning_dicts()`),9599,simonw,closed,0,,,,,6,2021-06-22T03:03:54Z,2021-06-24T23:17:38Z,2021-06-24T22:54:43Z,OWNER,,"Most of this library deals with lists of Python dictionaries - `.insert_all()`, `.rows`, `.rows_where()`, `.search()`. The `db.execute()` method is the only thing that returns a `sqlite3` cursor. There is a clumsily named `db.execute_returning_dicts(sql)` method but it's not currently mentioned in the documentation. It needs a better name, and needs to be properly documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925677191,MDU6SXNzdWU5MjU2NzcxOTE=,289,Mypy fixes for rows_from_file(),857609,adamchainz,closed,0,,,,,3,2021-06-20T20:34:59Z,2021-06-22T18:44:36Z,2021-06-22T18:13:26Z,NONE,,"Following https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864328927 You had two mypy errors. The first: > sqlite_utils/utils.py:157: error: Argument 1 to ""BufferedReader"" has incompatible type ""BinaryIO""; expected ""RawIOBase"" Looking at the `BufferedReader` docs, it seems to expect a `RawIOBase`, and this [has been copied into typeshed](https://github.com/python/typeshed/blob/9ec2f8712480c57353cea097a65d75a2c4ec1846/stdlib/io.pyi#L100). There may be scope to change how `BufferedReader` is documented and typed upstream, but for now it wouldn't be too bad to use a `typing.cast()`: ``` # Detect the format, then call this recursively buffered = io.BufferedReader( cast(io.RawIOBase, fp), # Undocumented BufferedReader support for BinaryIO buffer_size=4096, ) ``` The second error seems to be flagging a legitimate bug in your code: > sqlite_utils/utils.py:163: error: Argument 1 to ""decode"" of ""bytes"" has incompatible type ""Optional[str]""; expected ""str"" From your type hints, `encoding` may be `None`. In the CSV format block, you use `encoding or ""utf-8-sig""` to set a default, maybe that's desirable in this case too? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/289/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925545468,MDU6SXNzdWU5MjU1NDU0Njg=,288,sqlite-utils memory blah.json --schema,9599,simonw,closed,0,,,,,0,2021-06-20T08:10:40Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Like `--dump` but only outputs the schema - useful for understanding what you are about to run queries against.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925544070,MDU6SXNzdWU5MjU1NDQwNzA=,287,Update rowid examples in the docs,9599,simonw,closed,0,,,,,0,2021-06-20T08:03:00Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Changed in #284 - a couple of examples need updating on https://github.com/simonw/sqlite-utils/blob/3.10/docs/cli.rst.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925491857,MDU6SXNzdWU5MjU0OTE4NTc=,1383,Improve test coverage for `inspect.py`,9599,simonw,open,0,,,,,0,2021-06-20T00:22:43Z,2021-06-20T00:22:49Z,,OWNER,,https://codecov.io/gh/simonw/datasette/src/main/datasette/inspect.py shows only 36% coverage for that module at the moment.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 925487946,MDU6SXNzdWU5MjU0ODc5NDY=,286,Add installation instructions,9599,simonw,closed,0,,,,,1,2021-06-19T23:55:36Z,2021-06-20T18:47:13Z,2021-06-20T18:47:13Z,OWNER,,"`pip install sqlite-utils`, `pipx install sqlite-utils` and `brew install sqlite-utils`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925410305,MDU6SXNzdWU5MjU0MTAzMDU=,285,Introspection property for telling if a table is a rowid table,9599,simonw,closed,0,,,,,7,2021-06-19T14:56:16Z,2021-06-19T15:12:33Z,2021-06-19T15:12:33Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416785_,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925406964,MDU6SXNzdWU5MjU0MDY5NjQ=,1382,Datasette with Glitch - is it possible to use CSV with ISO-8859-1 encoding?,23701514,reichaves,closed,0,,,,,1,2021-06-19T14:37:20Z,2021-06-20T00:21:02Z,2021-06-20T00:20:06Z,NONE,,"Hi Please, I used Remix on Glitch to create a project on Glitch and uploaded a CSV But it's a CSV with ISO-8859-1 encoding (https://en.wikipedia.org/wiki/ISO/IEC_8859-1) Is it possible for me to change the encoding to correctly visualize the data? Example: https://emphasized-carpal-pillow.glitch.me/data/Emendas Best",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925320167,MDU6SXNzdWU5MjUzMjAxNjc=,284,.transform(types=) turns rowid into a concrete column,9599,simonw,closed,0,,,,,5,2021-06-19T05:25:27Z,2021-06-19T15:28:30Z,2021-06-19T15:28:30Z,OWNER,,"Noticed this in the tests for `sqlite-utils memory` in #282 - is it possible to fix this? https://github.com/simonw/sqlite-utils/commit/ec5174ed40fa283cb06f25ee0c0136297ec313ae",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925319214,MDU6SXNzdWU5MjUzMTkyMTQ=,283,memory: Shouldn't detect types for JSON,9599,simonw,closed,0,,,,,1,2021-06-19T05:17:35Z,2021-06-19T14:52:48Z,2021-06-19T14:52:48Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/ec5174ed40fa283cb06f25ee0c0136297ec313ae/sqlite_utils/cli.py#L1244-L1251 This runs against JSON as well as CSV/TSV - which isn't necessary and In fact throws errors if there is any nested data.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/283/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925305186,MDU6SXNzdWU5MjUzMDUxODY=,282,Automatic type detection for CSV data,9599,simonw,closed,0,,,,,4,2021-06-19T03:33:21Z,2021-06-19T04:42:03Z,2021-06-19T04:38:00Z,OWNER,,"I've touched on this before in #179 - but now that I've added `sqlite-utils memory` this is much more important - because unlike with `sqlite-utils insert` the in-memory command doesn't give you the opportunity to fix any types you imported from CSV, so queries like `select * from stdin where age > 3` are never going to work correctly against these temporary in-memory tables. Teaching `sqlite-utils insert` to detect types for columns in a CSV file would be a backwards-compatibility breaking change. Teaching `sqlite-utils memory` that trick would not be, since it hasn't been included in a release yet. It's a little inconsistent, but I'm going to have `sqlite-utils memory` default to detecting types while `sqlite-utils insert` does not. In each case this can be controlled by a new command-line option: cat file.csv | sqlite-utils memory - --no-detect-types To opt-in for `sqlite-utils insert`: cat file.csv | sqlite-utils insert blah.db blah - --detect-types I'll have short options for these too: `-n` for `--no-detect-types` and `-d` for `--detect-types`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/282/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 924992318,MDU6SXNzdWU5MjQ5OTIzMTg=,281,Mechanism for explicitly stating CSV or JSON or TSV for sqlite-utils memory,9599,simonw,closed,0,,,,,1,2021-06-18T15:04:53Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Implement `filename.json:json` and `-:nl` and suchlike options for specifying the format rather than guessing it - see https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861985944 Follows #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924991194,MDU6SXNzdWU5MjQ5OTExOTQ=,280,Add --encoding option to sqlite-utils memory,9599,simonw,closed,0,,,,,0,2021-06-18T15:03:32Z,2021-06-18T15:29:46Z,2021-06-18T15:29:46Z,OWNER,,Follow-on from #272 - this will work like `--encoding` on `sqlite-utils insert` and will affect all CSV files processed by `sqlite-utils memory`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924990677,MDU6SXNzdWU5MjQ5OTA2Nzc=,279,sqlite-utils memory should handle TSV and JSON in addition to CSV,9599,simonw,closed,0,,,,,7,2021-06-18T15:02:54Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Use sniff to detect CSV or TSV (if `:tsv` or `:csv` was not specified) and delimiters Follow-on from #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924748955,MDU6SXNzdWU5MjQ3NDg5NTU=,1380,Serve all db files in a folder,193463,stratosgear,open,0,,,,,5,2021-06-18T10:03:32Z,2021-11-13T08:09:11Z,,NONE,,"I tried to get the `serve` command to serve all the .db files in the `/mnt` folder but is seems that the server does not refresh the list of files. In more detail: * Starting datasette as a docker container with: ``` docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt ``` * Datasette correctly serves all the *.db files found in the /mnt folder * When the server is running, if I copy a new file in the $PWD folder, Datasette does not seem to see the new files, forcing me to restart Docker. Is there an option/setting that I overlooked, or is this something missing? BTW, the `--reload` setting, although at first glance is what you think you need, does not seem to do anything in regards of seeing all *.db files. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1380/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 924203783,MDU6SXNzdWU5MjQyMDM3ODM=,1379,Idea: ?_end=1 option for streaming CSV responses,9599,simonw,open,0,,,,,0,2021-06-17T18:11:21Z,2021-06-17T18:11:30Z,,OWNER,,"As discussed in this thread: https://twitter.com/simonw/status/1405554676993433605 - one of the disadvantages of Datasette's streaming CSV feature is that it's hard to tell if you got the whole file or if the connection ended early - or if an error occurred. Idea: offer an optional `?_end=1` parameter which, if enabled, adds a single row to the end of the CSV file that looks like this: `END,,,,,,,,,` For however many columns the CSV file usually has.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 923697888,MDU6SXNzdWU5MjM2OTc4ODg=,278,"Support db as first parameter before subcommand, or as environment variable",601708,mcint,closed,0,,,,,3,2021-06-17T09:26:29Z,2021-06-20T22:39:57Z,2021-06-18T15:43:19Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923602693,MDU6SXNzdWU5MjM2MDI2OTM=,276,support small help flag -h,601708,mcint,closed,0,,,,,0,2021-06-17T07:59:31Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/276/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922955697,MDU6SXNzdWU5MjI5NTU2OTc=,275,Enable code coverage,9599,simonw,closed,0,,,,,1,2021-06-16T18:33:49Z,2021-06-17T00:12:12Z,2021-06-17T00:12:12Z,OWNER,,"https://app.codecov.io/gh/simonw/sqlite-utils Same mechanism as Datasette. Need to copy across the token from that page and add an equivalent of this workflow: https://github.com/simonw/datasette/blob/main/.github/workflows/test-coverage.yml",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922832113,MDU6SXNzdWU5MjI4MzIxMTM=,274,sqlite-utils dump my.db command,9599,simonw,closed,0,,,,,0,2021-06-16T16:30:14Z,2021-06-16T23:51:54Z,2021-06-16T23:51:54Z,OWNER,,"Inspired by the `--dump` mechanism I added to `sqlite-utils memory` here: https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862018937 > Can use `.iterdump()` to implement this: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.iterdump > > Maybe instead (or as-well-as) offer `--dump` which dumps out the SQL from that.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 921878733,MDU6SXNzdWU5MjE4Nzg3MzM=,272,"Idea: import CSV to memory, run SQL, export in a single command",9599,simonw,closed,0,,,,,22,2021-06-15T23:02:48Z,2021-06-19T23:36:48Z,2021-06-18T15:05:03Z,OWNER,,"I quite often load a CSV file into a SQLite DB, then do stuff with it (like export results back out again as a new CSV) without any intention of keeping the CSV file around afterwards. What if `sqlite-utils` could do this for me? Something like this: sqlite-utils --csv blah.csv --csv baz.csv ""select * from blah join baz ..."" ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/272/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 920884085,MDU6SXNzdWU5MjA4ODQwODU=,1377,Mechanism for plugins to exclude certain paths from CSRF checks,9599,simonw,closed,0,,,,,3,2021-06-15T00:48:20Z,2021-06-23T22:51:33Z,2021-06-23T22:51:33Z,OWNER,,I need this for a plugin I'm building that offers a POST API.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 920636216,MDU6SXNzdWU5MjA2MzYyMTY=,64,"feature: support ""events""",231498,khimaros,open,0,,,,,5,2021-06-14T17:42:49Z,2021-06-15T00:48:37Z,,NONE,,"the GitHub API provides the ability to fetch all events for a given user, organization, or repository: https://docs.github.com/en/rest/reference/activity#list-events-for-the-authenticated-user this would allow users to export all of the issue comments, new issues, etc. that they created. something which is currently missing from the GitHub takeout exports.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 919822817,MDU6SXNzdWU5MTk4MjI4MTc=,1376,Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns),1726460,jcgregorio,open,0,,,,,3,2021-06-13T15:25:51Z,2021-06-13T15:39:37Z,,NONE,,"Trying to run datasette via the Docker container doesn't seem to work: ``` $ docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 544, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/usr/local/lib/python3.9/asyncio/base_events.py"", line 642, in run_until_complete return future.result() File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 584, in check_databases await database.execute_fn(check_connection) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 155, in execute_fn return await asyncio.get_event_loop().run_in_executor( File ""/usr/local/lib/python3.9/concurrent/futures/thread.py"", line 52, in run result = self.fn(*self.args, **self.kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 153, in in_thread return fn(conn) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/__init__.py"", line 892, in check_connection for r in conn.execute( sqlite3.DatabaseError: malformed database schema (generated_columns) - near ""AS"": syntax error ``` I have confirmed that the downloaded `fixtures.db` database is fine: ``` [skia-public] jcgregorio@jcgregorio840 ~/Downloads $ sqlite3 fixtures.db SQLite version 3.34.1 2021-01-20 14:10:07 Enter "".help"" for usage hints. sqlite> pragma integrity_check; ok sqlite> ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 919733213,MDU6SXNzdWU5MTk3MzMyMTM=,33,Searching for whitespace throws an error,9599,simonw,closed,0,,,,,0,2021-06-13T06:57:57Z,2021-06-13T14:36:39Z,2021-06-13T14:36:39Z,MEMBER,,"https://datasette.io/-/beta?q=+ returns a 500 > fts5: syntax error near """"",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919702451,MDU6SXNzdWU5MTk3MDI0NTE=,271,table.upsert_all() fails if input has a single column that should be a primary key,9599,simonw,closed,0,,,,,1,2021-06-13T02:50:27Z,2021-06-13T02:57:29Z,2021-06-13T02:57:29Z,OWNER,,"This works: ```pycon >>> db['foo'].insert_all([{""name"": ""hello""}], pk=""name"")
``` But this fails: ``` >>> db['foo3'].upsert_all([{""name"": ""hello""}], pk=""name"") Traceback (most recent call last): File """", line 1, in File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1837, in upsert_all return self.insert_all( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1778, in insert_all self.insert_chunk( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1588, in insert_chunk result = self.db.execute(query, params) File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 213, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: near ""WHERE"": syntax error ``` With the debugger: ``` >>> import pdb; pdb.pm() > /Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py(213)execute() -> return self.conn.execute(sql, parameters) (Pdb) print(sql, parameters) UPDATE [foo3] SET WHERE [name] = ? ['hello'] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/271/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919508498,MDU6SXNzdWU5MTk1MDg0OTg=,1375,JSON export dumps JSON fields as TEXT,4068,frafra,closed,0,,,,,2,2021-06-12T09:45:08Z,2021-06-14T09:41:59Z,2021-06-13T15:37:58Z,NONE,,"Hi! When a user tries to export data as JSON, I would expect to see the value of JSON columns represented as JSON instead of being rendered as a string. What do you think?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919314806,MDU6SXNzdWU5MTkzMTQ4MDY=,270,Cannot set type JSON,4068,frafra,closed,0,,,,,4,2021-06-11T23:53:22Z,2021-06-16T17:34:49Z,2021-06-16T15:47:06Z,NONE,,"It would be great if the column type could be set to JSON. That would not be different from handling a regular string. It would be something like `repr(value)` and it would work with both JSON and CSV inputs, no matter if `value` is a real list or just a string representing a list.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919250621,MDU6SXNzdWU5MTkyNTA2MjE=,269,bool type not supported,4068,frafra,closed,0,,,,,3,2021-06-11T22:00:36Z,2021-06-15T01:34:10Z,2021-06-15T01:34:10Z,NONE,,"Hi! Thank you for sharing this very nice tool :) It would be nice to have support for more types, like `bool`: it is not possible to convert to boolean at the moment. My suggestion would be to handle it as `bool(int(value))`, like csvkit does.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919181559,MDU6SXNzdWU5MTkxODE1NTk=,268,db.schema property and sqlite-utils schema command,9599,simonw,closed,0,,,,,4,2021-06-11T20:25:47Z,2021-06-11T20:51:56Z,2021-06-11T20:51:56Z,OWNER,,"`table.schema` returns the schema for a table. `db.schema` should return the schema for the whole databes. Can do this using `select sql from sqlite_master where sql is not null`: https://latest.datasette.io/fixtures?sql=select+sql+from+sqlite_master+where+sql+is+not+null",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 915488244,MDU6SXNzdWU5MTU0ODgyNDQ=,1372,"Add section to ""writing plugins"" about security, e.g. avoiding XSS",9599,simonw,open,0,,,,,0,2021-06-08T20:49:33Z,2021-06-08T20:49:46Z,,OWNER,,https://docs.datasette.io/en/stable/writing_plugins.html should have tips on writing secure plugins.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1372/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 915455228,MDU6SXNzdWU5MTU0NTUyMjg=,1371,Menu plugin hooks should include the request,9599,simonw,closed,0,,,,,1,2021-06-08T20:23:35Z,2021-06-10T04:46:01Z,2021-06-10T04:46:01Z,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#menu-links-datasette-actor - `menu_links(datasette, actor)` - `table_actions(datasette, actor, database, table)` - `database_actions(datasette, actor, database)` All three of these should optionally also accept the `request` object. This would allow them to take into account additional cookies, `Authorization` headers or the current request URL (including the domain/subdomain) - or even access `request.scope` for extra context that might have been passed down from ASGI middleware.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 915421499,MDU6SXNzdWU5MTU0MjE0OTk=,267,row.update() or row.pk,12721157,Gravitar64,open,0,,,,,4,2021-06-08T19:56:00Z,2021-06-22T17:27:27Z,,NONE,,"Hi, fantastic framework for working with Sqlite3 databases!!! I tried to update spezific rows in a table and used for row in db[tablename]: newValue = row[""counter""] * row[""prize""] row.update({""Fieldname"": newValue}) print(row) This updates the value in the printet row, but not in the database. So I switched to db[tablename].update(id, {""Filedname"": newValue}) This works fine. But row.update would be nicer, because no need for the id (its that row), no need for the tablename and the db (all defined in the for row ... loop). Thx ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 913900374,MDU6SXNzdWU5MTM5MDAzNzQ=,1369,Don't show foreign key IDs twice if no label,9599,simonw,open,0,,,,,1,2021-06-07T19:47:02Z,2021-06-07T19:47:24Z,,OWNER,,"![B5B54D94-A768-4544-A88D-CDCAB417CD3C](https://user-images.githubusercontent.com/9599/121078979-6e9d0600-c78e-11eb-8b70-20e6d29b48b1.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 913823889,MDU6SXNzdWU5MTM4MjM4ODk=,1367,Navigation menu display bug,9599,simonw,closed,0,,,,,1,2021-06-07T18:18:08Z,2021-06-07T18:24:19Z,2021-06-07T18:24:19Z,OWNER,,"With Datasette 0.57 the navigation menu looks like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1367/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913809802,MDU6SXNzdWU5MTM4MDk4MDI=,1366,Get rid of this `restore_working_directory` hack entirely,9599,simonw,open,0,,,,,2,2021-06-07T18:01:21Z,2021-06-07T18:03:03Z,,OWNER,,"> That seems to have fixed it. I'd love to get rid of this `restore_working_directory` hack entirely. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1361#issuecomment-855308811_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 913135723,MDU6SXNzdWU5MTMxMzU3MjM=,266,"Add some types, enforce with mypy",9599,simonw,closed,0,,,,,3,2021-06-07T06:05:56Z,2021-08-18T22:25:38Z,2021-08-18T22:25:38Z,OWNER,,"A good starting point would be adding type information to the members of these named tuples and the introspection methods that return them: https://github.com/simonw/sqlite-utils/blob/9dff7a38831d471b1dff16d40d89eb5c3b4e84d6/sqlite_utils/db.py#L51-L75",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913017577,MDU6SXNzdWU5MTMwMTc1Nzc=,1365,pathlib.Path breaks internal schema,25778,eyeseast,closed,0,,,,,1,2021-06-07T01:40:37Z,2021-06-21T15:57:39Z,2021-06-21T15:57:39Z,CONTRIBUTOR,,"Ran into an issue while trying to build a plugin to render GeoJSON. I'm using pytest's `tmp_path` fixture, which is a `pathlib.Path`, to get a temporary database path. I was getting a weird error involving writes, but I was doing reads. Turns out it's the internal database trying to insert a `Path` where it wants a string. My test looked like this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([database]) # this will break with a path await datasette.refresh_schemas() # build a url url = datasette.urls.table(database.stem, TABLE_NAME, format=""geojson"") response = await datasette.client.get(url) fc = response.json() assert 200 == response.status_code ``` I only ran into this while running tests, because passing in database paths from the CLI uses strings, but it's a weird error and probably something other people have run into. The fix is easy enough: Convert the path to a string and everything works. So this: ```python @pytest.mark.asyncio async def test_render_feature_collection(tmp_path): database = tmp_path / ""test.db"" datasette = Datasette([str(database)]) # this is fine now await datasette.refresh_schemas() ``` This could (probably, haven't tested) be fixed [here](https://github.com/simonw/datasette/blob/03ec71193b9545536898a4bc7493274fec48bdd7/datasette/app.py#L357) by calling `str(db.path)` or by doing that conversion earlier.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912959264,MDU6SXNzdWU5MTI5NTkyNjQ=,1364,Don't truncate columns on the list of databases,9599,simonw,closed,0,,,,,0,2021-06-06T22:01:56Z,2021-06-06T22:07:50Z,2021-06-06T22:07:50Z,OWNER,,"https://covid-19.datasettes.com/covid currently truncates at 9 database columns: Django SQL Dashboard showed me that this is a bad idea - having the full list of columns is actually really useful documentation for crafting custom SQL queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1364/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912864936,MDU6SXNzdWU5MTI4NjQ5MzY=,1362,Consider using CSP to protect against future XSS,9599,simonw,open,0,,,,,17,2021-06-06T15:32:20Z,2022-10-08T18:42:09Z,,OWNER,,The XSS in #1360 would have been a lot less damaging if Datasette used CSP to protect against such vulnerabilities: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 912485040,MDU6SXNzdWU5MTI0ODUwNDA=,1361,Intermittent CI failure: restore_working_directory FileNotFoundError,9599,simonw,closed,0,,,,,4,2021-06-05T22:48:13Z,2021-06-05T23:16:24Z,2021-06-05T23:16:24Z,OWNER,,"e.g. in https://github.com/simonw/datasette/runs/2754772233 - this is an intermittent error: ``` __________ ERROR at setup of test_hook_register_routes_render_message __________ [gw0] linux -- Python 3.8.10 /opt/hostedtoolcache/Python/3.8.10/x64/bin/python tmpdir = local('/tmp/pytest-of-runner/pytest-0/popen-gw0/test_hook_register_routes_rend0') request = > @pytest.fixture def restore_working_directory(tmpdir, request): > previous_cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912464443,MDU6SXNzdWU5MTI0NjQ0NDM=,1360,"Security flaw, to be fixed in 0.56.1 and 0.57",9599,simonw,closed,0,,,,,2,2021-06-05T21:53:51Z,2021-06-05T22:23:23Z,2021-06-05T22:22:06Z,OWNER,,"See security advisory here for details: https://github.com/simonw/datasette/security/advisories/GHSA-xw7c-jx9m-xh5g - the `?_trace=1` debugging option was not correctly escaping its JSON output, resulting in a [reflected cross-site scripting](https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks) vulnerability.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912419349,MDU6SXNzdWU5MTI0MTkzNDk=,1359,`?_trace=1` should only be available with a new `trace_debug` setting,9599,simonw,closed,0,,,,,0,2021-06-05T19:59:27Z,2021-06-05T20:18:46Z,2021-06-05T20:18:46Z,OWNER,,Just like template debug mode is controlled by this off-by-default setting: https://github.com/simonw/datasette/blob/368aa5f1b16ca35f82d90ff747023b9a2bfa27c1/datasette/app.py#L160-L164,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912418094,MDU6SXNzdWU5MTI0MTgwOTQ=,1358,Release Datasette 0.57,9599,simonw,closed,0,,,,,3,2021-06-05T19:56:13Z,2021-06-05T22:20:07Z,2021-06-05T22:20:07Z,OWNER,,"Need release notes. Changes are here: https://github.com/simonw/datasette/compare/0.56...368aa5f1b16ca35f82d90ff747023b9a2bfa27c1 Partial release notes already exist for the two alphas, https://github.com/simonw/datasette/releases/tag/0.57a0 and https://github.com/simonw/datasette/releases/tag/0.57a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 910092577,MDU6SXNzdWU5MTAwOTI1Nzc=,1356,"Research: syntactic sugar for using --get with SQL queries, maybe ""datasette query""",9599,simonw,open,0,,,,,10,2021-06-03T04:49:42Z,2022-01-20T01:06:37Z,,OWNER,,"Inspired by https://github.com/simonw/sqlite-utils/issues/264 - in particular this example: ``` datasette covid.db --get='/covid.yaml?sql=select * from ny_times_us_counties limit 1' - date: '2020-01-21' county: Snohomish state: Washington fips: 53061 cases: 1 deaths: 0 ``` Having to construct that URL - including potentially URL escaping the SQL query - isn't a great developer experience. Imagine if you could do this instead: datasette covid.db --query ""select * from ny_times_us_counties limit 1"" --format yaml ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 910088936,MDU6SXNzdWU5MTAwODg5MzY=,1355,datasette --get should efficiently handle streaming CSV,9599,simonw,open,0,,,,,2,2021-06-03T04:40:40Z,2022-03-20T22:38:53Z,,OWNER,,"It would be great if you could use `datasette --get` to run queries that return streaming CSV data without running out of RAM. Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 908465747,MDU6SXNzdWU5MDg0NjU3NDc=,1354,Update help in tests for latest Click,9599,simonw,closed,0,,,,,1,2021-06-01T16:14:31Z,2021-06-01T16:17:04Z,2021-06-01T16:17:04Z,OWNER,,"Now that Uvicorn 0.14 is out with an unpinned Click dependency - https://github.com/encode/uvicorn/pull/1033 - our test suite runs against Click 8.0 - which subtly changes the output of `--help` causing test failures: https://github.com/simonw/datasette/runs/2720383031?check_suite_focus=true ``` def test_help_includes(name, filename): expected = (docs_path / filename).read_text() runner = CliRunner() result = runner.invoke(cli, name.split() + [""--help""], terminal_width=88) actual = f""$ datasette {name} --help\n\n{result.output}"" # actual has ""Usage: cli package [OPTIONS] FILES"" # because it doesn't know that cli will be aliased to datasette expected = expected.replace(""Usage: datasette"", ""Usage: cli"") > assert expected == actual E AssertionError: assert '$ datasette ...e and exit.\n' == '$ datasette ...e and exit.\n' E Skipping 848 identical leading characters in diff, use -v to show E nt_id xxx E + E --version-note TEXT Additional note to show on /-/versions E --secret TEXT Secret used for signing secure values, such as signed E cookies E + E --title TEXT Title for metadata ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 908446997,MDU6SXNzdWU5MDg0NDY5OTc=,1353,?_nocount=1 for opting out of table counts,9599,simonw,closed,0,,,,,2,2021-06-01T15:53:27Z,2021-06-01T16:18:54Z,2021-06-01T16:17:04Z,OWNER,,"Running a trace against a CSV streaming export with the new `_trace=1` feature from #1351 shows that the following code is executing a `select count(*) from table` for every page of results returned: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/table.py#L700-L705 This is inefficient - a new `?_nocount=1` option would let us disable this count in the same way as #1349: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/base.py#L264-L276 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907795562,MDU6SXNzdWU5MDc3OTU1NjI=,265,Using enable_fts before search term,36287,prabhur,open,0,,,,,1,2021-06-01T01:43:34Z,2023-04-01T17:27:18Z,,NONE,,"Many thanks for the sqlite-utils suite of utilities. Has made my life much much easier. I used this to create a table and enable FTS. All works fine. The datasette utility detects FTS and shows a text box. Searching for a term using that interface works well. However, when I start to use features by following https://www.sqlite.org/fts5.html section **""3. Full-text Query Syntax""** I seem to run into issues that I suspect is due to `escape_fts` wrapper function. As an example, if i search for the term `""^குகை"" `on the text box in datasette it produces 140 results. However, when i tweak the query produced by datasette to not use ""escape_fts"" it produces 5 results. Similarly, when I try to restrict the search to a single column in FTS using a spec like `{title : ^குகை}` it returns no rows. The same thing pulls results when used without `escape_fts`. The text in the table is in Tamil language and the search term is a Tamil word. ``` ... where posts_fts match escape_fts(:search) ``` vs ``` ... where posts_fts match (:search) ``` Any ideas why? How can I get the benefits of both escaping as well as utilizing different facets of providing / controlling search terms? Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,rubenv,closed,0,,,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command: ``` twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since ``` Which gives the following error: ``` Error: Use either --since or --since_id, not both ``` Running without `--since`. ``` Traceback (most recent call last): File ""/usr/local/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline for tweet in bar: File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator for rv in self.iter: File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline yield from fetch_timeline( File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline raise Exception(str(tweets[""errors""])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}] ``` ``` Python 3.9.5 twitter-to-sqlite, version 0.21.3 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907642546,MDU6SXNzdWU5MDc2NDI1NDY=,264,"Supporting additional output formats, like GeoJSON",25778,eyeseast,closed,0,,,,,3,2021-05-31T18:03:32Z,2021-06-03T05:12:21Z,2021-06-03T05:12:21Z,CONTRIBUTOR,,"I have a project going where it would be useful to do some spatial processing in SQLite (instead of large files) and then output GeoJSON. So my workflow would be something like this: 1. Read Shapefiles, GeoJSON, CSVs into a SQLite database 2. Join, filter, prune as needed 3. Export GeoJSON for just the stuff I need at that moment, while still having a database of things that will be useful later I'm wondering if this is worth adding to SQLite-utils itself (GeoJSON, at least), or if it's better to make a counterpart to the ecosystem of `*-to-sqlite` tools, say a suite of `sqlite-to-*` things. Or would it be crazy to have a plugin system?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906993731,MDU6SXNzdWU5MDY5OTM3MzE=,1351,Get `?_trace=1` working with CSV and streaming CSVs,9599,simonw,closed,0,,,,,1,2021-05-31T03:02:15Z,2021-06-17T18:12:32Z,2021-06-01T15:50:09Z,OWNER,,"> I think it's worth getting `?_trace=1` to work with streaming CSV - this would have helped me spot this issue a long time ago. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1349#issuecomment-851133125_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906977719,MDU6SXNzdWU5MDY5Nzc3MTk=,1350,?_nofacets=1 query string argument for disabling facets and suggested facets,9599,simonw,closed,0,,,,,2,2021-05-31T02:22:29Z,2021-06-01T16:19:38Z,2021-05-31T02:39:18Z,OWNER,,"This is needed as an internal option for #1349. `datasette-graphql` can benefit from this too - maybe can even use it so that if you pass `?_shape=array` it gets automatically added, fixing #263.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906385991,MDU6SXNzdWU5MDYzODU5OTE=,1349,CSV ?_stream=on redundantly calculates facets for every page,9599,simonw,closed,0,,,,,9,2021-05-29T06:11:23Z,2021-06-17T18:12:32Z,2021-06-01T15:52:53Z,OWNER,,"I'm trying to figure out why a full CSV export from https://covid-19.datasettes.com/covid/ny_times_us_counties runs unbearably slowly. It's because the streaming endpoint works by scrolling through every page, and it turns out every page calculates facets and suggested facets!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906356331,MDU6SXNzdWU5MDYzNTYzMzE=,263,`sqlite-utils indexes` command,9599,simonw,closed,0,,,,,6,2021-05-29T04:52:34Z,2021-06-03T04:34:38Z,2021-06-03T04:34:38Z,OWNER,,"While working on #260 I realized there's no command to show indexes in a database, even though there is one for showing tables and one for triggers. I should implement #261 first.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906345899,MDU6SXNzdWU5MDYzNDU4OTk=,261,`table.xindexes` using `PRAGMA index_xinfo(table)`,9599,simonw,closed,0,,,,,5,2021-05-29T04:23:48Z,2021-06-03T03:54:14Z,2021-06-03T03:51:32Z,OWNER,,"> `PRAGMA index_xinfo(table)` DOES return that data: > ``` > (Pdb) [c[0] for c in fresh_db.execute(""PRAGMA > index_xinfo('idx_dogs_age_name')"").description] > ['seqno', 'cid', 'name', 'desc', 'coll', 'key'] > (Pdb) fresh_db.execute(""PRAGMA index_xinfo('idx_dogs_age_name')"").fetchall() > [(0, 2, 'age', 1, 'BINARY', 1), (1, 0, 'name', 0, 'BINARY', 1), (2, -1, None, 0, 'BINARY', 0)] > ``` > See https://sqlite.org/pragma.html#pragma_index_xinfo > > Example output: https://covid-19.datasettes.com/covid?sql=select+*+from+pragma_index_xinfo%28%27idx_ny_times_us_counties_date%27%29 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/260#issuecomment-850766552_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906330187,MDU6SXNzdWU5MDYzMzAxODc=,260,Support creating descending order indexes,9599,simonw,closed,0,,,,,12,2021-05-29T03:42:59Z,2021-05-29T05:01:39Z,2021-05-29T05:01:39Z,OWNER,,"SQLite lets you create indexes in reverse order, which can have a surprisingly big impact on performance, see https://github.com/simonw/covid-19-datasette/issues/27 I tried doing this using `sqlite-utils` like so, but it's didn't work: ```python db[""ny_times_us_counties""].create_index([""date desc""]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 904071938,MDU6SXNzdWU5MDQwNzE5Mzg=,1345,?_nocol= does not interact well with default facets,9599,simonw,closed,0,,,,,7,2021-05-27T18:39:55Z,2021-05-31T02:40:44Z,2021-05-31T02:31:21Z,OWNER,,"Clicking ""Hide this column"" on `fips` on https://covid-19.datasettes.com/covid/ny_times_us_counties shows this error: https://covid-19.datasettes.com/covid/ny_times_us_counties?_nocol=fips > ## Invalid SQL > no such column: fips The reason is that https://covid-19.datasettes.com/-/metadata sets up the following: ```json ""ny_times_us_counties"": { ""sort_desc"": ""date"", ""facets"": [ ""state"", ""county"", ""fips"" ], ``` It's setting `fips` as a default facet, which breaks if you attempt to remove the column using `?_nocol`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903986178,MDU6SXNzdWU5MDM5ODYxNzg=,1344,Test Datasette Docker images built for different architectures,9599,simonw,open,0,,,,,10,2021-05-27T16:52:29Z,2022-09-06T00:07:58Z,,OWNER,,"Continuing on from #1319 - now that we have the ability to build Datasette's Docker image against multiple architectures we should test that it works. We can do this with QEMU emulation, see https://twitter.com/nevali/status/1397958044571602945",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 903978133,MDU6SXNzdWU5MDM5NzgxMzM=,1343,Figure out how to publish alpha/beta releases to Docker Hub,9599,simonw,closed,0,,,,,4,2021-05-27T16:42:17Z,2021-05-27T16:46:37Z,2021-05-27T16:45:41Z,OWNER,,"> It looks like all I need to do to ship an alpha version to Docker Hub is NOT point the `latest` tag at it after it goes live: https://github.com/simonw/datasette/blob/1a8972f9c012cd22b088c6b70661a9c3d3847853/.github/workflows/publish.yml#L75-L77 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1319#issuecomment-849780481_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903902495,MDU6SXNzdWU5MDM5MDI0OTU=,1342,Improve `path_with_replaced_args()` and friends and document them,9599,simonw,open,0,,,,,3,2021-05-27T15:18:28Z,2021-05-27T15:23:02Z,,OWNER,,"> In order to cleanly implement this I need to expose the `path_with_replaced_args` utility function to Datasette's template engine. This is the first time this will become an exposed (and hence should-by-documented) API and I don't like its shape much. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1337#issuecomment-849721280_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 903200328,MDU6SXNzdWU5MDMyMDAzMjg=,1341,"""Show all columns"" cog menu item should show if ?_col= is used",9599,simonw,closed,0,,,,,1,2021-05-27T04:28:17Z,2021-05-27T04:31:16Z,2021-05-27T04:31:16Z,OWNER,,"On https://latest.datasette.io/fixtures/sortable?_col=sortable the ""Show all columns"" item (from #615) is not shown (it should be): ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 901009787,MDU6SXNzdWU5MDEwMDk3ODc=,1340,Research: Cell action menu (like column action but for individual cells),9599,simonw,open,0,,,,,1,2021-05-25T15:49:16Z,2021-05-26T18:59:58Z,,OWNER,,"Had an idea today that it might be useful to select an individual cell and say things like ""show me all other rows with the same value"" - maybe even a set of other menu options against cells as well. Mocked up a show-on-hover ellipses demo using the CSS inspector: ![idea](https://user-images.githubusercontent.com/9599/119528316-f0744480-bd35-11eb-8eb4-1deea6d60cce.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1340/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 899169307,MDU6SXNzdWU4OTkxNjkzMDc=,1338,Fix jinja2 warnings,9599,simonw,closed,0,,,,,0,2021-05-24T01:38:23Z,2021-05-24T01:41:55Z,2021-05-24T01:41:55Z,OWNER,,"Lots of these in the test suite now, after the Jinja upgrade in #1331: ``` tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:45: DeprecationWarning: 'jinja2.escape' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.escape' instead. label=jinja2.escape(data[""label""] or """") or "" "", tests/test_plugins.py::test_hook_render_cell_link_from_json datasette/tests/plugins/my_plugin_2.py:41: DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead. return jinja2.Markup( ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 898904402,MDU6SXNzdWU4OTg5MDQ0MDI=,1337,"""More"" link for facets that shows _facet_size=max results",9599,simonw,closed,0,,,,,7,2021-05-23T00:08:51Z,2021-05-27T16:14:14Z,2021-05-27T16:01:03Z,OWNER,,"_Original title: ""More"" link for facets that shows the full set of results_ The simplest way to do this will be to have it link to a generated SQL query. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1332#issuecomment-846479062_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1337/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 897212458,MDU6SXNzdWU4OTcyMTI0NTg=,63,Ability to fetch commits from branches other than the default,9599,simonw,open,0,,,,,0,2021-05-20T17:58:08Z,2021-05-20T17:58:08Z,,MEMBER,,This tool is currently almost entirely ignorant of the concept of branches. One example: you can't retrieve commits from any branch other than the default (usually main).,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 895686039,MDU6SXNzdWU4OTU2ODYwMzk=,1336,Document turning on WAL for live served SQLite databases,9599,simonw,open,0,,,,,1,2021-05-19T17:08:58Z,2022-01-13T21:55:59Z,,OWNER,,"Datasette docs don't talk about WAL yet, which allows you to safely serve reads from a database file while it is accepting writes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 894948100,MDU6SXNzdWU4OTQ5NDgxMDA=,259,Suggest the --alter option if a new column cannot be added,9599,simonw,closed,0,,,,,1,2021-05-19T03:17:38Z,2021-05-19T03:27:33Z,2021-05-19T03:26:26Z,OWNER,,Refs #256.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893890496,MDU6SXNzdWU4OTM4OTA0OTY=,1332,?_facet_size=X to increase number of facets results on the page,192568,mroswell,closed,0,,,,,5,2021-05-18T02:40:16Z,2021-05-27T16:13:07Z,2021-05-23T00:34:37Z,CONTRIBUTOR,,"Is there a way to add a parameter to the URL to modify default_facet_size? LIkewise, a way to produce a link on the three dots to expand to all items (or match previous number of items, or add x more)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893537744,MDU6SXNzdWU4OTM1Mzc3NDQ=,1331,Add support for Jinja2 version 3.0,475613,MarkusH,closed,0,,,,,10,2021-05-17T17:14:36Z,2021-05-23T00:57:39Z,2021-05-23T00:57:39Z,NONE,,"A week ago, [The Pallets Project](https://github.com/pallets) released [new major versions of several of its projects](https://palletsprojects.com/blog/flask-2-0-released/). Among those updates is one for Jinja2, which bumps it to version 3.0.0. I'd like for datasette to support Jinaj2 version 3.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1331/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 892457208,MDU6SXNzdWU4OTI0NTcyMDg=,1327,Support Unicode characters in metadata.json,20846286,GmGniap,closed,0,,,,,2,2021-05-15T14:33:58Z,2021-05-24T19:10:21Z,2021-05-24T19:10:21Z,NONE,,"Hello , when I used Burmese (Unicode) characters in metadata.json like below - ![image](https://user-images.githubusercontent.com/20846286/118364978-cba70100-b5c0-11eb-967c-7dc3b62478f2.png) It gave wrong results when I run datasette - ![image](https://user-images.githubusercontent.com/20846286/118365025-fc873600-b5c0-11eb-97ce-19541b8cc6d8.png) It would be great & helpful for us if metadata.json can support in Unicode supported Asian Languages. Thanks & Regards. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 891969037,MDU6SXNzdWU4OTE5NjkwMzc=,1326,How to limit fields returned from the JSON API?,5268174,bram2000,closed,0,,,,,1,2021-05-14T14:27:41Z,2021-05-23T02:55:06Z,2021-05-23T02:55:00Z,NONE,,"Hi, I have quite wide tables, and in many cases only want a subset of the data (to save on network bandwidth). I need to use the JSON API as handling pagination is so much easier, but I can't see a way to select specific columns. Is there a way to do this, or is it a feature request? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 884952179,MDU6SXNzdWU4ODQ5NTIxNzk=,1320,Can't use apt-get in Dockerfile when using datasetteproj/datasette as base,2670795,brandonrobertz,closed,0,,,,,4,2021-05-10T19:37:27Z,2021-05-24T18:15:56Z,2021-05-24T18:07:08Z,CONTRIBUTOR,,"The datasette base Docker image is super convenient, but there's one problem: if any of the plugins you install require additional system dependencies (e.g., xz, git, curl) then any attempt to use apt in said Dockerfile results in an explosion: ``` $ docker-compose build Building server [+] Building 9.9s (7/9) => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 666B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 34B 0.0s => [internal] load metadata for docker.io/datasetteproject/datasette:latest 0.6s => [base 1/4] FROM docker.io/datasetteproject/datasette@sha256:2250d0fbe57b1d615a8d6df0c9d43deb9533532e00bac68854773d8ff8dcf00a 0.0s => [internal] load build context 1.8s => => transferring context: 2.44MB 1.8s => CACHED [base 2/4] WORKDIR /datasette 0.0s => ERROR [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils 9.2s ------ > [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils: #6 0.446 Get:1 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB] #6 0.449 Get:2 http://deb.debian.org/debian buster InRelease [121 kB] #6 0.459 Get:3 http://httpredir.debian.org/debian sid InRelease [157 kB] #6 0.784 Get:4 http://deb.debian.org/debian buster-updates InRelease [51.9 kB] #6 0.790 Get:5 http://httpredir.debian.org/debian sid/main amd64 Packages [8626 kB] #6 1.003 Get:6 http://deb.debian.org/debian buster/main amd64 Packages [7907 kB] #6 1.180 Get:7 http://security.debian.org/debian-security buster/updates/main amd64 Packages [286 kB] #6 7.095 Get:8 http://deb.debian.org/debian buster-updates/main amd64 Packages [10.9 kB] #6 8.058 Fetched 17.2 MB in 8s (2243 kB/s) #6 8.058 Reading package lists... #6 9.166 E: flAbsPath on /var/lib/dpkg/status failed - realpath (2: No such file or directory) #6 9.166 E: Could not open file - open (2: No such file or directory) #6 9.166 E: Problem opening #6 9.166 E: The package lists or status file could not be parsed or opened. ``` The problem seems to be from completely wiping out `/var/lib/dpkg` in the upstream Dockerfile: https://github.com/simonw/datasette/blob/1b697539f5b53cec3fe13c0f4ada13ba655c88c7/Dockerfile#L18 I've tested without removing the directory and apt works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 871304967,MDU6SXNzdWU4NzEzMDQ5Njc=,1315,"settings.json should be picked up by ""datasette publish cloudrun""",9599,simonw,open,0,,,,,0,2021-04-29T18:16:41Z,2021-04-29T18:16:41Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 870946764,MDU6SXNzdWU4NzA5NDY3NjQ=,1312,how to query many-to-many relationship via json API?,5268174,bram2000,open,0,,,,,0,2021-04-29T12:09:49Z,2021-04-29T12:09:49Z,,NONE,,"Hi, Firstly thanks for Datasette, it's great! I'm trying to use the JSON API to query data from a Datasette instance. I have a simple 3 table many-to-many relationship, like so: `category` - list of categories `document` - list of documents `document_category` - join table (a category contains many documents, and a document can be a member of multiple categories) the `document_category` table foreign keys to the other two using their respective row_ids. Now I want to return ""all documents within category X"" but I cannot see a way to do this without executing two queries; the first to lookup the row_id of category X, and the second to join `document` with `document_category` where category ID is . I could easily write this in SQL, but this makes programmatic handling of pagination much more difficult (we'd have to dynamically modify the SQL to select the row_id and include the correct where and limit clauses). Is there a way to achieve this using the JSON API? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 870125126,MDU6SXNzdWU4NzAxMjUxMjY=,1310,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx),3747136,ColinMaudry,closed,0,,,,,2,2021-04-28T16:20:11Z,2021-04-30T07:26:11Z,2021-04-30T06:58:46Z,NONE,,"Hi, I have started developing a plugin to export records as a spreadsheet file. It could be ods or xlsx, whatever is easier. I have spotted the following packages: - ods files: https://pypi.org/project/odswriter/ - xlsx files: https://openpyxl.readthedocs.io/en/stable/index.html (quite powerful) or https://xlsxwriter.readthedocs.io/ (faster) This is the code I have so far, I test it with the `--plugins-dir` option: ```python from datasette import hookimpl from datasette.utils.asgi import Response import odswriter as ods def render_spreadsheet(rows): with ods.writer(open(""test.ods"",""wb"")) as odsfile: for row in rows: odsfile.writerow([""String"", ""ABCDEF123456"", ""123456""]) return Response(odsfile, content_type=""application/vnd.oasis.opendocument.spreadsheet"", status=200) @hookimpl def register_output_renderer(): return {""extension"": ""ods"", ""render"": render_spreadsheet} ``` I get the following error: ``` Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' ERROR: Exception in ASGI application Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 396, in run_asgi result = await app(self.scope, self.receive, self.send) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py"", line 45, in __call__ return await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 161, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/tracer.py"", line 75, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 107, in app_wrapped_with_csrf await app(scope, receive, wrapped_send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1086, in __call__ return await self.route_path(scope, receive, send, path) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1133, in route_path return await self.handle_500(request, send, exception) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1267, in handle_500 await asgi_send_html( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 217, in asgi_send_html await asgi_send( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 237, in asgi_send await asgi_start(send, status, headers, content_type) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 246, in asgi_start await send( File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 103, in wrapped_send await send(event) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 482, in send raise RuntimeError(msg % message_type) RuntimeError: Expected ASGI message 'http.response.body', but got 'http.response.start'. ``` I tried with `AsgiFileDownload` like in [DatabaseDownload](https://github.com/simonw/datasette/blob/main/datasette/views/database.py#L150) to deal with the binary nature of the ods file, but the renderer expects a Response: > should be dict or Response However, the `Response` class only supports the following methods, not binary: - html - text - json - redirect How would you suggest me to proceed to have my ods file downloaded? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 868188068,MDU6SXNzdWU4NjgxODgwNjg=,257,"Insert from JSON containing strings with non-ascii characters are escaped as unicode for lists, tuples, dicts.",6586811,dylan-wu,closed,0,,,,,0,2021-04-26T20:46:25Z,2021-05-19T02:57:05Z,2021-05-19T02:57:05Z,CONTRIBUTOR,,"JSON Test File (test.json): ```json [ { ""id"": 123, ""text"": ""FR Théâtre"" }, { ""id"": 223, ""text"": [ ""FR Théâtre"" ] } ] ``` Command to import: ```bash sqlite-utils insert test.db text test.json --pk=id ``` Resulting table view from datasette: ![image](https://user-images.githubusercontent.com/6586811/116147833-cdf2fb00-a6a5-11eb-8412-0aae81b6e6dd.png) Original, db.py line 2225: ```python return json.dumps(value, default=repr) ``` Fix, db.py line 2225: ```python return json.dumps(value, default=repr, ensure_ascii=False) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 866668415,MDU6SXNzdWU4NjY2Njg0MTU=,1308,"Columns named ""link"" display in bold",9599,simonw,closed,0,,,,,3,2021-04-24T05:58:11Z,2021-04-24T06:07:49Z,2021-04-24T06:07:49Z,OWNER,,Reported in office hours today.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 864969683,MDU6SXNzdWU4NjQ5Njk2ODM=,1305,Index view crashes when any database table is not accessible to actor,416374,gfrmin,closed,0,,,,,0,2021-04-22T13:44:22Z,2021-06-02T04:26:29Z,2021-06-02T04:26:29Z,CONTRIBUTOR,,"Because of https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L63, the ```tables``` dict built does not include invisible tables; however, if https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L80 is reached (because table_counts was not successfully initialized, e.g. due to a very large database) then as db.get_all_foreign_keys() returns ALL tables, a KeyError will be raised. This error can be recreated with the fixtures.db if any table is hidden, e.g. by adding something like ```""foreign_key_references"": { ""allow"": {} }``` to fixtures-metadata.json and deleting ```or not table_counts``` from https://github.com/simonw/datasette/blob/main/datasette/views/index.py#L77. I'm not sure how to fix this error; perhaps by testing if the table is in the aforementions ```tables``` dict.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 863884805,MDU6SXNzdWU4NjM4ODQ4MDU=,1304,"Document how to send multiple values for ""Named parameters"" ",9308268,rayvoelker,open,0,,,,,4,2021-04-21T13:19:06Z,2021-12-08T03:23:14Z,,NONE,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters I thought that I had seen an example of how to do this example below, but I can't seem to find it ```sql select * from bib where bib.bib_record_num in (1008088,1008092) ``` ```sql select * from bib where bib.bib_record_num in (:bib_record_numbers) ``` ![image](https://user-images.githubusercontent.com/9308268/115558839-2333a480-a281-11eb-85e6-ce3bada79140.png) https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-204d100?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Awhere%0D%0A++bib.bib_record_num+in+%28%3Abib_record_numbers%29&bib_record_numbers=1008088%2C1008092 Or, maybe this isn't a fully supported feature. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 861622839,MDU6SXNzdWU4NjE2MjI4Mzk=,256,inserting with --nl errors with: sqlite3.OperationalError: table
has no column named ,279769,rathboma,closed,0,,,,,2,2021-04-19T18:01:03Z,2021-05-19T03:26:54Z,2021-05-19T03:26:54Z,NONE,,"I have a `jsonl` file, it is 10,000 lines long. Inserting from the cli with `sqlite-utils insert db table file --nl --batch-size 10000` fails with this missing column error, even though I'm telling it to use the whole file in the first batch. This seems similar to #18 and #139, but maybe it's unique to `--nl`?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 860734722,MDU6SXNzdWU4NjA3MzQ3MjI=,1302,Fix disappearing facets,192568,mroswell,open,0,,,,,0,2021-04-18T18:42:33Z,2021-04-20T07:40:15Z,,CONTRIBUTOR,,"1. Clone https://github.com/mroswell/list-N 2. Run `datasette disinfectants.db -o` 3. Select the `Safer_or_Toxic` facet. 4. Select `Toxic`. 5. Close out the `Safer_or_Toxic` facet. 6. Examine `Suggested facets` list. `Safer_or_Toxic` is GONE. 7. Try some other facets. When you select an element, and then close the list, in some cases, the facet properly returns to the `Suggested facet` list... Arrays and dates properly return to the list, but fields with strings don't return to the list. Since my site is devoted to whether disinfectants are Safer or Toxic, having the suggested facet disappear from the suggested facet list is very confusing* to end-users. This, along with a few other issues, unfortunately proved beyond my own programming ability to address. So I hired a Senior-level developer to address a number of issues, including this disappearing act. 8. Open a new terminal. Run `datasette disinfectants.db -m metadata.json --static static:static/ --template-dir templates/ --plugins-dir plugins/ -p 8001 -o` 9. Repeat steps 3-6, but this time, the Safer_or_Toxic facet returns to the list (and the related URL parameters are removed). I'm not sure how to do a pull request for this, because the plugin contains other functionality that goes beyond this bug. I wanted the facets sorted in a certain order (both in the suggested facet list, and the detail lists) (... the detail lists were hopping around all over the place before...) I wanted the duplicate facets removed (leaving only the one where you can facet by individual item in an array.) I wanted the arrays to be presented in a prettier fashion (I did that in the template... That could be moved over to the plugin at some point) I'm thinking it'll be very helpful if applicable parts of my project's plugin (sort_suggested_facets_plugin.py) will be able to be incorporated back into datasette, but I leave that to you to consider. (* The disappearing facet bug was especially confusing because I'm removing the filters and sql from the table page, at the request of the organization. The filters and sql detail created a lot of confusion for end users who try to find disinfectants used by Hospitals, for instance, as an '=' won't find them, since they are part of the Use_site array.) My disappearing-facet confusion was documented in my own issue: https://github.com/mroswell/list-N/issues/57 (addressed by the plugin). Other facet-related issues here: https://github.com/mroswell/list-N/issues/54 (addressed by the plugin); https://github.com/mroswell/list-N/issues/15 (addressed by template); https://github.com/mroswell/list-N/issues/53 (not yet addressed). ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 860722711,MDU6SXNzdWU4NjA3MjI3MTE=,1301,Publishing to cloudrun with immutable mode?,5413548,louispotok,open,0,,,,,1,2021-04-18T17:51:46Z,2022-10-07T02:38:04Z,,CONTRIBUTOR,,"I'm a bit confused about immutable mode and publishing to cloudrun. (I want to publish with immutable mode so that I can support database downloads.) Running `datasette publish cloudrun --extra-options=""-i example.db""` leads to an error: > Error: Invalid value for '-i' / '--immutable': Path 'example.db' does not exist. However, running `datasette publish cloudrun example.db` not only works but seems to publish in immutable mode anyway! I'm seeing this both with `/-/databases.json` and the fact that downloads are working. When I just `datasette serve` locally, this succeeds both ways and works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 860625833,MDU6SXNzdWU4NjA2MjU4MzM=,1300,Make row available to `render_cell` plugin hook,3243482,abdusco,closed,0,,,,,5,2021-04-18T10:14:37Z,2022-07-07T16:34:05Z,2022-07-07T16:31:22Z,CONTRIBUTOR,,"*Original title: **Generating URL for a row inside `render_cell` hook*** Hey, I am using Datasette to view a database that contains video metadata. It has BLOB columns that contain video thumbnails in JPG format (around 100-500KB per row). I've registered an output formatter that extends `datasette.blob_renderer.render_blob` function and serves the column with `image/jpeg` content type. ```python from datasette.blob_renderer import render_blob async def render_jpg(datasette, database, rows, columns, request, table, view_name): response = await render_blob(datasette, database, rows, columns, request, table, view_name) response.content_type = ""image/jpeg"" response.headers[""Content-Disposition""] = f'inline; filename=""image.jpg""' return response @hookimpl def register_output_renderer(): return { ""extension"": ""jpg"", ""render"": render_jpg, ""can_render"": lambda: True, } ``` This works well. I can visit `http://localhost:8001/mydb/videos/1.jpg?_blob_column=thumbnail` and view the image. I want to display the image directly with an `` tag (lazy-loaded of course). So, I need a URL, because embedding base64 would increase the page size too much (each image > 100KB). Datasette generates a link with `.blob` extension for blob columns. It does this by calling `datasette.urls.row_blob` https://github.com/simonw/datasette/blob/7a2ed9f8a119e220b66d67c7b9e07cbab47b1196/datasette/views/table.py#L169-L179 But I have no way of getting the row inside the `render_cell` hook. ```python @hookimpl def render_cell(value, column, table, database, datasette): if isinstance(value, bytes) and imghdr.what(None, value): # generate url return '$renderedLink' ``` Any pointers?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 858501079,MDU6SXNzdWU4NTg1MDEwNzk=,255,transform --help should tell you the available types,9599,simonw,closed,0,,,,,0,2021-04-15T05:24:48Z,2021-05-29T03:55:52Z,2021-05-29T03:55:52Z,OWNER,,"``` Usage: sqlite-utils transform [OPTIONS] PATH TABLE Transform a table beyond the capabilities of ALTER TABLE Options: --type ... Change column type to X ``` This should specify that the possible types are 'INTEGER', 'TEXT', 'FLOAT', 'BLOB'.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 856895291,MDU6SXNzdWU4NTY4OTUyOTE=,1299,Design better empty states,9599,simonw,open,0,,,,,0,2021-04-13T12:06:12Z,2021-04-13T12:06:12Z,,OWNER,,Inspiration here: https://emptystat.es/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855476501,MDU6SXNzdWU4NTU0NzY1MDE=,1298,improve table horizontal scroll experience,192568,mroswell,open,0,,,,,4,2021-04-12T01:55:16Z,2022-08-30T21:11:49Z,,CONTRIBUTOR,,"Wide tables aren't a huge problem if you know to click and drag right. But it's not at all obvious to do that. (it also tends to blue-select any content as it's dragging.) Depending on column widths, public users might entirely miss all the columns to the right. There is a scrollbar at the bottom of the table, but I'm displaying ALL my records because it's the only way for datasette-vega to make accurate charts. So that bottom scrollbar is likely to be missed. I wonder if some sort of javascript-y mouseover to an arrow might help, similar to those seen in image carousels. Ah: here's a perfect example: 1. Visit http://google.com 2. Search for: animals endangered 3. Note the 'g-right-button' (in the code) that looks like a right-facing caret in a circle. 4. Click on that and the carousel scrolls right (and 'g-left-button' appears on the left). Might be tricky to do that on a table, rather than a one-row carousel, but it's worth experimenting with. Another option is just to put the scrollbars at the top of the table, too. Meantime, I'm trying to build a button like the ""View/hide all columns on https://salaries.news.baltimoresun.com/salaries-be494cf/2019+Maryland+state+salaries Might be nice to have that available by default, with settings in the metadata showing which are on by default. (I saw some other closed issues related to horizontal scrolling, and admit I don't entirely understand them. For instance, the animated gif at https://github.com/simonw/datasette/issues/998#issuecomment-714117534 confuses me. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1298/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855451460,MDU6SXNzdWU4NTU0NTE0NjA=,1297,"Documentation: json1, and introspection endpoints",192568,mroswell,open,0,,,,,0,2021-04-12T00:38:00Z,2021-04-12T01:29:33Z,,CONTRIBUTOR,,"https://docs.datasette.io/en/stable/facets.html notes that: > If your SQLite installation provides the json1 extension (you can check using /-/versions) Datasette will automatically detect columns that contain JSON arrays... When I check -/versions I see two sections relevant to json1: ``` ""extensions"": { ""json1"": null }, ""compile_options"": [ ... ""ENABLE_JSON1"", ``` The ENABLE_JSON1 makes me think json1 is likely available. But the `""json1"": null` made me think it wasn't available (because of the `null`). It would help if the documentation provided clarity about how to know if json1 is installed. It would also be helpful if the `/-/versions` information signalled somehow that that is to be appended to the hostname or domain name (or whatever you want to call it, or simply show it, using `example.com/-/versions` instead of `/-/versions`. Likewise on that last point, for https://docs.datasette.io/en/stable/introspection.html#introspection , at least at some point on that page detailing where those introspection endpoints go. (Sometimes documentation can be so abbreviated that it's hard for new users to figure out what's going on.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855296937,MDU6SXNzdWU4NTUyOTY5Mzc=,1295,Errors should have links to further information,9599,simonw,open,0,,,,,2,2021-04-11T12:39:12Z,2022-12-14T23:28:49Z,,OWNER,,"Inspired by this tweet: https://twitter.com/willmcgugan/status/1381186384510255104 > While I am thinking about faqs. I’d also like to add short URLs to Rich exceptions. > > I loath cryptic error messages, and I’ve created a fair few myself. In Rich I’ve tried to make them as plain English as possible. But... > > would be great if every error message linked to a page that explains the error in detail and offers fixes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 853672224,MDU6SXNzdWU4NTM2NzIyMjQ=,1294,"""You can check out any time you like. But you can never leave!""",192568,mroswell,open,0,,,,,0,2021-04-08T17:02:15Z,2021-04-08T18:35:50Z,,CONTRIBUTOR,,"(Feel free to rename this one.) - The column gear lets you ""Show not-blank rows."" Then it places a parameter in the URL, which a web developer would notice, but a lot of users won't notice, or know to delete it. Would be good to toggle ""Show not-blank rows"" with ""Show all rows."" (Also would be quite helpful to have a ""Show blank rows | Show all rows"" option) - The column gear lets you ""Sort ascending"" and ""Sort descending"" but then you're stuck with some sort of sorted version thereafter, unless you know to sort the ID column, or to remove the full _sort parameter and its value in the URL. Would be good to offer a ""Remove sort"" option in the gear. - These requests are in the same camp as: https://github.com/simonw/datasette-vega/issues/36 - I suspect there are other url parameter instances where similar analysis would be helpful, but the three above are the use cases I've run across. UPDATE: - It would be helpful to have a ""Previous page"" available for all but the first table page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849978964,MDU6SXNzdWU4NDk5Nzg5NjQ=,1293,Show column metadata plus links for foreign keys on arbitrary query results,9599,simonw,open,0,,,,,51,2021-04-04T22:59:42Z,2022-09-02T17:34:09Z,,OWNER,,"Related to #620. It would be _really_ cool if Datasette could magically detect the source of the data displayed in an arbitrary query and, if that data represents a foreign key, display it as a hyperlink. Compare https://latest.datasette.io/fixtures/facetable To https://latest.datasette.io/fixtures?sql=select+pk%2C+created%2C+planet_int%2C+on_earth%2C+state%2C+city_id%2C+neighborhood%2C+tags%2C+complex_array%2C+distinct_some_null+from+facetable+order+by+pk+limit+101 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 849975810,MDU6SXNzdWU4NDk5NzU4MTA=,1292,Research ctypes.util.find_library('spatialite'),9599,simonw,open,0,,,,,1,2021-04-04T22:36:59Z,2022-01-20T21:28:50Z,,OWNER,,"Spotted this in the Django SpatiaLite backend: https://github.com/django/django/blob/8f6a7a0e9e7c5404af6520ae606927e32415eb00/django/contrib/gis/db/backends/spatialite/base.py#L24-L36 ```python ctypes.util.find_library('spatialite') ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1292/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849543502,MDU6SXNzdWU4NDk1NDM1MDI=,1289,Speed up tests with pytest-xdist,9599,simonw,closed,0,,,,,3,2021-04-03T00:47:39Z,2021-04-03T03:42:28Z,2021-04-03T03:42:28Z,OWNER,,"I think I can get this working for almost every test, then use the pattern in https://github.com/pytest-dev/pytest-xdist/issues/385#issuecomment-444545641 to opt specific tests out of being run in parallel.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1289/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 849512840,MDU6SXNzdWU4NDk1MTI4NDA=,1288,Facets: show counts for null,1111743,jungle-boogie,open,0,,,,,0,2021-04-02T22:33:44Z,2021-04-02T22:33:44Z,,NONE,,"Hi, Thank you for Datasette and being a fan of SQLite! Not all rows in a record will always contain data. So when using a facet on a column where some records have data and others don't, you don't get an accurate count of the results. Please consider also counting and showing null records with facets.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1288/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849396758,MDU6SXNzdWU4NDkzOTY3NTg=,1287,Upgrade to Python 3.9.4,9599,simonw,open,0,,,,,5,2021-04-02T18:43:15Z,2021-04-03T22:38:39Z,,OWNER,,Has some security fixes https://pythoninsider.blogspot.com/2021/04/python-393-and-389-are-now-available.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849220154,MDU6SXNzdWU4NDkyMjAxNTQ=,1286,Better default display of arrays of items,192568,mroswell,open,0,,,,,5,2021-04-02T13:31:40Z,2021-06-12T12:36:15Z,,CONTRIBUTOR,,"Would be great to have template filters that convert array fields to bullets and/or delimited lists upon table display: ``` |to_bullets |to_comma_delimited |to_semicolon_delimited ``` or maybe: ``` |join_array(""bullet"") |join_array(""bullet"",""square"") |join_array("";"") |join_array("","") ``` Keeping in mind that bullets show up in html as \ while other delimiting characters appear after the value. Of course, the fields themselves would remain as facetable arrays. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 847700726,MDU6SXNzdWU4NDc3MDA3MjY=,1285,Feature Request or Plugin Request: Numeric Range Facets,192568,mroswell,open,0,,,,,0,2021-04-01T01:50:20Z,2021-04-01T02:28:19Z,,CONTRIBUTOR,,"It would be great to offer facets for numeric data ranges. The ranges could pull from typical GIS methods of creating choropleth maps. https://gisgeography.com/choropleth-maps-data-classification/ Of the following, for mapping, I've always preferred a Jenks Natural Breaks, or a cross between Jenks and Pretty breaks. - Equal Intervals - Quantile (equal count) - Standard Deviation - Natural Breaks (Jenks) Classification - Pretty Breaks - Some sort of Aggregate Jenks Classification (this isn't standard, but it would be nice to be able to set classification ranges that work across tables.) Here are some links for Natural Breaks, in case this method is unfamiliar. - https://en.wikipedia.org/wiki/Jenks_natural_breaks_optimization - http://wiki.gis.com/wiki/index.php/Jenks_Natural_Breaks_Classification - https://medium.com/analytics-vidhya/jenks-natural-breaks-best-range-finder-algorithm-8d1907192051 Per that last link, there is a Jenks Python module... They also describe it as data-intensive for larger datasets. Maybe this is a good plugin idea. An example of equal Intervals would be 0 – < 10 10 – < 20 20 – < 30 30 – < 40 It's kind of confusing to have that less-than sign in there. it could also be displayed as: 0 – 10 10 – 20 20 – 30 30 – 40 But then it's not completely clear which category 10 is in, for instance. (Best to right-justify.. and use an ""en dash"" between numbers.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 847423559,MDU6SXNzdWU4NDc0MjM1NTk=,253,fixtures.db example error in sql-utils blog post,192568,mroswell,closed,0,,,,,2,2021-03-31T22:07:36Z,2021-05-19T03:31:48Z,2021-05-19T03:31:47Z,NONE,,"En route to trying to understand column order transform documentation, I tried the instructions here: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/ I get a malformed database schema syntax error. ``` $ wget https://latest.datasette.io/fixtures.db --2021-03-31 18:00:23-- https://latest.datasette.io/fixtures.db Resolving latest.datasette.io (latest.datasette.io)... 2607:f8b0:4004:801::2013, 142.250.73.211 Connecting to latest.datasette.io (latest.datasette.io)|2607:f8b0:4004:801::2013|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [application/octet-stream] Saving to: ‘fixtures.db’ fixtures.db [ <=> ] 260.00K --.-KB/s in 0.1s 2021-03-31 18:00:23 (2.41 MB/s) - ‘fixtures.db’ saved [266240] $ sqlite3 fixtures.db '.schema facetable' Error: malformed database schema (generated_columns) - near ""AS"": syntax error $ sqlite3 fixtures.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> .schema Error: malformed database schema (generated_columns) - near ""AS"": syntax error ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 845794436,MDU6SXNzdWU4NDU3OTQ0MzY=,1284,Feature or Documentation Request: Individual table as home page template,192568,mroswell,open,0,,,,,4,2021-03-31T03:56:17Z,2021-11-04T03:15:01Z,,CONTRIBUTOR,,"It would be great to have a sample showing how to move a single database that has a single table, to the index page. I'm trying it now, and find there is a real depth of Datasette and Python understanding that's required to be successful. I've got all the basic jinja concepts down... variables, template control structures, template inheritance, template overrides, css, html, the --template-dir and --static arguments, etc. But copying the table.html file to index.html doesn't work. There are undocumented functions and filters... I can figure some of them out (yay, url_builder.py and utils/__init__.py!) but it's a slog better handled by a much stronger Python developer. One sample would make a world of difference. The ideal form of this documentation would be a diff between the default table.html and how that would look if essentially moved to index.html. The use case is for everyone who wants to create a public-facing website to explore a single table at the root directory. (Maybe a second bit of documentation for people who have a single database with multiple tables.) (Hmm... might be cool to have a setting for that, where it happens automagically! If only one table, then home page is at the table level. if only one database, then home page is at the database level.... as an option.) I suppose I could ignore this, and somehow do this in the DNS settings once I hook up Vercel to a domain name, maybe.. and remove the breadcrumbs in table.html... but for now, a documentation request in the form of a diff... for viewing a single table (or a single database) at the root. (Actually, there's probably room for a whole expanded section on templates. Noticed some nice table metadata in one of the datasette examples, for instance... Hmm... maybe a whole library of solutions in one place... maybe a documentation hackathon! If that's of interest, of course it's a separate issue. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 843884745,MDU6SXNzdWU4NDM4ODQ3NDU=,1283,advanced #export causes unexpected scrolling,192568,mroswell,open,0,,,,,0,2021-03-29T22:46:57Z,2021-03-29T22:46:57Z,,CONTRIBUTOR,,"1. Visit a datasette table page 2. Click on the ""(advanced)"" link. This adds a fragment identifier ""#export"" to the URL, and scrolls down to the ""Advanced export"" div with the ""export"" id. 3. Manually scroll back up, and click on a suggested facet. The fragment identifier is still present, and the app scrolls back down to the ""Advanced export"" div. I think this is unwanted behavior. The user remedy seems to be to manually remove the ""#export"" from the URL. This behavior happens in my project, and in: https://covid-19.datasettes.com/covid/economist_excess_deaths (for instance) but not in this table: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1283/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 842881221,MDU6SXNzdWU4NDI4ODEyMjE=,1281,Latest Datasette tags missing from Docker Hub,9599,simonw,closed,0,,,,,7,2021-03-29T00:58:30Z,2021-03-29T01:41:48Z,2021-03-29T01:41:48Z,OWNER,,"Spotted this while testing https://github.com/simonw/datasette/issues/1249#issuecomment-808998719_ https://hub.docker.com/r/datasetteproject/datasette/tags?page=1&ordering=last_updated isn't showing the tags for any version more recent than 0.54.1 - we are up to 0.56 now. But the `:latest` tag is for the new 0.56 release.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842862708,MDU6SXNzdWU4NDI4NjI3MDg=,1280,Ability to run CI against multiple SQLite versions,9599,simonw,open,0,,,,,2,2021-03-28T23:54:50Z,2021-05-10T19:07:46Z,,OWNER,,"Issue #1276 happened because I didn't run tests against a SQLite version prior to 3.16.0 (released 2017-01-02). Glitch is a deployment target and runs SQLite 3.11.0 from 2016-02-15. If CI ran against that version of SQLite this bug could have been avoided.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 842695374,MDU6SXNzdWU4NDI2OTUzNzQ=,35,Support to annotate photos on other than macOS OSes,1151557,ligurio,open,0,,,,,1,2021-03-28T09:01:25Z,2021-04-05T07:37:57Z,,NONE,,dogsheep-photos allows to annotate photos using Apple Photo's db. It would be nice to have such ability on other OSes too. For example using trained local model or using Google Vision API (see #14).,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,