id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,simonw,open,0,,,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066474200,I_kwDOCGYnMM4_kRrY,344,Support STRICT tables,9599,simonw,closed,0,,,,,14,2021-11-29T20:32:23Z,2023-12-08T05:22:39Z,2023-12-08T05:22:39Z,OWNER,,"New in SQLite 3.37.0, released a few days ago: https://www.sqlite.org/stricttables.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1988525411,I_kwDOCGYnMM52hn1j,603,Pyhton 3.12 Bug report,1324252,constantinedev,open,0,,,,,1,2023-11-10T22:57:48Z,2023-12-08T05:10:31Z,,NONE,,"I start with new python3 verison 3.12.0 Also have the error where connect DataBase ``` Traceback (most recent call last): File ""/home/t/Development/python/FKPJ/ClinicSYS/run.py"", line 1, in import re, os, io, json, sqlite_utils, requests, pytz, logging File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/__init__.py"", line 1, in from .db import Database File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 277, in class Database: File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 306, in Database filename_or_conn: Optional[Union[str, pathlib.Path, sqlite3.Connection]] = None, ^^^^^^^^^^^^^^^^^^ ``` This bug come from `sqlite-utils` since's v3.33. Anyone get the same ? As well now of the resolved plan just keep the sqlite-utils version in python3.12 with v3.32.1 [tested] but where are the sqlite3.Connection problem.... This won't happen on python version down to 3.11[tested] Just the python3.12.0, I have test this error are come from the sqlite3 connection The error say from `sqlite_utils` and with the sqlite3 Connection, what can I do. Let fix together.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2007893839,I_kwDOCGYnMM53rgdP,605,Insert fails with `Error: Python int too large to convert to SQLite INTEGER`; can we use `NUMERIC` here?,12229877,Zac-HD,closed,0,,,,,1,2023-11-23T10:19:46Z,2023-12-08T05:07:54Z,2023-12-08T05:07:54Z,NONE,,"I'm currently working on a new feature for Hypothesis, where we can dump a tidy jsonlines table of all the test cases we tried - including arguments, outcomes, timings, coverage, etc. Exploring this seems like a perfect cases for `sqlite-utils` and `datasette`, but I pretty quickly ran into an integer overflow problem and don't want to recommend that experience to my users. I originally went to report this as a bug... and then found https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895581038 almost exactly matched my repro 😅 https://github.com/simonw/sqlite-utils/issues/110#issuecomment-626391063 suggests that using `NUMERIC` would avoid this overflow error, although ""If the TEXT value is a well-formed integer literal that is too large to fit in a 64-bit signed integer, it is converted to REAL."" suggests that this would come at the cost of rounding to the nearest float value. Maybe I should just convert large integers to float before writing out my json? After a bit more hacking, ""manually cast large integers to float"" seems like a decent solution for my particular case, but having written it up I thought I might as well post this issue anyway - I hope it's useful feedback, and won't mind at all if you close as wontfix if it's not.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,precipice,open,0,,,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2029161033,I_kwDOCGYnMM548opJ,606,str and int as aliases for text and integer,9599,simonw,closed,0,,,,,2,2023-12-06T18:35:49Z,2023-12-06T19:44:04Z,2023-12-06T18:49:32Z,OWNER,,"I keep making this mistake: ```bash sqlite-utils add-column content.db assets _since int ``` ``` Usage: sqlite-utils add-column [OPTIONS] PATH TABLE COL_NAME [[integer|float|b lob|text|INTEGER|FLOAT|BLOB|TEXT]] Try 'sqlite-utils add-column -h' for help. Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]': 'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,fgregg,open,0,,,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed. this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application"" (see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,fzakaria,open,0,,,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is ""stringifying"" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,mattparmett,open,0,,,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler. Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler. This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it. One example: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537 Other instances: https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 564833696,MDU6SXNzdWU1NjQ4MzM2OTY=,670,Prototoype for Datasette on PostgreSQL,9599,simonw,open,0,,,,,15,2020-02-13T17:17:55Z,2023-11-17T15:32:21Z,,OWNER,,"I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite. Some of the factors making me think PostgreSQL support could be worth the effort: - Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL. - Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using [db-to-sqlite](https://github.com/simonw/db-to-sqlite) but that's a pretty big barrier to getting started - being able to run `datasette postgresql://connection-string` and start trying it out would be a massively better experience. - Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus. - Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else. - It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't. - Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future. The above reasons feel strong enough to justify a prototype.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/670/reactions"", ""total_count"": 19, ""+1"": 14, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 5, ""rocket"": 0, ""eyes"": 0}",, 1994857251,I_kwDOBm6k_c525xsj,2208,No suggested facets when a column named 'value' is included,198537,rgieseke,open,0,,,,,1,2023-11-15T14:11:17Z,2023-11-15T14:18:59Z,,CONTRIBUTOR,,"When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174 Currently the following is shown (from https://latest.datasette.io/fixtures/facetable) ![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5) When I add a column named 'value' only the JSON facets are processed. ![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d) I think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that. There is also a TODO with a similar question in the same file. I have not looked into that yet. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,honzajavorek,open,0,,,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error: ``` $ datasette Traceback (most recent call last): File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group' ``` I have datasette in my dependencies like this: ```toml [tool.poetry.group.dev.dependencies] datasette = {version = ""1.0a7"", allow-prereleases = true} ``` I had the latest regular version (not pre-release) there originally, but the result was the same: ```toml [tool.poetry.group.dev.dependencies] datasette = ""0.64.5"" ``` Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978603203,I_kwDOCGYnMM517xbD,602,`sqlite-utils transform` removes the `AUTOINCREMENT` keyword,4472046,ArsTapatun,open,0,,,,,0,2023-11-06T08:48:43Z,2023-11-06T08:48:43Z,,NONE,,"### Context We ran into this bug randomly, noticing that deleted `ROWID` would get reused after migrating the DB. Using `transform` to change any column in the table will also unexpectedly strip away the `AUTOINCREMENT` keyword from the primary key definition, even if it was not the transformation target. ### Reproducible example **Original database** ```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF $ sqlite3 test.db "".schema mytable"" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ``` **Modified database after sqlite-utils** ```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE IF NOT EXISTS ""mytable"" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139 In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates. You can't even try calling `post_body()` and implement your own custom parsing because of: - #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978022687,I_kwDOBm6k_c515jsf,2204,request.post_body() can only be called once,9599,simonw,open,0,,,,,0,2023-11-05T23:22:03Z,2023-11-05T23:23:23Z,,OWNER,,"This code here: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135 It consumes the messages, which means if you try to call it a second time you won't be able to get at the body. This is efficient - we don't end up with a `request` object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not. Potential solution: set `request._body` the first time it is called, and return that on subsequent calls. Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call `post_body()` multiple times against one of those larger bodies. I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 959137143,MDU6SXNzdWU5NTkxMzcxNDM=,1415,feature request: document minimum permissions for service account for cloudrun,536941,fgregg,open,0,,,,,4,2021-08-03T13:48:43Z,2023-11-05T16:46:59Z,,CONTRIBUTOR,,"Thanks again for such a powerful project. For deploying to cloudrun from github actions, I'd like to create a service account with minimal permissions. It would be great to document what those minimum permission that need to be set in the IAM. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1415/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,LyzardKing,open,0,,,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue. I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly. Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: ```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!``` Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977155641,I_kwDOCGYnMM512QA5,601,Move plugin directory into documentation,9599,simonw,open,0,,,,,0,2023-11-04T04:07:52Z,2023-11-04T04:07:52Z,,OWNER,,"https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html https://til.simonwillison.net/readthedocs/stable-docs",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1976986318,I_kwDOCGYnMM511mrO,599,Cannot find spatialite on arm64 linux,37802088,MikeCoats,closed,0,,,,,1,2023-11-03T22:05:51Z,2023-11-04T01:06:31Z,2023-11-04T00:33:28Z,CONTRIBUTOR,,"Initially, I found an issue in `datasette` where it wouldn’t find `spatialite` when running on my Radxa Rock 5B - an RK3588 powered SBC, running the arm64 build of Debian Bullseye. I confirmed the same behaviour on my Raspberry Pi 4 - a BCM2711 powered SBC, running the arm64 build of Debian Bookworm. ``` $ datasette --load-extension=spatialite example.db Error: Could not find SpatiaLite extension ``` I did some digging and realised the issue originates in this project. Even with the `libsqlite3-mod-spatialite` package installed, `pytest` skips all of the GIS tests in the project. ``` $ apt list --installed | grep spatial […] libsqlite3-mod-spatialite/stable,now 5.0.1-3 arm64 [installed] $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` I tracked the issue down to the [`find_sqlite()` function in the `utils.py`](https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/utils.py#L60) file. The [`SPATIALITE_PATHS`](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/utils.py#L34-L39) array doesn’t have an entry for the location of this module on arm64 linux. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1553425465,I_kwDOCGYnMM5cl2Q5,522,Add COLUMN_TYPE_MAPPING for timedelta,81377,maport,closed,0,,,,,0,2023-01-23T16:49:54Z,2023-11-04T00:49:51Z,2023-11-04T00:49:51Z,NONE,,"Currently trying to create a column with Python type `datetime.timedelta` results in an error: ``` >>> from sqlite_utils import Database >>> db = Database(""test.db"") >>> test_tbl = db['test'] >>> test_tbl.insert({'col1': datetime.timedelta()}) Traceback (most recent call last): File """", line 1, in File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 2979, in insert return self.insert_all( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 3082, in insert_all self.create( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 1574, in create self.db.create_table( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 961, in create_table sql = self.create_table_sql( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 852, in create_table_sql column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` The reason this would be useful is that `MySQLdb` uses `timedelta` for MySQL `TIME` columns: ``` >>> import MySQLdb >>> conn = MySQLdb.connect(host='database', user='user', passwd='pw') >>> csr = conn.cursor() >>> csr.execute(""SELECT CAST('11:20' AS TIME)"") >>> tuple(csr) ((datetime.timedelta(seconds=40800),),) ``` So currently any attempt to convert a MySQL DB with a `TIME` column using `db-to-sqlite` will result in the above error. I was rather surprised that `MySQLdb` uses `timedelta` for `TIME` columns but I see that [this column type](https://dev.mysql.com/doc/refman/8.0/en/time.html) is intended for time intervals as well as the time of day so it makes sense. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/522/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239034903,I_kwDOCGYnMM5J2iwX,433,CLI eats my cursor,7908073,chapmanjacobd,closed,0,,,,,10,2022-05-17T18:52:52Z,2023-11-04T00:46:30Z,2023-11-04T00:46:30Z,CONTRIBUTOR,,"I'm not sure why this happens but `sqlite-utils` makes my terminal cursor disappear after running commands like `sqlite-utils insert`. I've only noticed this behavior in `sqlite-utils`, not in any other CLI tools I can still type commands after it runs but the text cursor is invisible",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/433/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 684961449,MDU6SXNzdWU2ODQ5NjE0NDk=,949,Try out CodeMirror SQL hints,9599,simonw,closed,0,,,,,5,2020-08-24T20:58:21Z,2023-11-03T05:28:58Z,2020-11-01T03:29:48Z,OWNER,,"> It would also be interesting to try out the SQL hint mode, which can autocomplete against tables and columns. This demo shows how to configure that: https://codemirror.net/mode/sql/ > > Some missing documentation: https://stackoverflow.com/questions/20023381/codemirror-how-add-tables-to-sql-hint _Originally posted by @simonw in https://github.com/simonw/datasette/issues/948#issuecomment-679355426_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/949/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,dazzag24,closed,0,,,,,3,2019-02-14T16:30:22Z,2023-10-25T13:23:04Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,andrewsanchez,open,0,,,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1163369515,I_kwDOBm6k_c5FV5wr,1655,query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data,536941,fgregg,open,0,,,,,8,2022-03-09T00:56:40Z,2023-10-17T21:53:17Z,,CONTRIBUTOR,,"[this page](https://labordata.bunkum.us/opdr-8335ea3?sql=with+most_recent_lu+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%28%0D%0A++++++select%0D%0A++++++++*%0D%0A++++++from%0D%0A++++++++lm_data%0D%0A++++++order+by%0D%0A++++++++f_num%2C%0D%0A++++++++receive_date+desc%0D%0A++++%29+t%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++aff_abbr+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+abbr_local_name%2C%0D%0A++coalesce%28%0D%0A++++regexp_match%28%27%28.*%3F%29%28%2C%3F+AFL-CIO%24%29%27%2C+union_name%29%2C%0D%0A++++regexp_match%28%27%28.*%3F%29%28+IND%24%29%27%2C+union_name%29%2C%0D%0A++++union_name%0D%0A++%29+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+full_local_name%2C%0D%0A++*%0D%0Afrom%0D%0A++most_recent_lu%0D%0Awhere+%28desig_num+IS+NOT+NULL+OR+unit_name+IS+NOT+NULL%29+AND+desig_name+%21%3D+%27HQ%27%0D%0Alimit%0D%0A++5000+offset+0) is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb. it's using over a 1G on chrome 99. i found this because, i was trying to figure out why editing the SQL was getting very slow. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1943259395,I_kwDOEhK-wc5z08kD,16, time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ',3746270,linonetwo,open,0,,,,,0,2023-10-14T13:24:39Z,2023-10-14T13:24:39Z,,NONE,," ``` evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File ""/usr/local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) ^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 46, in save_note ""created"": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 111, in convert_datetime return datetime.datetime.strptime(s, ""%Y%m%dT%H%M%SZ"").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 349, in _strptime raise ValueError(""time data %r does not match format %r"" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' ``` enex is exported by evernote mac client ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,Olshansk,closed,0,,,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1940346034,I_kwDOBm6k_c5zp1Sy,2199,Detailed upgrade instructions for metadata.yaml -> datasette.yaml,9599,simonw,open,0,,,3268330,Datasette 1.0,7,2023-10-12T16:21:25Z,2023-10-12T22:08:42Z,,OWNER,,"> `Exception: Datasette no longer accepts plugin configuration in --metadata. Move your ""plugins"" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.` > > I think we should link directly to documentation that tells people how to perform this upgrade. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,hcarter333,open,0,,,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of `python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite` and `python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""` and `python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll` I got the error: ``` File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # ""extension"" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) #else: conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: `python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1920416843,I_kwDOCGYnMM5ydzxL,597,sqlite-utils insert-files should be able to convert fields,1737541,grimnight,open,0,,,,,0,2023-09-30T22:20:47Z,2023-09-30T22:20:47Z,,NONE,,"Currently using both `insert-files` and `convert` is needed in order to create sqlar files, it would be more convenient if it could be done with just one command. ```shell ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100% ~ ❯ sqlite-utils convert test.sqlar sqlar data ""zlib.compress(value)"" --import=zlib --where ""name = 'test.py'"" [####################################] 100% ~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data ""zlib.compress(sys.stdin.buffer.read())"" --import=zlib --import=sys --where ""name = 'test.py'"" # Alternative way [####################################] 100% ~ ❯ sqlite3 test.sqlar ""SELECT hex(data) FROM sqlar WHERE name = 'test.py';"" | python3 -c ""import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))"" import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ rm test.py ~ ❯ sqlar -l test.sqlar test.py ~ ❯ sqlar -x test.sqlar ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 777333388,MDU6SXNzdWU3NzczMzMzODg=,1168,Mechanism for storing metadata in _metadata tables,9599,simonw,open,0,,,,,21,2021-01-01T18:47:27Z,2023-09-28T18:29:05Z,,OWNER,,"_Original title: Perhaps metadata should all live in a `_metadata` in-memory database_ Inspired by #1150 - metadata should be exposed as an API, and for large Datasette instances that API may need to be paginated. So why not expose it through an in-memory database table? One catch to this: plugins. #860 aims to add a plugin hook for metadata. But if the metadata comes from an in-memory table, how do the plugins interact with it? The need to paginate over metadata does make a plugin hook that returns metadata for an individual table seem less wise, since we don't want to have to do 10,000 plugin hook invocations to show a list of all metadata. If those plugins write directly to the in-memory table how can their contributions survive the server restarting?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1907281675,I_kwDOCGYnMM5xrs8L,595,Cascading DELETE not working with Table.delete(pk),123451970,cycle-data,closed,0,,,,,1,2023-09-21T15:46:41Z,2023-09-25T09:38:57Z,2023-09-25T09:38:13Z,NONE,,"Hi ! I noticed that when I am trying to use the delete method of the Table object, the record get properly deleted from the table, but the cascading delete triggers on foreign keys do not activate. `self.db[""contact""].delete(contact_id)` I tried querying the database directly via DB Browser and the triggers work without any issue. Looked up the source code and behind the scene this method is just querying the database normally so I'm not exactly sure where this behavior comes from. Thank you in advance for your time ! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944846776,MDU6SXNzdWU5NDQ4NDY3NzY=,297,Option for importing CSV data using the SQLite .import mechanism,9599,simonw,open,0,,,,,23,2021-07-14T22:36:41Z,2023-09-22T20:49:52Z,,OWNER,,"As seen in https://til.simonwillison.net/sqlite/import-csv - `.mode csv` and then `.import school.csv schools` is hugely faster than importing via `sqlite-utils insert` and doing the work in Python - but it can only be implemented by shelling out to the `sqlite3` CLI tool, it's not functionality that is exposed to the Python `sqlite3` module. An option to use this would be useful - maybe something like this: sqlite-utils insert blah.db blah blah.csv --fast",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1907765514,I_kwDOBm6k_c5xtjEK,2195,`datasette publish` needs support for the new config/metadata split,9599,simonw,open,0,,,,,9,2023-09-21T21:08:12Z,2023-09-21T22:57:48Z,,OWNER,,"> ... which raises the challenge that `datasette publish` doesn't yet know what to do with a config file! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1901416155,I_kwDOBm6k_c5xVU7b,2189,Server hang on parallel execution of queries to named in-memory databases,9599,simonw,closed,0,,,,,31,2023-09-18T17:23:18Z,2023-09-21T22:26:21Z,2023-09-21T22:26:21Z,OWNER,,"I've started to encounter a bug where queries to tables inside named in-memory databases sometimes trigger server hangs. I'm still trying to figure out what's going on here - on one occasion I managed to Ctrl+C the server and saw an exception that mentioned a thread lock, but usually hitting Ctrl+C does nothing and I have to `kill -9` the PID instead. This is all running on my M2 Mac. I've seen the bug in the Datasette 1.0 alphas and in Datasette 0.64.3 - but reverting to 0.61 appeared to fix it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1662951875,I_kwDOBm6k_c5jHqHD,2057,DeprecationWarning: pkg_resources is deprecated as an API,9599,simonw,closed,0,,,,,25,2023-04-11T17:41:20Z,2023-09-21T22:09:10Z,2023-09-21T22:09:10Z,OWNER,,"Got this running tests against Python 3.11. ``` ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/datasette/app.py:14: in import pkg_resources ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/pkg_resources/__init__.py:121: in warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) E DeprecationWarning: pkg_resources is deprecated as an API ``` I ran with `pytest -Werror --pdb -x` to get the debugger for that warning, but it turned out searching the code worked better. It's used in these two places: https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/plugins.py#L43-L50 https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/app.py#L1037",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2057/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907695234,I_kwDOBm6k_c5xtR6C,2194,"Deploy failing with ""plugins/alternative_route.py: Not a directory""",9599,simonw,closed,0,,,,,8,2023-09-21T20:17:49Z,2023-09-21T22:08:19Z,2023-09-21T22:08:19Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/6266449018/job/17017460074 This is a bit of a mystery, I don't think I've changed anything recently that could have broken this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907655261,I_kwDOBm6k_c5xtIJd,2193,"""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run",9599,simonw,closed,0,,,,,6,2023-09-21T19:49:34Z,2023-09-21T21:56:43Z,2023-09-21T21:56:43Z,OWNER,,"> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the ""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,cadeef,open,0,,,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument: ``` $ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload. ``` If a 'serve' file is created it launches properly (albeit with an empty database called serve): ``` $ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ``` Version (running from HEAD on main): ``` $ datasette --version datasette, version 1.0a2 ``` This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context. I'm happy to debug and land a patch if it's welcome.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1900026059,I_kwDOBm6k_c5xQBjL,2188,"Plugin Hooks for ""compile to SQL"" languages",15178711,asg017,open,0,,,,,2,2023-09-18T01:37:15Z,2023-09-18T06:58:53Z,,CONTRIBUTOR,,"There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples: - Logica https://logica.dev - PRQL https://prql-lang.org - Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage. A few things I'd imagine a `datasette-prql` or `datasette-logica` plugin would do: - `prql=` instead of `sql=` - Code editor support (syntax highlighting, autocomplete) - Hide/show SQL",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 787098345,MDU6SXNzdWU3ODcwOTgzNDU=,1191,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,9599,simonw,open,0,,,3268330,Datasette 1.0,12,2021-01-15T18:18:51Z,2023-09-18T06:55:52Z,,OWNER,,"Sometimes a plugin may want to add content to an existing default template - for example `datasette-search-all` adds a new search box at the top of `index.html`. I also want `datasette-upload-csvs` to add a CTA on the `database.html` page: https://github.com/simonw/datasette-upload-csvs/issues/18 Currently plugins can do this by providing a new version of the `index.html` template - but if multiple plugins try to do that only one of them will succeed. It would be better if there were known areas of those templates which plugins could add additional content to, such that multiple plugins can use the same spot.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1191/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,geofinder,open,0,,,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17 Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported. That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed. It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1895266807,I_kwDOBm6k_c5w93n3,2184,Design decision - should configuration be exposed at /-/config ?,9599,simonw,open,0,,,,,0,2023-09-13T21:07:08Z,2023-09-13T21:07:38Z,,OWNER,,"> This made me think. That `{""$env"": ""ENV_VAR""}` hack was introduced back here: > > - https://github.com/simonw/datasette/issues/538 > > The problem it was solving was that metadata was visible to everyone with access to the instance at `/-/metadata` but plugins clearly needed a way to set secret settings. > > Now that this stuff is moving to config, we have some decisions to make: > > 1. Add `/-/config` to let people see the configuration of their instance, and keep the `$env` trick for secret settings. > 2. Say all configuration aside from metadata is secret and make `$env` optional or ditch it entirely. > 3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from `/-/config` > > I've found `/-/metadata` extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like. > > So I'm leaning towards option 1 or 3. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924_ Also refs: - #2093",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1891614971,I_kwDOCGYnMM5wv8D7,594,Represent compound foreign keys in table.foreign_keys output,9599,simonw,open,0,,,,,2,2023-09-12T03:48:24Z,2023-09-12T03:51:13Z,,OWNER,,"Given this schema: ```sql CREATE TABLE departments ( campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, dept_name TEXT, PRIMARY KEY (campus_name, dept_code) ); CREATE TABLE courses ( course_code TEXT PRIMARY KEY, course_name TEXT, campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, FOREIGN KEY (campus_name, dept_code) REFERENCES departments(campus_name, dept_code) ); ``` The output of `db[""courses""].foreign_keys` right now is: ``` [ForeignKey(table='courses', column='campus_name', other_table='departments', other_column='campus_name'), ForeignKey(table='courses', column='dept_code', other_table='departments', other_column='dept_code')] ``` Which suggests two normal foreign keys, not one compound foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781530343,I_kwDOBm6k_c5qL_7n,2093,"Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File",15178711,asg017,open,0,,,,,8,2023-06-29T21:18:23Z,2023-09-11T20:19:32Z,,CONTRIBUTOR,,"Very often I get tripped up when trying to configure my Datasette instances. For example: if I want to change the port my app listen too, do I do that with a CLI flag, a `--setting` flag, inside `metadata.json`, or an env var? If I want to up the time limit of SQL statements, is that under `metadata.json` or a setting? Where does my plugin configuration go? Normally I need to look it up in Datasette docs, and I quickly find my answer, but the number of places where ""config"" goes it overwhelming. - Flat CLI flags like `--port`, `--host`, `--cors`, etc. - `--setting`, like `default_page_size`, `sql_time_limit_ms` etc - Inside `metadata.json`, including plugin configuration Typically my Datasette deploys are extremely long shell commands, with multiple `--setting` and other CLI flags. ## Proposal: Consolidate all ""config"" into `datasette.toml` I propose that we add a new `datasette.toml` that combines ""settings"", ""metadata"", and other common CLI flags like `--port` and `--cors` into a single file. It would be similar to ""Cargo.toml"" in Rust projects, ""package.json"" in Node projects, and ""pyproject.toml"" in Python, etc. A sample of what it could look like: ```toml # ""top level"" configuration that are currently CLI flags on `datasette serve` [config] port = 8020 host = ""0.0.0.0"" cors = true # replaces multiple `--setting` flags [settings] base_url = ""/app/datasette/"" default_allow_sql = true sql_time_limit_ms = 3500 # replaces `metadata.json`. # The contents of datasette-metadata.json could be defined in this file instead, but supporting separate files is nice (since those are easy to machine-generate) [metadata] include=""./datasette-metadata.json"" # plugin-specific [plugins] [plugins.datasette-auth-github] client_id = {env = ""DATASETTE_AUTH_GITHUB_CLIENT_ID""} client_secret = {env = ""GITHUB_CLIENT_SECRET""} [plugins.datasette-cluster-map] latitude_column = ""lat"" longitude_column = ""lon"" ``` ## Pros - Instead of multiple files and CLI flags, everything could be in one tidy file - Editing config in a separate file is easier than editing CLI flags, since you don't have to kill a process + edit a command every time - New users will know ""just edit my `datasette.toml` instead of needing to learn metadata + settings + CLI flags - Better dev experience for multiple environment. For example, could have `datasette -c datasette-dev.toml` for local dev environments (enables SQL, debug plugins, long timeouts, etc.), and a `datasette -c datasette-prod.toml` for ""production"" (lower timeouts, less plugins, monitoring plugins, etc.) ## Cons - Yet another config-management system. Now Datasette users will need to know about metadata, settings, CLI flags, _and_ `datasette.toml`. However with enough documentation + announcements + examples, I think we can get ahead of it. - If toml is chosen, would need to add a toml parser for Python version <3.11 - Multiple sources of config require priority. For example: Would `--setting default_allow_sql off` override the value inside `[settings]`? What about `--port`? ## Other Notes ### Toml I chose toml over json because toml supports comments. I chose toml over yaml because Python 3.11 has builtin support for it. I also find toml easier to work with since it doesn't have the odd ""gotchas"" that YAML has (""ex `3.10` resolving to `3.1`, Norway `NO` resolving to `false`, etc.). It also mimics `pyproject.toml` which is nice. Happy to change my mind about this however ### Plugin config will be difficult Plugin config is currently in `metadata.json` in two places: 1. Top level, under `""plugins.[plugin-name]""`. This fits well into `datasette.toml` as `[plugins.plugin-name]` 2. Table level, under `""databases.[db-name].tables.[table-name].plugins.[plugin-name]`. This doesn't fit that well into `datasette.toml`, unless it's nested under `[metadata]`? ### Extensions, static, one-off plugins? We could also include equivalents of `--plugins-dir`, `--static`, and `--load-extension` into `datasette.toml`, but I'd imagine there's a few security concerns there to think through. ### Explicitly list with plugins to use? I believe Datasette by default will load all install plugins on startup, but maybe `datasette.toml` can specify a list of plugins to use? For example, a dev version of `datasette.toml` can specify `datasette-pretty-traces`, but the prod version can leave it out",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2093/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1876353656,I_kwDOBm6k_c5v1uJ4,2168,Consider a request/response wrapping hook slightly higher level than asgi_wrapper(),9599,simonw,open,0,,,,,6,2023-08-31T21:42:04Z,2023-09-10T17:54:08Z,,OWNER,,"There's a long justification for why this might be needed here: - https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001 Short version: it would be neat if it was possible to stash some data on the `request` object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware. The `asgi_wrapper()` mechanism doesn't have access to the request or response objects - it gets `scope` and can mess around with `receive` and `send`, but those are pretty low-level primitives. Since Datasette has well-defined `request` and `response` objects now it might be nice to have a middleware layer that can manipulate those directly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1886771493,I_kwDOCGYnMM5wddkl,592,`table.transform()` should preserve `rowid` values,9599,simonw,closed,0,,,,,6,2023-09-08T00:42:38Z,2023-09-10T17:46:41Z,2023-09-09T00:45:32Z,OWNER,,"I just spotted a bug when using https://datasette.io/plugins/datasette-configure-fts and https://datasette.io/plugins/datasette-edit-schema at the same time. Steps to reproduce: - Configure FTS for a table, then run a test search - Edit the schema for that table and change the order of columns - Run the test search again I got the wrong search results, which I think is because the `_fts` table pointed to the first table by `rowid` but those `rowid` values were entirely rewritten as a consequence of running `table.transform()` on the table. Reconfiguring FTS on the table fixed the problem. I think `table.transform()` should be able to preserve `rowid` values.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886791100,I_kwDOBm6k_c5wdiW8,2180,Plugin hook: `actors_from_ids()`,9599,simonw,closed,0,,,,,6,2023-09-08T01:16:41Z,2023-09-10T17:44:14Z,2023-09-08T04:28:03Z,OWNER,,"In building Datasette Cloud we realized that a bunch of the features we are building need a way of resolving an actor ID to the actual actor, in order to display something more interesting than just an integer ID. Social plugins in particular need this - comments by X, CSV uploaded by X, that kind of thing. I think the solution is a new plugin hook: `actors_from_ids(datasette, ids)` which can return a list of actor dictionaries. The default implementation can return `[{""id"": ""...""}]` for the IDs passed to it. Pluggy has a `firstresult=True` option which is relevant here, since this is the first plugin hook we will have implemented where only one plugin should provide an answer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1888477283,I_kwDOC8SPRc5wj-Bj,38,Run `rebuild_fts` after building the index,9599,simonw,open,0,,,,,0,2023-09-08T23:17:45Z,2023-09-08T23:17:45Z,,MEMBER,,"In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix: ```bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1874255116,I_kwDOBm6k_c5vtt0M,2164,Ability to only load a specific list of plugins,9599,simonw,closed,0,,,,,1,2023-08-30T19:33:41Z,2023-09-08T04:35:46Z,2023-08-30T22:12:27Z,OWNER,,"I'm going to try and get this working through an environment variable, so that you can start Datasette and it will only load a subset of plugins including those that use the `register_commands()` hook. Initial research on this: - https://github.com/pytest-dev/pluggy/issues/422",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886350562,I_kwDOBm6k_c5wb2zi,2178,Don't show foreign key links to tables the user cannot access,9599,simonw,closed,0,,,,,5,2023-09-07T17:56:41Z,2023-09-07T23:28:27Z,2023-09-07T23:28:27Z,OWNER,,"Spotted this problem while working on this plugin: - https://github.com/simonw/datasette-public It's possible to make a table public to any users - but then you may end up with situations like this: That table is public, but the foreign key links go to tables that are NOT public. We're also leaking the names of the values in those private tables here, which we shouldn't do. So this is a tiny bit of an information leak. Since this only affects people who have configured a table to be public that has foreign keys to a table that is private I don't think this is worth issuing a vulnerability report about - I very much doubt anyone is running Datasette configured in a way that could result in problems because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886649402,I_kwDOBm6k_c5wc_w6,2179,Flaky test: test_hidden_sqlite_stat1_table,9599,simonw,closed,0,,,,,0,2023-09-07T22:48:43Z,2023-09-07T22:51:19Z,2023-09-07T22:51:19Z,OWNER,,"This test here: https://github.com/simonw/datasette/blob/fbcb103c0cb6668018ace539a01a6a1f156e8d6a/tests/test_api.py#L1011-L1020 It failed for me like this: `E AssertionError: assert [('normal', False), ('sqlite_stat1', True), ('sqlite_stat4', True)] in ([('normal', False), ('sqlite_stat1', True)],)` Looks like some builds of SQLite include a `sqlite_stat4` table.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1010112818,I_kwDOBm6k_c48NRky,1479,"Win32 ""used by another process"" error with datasette publish",76450761,kirajano,open,0,,,,,7,2021-09-28T19:12:00Z,2023-09-07T02:14:16Z,,NONE,,"I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette. Failed to deploy. Attaching logs: 1. Tried with an app created via `flyctl apps create frosty-fog-8565` and the ran `datasette publish fly covid.db --app frosty-fog-8565` ``` Deploying frosty-fog-8565 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080 Error error connecting to docker: An unknown error occured. Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpgcm8cz66\\frosty-fog-8565' ``` 2. Tried also with an app that gets autogenerate when running `flyctl launch`. This also generates the .toml file. Ran then `datasette publish fly covid.db --app dark-feather-168` **but different error now** ```Deploying dark-feather-168 ==> Validating app configuration Error not possible to validate configuration: server returned Post ""https://api.fly.io/graphql"": unexpected EOF Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpnoyewcre\\dark-feather-168' ``` These are also the contents of the generated **.toml file** in 2 scenario: ``` # fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00 app = ""dark-feather-168"" kill_signal = ""SIGINT"" kill_timeout = 5 processes = [] [env] [experimental] allowed_public_ports = [] auto_rollback = true [[services]] http_checks = [] internal_port = 8080 processes = [""app""] protocol = ""tcp"" script_checks = [] [services.concurrency] hard_limit = 25 soft_limit = 20 type = ""connections"" [[services.ports]] handlers = [""http""] port = 80 [[services.ports]] handlers = [""tls"", ""http""] port = 443 [[services.tcp_checks]] grace_period = ""1s"" interval = ""15s"" restart_limit = 6 timeout = ""2s"" ``` 3. But also trying `datasette package covid.db` to create a local DOCKERFILE to later try to push it via `flyctl deploy` fails as well. ```[+] Building 147.3s (11/11) FINISHED => [internal] load build definition from Dockerfile 0.2s => => transferring dockerfile: 396B 0.0s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for docker.io/library/python:3.8 4.7s => [auth] library/python:pull token for registry-1.docker.io 0.0s => [internal] load build context 0.1s => => transferring context: 82.37kB 0.0s => [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s => => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s => => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s => => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s => => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s => => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s => => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s => => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s => => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s => => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s => => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s => => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s => => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s => => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s => => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s => => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s => => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s => => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s => => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s => => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s => => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s => => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s => => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s => [2/5] COPY . /app 2.3s => [3/5] WORKDIR /app 0.2s => [4/5] RUN pip install -U datasette 26.9s => [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s => exporting to image 1.2s => => exporting layers 1.2s => => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them Traceback (most recent call last): ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py"", line 283, in package call(args) File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpkb27qid3\\datasette'```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1479/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1884408624,I_kwDOBm6k_c5wUcsw,2177,Move schema tables from _internal to _catalog,9599,simonw,open,0,,,,,1,2023-09-06T16:58:33Z,2023-09-06T17:04:30Z,,OWNER,,"This came up in discussion over: - https://github.com/simonw/datasette/pull/2174 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 336464733,MDU6SXNzdWUzMzY0NjQ3MzM=,328,"Installation instructions, including how to use the docker image",9599,simonw,closed,0,,,,,4,2018-06-28T03:59:33Z,2023-09-05T14:10:39Z,2018-06-28T04:02:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1879214365,I_kwDOCGYnMM5wAokd,590,Ability to tell if a Database is an in-memory one,9599,simonw,open,0,,,,,1,2023-09-03T19:50:15Z,2023-09-03T19:50:36Z,,OWNER,,"Currently the constructor accepts `memory=True` or `memory_name=...` and uses those to create a connection, but does not record what those values were: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L307-L349 This makes it hard to tell if a database object is to an in-memory or a file-based database, which is sometimes useful to know.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,simonw,open,0,,,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1292370469,I_kwDOBm6k_c5NCAIl,1765,Document plugins providing new plugin hook-,9599,simonw,closed,0,,,,,1,2022-07-03T17:05:14Z,2023-08-31T23:08:24Z,2023-08-31T23:06:31Z,OWNER,,I've used this pattern twice now: https://til.simonwillison.net/datasette/register-new-plugin-hooks - in `datasette-graphql` and `datasette-low-disk-space-hook`. I should describe the pattern on https://docs.datasette.io/en/stable/writing_plugins.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1765/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1876407598,I_kwDOBm6k_c5v17Uu,2169,execute-sql on a database should imply view-database/view-permission,9599,simonw,closed,0,,,,,0,2023-08-31T22:45:56Z,2023-08-31T22:46:28Z,2023-08-31T22:46:28Z,OWNER,,"I noticed that a token with `execute-sql` permission alone did not work, because it was not allowed to view the instance of the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1875739055,I_kwDOBm6k_c5vzYGv,2167,Document return type of await ds.permission_allowed(),9599,simonw,open,0,,,,,0,2023-08-31T15:14:23Z,2023-08-31T15:14:23Z,,OWNER,,"The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350 On inspecting the code I'm not 100% sure if it's possible for this. method to return `None`, or if it can only return `True` or `False`. Need to confirm that. https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,asg017,open,0,,,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 742041667,MDU6SXNzdWU3NDIwNDE2Njc=,1092,Make cascading permission checks available to plugins,9599,simonw,closed,0,,,,,1,2020-11-13T01:02:55Z,2023-08-30T22:17:42Z,2023-08-30T22:17:41Z,OWNER,,"The `BaseView` class has a method for cascading permission checks, but it's not easily accessible to plugins. https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/views/base.py#L75-L99 This leaves plugins like `datasette-graphql` having to implement their own versions of this logic, which is bad: https://github.com/simonw/datasette-graphql/issues/65 > First check `view-database` - if that says `False` then disallow access, if it says `True` then allow access. If it says `None` check `view-instance`. This should become a supported API that plugins are encouraged to use.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594237015,MDU6SXNzdWU1OTQyMzcwMTU=,718,Plugin idea: datasette-redirects,9599,simonw,open,0,,,,,0,2020-04-05T03:41:38Z,2023-08-30T22:17:31Z,,OWNER,,"I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin. https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 787098146,MDU6SXNzdWU3ODcwOTgxNDY=,1190,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,1024355,tomershvueli,closed,0,,,,,3,2021-01-15T18:18:42Z,2023-08-30T22:16:39Z,2023-08-30T22:16:38Z,NONE,,"If I have a self-hosted instance of Datasette up and running, I'd like to be able to the use the CLI to publish databases to that instance, not only Google or Heroku. Ideally there'd be a `url` parameter or something similar to which one could point the publish command to their instance. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1872043170,I_kwDOBm6k_c5vlRyi,2163,Rename core_X to catalog_X in the internals,9599,simonw,closed,0,,,,,1,2023-08-29T16:45:00Z,2023-08-29T17:01:31Z,2023-08-29T17:01:31Z,OWNER,,"Discussed with Alex this morning. We think the American spelling is fine here (it's shorter than `catalogue`) and that it's a slightly less lazy name than `core_`. Follows: - https://github.com/simonw/datasette/issues/2157",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1805076818,I_kwDOBm6k_c5rl0lS,2102,API tokens with view-table but not view-database/view-instance cannot access the table,9599,simonw,closed,0,9599,simonw,,,20,2023-07-14T15:34:27Z,2023-08-29T16:32:36Z,2023-08-29T16:32:35Z,OWNER,,"> Spotted a problem while working on this: if you grant a token access to view table for a specific table but don't also grant view database and view instance permissions, that token is useless. > > This was a deliberate design decision in Datasette - it's documented on https://docs.datasette.io/en/1.0a2/authentication.html#access-permissions-in-metadata > >> If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries. > > I'm now second-guessing if this was a good decision. _Originally posted by @simonw in https://github.com/simonw/datasette-auth-tokens/issues/7#issuecomment-1636031702_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865232341,I_kwDOBm6k_c5vLS_V,2153,Datasette --get --actor option,9599,simonw,closed,0,,,,,5,2023-08-24T14:00:03Z,2023-08-28T20:19:15Z,2023-08-28T20:15:53Z,OWNER,,"I experimented with a prototype of this here: - https://github.com/simonw/datasette/issues/2102#issuecomment-1691037971_ Which lets me run requests as if they belonged to a specific actor like this: ```bash datasette fixtures.db --get '/fixtures/facetable.json' --actor '{ ""_r"": { ""r"": { ""fixtures"": { ""facetable"": [ ""vt"" ] } } }, ""a"": ""user"" }' ``` Really useful for testing actors an `_r` options. Is this worth adding as a feature?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,simonw,open,0,,,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax. > > Here's what I've come up with: > ``` > datasette \ > -s settings.sql_time_limit_ms 1000 \ > -s plugins.datasette-auth-tokens.manage_tokens true \ > -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ > mydatabase.db tokens.db > ``` > Which would be equivalent to `datasette.yml` containing this: > ```yaml > plugins: > datasette-auth-tokens: > manage_tokens: true > manage_tokens_database: tokens > settings: > sql_time_limit_ms: 1000 > ``` More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1868713944,I_kwDOCGYnMM5vYk_Y,588,`table.get(column=value)` option for retrieving things not by their primary key,9599,simonw,open,0,,,,,1,2023-08-28T00:41:23Z,2023-08-28T00:41:54Z,,OWNER,,"This came up working on this feature: - https://github.com/simonw/llm/pull/186 I have a table with this schema: ```sql CREATE TABLE [collections] ( [id] INTEGER PRIMARY KEY, [name] TEXT, [model] TEXT ); CREATE UNIQUE INDEX [idx_collections_name] ON [collections] ([name]); ``` So the primary key is an integer (because it's going to have a huge number of rows foreign key related to it, and I don't want to store a larger text value thousands of times), but there is a unique constraint on the `name` - that would be the primary key column if not for all of those foreign keys. Problem is, fetching the collection by name is actually pretty inconvenient. Fetch by numeric ID: ```python try: table[""collections""].get(1) except NotFoundError: # It doesn't exist ``` Fetching by name: ```python def get_collection(db, collection): rows = db[""collections""].rows_where(""name = ?"", [collection]) try: return next(rows) except StopIteration: raise NotFoundError(""Collection not found: {}"".format(collection)) ``` It would be neat if, for columns where we know that we should always get 0 or one result, we could do this instead: ```python try: collection = table[""collections""].get(name=""entries"") except NotFoundError: # It doesn't exist ``` The existing `.get()` method doesn't have any non-positional arguments, so using `**kwargs` like that should work: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L1495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 685806511,MDU6SXNzdWU2ODU4MDY1MTE=,950,Private/secret databases: database files that are only visible to plugins,9599,simonw,closed,0,,,,,6,2020-08-25T20:46:17Z,2023-08-24T22:26:09Z,2023-08-24T22:26:08Z,OWNER,,"In thinking about the best way to implement https://github.com/simonw/datasette-auth-passwords/issues/6 (SQL-backed user accounts for `datasette-auth-passwords`) I realized that there are a few different use-cases where a plugin might want to store data that isn't visible to regular Datasette users: - Storing password hashes - Storing API tokens - Storing secrets that are used for data import integrations (secrets for talking to the Twitter API for example) Idea: allow one or more private database files to be attached to Datasette, something like this: datasette github.db linkedin.db -s secrets.db -m metadata.yml The `secrets.db` file would not be visible using any of the Datasette's usual interface or API routes - but plugins would be able to run queries against it. So `datasette-auth-passwords` might then be configured like this: ```yaml plugins: datasette-auth-passwords: database: secrets sql: ""select password_hash from passwords where username = :username"" ``` The plugin could even refuse to operate against a database that hadn't been loaded as a secret database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 814595021,MDU6SXNzdWU4MTQ1OTUwMjE=,1241,Share button for copying current URL,7107523,Kabouik,open,0,,,,,6,2021-02-23T15:55:40Z,2023-08-24T20:09:52Z,,NONE,,"I use datasette in an `iframe` inside another HTML file that contains other ways to represent my data (mostly leaflets maps built with R on summarized data), and the datasette `iframe` is a tab in that page. This particular use prevents users to access the full URLs of their datasette views and queries, which is a shame because the way datasette handles URLs to make every view or query easy to share is awesome. I know how to get the URL from the context menu of my browser, but I don't think many visitors would do it or even notice that datasette uses permalinks for pretty much every action they do. Would it be possible to add a ""Share link"" button to the interface, either in datasette itself or in a plugin?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1241/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855885427,I_kwDOBm6k_c5unpBz,2143,De-tangling Metadata before Datasette 1.0,15178711,asg017,open,0,,,,,24,2023-08-18T00:51:50Z,2023-08-24T18:28:27Z,,CONTRIBUTOR,,"Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add ""metadata"" about your ""data"" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries. Specifically, I've found the following problems when working with Datasette metadata: 1. Metadata cannot be updated without re-starting the entire Datasette instance. 2. The `metadata.json`/`metadata.yaml` has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries 3. The Python APIs for defining extra metadata are a bit awkward (the `datasette.metadata()` class, `get_metadata()` hook, etc.) ## Possible solutions Here's a few ideas of Datasette core changes we can make to address these problems. ### Re-vamp the Datasette Python metadata APIs The Datasette object has a single `datasette.metadata()` method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the `get_metadata()` hook. The `get_metadata()` hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do. (I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods) ### Add an optional `datasette_metadata` table Datasette should detect and use metadata stored in a new special table called `datasette_metadata`. This would be a regular table that a user can edit on their own, and would serve as a ""live updating"" source of metadata, than can be changed while the Datasette instance is running. Not too sure what the schema would look like, but I'd imagine: ```sql CREATE TABLE datasette_metadata( level text, target any, key text, value any, primary key (level, target) ) ``` Every row in this table would map to a single metadata ""entry"". - `level` would be one of ""datasette"", ""database"", ""table"", ""column"", which is the ""level"" the entry describes. For example, `level=""table""` means it is metadata about a specific table, `level=""database""` for a specific database, or `level=""datasette""` for the entire Datasette instance. - `target` would ""point"" to the specific object the entry metadata is about, and would depend on what `level` is specific. - `level=""database""`: `target` would be the string name of the database that the metadata entry is about. ex `""fixtures""` - `level=""table""`: `target` would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex `[""fixtures"", ""students""]` - `level=""column""`: `target` would be a JSON array of 3 strings: The database name, table name, and column name. Ex `[""fixtures"", ""students"", ""student_id""`] - `key` would be the type of metadata entry the row has, similar to the current ""keys"" that exist in `metadata.json`. Ex `""about_url""`, `""source""`, `""description""`, etc - `value` would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc A quick sample: level | target | key | value -- | -- | -- | -- datasette | NULL | title | my datasette title... db | fixtures | source | table | [""fixtures"", ""students""] | label_column | student_name column | [""fixtures"", ""students"", ""birthdate""] | description | This `datasette_metadata` would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in `datasette_metadata`, to update descriptions/columns on the fly. ### Re-vamp `metadata.json` and move non-metadata config to another place The motivation behind this is that it's awkward that `metadata.json` contains config about things that are not strictly metadata, including: - Plugin configuration - [Authentication/permissions](https://docs.datasette.io/en/latest/authentication.html#access-permissions-in-metadata) (ex the `allow` key on datasettes/databases/tables - Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases. I think we should move these outside of `metadata.json` and into a different file. The `datasette.json` idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in `datasette.json`, while `metadata.json`/`datasette_metadata` will strictly be about documenting databases/tables/columns. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1858228057,I_kwDOBm6k_c5uwk9Z,2147,Plugin hook for database queries that are run,18899,jackowayed,open,0,,,,,6,2023-08-20T18:43:50Z,2023-08-24T03:54:35Z,,NONE,,"I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.) As far as I can tell reading the docs, there isn't really a hook setup to allow this. Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good. I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just ""haven't needed it yet."" I'm potentially interested in implementing the hook if the latter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1863810783,I_kwDOBm6k_c5vF37f,2150,form label { width: 15% } is a bad default,9599,simonw,closed,0,,,,,4,2023-08-23T18:22:27Z,2023-08-23T18:37:18Z,2023-08-23T18:35:48Z,OWNER,,"See: - https://github.com/simonw/datasette-configure-fts/issues/14 - https://github.com/simonw/datasette-auth-tokens/issues/12",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795051447,I_kwDOBm6k_c5q_k-3,2097,Drop Python 3.7,9599,simonw,closed,0,,,,,0,2023-07-08T18:39:44Z,2023-08-23T18:18:00Z,2023-08-23T18:18:00Z,OWNER,,"> I'm going to drop Python 3.7. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892_ It's not supported any more: https://devguide.python.org/versions/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2097/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459509126,MDU6SXNzdWU0NTk1MDkxMjY=,516,Enforce import sort order with isort,9599,simonw,open,0,,,,,8,2019-06-22T20:35:50Z,2023-08-23T02:15:36Z,,OWNER,,"I want to use isort to order imports. A few steps here: - [x] Add a .isort.cfg file (see below) - [x] Use `isort -rc` to reformat existing code - [ ] Commit this change - [x] Add a unit test that ensures future changes remain isort compatible",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 449886319,MDU6SXNzdWU0NDk4ODYzMTk=,493,Rename metadata.json to config.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-05-29T15:48:03Z,2023-08-23T01:29:21Z,2023-08-23T01:29:20Z,OWNER,,"It is increasingly being useful configuration options, when it started out as purely metadata. Could cause confusion with the `--config` mechanism though - maybe that should be called ""settings"" instead?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324720095,MDU6SXNzdWUzMjQ3MjAwOTU=,275,"""config"" section in metadata.json (root, database and table level)",9599,simonw,closed,0,,,,,3,2018-05-20T16:02:28Z,2023-08-23T01:28:37Z,2023-08-23T01:28:37Z,OWNER,,"Split off from #274 Metadata should an optional `""config""` section at root, table or database level. The TableView and RowView and DatabaseView and BaseView classes could all have a `.config(""key"")` method which knows how to resolve the hierarchy of configs. This will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1857234285,I_kwDOBm6k_c5usyVt,2145,If a row has a primary key of `null` various things break,9599,simonw,open,0,,,,,23,2023-08-18T20:06:28Z,2023-08-21T17:30:01Z,,OWNER,,"Stumbled across this while experimenting with `datasette-write-ui`. The error I got was a 500 on the `/db` page: > `'NoneType' object has no attribute 'encode'` Tracked it down to this code, which assembles the URL for a row page: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L120-L134 That's because `tilde_encode` can't handle `None`: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L1175-L1178 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1817289521,I_kwDOCGYnMM5sUaMx,577,Get `add_foreign_keys()` to work without modifying `sqlite_master`,9599,simonw,closed,0,,,,,9,2023-07-23T20:40:18Z,2023-08-18T17:43:11Z,2023-08-18T00:48:10Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/13ebcc575d2547c45e8d31288b71a3242c16b886/sqlite_utils/db.py#L1165-L1174 This is the only place in the code that attempts to modify `sqlite_master` directly, which fails on some Python installations. Could this use the `.transform()` trick instead? Or automatically switch to that trick if it hits an error?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,simonw,open,0,,,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855894222,I_kwDOCGYnMM5unrLO,585,CLI equivalents to `transform(add_foreign_keys=)`,9599,simonw,closed,0,,,,,7,2023-08-18T01:07:15Z,2023-08-18T01:51:16Z,2023-08-18T01:51:15Z,OWNER,,"The new options added in: - #577 Deserve consideration in the CLI as well. https://github.com/simonw/sqlite-utils/blob/d2bcdc00c6ecc01a6e8135e775ffdb87572b802b/sqlite_utils/db.py#L1706-L1708",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1754174496,I_kwDOCGYnMM5ojpQg,558,Ability to define unique columns when creating a table,1910303,aguinane,open,0,,,,,0,2023-06-13T06:56:19Z,2023-08-18T01:06:03Z,,NONE,,"When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set. ```python from sqlite_utils import Database columns = {""mRID"": str, ""name"": str} db = Database(""example.db"") db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], if_not_exists=True) db[""ExampleTable""].create_index([""mRID""], unique=True, if_not_exists=True) ``` So something like this would add the UNIQUE flag to the table definition. ```python db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], unique=[""mRID""], if_not_exists=True) ``` ```sql CREATE TABLE ExampleTable ( mRID TEXT PRIMARY KEY NOT NULL UNIQUE, name TEXT ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855836914,I_kwDOCGYnMM5undLy,583,Get rid of test.utils.collapse_whitespace,9599,simonw,closed,0,,,,,1,2023-08-17T23:31:09Z,2023-08-18T00:59:19Z,2023-08-18T00:59:19Z,OWNER,,"I have a neater pattern for this now - instead of: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L472-L475 I now prefer: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L1163-L1171",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1847201263,I_kwDOBm6k_c5uGg3v,2140,Remove all remaining documentation instances of '$ ',9599,simonw,closed,0,,,,,1,2023-08-11T17:42:13Z,2023-08-11T17:52:25Z,2023-08-11T17:45:00Z,OWNER,,"For example this: https://github.com/simonw/datasette/blob/4535568f2ce907af646304d0ebce2500ebd55677/docs/authentication.rst?plain=1#L33-L35 The problem with that `$ ` prefix is that it prevents users from copying and pasting the raw command. https://docs.datasette.io/en/stable/authentication.html#using-the-root-actor",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,LBHELewis,open,0,,,,,1,2019-08-05T13:14:45Z,2023-08-11T05:19:42Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1846076261,I_kwDOBm6k_c5uCONl,2139,border-color: ##ff0000 bug - two hashes,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-11T01:22:58Z,2023-08-11T05:16:24Z,2023-08-11T05:16:24Z,OWNER,,"Spotted this on https://latest.datasette.io/extra_database ```html
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1843391585,I_kwDOBm6k_c5t3-xh,2134,Add writable canned query demo to latest.datasette.io,9599,simonw,closed,0,,,,,5,2023-08-09T14:31:30Z,2023-08-10T01:22:46Z,2023-08-10T01:05:56Z,OWNER,,"This would be useful while working on: - #2114",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1844213115,I_kwDOBm6k_c5t7HV7,2138,on_success_message_sql option for writable canned queries,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-10T00:20:14Z,2023-08-10T00:39:40Z,2023-08-10T00:34:26Z,OWNER,,"> Or... how about if the `on_success_message` option could define a SQL query to be executed to generate that message? Maybe `on_success_message_sql`. - https://github.com/simonw/datasette/issues/2134",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841501975,I_kwDOBm6k_c5twxcX,2133,[feature request]`datasette install plugins.json` options,54462,HaveF,closed,0,,,,,9,2023-08-08T15:06:50Z,2023-08-10T00:31:24Z,2023-08-09T22:04:46Z,NONE,,"Hi, simon ❤️ `datasette plugins --all > plugins.json` could generate all plugins info. On another machine, it would be great to install all plugins just by `datasette install plugins.json`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 627794879,MDU6SXNzdWU2Mjc3OTQ4Nzk=,782,Redesign default .json format,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,55,2020-05-30T18:47:07Z,2023-08-10T00:07:17Z,2023-08-10T00:07:17Z,OWNER,,The default JSON just isn't right. I find myself using `?_shape=array` for almost everything I build against the API.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843710170,I_kwDOBm6k_c5t5Mja,2136,Query view shouldn't return `columns`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-08-09T17:23:57Z,2023-08-09T19:03:04Z,2023-08-09T19:03:04Z,OWNER,,"I just noticed that https://latest.datasette.io/fixtures/roadside_attraction_characteristics.json?_labels=on&_size=1 returns: ```json { ""ok"": true, ""next"": ""1"", ""rows"": [ { ""rowid"": 1, ""attraction_id"": { ""value"": 1, ""label"": ""The Mystery Spot"" }, ""characteristic_id"": { ""value"": 2, ""label"": ""Paranormal"" } } ], ""truncated"": false } ``` But https://latest.datasette.io/fixtures.json?sql=select+rowid%2C+attraction_id%2C+characteristic_id+from+roadside_attraction_characteristics+order+by+rowid+limit+1 returns: ```json { ""rows"": [ { ""rowid"": 1, ""attraction_id"": 1, ""characteristic_id"": 2 } ], ""columns"": [ ""rowid"", ""attraction_id"", ""characteristic_id"" ], ""ok"": true, ""truncated"": false } ``` The `columns` key in the query response is inconsistent with the table response.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843821954,I_kwDOBm6k_c5t5n2C,2137,Redesign row default JSON,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-08-09T18:49:11Z,2023-08-09T19:02:47Z,,OWNER,,"This URL here: https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables ```json { ""database"": ""fixtures"", ""table"": ""simple_primary_key"", ""rows"": [ { ""id"": ""1"", ""content"": ""hello"" } ], ""columns"": [ ""id"", ""content"" ], ""primary_keys"": [ ""id"" ], ""primary_key_values"": [ ""1"" ], ""units"": {}, ""foreign_key_tables"": [ { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_blank_label"", ""count"": 0, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_blank_label=1"" }, { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_label"", ""count"": 1, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_label=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f3"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f3=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f2"", ""count"": 0, ""link"": ""/fixtures/complex_foreign_keys?f2=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f1"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f1=1"" } ], ""query_ms"": 4.226590999678592, ""source"": ""tests/fixtures.py"", ""source_url"": ""https://github.com/simonw/datasette/blob/main/tests/fixtures.py"", ""license"": ""Apache License 2.0"", ""license_url"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""ok"": true, ""truncated"": false } ``` That `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1560662739,I_kwDOBm6k_c5dBdLT,2007,`render_cell()` hook should take an optional `request` argument,9599,simonw,closed,0,,,,,1,2023-01-28T03:13:00Z,2023-08-09T17:15:03Z,2023-01-28T03:34:26Z,OWNER,,From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2007/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840417903,I_kwDOBm6k_c5tsoxv,2131,Refactor code that supports templates_considered comment,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-08T01:28:36Z,2023-08-09T15:27:41Z,,OWNER,,"I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167 I think it should move to `datasette.render_template()` - and maybe have a renamed template variable too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841343173,I_kwDOBm6k_c5twKrF,2132,Get form fields on query page working again ,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-08-08T13:39:05Z,2023-08-08T13:45:10Z,2023-08-08T13:45:09Z,OWNER,,"Caused by: - #2112 https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+where+%22pk1%22+%3D+%3Ap0+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=b The `:p0` form field is missing. Submitting the form results in this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840324765,I_kwDOBm6k_c5tsSCd,2129,CSV ?sql= should indicate errors,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-07T23:13:04Z,2023-08-08T02:02:21Z,,OWNER,,"> https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now: ```bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' ``` ``` HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822982933,I_kwDOBm6k_c5sqIMV,2117,Figure out what to do about `DatabaseView.name`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:58:06Z,2023-08-08T02:02:07Z,2023-08-08T02:02:07Z,OWNER,,"In the old code: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35 This `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54 Figure out how that should work once I've refactored those classes to view functions instead. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940964,I_kwDOBm6k_c5sp98k,2115,Ensure all tests pass against new query view JSON,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,0,2023-07-26T18:25:20Z,2023-08-08T02:01:39Z,2023-08-08T02:01:38Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822938661,I_kwDOBm6k_c5sp9Yl,2112,Build HTML version of /content?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:23:34Z,2023-08-08T02:01:09Z,2023-08-08T02:01:01Z,OWNER,,"This will help make the hook as robust as possible. - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822937426,I_kwDOBm6k_c5sp9FS,2111,Implement new /content.json?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-07-26T18:22:39Z,2023-08-08T02:00:37Z,2023-08-08T02:00:22Z,OWNER,,"This will be the base that the remaining work builds on top of. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840329615,I_kwDOBm6k_c5tsTOP,2130,Render plugin mechanism needs `error` and `truncated` fields,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,2,2023-08-07T23:19:19Z,2023-08-08T01:51:54Z,2023-08-08T01:47:42Z,OWNER,,"While working on: - https://github.com/simonw/datasette/pull/2118 It became clear that the `render` callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette Needs to grow the ability to be told if an error occurred (an `error` string) and if the results were truncated (a `truncated` boolean).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1818838294,I_kwDOCGYnMM5saUUW,578,Plugin hook for adding new output formats,9599,simonw,open,0,,,,,5,2023-07-24T17:29:18Z,2023-08-07T15:41:49Z,,OWNER,,"> What would it take to add a format hook? I'm still thinking about my GIS workflow, and being able to do `sqlite-utils query ... --geojson` would be nice. It's the one place my Datasette workflow is messy, having to do `datasette . --get /path/to/query.geojson --setting max_rows_returned 10000 --load-extension spatialite`. > I know the current pattern is `--csv`, but maybe `--format geojson` is more future-proof. https://discord.com/channels/823971286308356157/997738192360964156/1133076679011602432",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1839344979,I_kwDOCGYnMM5toi1T,582,Handling CSV/file input that contains NUL bytes,1448859,betatim,open,0,,,,,0,2023-08-07T12:24:14Z,2023-08-07T12:24:14Z,,NONE,,"I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte. When the processing reaches the line that contains the NUL an exception is raised. I'm wondering if there is something that can be done in `sqlite-utils` to say ""skip lines with encoding errors"" or some such. I think it isn't super straightforward though as the exception comes from inside the `csv` module that does all the parsing. Concretely the file is the `KernelVersions.csv` from https://www.kaggle.com/datasets/kaggle/meta-kaggle This is the command and output: ``` $ sqlite-utils insert --csv kaggle.db kaggle KernelVersions.csv [------------------------------------] 0% [#####################---------------] 60% 00:04:24Traceback (most recent call last): File ""/home/foobar/miniconda/envs/meta-kaggle/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1223, in insert insert_upsert_implementation( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1085, in insert_upsert_implementation db[table].insert_all( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3198, in insert_all chunk = list(chunk) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3742, in fix_square_braces for record in records: File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1071, in docs = (decode_base64_values(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1068, in docs = (verify_is_dict(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1003, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: line contains NUL ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1824457306,I_kwDOBm6k_c5svwJa,2122,Parameters on canned queries: fixed or query-generated list?,1563881,meowcat,open,0,,,,,0,2023-07-27T14:07:07Z,2023-07-27T14:07:07Z,,NONE,,"Hi, currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.) * adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is) * making it possible to provide a query which returns selectable values for a parameter, e.g. ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: ""SELECT VALUE FROM generate_series(1, 30, 1)"" # this obviously requires the corresponding sqlite extension instrument: sql: ""SELECT DISTINCT MACHINE FROM calendar_entries"" ``` * making it possible to provide a fixed list of parameters ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,simonw,open,0,,,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6 Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822934563,I_kwDOBm6k_c5sp8Yj,2109,Plan for getting the new JSON format query views working,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:20:18Z,2023-07-27T00:24:47Z,2023-07-26T18:25:34Z,OWNER,,"I've been stuck on this for too long. I'm breaking it down into a full milestone: https://github.com/simonw/datasette/milestone/29",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823160748,I_kwDOCGYnMM5sqzms,581,`sqlite-utils convert --pdb` option,9599,simonw,closed,0,,,,,1,2023-07-26T21:02:50Z,2023-07-26T21:07:45Z,2023-07-26T21:06:10Z,OWNER,,While using `sqlite-utils convert` I realized it would be handy if you could pass `--pdb` to have it open the debugger at the first instance of a failed conversion.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822936521,I_kwDOBm6k_c5sp83J,2110,Merge database index page and query view,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:21:57Z,2023-07-26T19:53:25Z,2023-07-26T19:53:25Z,OWNER,,"Refs: - #2109 The idea here is that hitting `/content` without a `?sql=` will show an empty result set AND default to including a bunch of extras about the list of tables in the database. Then I won't have to think about `/content` and `/content?sql=` as separate pages any more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822918995,I_kwDOCGYnMM5sp4lT,580,Add way to export to a csv file using the Python library,44324811,kevinlinxc,open,0,,,,,0,2023-07-26T18:09:26Z,2023-07-26T18:09:26Z,,NONE,,"According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822813627,I_kwDOBm6k_c5spe27,2108,some (many?) SQL syntax errors are not throwing errors with a .csv endpoint,536941,fgregg,open,0,,,,,0,2023-07-26T16:57:45Z,2023-07-26T16:58:07Z,,CONTRIBUTOR,,"here's a CTE query that should always fail with a syntax error: ```sql with foo as (nonsense) select * from foo; ``` when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B same with this bad sql ```sql select a, from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B vs https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B but, datasette catches this bad sql at both endpoints: ```sql slect a from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1821108702,I_kwDOCGYnMM5si-ne,579,Special handling for SQLite column of type `JSON`,15178711,asg017,open,0,,,,,0,2023-07-25T20:37:23Z,2023-07-25T20:37:23Z,,CONTRIBUTOR,,"`sqlite-utils` should detect and have specially handling for column with a `JSON` column. For example: ```sql CREATE TABLE ""dogs"" ( id INTEGER PRIMARY KEY, name TEXT, friends JSON ); ``` ## Automatic Nesting According to [""Nested JSON Values""](https://sqlite-utils.datasette.io/en/stable/cli.html#nested-json-values), sqlite-utils will only expand JSON if the `--json-cols` flag is passed. It looks like it'll try to `json.load` all text column to test if its JSON, which can get expensive on non-json columns. Instead, `sqlite-utils` should be default (ie without the `--json-cols` flags) do the `maybe_json()` operation on columns with a declared `JSON` type. So the above table would expand the `""friends""` column as expected, withoutthe `--json-cols` flag: ```bash sqlite-utils dogs.db ""select * from dogs"" | python -mjson.tool ``` ``` [ { ""id"": 1, ""name"": ""Cleo"", ""friends"": [ { ""name"": ""Pancakes"" }, { ""name"": ""Bailey"" } ] } ] ``` --- I'm sure there's other ways `sqlite-utils` can specially handle JSON columns, so keeping this open while I think of more",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816997390,I_kwDOCGYnMM5sTS4O,576,Backfill the release notes prior to 0.4,9599,simonw,closed,0,,,,,2,2023-07-23T05:41:42Z,2023-07-23T05:49:51Z,2023-07-23T05:48:21Z,OWNER,,"Currently the changelog starts at 0.4: https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115 I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816919568,I_kwDOCGYnMM5sS_4Q,575,Python API ability to opt-out of connection plugins,9599,simonw,closed,0,,,,,2,2023-07-22T23:01:13Z,2023-07-22T23:17:22Z,2023-07-22T23:08:22Z,OWNER,,"Plugins affecting the CLI by default makes sense to me. I'm less confident about them _always_ affecting users of the Python API. I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this: ```python from sqlite_utils import Database db = Database(memory=True, execute_plugins=False) # Anything using db from here on will not execute plugins ``` cc @asg017 Refs: - #567 - #574 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1801394744,I_kwDOCGYnMM5rXxo4,567,Plugin system,15178711,asg017,closed,0,,,,,9,2023-07-12T17:02:14Z,2023-07-22T22:59:37Z,2023-07-22T22:59:36Z,CONTRIBUTOR,,"I'd like there to be a plugin system for sqlite-utils, similar to the datasette/llm plugins. I'd like to make plugins that would do things like: - Register SQLite extensions for more SQL functions + virtual tables - Register new subcommands - Different input file formats for `sqlite-utils memory` - Different output file formats (in addition to `--csv` `--tsv` `--nl` etc. A few real-world use-cases of plugins I'd like to see in sqlite-utils: - Register many of my sqlite extensions in sqlite-utils (`sqlite-http`, `sqlite-lines`, `sqlite-regex`, etc.) - New subcommands to work with `sqlite-vss` vector tables - Input/ouput Parquet/Avro/Arrow IPC files with `sqlite-arrow`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816876211,I_kwDOCGYnMM5sS1Sz,571,`.transform(keep_table=...)` option,9599,simonw,closed,0,,,,,1,2023-07-22T19:49:29Z,2023-07-22T22:32:18Z,2023-07-22T22:32:18Z,OWNER,,">> Also need a design for an option for the `.transform()` method to indicate that the new table should be created with a new name without dropping the old one. > > I think `keep_table=""name_of_table""` is good for this. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/565#issuecomment-1646657324_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816877910,I_kwDOCGYnMM5sS1tW,572,Don't test Python 3.7 against textual,9599,simonw,closed,0,,,,,2,2023-07-22T19:57:03Z,2023-07-22T22:16:50Z,2023-07-22T22:16:50Z,OWNER,,"Spotted this in the GitHub Actions logs: ![IMG_5046](https://github.com/simonw/sqlite-utils/assets/9599/81fb1093-cd8a-4019-a612-2e49b500c933) ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786243905,I_kwDOCGYnMM5qd-tB,564,Document that running `db.transform()` tidies up the schema indentation,9599,simonw,closed,0,,,,,0,2023-07-03T13:59:28Z,2023-07-22T22:15:34Z,2023-07-22T22:15:34Z,OWNER,,"> ... and it turns out running `.transform()` with no arguments still fixes the format of the schema! ```pycon >>> db[""log""].add_column(""foo"", str) >>> db[""log""].add_column(""bar"", str)
>>> db[""log""].add_column(""baz"", str)
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT , [foo] TEXT, [bar] TEXT, [baz] TEXT) >>> db[""log""].transform()
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT, [foo] TEXT, [bar] TEXT, [baz] TEXT ) ``` _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618347727_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/564/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1205687423,I_kwDOCGYnMM5H3VR_,426,CLI docs should link to Python docs and vice versa,9599,simonw,closed,0,9599,simonw,,,1,2022-04-15T16:05:15Z,2023-07-22T22:13:22Z,2023-07-22T22:13:22Z,OWNER,,"For every command/API method there should be a link to the equivalent in the other form factor. Maybe also link to the API and CLI reference pages too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/426/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786258502,I_kwDOCGYnMM5qeCRG,565,Table renaming: db.rename_table() and sqlite-utils rename-table,9599,simonw,closed,0,,,,,6,2023-07-03T14:07:42Z,2023-07-22T22:12:40Z,2023-07-22T22:12:40Z,OWNER,,"> I find myself wanting two new features in `sqlite-utils`: > - The ability to have the new transformed table set to a specific name, while keeping the old table around > - The ability to rename a table (`sqlite-utils` doesn't have a table rename function at all right now) _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618375042_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816851056,I_kwDOCGYnMM5sSvJw,568,"table.create(..., replace=True)",9599,simonw,closed,0,,,,,7,2023-07-22T18:12:22Z,2023-07-22T19:25:35Z,2023-07-22T19:15:44Z,OWNER,,"Found myself using this pattern to quickly prototype a schema: ```python import sqlite_utils db = sqlite_utils.Database(memory=True) print(db[""answers_chunks""].create({ ""id"": int, ""content"": str, ""embedding_type_id"": int, ""embedding"": bytes, ""embedding_content_md5"": str, ""source"": str, }, pk=""id"", transform=True).schema) ``` Using `replace=True` to drop and then recreate the table would be neat here, and would be consistent with other places that use `replace=True`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857105,I_kwDOCGYnMM5sSwoR,570,`sqlite-utils install -e` option,9599,simonw,closed,0,,,,,0,2023-07-22T18:32:23Z,2023-07-22T18:55:59Z,2023-07-22T18:32:56Z,OWNER,,"As seen in LLM. Needed while working on: - #567",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816830546,I_kwDODEm0Qs5sSqJS,73,Twitter v1 API shutdown,6341745,david-perez,open,0,,,,,0,2023-07-22T16:57:41Z,2023-07-22T16:57:41Z,,NONE,,"I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get: ``` [2023-07-19 21:00:04.937536] File ""/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets[""errors""])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}] ``` It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they [announced they'd be deprecated on 29th April](https://twittercommunity.com/t/reminder-to-migrate-to-the-new-free-basic-or-enterprise-plans-of-the-twitter-api/189737). Unfortunately [retrieving likes using the v2 API](https://developer.twitter.com/en/docs/twitter-api/tweets/likes/introduction) is not part of their [free plan](https://developer.twitter.com/en/portal/products). In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,aki-k,open,0,,,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,simonw,open,0,,,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,simonw,open,0,,,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,simonw,open,0,,,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080 > May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1803264272,I_kwDOBm6k_c5re6EQ,2101,alter: true support for JSON write API,9599,simonw,open,0,,,,,1,2023-07-13T15:24:11Z,2023-07-13T15:24:18Z,,OWNER,,"Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642 > The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 771608692,MDU6SXNzdWU3NzE2MDg2OTI=,14,UNIQUE constraint failed: workouts.id,1234956,n8henrie,open,0,,,,,5,2020-12-20T15:11:20Z,2023-07-10T14:46:52Z,,NONE,,"I'm getting an error on my initial attempt to import data: ```console $ healthkit-to-sqlite 20201119\ healthkit\ export.zip healthkit.db Importing from HealthKit [###################################-] 98% 00:00:01 Traceback (most recent call last): File ""venv/bin/healthkit-to-sqlite"", line 8, in sys.exit(cli()) File ""venv/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""venv/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""venv/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""venv/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/utils.py"", line 34, in convert_xml_to_sqlite workout_to_db(el, db, zipfile) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/utils.py"", line 57, in workout_to_db pk = db[""workouts""].insert(record, alter=True, hash_id=""id"").last_pk File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1660, in insert return self.insert_all( File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1778, in insert_all self.insert_chunk( File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1588, in insert_chunk result = self.db.execute(query, params) File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 213, in execute return self.conn.execute(sql, parameters) sqlite3.IntegrityError: UNIQUE constraint failed: workouts.id ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795219865,I_kwDOCGYnMM5rAOGZ,566,`--no-headers` doesn't work on most formats,33625,zellyn,open,0,,,,,2,2023-07-09T03:43:36Z,2023-07-09T04:13:35Z,,NONE,,"Version 3.33 ``` sqlite-utils query library.db 'select asin from audible' --fmt plain --no-headers | head -3 asin 0062804006 0062891421 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771202454,MDU6SXNzdWU3NzEyMDI0NTQ=,1153,"Use YAML examples in documentation by default, not JSON",9599,simonw,closed,0,,,,,22,2020-12-18T22:20:15Z,2023-07-08T20:09:48Z,2023-07-08T20:08:13Z,OWNER,,"YAML configuration is much better for multi-line strings, and I'm increasingly adding configuration options to Datasette that benefit from that - fragments of HTML in `description_html` or SQL queries used to configure things like https://github.com/simonw/datasette-atom for example. Rather than confusing things by showing both in the documentation, I should switch all of the default examples to use YAML instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1794097871,I_kwDOBm6k_c5q78LP,2095,"Introduce ""dark mode"" CSS",3315059,jamietanna,open,0,,,,,0,2023-07-07T19:15:58Z,2023-07-07T19:15:58Z,,NONE,,Using [the CSS media query `prefers-color-scheme`](https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme) we can provide a dark-mode version of Datasette,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1785360409,I_kwDOCGYnMM5qanAZ,563,`--empty-null` option when importing CSV,9599,simonw,closed,0,,,,,1,2023-07-03T05:23:36Z,2023-07-03T05:44:43Z,2023-07-03T05:42:30Z,OWNER,,"CSV files with empty cells in (which come through as the empty string) are common and a bit gross. Having an option that means ""and if it's an empty string store `null` instead) would be cool. I brainstormed name options here https://chat.openai.com/share/c947b738-ee7d-419c-af90-bc84e90987da",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1355148385,I_kwDOBm6k_c5Qxexh,1796,Research an upgrade to CodeMirror 6,9599,simonw,closed,0,,,,,4,2022-08-30T04:27:46Z,2023-07-03T04:58:21Z,2023-07-03T04:58:21Z,OWNER,,"There are still a bunch of bugs in CodeMirror 5 that affect various mobile browsers - see Datasette Discord report here: https://discord.com/channels/823971286308356157/823971286941302908/1013878624992108645 https://user-images.githubusercontent.com/9599/187349269-7b7c0c8c-3894-4810-82f0-de7c1eb940b3.mp4 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1784794489,I_kwDOCGYnMM5qYc15,562,Explore the intersection between sqlite-utils and dataclasses,9599,simonw,open,0,,,,,1,2023-07-02T19:23:08Z,2023-07-02T19:26:39Z,,OWNER,,"> Aside: this makes me think it might be cool if `sqlite-utils` had a way of working with dataclasses rather than just dicts, and knew how to create a SQLite table to match a dataclass and maybe how to code-generate dataclasses for a specific table schema (dynamically or even using code-generation that can be written to disk, for better editor integrations). _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1616742529_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1783304750,I_kwDOBm6k_c5qSxIu,2094,JS Plugin Hooks for the Code Editor,15178711,asg017,open,0,,,,,0,2023-07-01T00:51:57Z,2023-07-01T00:51:57Z,,CONTRIBUTOR,,"When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor. I'd eventually like to build a JS plugin for [`sqlite-docs`](https://github.com/asg017/sqlite-docs), to add things like: - Inline documentation for tables/columns on hover - Inline docs for custom functions that are loaded in - More detailed autocomplete for tables/columns/functions I did some hacking to see what this would look like, see here: There can be a new hook that allows JS plugins to add new ""extension"" in the CodeMirror editorview here: https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25 Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of `""@codemirror/autocomplete""`, for example. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2094/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781047747,I_kwDOBm6k_c5qKKHD,2092,test_homepage intermittent failure,9599,simonw,closed,0,,,,,2,2023-06-29T15:20:37Z,2023-06-29T15:26:28Z,2023-06-29T15:24:13Z,OWNER,,"e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852 ``` =================================== FAILURES =================================== ________________________________ test_homepage _________________________________ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python ds_client = @pytest.mark.asyncio async def test_homepage(ds_client): response = await ds_client.get(""/.json"") assert response.status_code == 200 assert ""application/json; charset=utf-8"" == response.headers[""content-type""] data = response.json() assert data.keys() == {""fixtures"": 0}.keys() d = data[""fixtures""] assert d[""name""] == ""fixtures"" assert d[""tables_count""] == 24 assert len(d[""tables_and_views_truncated""]) == 5 assert d[""tables_and_views_more""] is True # 4 hidden FTS tables + no_primary_key (hidden in metadata) assert d[""hidden_tables_count""] == 6 # 201 in no_primary_key, plus 6 in other hidden tables: > assert d[""hidden_table_rows_sum""] == 207, data E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}} E assert 0 == 207 ``` My guess is that this is a timing error, where very occasionally the ""count rows but stop counting if it exceeds a time limit"" thing fails.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,simonw,open,0,,,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1780973290,I_kwDOBm6k_c5qJ37q,2089,codespell test failure,9599,simonw,closed,0,,,,,5,2023-06-29T14:40:10Z,2023-06-29T14:48:11Z,2023-06-29T14:48:10Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/5413443676/jobs/9838999356 ``` codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt shell: /usr/bin/bash -e {0} env: pythonLocation: /opt/hostedtoolcache/Python/3.9.17/x64 LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.17/x64/lib docs/metadata.rst:192: displaing ==> displaying ``` This failure is legit, it found a spelling mistake: https://github.com/simonw/datasette/blob/ede62036180993dbd9d4e5d280fc21c183cda1c3/docs/metadata.rst#L192",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2089/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1777548699,I_kwDOCGYnMM5p8z2b,561,`--stop-after` option for `insert` and `upsert` commands,9599,simonw,closed,0,,,,,1,2023-06-27T18:44:15Z,2023-06-27T18:50:09Z,2023-06-27T18:50:08Z,OWNER,,I found myself wanting to insert rows from a 849MB CSV file without processing the whole thing: https://huggingface.co/datasets/jerpint-org/HackAPrompt-Playground-Submissions/tree/main,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810618495,MDU6SXNzdWU4MTA2MTg0OTU=,235,Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified,6913891,kristomi,closed,0,,,,,18,2021-02-17T23:33:23Z,2023-06-26T01:47:01Z,2023-06-25T23:25:53Z,NONE,,"Thanks for what seems like a truly great suite of libraries. I wanted to try out Datasette, but never got more than half way through your YouTube video with the SF tree dataset. Whenever I try to extract a column, I get a `sqlite3.OperationalError: table sqlite_master may not be modified` error from Python. This snippet reproduces the error on my system, Python 3.9.1 and sqlite-utils 3.5 on an M1 Macbook Pro running in rosetta mode: ``` curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id sqlite-utils extract meteorites.db meteorites recclass ``` I have tried googling the problem, but all I've found is that this *might* be a problem with the sqlite3 database running in defensive mode, but I definitely can't know for sure. Does the problem seem familiar to you? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/235/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1773450152,I_kwDOCGYnMM5ptLOo,559,sqlean support,9599,simonw,closed,0,,,,,0,2023-06-25T19:27:26Z,2023-06-25T23:25:53Z,2023-06-25T23:25:53Z,OWNER,,"If sqlean is available, use that. Refs: - https://github.com/nalgeon/sqlean.py/issues/1#issuecomment-1605707788 This will provide a good workaround for: - #235 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323223872,MDU6SXNzdWUzMjMyMjM4NzI=,260,Validate metadata.json on startup,9599,simonw,open,0,,,,,7,2018-05-15T13:42:56Z,2023-06-21T12:51:22Z,,OWNER,,"It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail. To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1764792125,I_kwDOBm6k_c5pMJc9,2086,Show information on startup in directory configuration mode,9599,simonw,open,0,,,,,0,2023-06-20T07:13:33Z,2023-06-20T07:13:33Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098 > One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1762180409,I_kwDOBm6k_c5pCL05,2085,Interactive row selection in Datasette ,24938923,learning4life,open,0,,,,,0,2023-06-18T08:29:45Z,2023-06-18T08:31:23Z,,NONE,,"Simon did a excellent [prototype](https://til.simonwillison.net/datasette/row-selection-prototype) of an interactive row selection in Datasette. I hope this [functionality](https://camo.githubusercontent.com/3d4a0f31fb6a27fd279f809af5b53dc3b76faa63c7721e228951c5252b645a77/68747470733a2f2f7374617469632e73696d6f6e77696c6c69736f6e2e6e65742f7374617469632f323032332f6461746173657474652d7069636b65722e676966) can be turned into a Datasette plugin. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1761613778,I_kwDOBm6k_c5pABfS,2084,Support facets for columns that contain timestamps,19492893,devxpy,open,0,,,,,0,2023-06-17T03:33:54Z,2023-06-17T03:33:54Z,,NONE,," Django has this very nice filter for datetime fields - It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now... ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1655860104,I_kwDOCGYnMM5ismuI,535,rows: --transpose or psql extended view-like functionality,7908073,chapmanjacobd,closed,0,,,,,2,2023-04-05T15:37:33Z,2023-06-15T08:39:49Z,2023-06-14T22:05:28Z,CONTRIBUTOR,,"It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`) In other words instead of this: ``` sqlite-utils rows --limit 5 --fmt github track_metadata.db songs ``` | track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 | The output would look something like this: ``` $ for col in (sqlite-columns track_metadata.db songs) sqlite-utils --fmt github track_metadata.db ""select $col from songs order by rowid desc limit 5"" end ``` | track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 | ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1581090327,I_kwDOCGYnMM5ePYYX,529,Microsoft line endings,7908073,chapmanjacobd,closed,0,,,,,1,2023-02-12T02:20:48Z,2023-06-14T23:12:12Z,2023-06-14T23:11:47Z,CONTRIBUTOR,,"sqlite-utils prints `\r\n` but [it should probably](https://devblogs.microsoft.com/commandline/extended-eol-in-notepad/) print `\n` (unless the platform is detected as Windows?) It has tripped me up a few times when piping the output of sqlite-utils to other programs: ``` $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1383646615,I_kwDOCGYnMM5SeMWX,491,Ability to merge databases and tables,8904453,sgraaf,open,0,,,,,7,2022-09-23T11:10:55Z,2023-06-14T22:14:24Z,,NONE,,"Hi! Let me firstly say that I am a big fan of your work -- I follow your tweets and blog posts with great interest 😄. Now onto the matter at hand: I think it would be great if `sqlite-utils` included a `merge` or `combine` command, with the purpose of combining different SQLite databases into a single SQLite database. This way, the newly ""merged"" database would contain all differently named tables contained in the databases to be merged as-is, as well a concatenation of all tables of the same name. This could look something like this: ```bash sqlite-utils merge cats.db dogs.db > animals.db ``` I imagine this is rather straightforward if all databases involved in the merge contain differently named tables (i.e. no chance of conflicts), but things get slightly more complicated if two or more of the databases to be merged contain tables with the same name. Not only do you have to ""do something"" with the primary key(s), but these tables could also simply have different schemas (and therefore be incompatible for concatenation to begin with). Anyhow, I would love your thoughts on this, and, if you are open to it, work together on the design and implementation!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/491/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1733198948,I_kwDOCGYnMM5nToRk,555,Filter table by a large bunch of ids,10843208,redraw,open,0,,,,,1,2023-05-31T00:29:51Z,2023-06-14T22:01:57Z,,NONE,,"Hi! this might be a question related to both SQLite & sqlite-utils, and you might be more experienced with them. I have a large bunch of ids, and I'm wondering which is the best way to query them in terms of performance, and simplicity if possible. The naive approach would be something like `select * from table where rowid in (?, ?, ?...)` but that wouldn't scale if ids are >1k. Another approach might be creating a temp table, or in-memory db table, insert all ids in that table and then join with the target one. I failed to attach an in-memory db both using sqlite-utils, and plain sql's execute(), so my closest approach is something like, ```python def filter_existing_video_ids(video_ids): db = get_db() # contains a ""videos"" table db.execute(""CREATE TEMPORARY TABLE IF NOT EXISTS tmp (video_id TEXT NOT NULL PRIMARY KEY)"") db[""tmp""].insert_all([{""video_id"": video_id} for video_id in video_ids]) for row in db[""tmp""].rows_where(""video_id not in (select video_id from videos)""): yield row[""video_id""] db[""tmp""].drop() ``` That kinda worked, I couldn't find an option in sqlite-utils's `create_table()` to tell it's a temporary table. Also, `tmp` table is not dropped finally, neither using `.drop()` despite being created with the keyword `TEMPORARY`. I believe it should be automatically dropped after connection/session ends though I read.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1740150327,I_kwDOCGYnMM5nuJY3,557,Aliased ROWID option for tables created from alter=True commands,7908073,chapmanjacobd,closed,0,,,,,2,2023-06-04T05:29:28Z,2023-06-14T06:09:21Z,2023-06-05T19:26:26Z,CONTRIBUTOR,,"> If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values. ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK. It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1751214236,I_kwDOC8SPRc5oYWic,36,Getting sqlite_master may not be modified when creating dogsheep index,8711912,khushmeeet,open,0,,,,,0,2023-06-11T03:21:53Z,2023-06-11T03:21:53Z,,NONE,,"When creating a `dogsheep` index from `config.yml` file on pocket.db (created using pocket-to-sqlite), I am getting this error ``` Traceback (most recent call last): File ""/Users/khushmeeet/.pyenv/versions/3.11.2/bin/dogsheep-beta"", line 8, in sys.exit(cli()) ^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/cli.py"", line 36, in index run_indexer( File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 32, in run_indexer ensure_table_and_indexes(db, tokenize) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 91, in ensure_table_and_indexes table.add_foreign_key(*fk) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2155, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 1116, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified ``` Command I ran to get this error ``` dogsheep-beta index pocket.db config.yml ``` Dogsheep version ``` dogsheep-beta, version 0.10.2 ``` Python version ``` Python 3.11.2 ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1740026046,I_kwDOCGYnMM5ntrC-,556,Support storing incrementally piped values,601708,mcint,open,0,,,,,1,2023-06-04T00:45:23Z,2023-06-04T01:21:15Z,,CONTRIBUTOR,,"I'm trying to use sqlite-utils to data generated incrementally. There are a few aspects of this that I don't currently know how to handle. I would like an option to apply writes incrementally, line-by-line as they are received. I would like an option to echo incremental progress. And, it would be nice to have In particular, I'm using CoreLocationCLI -w -j to generate, newline-delimited JSON. One variant of the command `stdbuf -oL CoreLocationCLI -w -j | pee 'sqlite-utils insert loc.db loc -' nl` `pee`, from `moreutils`, is like `tee` but spawns and pipes to the processes created by invoking each of its arguments, so, for gratuitous demonstration, `pee 'sponge out.log' cat` would behave like `tee`. It looks like I can get what I want with: `stdbuf -oL CoreLocationCLI -w -j | while read line; do <<<""$line"" sqlite-utils insert loc.db loc -; echo ""$line""; done | nl` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1727478903,I_kwDOBm6k_c5m9zx3,2081,Update Endpoints defined in metadata throws 403 Forbidden after a while,15085007,cutmasta-kun,open,0,,,,,0,2023-05-26T11:52:30Z,2023-05-26T11:52:30Z,,NONE,,"Hello. I expose an endpoint to update `tasks`: ``` { ""title"": ""My Datasette Instance"", ""databases"": { ""tasks"": { ""queries"": { ""update_task"": { ""sql"": ""UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID"", ""write"": true, ""on_success_message"": ""Task updated"", ""on_success_redirect"": ""/tasks/tasks.json"", ""on_error_message"": ""Task update failed"", ""on_error_redirect"": ""/tasks.json"", ""params"": [""queueID"", ""taskData"", ""status"", ""result"", ""systemMessage""] } } } } } ``` This works really well! But after a while, the Datasette Instanz answers with **403 Forbidden**. I have to delete the database and recreate it in order to work again. Any help here? (´。_。`)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1726236847,I_kwDOBm6k_c5m5Eiv,2078,Resolve the difference between `wrap_view()` and `BaseView`,9599,simonw,closed,0,,,,,16,2023-05-25T17:44:32Z,2023-05-26T00:18:46Z,2023-05-26T00:18:46Z,OWNER,,"There are two patterns for implementing views in Datasette at the moment. I want to combine those. Part of: - #2053",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2078/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726531350,I_kwDOBm6k_c5m6McW,2079,Datasette should serve Access-Control-Max-Age,9599,simonw,closed,0,,,,,8,2023-05-25T21:50:50Z,2023-05-25T22:56:28Z,2023-05-25T22:08:35Z,OWNER,,"Currently the CORS headers served are: https://github.com/simonw/datasette/blob/9584879534ff0556e04e4c420262972884cac87b/datasette/utils/__init__.py#L1139-L1143 Serving `Access-Control-Max-Age: 600` would allow browsers to cache that for 10 minutes, avoiding additional CORS pre-flight OPTIONS requests during that time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1720096994,I_kwDOCGYnMM5mhpji,554,"`IndexError` when doing `.insert(..., pk='id')` after `insert_all`",1231935,xavdid,open,0,,,,,1,2023-05-22T17:13:02Z,2023-05-22T17:18:33Z,,NONE,,"I believe this is related to https://github.com/simonw/sqlite-utils/issues/98. When `pk` is specified by table A's `insert` call, it throws an index error if a different table has written a row with a higher rowid than exists in the first table. Here's a basic example: ```py from sqlite_utils import Database def test_pk_for_insert(fresh_db): user = {""id"": ""abc"", ""name"": ""david""} fresh_db[""users""].insert(user, pk=""id"") fresh_db[""comments""].insert_all( [ {""id"": ""def"", ""text"": ""ok""}, {""id"": ""ghi"", ""text"": ""great""}, ], ) fresh_db[""users""].insert( user, ignore=True, # BUG: when specifying pk on the second insert call # db.py goes into a block it doesn't expect and we get the error pk=""id"", ) if __name__ == ""__main__"": db = Database(""bug.db"") if db[""users""].exists(): raise ValueError( ""bug only shows on a new database - remove bug.db before running the script"" ) test_pk_for_insert(db) ``` The error is: ```py File ""/Users/david/projects/reddit-to-sqlite/.venv/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2960, in insert_chunk row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: list index out of range ``` The issue is in this block: https://github.com/simonw/sqlite-utils/blob/2747257a3334d55e890b40ec58fada57ae8cfbfd/sqlite_utils/db.py#L2954-L2958 relevant locals are: - `pk`: `'id'` - `result.lastrowid`: `2` What's most interesting is the comment `# self.last_rowid will be 0 if a ""INSERT OR IGNORE"" happened`, which doesn't seem to be the case here. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1718612569,I_kwDOCGYnMM5mb_JZ,552,Document how to setup shell auto-completion,9599,simonw,closed,0,,,,,1,2023-05-21T19:20:41Z,2023-05-21T21:05:16Z,2023-05-21T21:03:40Z,OWNER,,"https://click.palletsprojects.com/en/8.1.x/shell-completion/ This works for `zsh`: eval ""$(_SQLITE_UTILS_COMPLETE=zsh_source sqlite-utils)"" This will probably work for `bash`: eval ""$(_SQLITE_UTILS_COMPLETE=bash_source sqlite-utils)"" Need to add this to the installation docs here: https://sqlite-utils.datasette.io/en/stable/installation.html - along with the pattern for adding that to `.zshrc` or whatever.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718607907,I_kwDOCGYnMM5mb-Aj,551,Make as many examples in the CLI docs as possible copy-and-pastable,9599,simonw,closed,0,,,,,6,2023-05-21T19:04:10Z,2023-05-21T21:04:04Z,2023-05-21T20:57:24Z,OWNER,,"e.g. in this section: https://sqlite-utils.datasette.io/en/stable/cli.html#running-queries-directly-against-csv-or-json The little copy button will also copy the `$ ` which breaks the examples when copied.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718517882,I_kwDOCGYnMM5mboB6,545,Try out Trogon for a tui interface,9599,simonw,closed,0,,,,,6,2023-05-21T14:08:25Z,2023-05-21T19:33:13Z,2023-05-21T18:41:58Z,OWNER,,https://github.com/Textualize/trogon,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/545/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718576761,I_kwDOCGYnMM5mb2Z5,548,analyze-tables should validate provide --column names,9599,simonw,closed,0,,,,,1,2023-05-21T17:20:24Z,2023-05-21T17:35:52Z,2023-05-21T17:35:52Z,OWNER,,"Noticed this while testing: - #547 If you pass a non-existent column to `-c/--column` you don't get an error message.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718572201,I_kwDOCGYnMM5mb1Sp,547,No need to show common values if everything is null,9599,simonw,closed,0,,,,,1,2023-05-21T17:05:07Z,2023-05-21T17:19:21Z,2023-05-21T17:19:21Z,OWNER,,"Noticed this: ``` % sqlite-utils analyze-tables content.db repos -c delete_branch_on_merge --common-limit 20 --no-least repos.delete_branch_on_merge: (1/1) Total rows: 158 Null rows: 158 Blank rows: 0 Distinct values: 0 Most common: 158: None ``` The `158: None` there is duplicate information considering we already know there are 158/158 null rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718515590,I_kwDOCGYnMM5mbneG,544,New options for analyze-tables --common-limit --no-most and --no-least,9599,simonw,closed,0,,,,,2,2023-05-21T14:03:19Z,2023-05-21T17:03:06Z,2023-05-21T16:19:31Z,OWNER,,"The ""least common"" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values. sqlite-utils analyze-tables content.db repos --common-limit 20 --no-least",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124731464,I_kwDOCGYnMM5DCgpI,399,"Make it easier to insert geometries, with documentation and maybe code",9599,simonw,open,0,,,,,25,2022-02-05T00:11:26Z,2023-05-16T03:11:52Z,,OWNER,,"In playing with the new SpatiaLite helpers from #385 I noticed that actually populating geometry columns is still a little bit tricky. Here's what I ended up doing: ```python import httpx, sqlite_utils db = sqlite_utils.Database(""/tmp/spatial.db"") attractions = httpx.get(""https://latest.datasette.io/fixtures/roadside_attractions.json?_shape=array"").json() db[""attractions""].insert_all(attractions, pk=""pk"") # Schema of that table is now: # CREATE TABLE [attractions] ( # [pk] INTEGER PRIMARY KEY, # [name] TEXT, # [address] TEXT, # [latitude] FLOAT, # [longitude] FLOAT # ) db.init_spatialite() db[""attractions""].add_geometry_column(""point"", ""POINT"") db.execute("""""" update attractions set point = GeomFromText( 'POINT(' || longitude || ' ' || latitude || ')', 4326 ) """""") ``` That last line took some figuring out - especially the need for the SRID of `4326`, without which I got this error: > `IntegrityError: attractions.point violates Geometry constraint [geom-type or SRID not allowed]` It would be good to both document this in more detail, but ideally also to come up with a more obvious pattern for inserting common types of spatial data. Also related: - #398 - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,simonw,open,0,,,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline I had to do this workaround: ```sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] ``` I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1578790070,I_kwDOCGYnMM5eGmy2,527,`Table.convert()` skips falsey values,167893,mcarpenter,closed,0,,,,,5,2023-02-10T00:00:52Z,2023-05-09T21:15:05Z,2023-05-08T21:03:24Z,CONTRIBUTOR,,"# Summary By design, `Table.convert()` does [not attempt](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2663) conversion of falsey values (`None`, `""""`, `0`, ...). This is surprising (directly contradicts the docstring) and `convert()` may quietly skip cells where the user assumed a conversion would take place. # Example Increment a column of integers by one ``` python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 0}, {col:1}]) print(table.get(1)) # 0 print(table.get(2)) # 1 print() table.convert(col, lambda x: x+1) print(table.get(1)) # got 0, expected 1 ⚠⚠⚠ print(table.get(2)) # got 2, expected 2 ``` Another example might be, say, transforming cells containing empty string to `NULL`. # Discussion This was, I think, a pragmatic choice so that consumers can skip writing guard clauses for these falsey values (particularly from the CLI). But this surprising undocumented behavior can lead to incorrect data. I don't think this is a good trade-off between convenience and correctness. In the absence of this convenience users will either have to write guard clauses into their conversion expressions (or adapt the called function to do the same), so: ``` python fn(value) if value else value ``` instead of: ``` python fn(value) ``` This is more typing and sometimes I will forget, and there will be errors. (But they will be noisy errors, which is a good thing). Such a change will certainly inconvenience some existing consumers; there will be some breakage. But I think this is worth it to avoid quietly not converting some values by default, which can lead to quietly bad data. I have a PR that I will attach, please take a look and see what you think.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1702354223,I_kwDOBm6k_c5ld90v,2070,Mechanism for deploying a preview of a branch using Vercel,9599,simonw,closed,0,,,,,2,2023-05-09T16:21:45Z,2023-05-09T16:25:00Z,2023-05-09T16:24:31Z,OWNER,,"I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml It deployed the `json-extras-query` branch here: https://datasette-preview-json-extras-query.vercel.app/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1701018909,I_kwDOCGYnMM5lY30d,543,Tests broken on Windows due to new convert() lambda names,9599,simonw,closed,0,,,,,0,2023-05-08T22:11:29Z,2023-05-08T22:19:04Z,2023-05-08T22:19:04Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/4920084038/jobs/8788501314 ```python sql = 'update [example] set [dt] = lambda_-9223371942137158589([dt]);' ``` From: - #526",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279144769,I_kwDOCGYnMM5MPjNB,448,Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto',236907,mungewell,closed,0,,,,,5,2022-06-21T21:48:27Z,2023-05-08T22:01:00Z,2023-05-08T22:01:00Z,NONE,,"Attempting to run the example given here (without extra bracket ;-): https://sqlite-utils.datasette.io/en/stable/python-api.html#reading-rows-from-a-file ``` from sqlite_utils.utils import rows_from_file import io rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) print(list(rows), format) # Outputs [{'id': '1', 'name': 'Cleo'}] Format.CSV ``` Gives error ``` >""c:\Program Files\Python37\python.exe"" test2.py Traceback (most recent call last): File ""test2.py"", line 4, in rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) File ""C:\Users\swood\Downloads\sqlite-utils-main-20220621\sqlite-utils-main\sqlite_utils\utils.py"", line 300, in rows_from_file first_bytes = buffered.peek(2048).strip() AttributeError: '_io.StringIO' object has no attribute 'readinto' ``` I am running Python on Windows. ``` >""c:\Program Files\Python37\python.exe"" Python 3.7.4 (tags/v3.7.4:e09359112e, Jul 8 2019, 20:34:20) [MSC v.1916 64 bit (AMD64)] on win32 Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/448/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1575131737,I_kwDOCGYnMM5d4ppZ,525,Repeated calls to `Table.convert()` fail,167893,mcarpenter,closed,0,,,,,4,2023-02-07T22:40:47Z,2023-05-08T21:59:41Z,2023-05-08T21:54:02Z,CONTRIBUTOR,,"## Summary When using the API, repeated calls to `Table.convert()` do not work correctly since all conversions quietly use the callable (function, lambda) from the first call to `convert()` only. Subsequent invocations with different callables use the callable from the first invocation only. ## Example ```python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 1}]) print(table.get(1)) table.convert(col, lambda x: x*2) print(table.get(1)) def zeroize(x): return 0 #zeroize = lambda x: 0 #zeroize.__name__ = 'zeroize' table.convert(col, zeroize) print(table.get(1)) ``` Output: ``` {'x': 1} {'x': 2} {'x': 4} ``` Expected: ``` {'x': 1} {'x': 2} {'x': 0} ``` ## Explanation This is some relevant [documentation](https://github.com/simonw/sqlite-utils/blob/1491b66dd7439dd87cd5cd4c4684f46eb3c5751b/docs/python-api.rst#registering-custom-sql-functions:~:text=By%20default%20registering%20a%20function%20with%20the%20same%20name%20and%20number%20of%20arguments%20will%20have%20no%20effect). * `Table.convert()` takes a `Callable` to perform data conversion on a column * The `Callable` is passed to `Database.register_function()` * `Database.register_function()` uses the callable's `__name__` attribute for registration * (Aside: all lambdas have a `__name__` of ``: I thought this was the problem, and it was close, but not quite) * However `convert()` first wraps the callable by local function [`convert_value()`](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2661) * Consequently `register_function()` sees name `convert_value` for all invocations from `convert()` * `register_function()` silently ignores registrations using the same name, retaining only the first such registration There's a mismatch between the comments and the code: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L404 but actually the existing function is returned/used instead (as the ""registering custom sql functions"" doc I linked above says too). Seems like this can be rectified to match the comment? ## Suggested fix I think there are four things: 1. The call to `register_function()` from `convert()`should have an explicit `name=` parameter (to continue using `convert_value()` and the progress bar). 2. For functions, this name can be the real function name. (I understand the sqlite api needs a name, and it's nice if those are recognizable names where possible). For lambdas would `'lambda-{uuid}'` or similar be acceptable? 3. `register_function()` really should throw an error on repeated attempts to register a duplicate (function, arity)-pair. 4. A test? I haven't looked at the test framework here but seems this should be testable. ## See also - #458 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1465194249,I_kwDOCGYnMM5XVRcJ,514,upsert of new row with check constraints fails,193185,cldellow,closed,0,,,,,5,2022-11-26T16:12:23Z,2023-05-08T21:50:52Z,2023-05-08T21:50:51Z,NONE,,"(I originally opened this in https://github.com/simonw/datasette-insert/issues/20, but I see that that library depends on sqlite-utils) In the case of a new row, upsert first adds the row, specifying only its pkeys: https://github.com/simonw/sqlite-utils/blob/965ca0d5f5bffe06cc02cd7741344d1ddddf9d56/sqlite_utils/db.py#L2783-L2787 This means that a table with NON NULL (or other constraint) columns that aren't part of the pkey can't have new rows upserted.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1044267332,I_kwDOCGYnMM4-PkFE,336,"sqlite-util tranform --column-order mangles columns of type ""timestamp""",536941,fgregg,closed,0,,,,,1,2021-11-04T01:15:38Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,CONTRIBUTOR,,"Reproducible code below: ```bash > echo 'create table bar (baz text, created_at timestamp default CURRENT_TIMESTAMP)' | sqlite3 foo.db > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE bar (baz text, created_at timestamp default CURRENT_TIMESTAMP); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT 'CURRENT_TIMESTAMP' ); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT '''CURRENT_TIMESTAMP''' ); ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1432377191,I_kwDOCGYnMM5VYFdn,509,`sqlite-utils transform` breaks DEFAULT string values and STRFTIME(),2199875,kennysong,closed,0,,,,,0,2022-11-02T02:32:23Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,NONE,,"Very nice library! Our team found sqlite-utils through @simonw's [comment on the ""Simple declarative schema migration for SQLite"" article](https://news.ycombinator.com/item?id=31249823), and we were excited to use it, but unfortunately `sqlite-utils transform` seems to break our DB. Running `sqlite-utils transform` to modify a column mangles their DEFAULT values: - Default string values are wrapped in extra single quotes - Function expressions such as [`STRFTIME()`](https://www.sqlite.org/lang_datefunc.html) are turned into strings! ------ Here are steps to reproduce: **Original database** ``` $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) EOF $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) ``` **Modified database after sqlite-utils** ``` $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 $ sqlite-utils transform test.db mytable --rename col1 renamedcol1 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE ""mytable"" ( [renamedcol1] TEXT DEFAULT '''foo''', [col2] TEXT DEFAULT 'STRFTIME(''%Y-%m-%d %H:%M:%f'', ''NOW'')' ) $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 'foo'|STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW') ``` (Related: #336)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1700936245,I_kwDOCGYnMM5lYjo1,542,Remove `skip_false=True` and `--no-skip-false` in `sqlite-utils` 4.0,9599,simonw,open,0,,,9374594,4.0 backwards incomatible changes,1,2023-05-08T21:04:28Z,2023-05-08T21:07:41Z,,OWNER,,"Following: - #527 The only reason I didn't remove fix this mis-feature entirely is that it represents a backwards incompatible change. I'll make that change in 4.0.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1595340692,I_kwDOCGYnMM5fFveU,530,"add ability to configure ""on delete"" and ""on update"" attributes of foreign keys:",536941,fgregg,open,0,,,,,2,2023-02-22T15:44:14Z,2023-05-08T20:39:01Z,,CONTRIBUTOR,,"sqlite supports these, and it would be quite nice to be able to add them with sqlite-utils. https://www.sqlite.org/foreignkeys.html#fk_actions",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,voltagex,closed,0,,,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json` ``` sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json [###################################-] 99% 00:00:00 Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ``` Please show more information as to *why* this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1695428235,I_kwDOCGYnMM5lDi6L,538,`table.upsert_all` fails to write rows when `not_null` is present,1231935,xavdid,closed,0,,,,,9,2023-05-04T07:30:38Z,2023-05-08T20:06:35Z,2023-05-08T19:27:02Z,NONE,,"I found an odd bug today, where calls to `table.upsert_all` don't write rows if you include the `not_null` kwarg. ## Repro Example ```py from sqlite_utils import Database db = Database(""upsert-test.db"") db[""comments""].upsert_all( [{""id"": 1, ""name"": ""david""}], pk=""id"", not_null=[""name""], ) assert list(db[""comments""].rows) # err! ``` The schema is correctly created: ```sql CREATE TABLE [comments] ( [id] INTEGER PRIMARY KEY, [name] TEXT NOT NULL ) ``` But no rows are created. Removing either the `not_null` kwargs works as expected, as does an `insert_all` call. ## Version Info - Python: `3.11.0` - sqlite-utils: `3.30` - sqlite: `3.39.5 2022-10-14`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1700840265,I_kwDOCGYnMM5lYMNJ,541,Get tests to pass with `pytest -Werror`,9599,simonw,open,0,,,,,1,2023-05-08T19:57:23Z,2023-05-08T19:59:35Z,,OWNER,,"Inspired by: - #534",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1622640374,I_kwDOCGYnMM5gt4b2,534, ResourceWarning: unclosed file,1244826,djhenderson,closed,0,,,,,1,2023-03-14T03:02:18Z,2023-05-08T19:56:29Z,2023-05-08T19:56:29Z,NONE,,"Issuing either ``` py -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` or ``` set pythonwarnings=default sqlite-utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` exhibits a ResourceWarning indicating that the CSV file being loaded is not closed. sqlite-utils --version sqlite-utils, version 3.30 py --version Python 3.11.2 Windows Version 10.0.19045 Build 19045 SQLite version 3.41.0 2023-02-21 18:09:37 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699184583,I_kwDOCGYnMM5lR3_H,540,sphinx.builders.linkcheck build error,9599,simonw,closed,0,,,,,4,2023-05-07T18:37:09Z,2023-05-08T04:56:13Z,2023-05-07T18:42:36Z,OWNER,,"https://readthedocs.org/projects/sqlite-utils/builds/20512693/ ``` Running Sphinx v6.2.1 Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 442, in load_extension mod = import_module(extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File """", line 1014, in _gcd_import File """", line 991, in _find_and_load File """", line 975, in _find_and_load_unlocked File """", line 671, in _load_unlocked File """", line 783, in exec_module File """", line 219, in _call_with_frames_removed File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/builders/linkcheck.py"", line 20, in from requests import Response File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/requests/__init__.py"", line 43, in import urllib3 File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/urllib3/__init__.py"", line 38, in raise ImportError( ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168 The above exception was the direct cause of the following exception: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/cmd/build.py"", line 280, in build_main app = Sphinx(args.sourcedir, args.confdir, args.outputdir, File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 225, in __init__ self.setup_extension(extension) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 404, in setup_extension self.registry.load_extension(self, extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 445, in load_extension raise ExtensionError(__('Could not import extension %s') % extname, sphinx.errors.ExtensionError: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) Extension error: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699174055,I_kwDOCGYnMM5lR1an,539,"`--raw-lines` option, like `--raw` for multiple lines",9599,simonw,closed,0,,,,,4,2023-05-07T18:07:46Z,2023-05-07T18:43:24Z,2023-05-07T18:26:18Z,OWNER,,I wanted to output newline-separated output of the first column of every row in the results - like `--row` but for more than one line.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1698865182,I_kwDOBm6k_c5lQqAe,2069,[BUG] Cannot insert new data to deployed instance,31861128,yqlbu,open,0,,,,,1,2023-05-07T02:59:42Z,2023-05-07T03:17:35Z,,NONE,,"## Summary Recently, I deployed an instance of datasette to Vercel with the following plugins: - datasette-auth-tokens - datasette-insert With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel: ```console File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ``` I think it is a potential bug. ## Reproduce
metadata.json
```json { ""plugins"": { ""datasette-insert"": { ""allow"": { ""id"": ""*"" } }, ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""INSERT_TOKEN"" }, ""actor"": { ""id"": ""repeater"" } } ], ""param"": ""_auth_token"" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H ""Authorization: Bearer "" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File ""/var/task/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/var/task/datasette/app.py"", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File ""/var/task/datasette/utils/__init__.py"", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File ""/var/task/datasette_insert/__init__.py"", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File ""/var/task/datasette_insert/__init__.py"", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File ""/var/task/datasette/database.py"", line 167, in execute_write_fn raise result File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1393202060,I_kwDOCGYnMM5TCpOM,496,devrel/python api: Pylance type hinting,7908073,chapmanjacobd,open,0,,,,,4,2022-10-01T03:03:34Z,2023-05-03T05:53:27Z,,CONTRIBUTOR,,"Pylance is generally pretty good at figuring out stuff but `sqlite-utils` has some quirks which make type hinting kinda useless. Maybe you don't care but I thought I would bring it to your attention. For example: ``` db[""subs""].insert_all(subs, pk=""index"") ``` ``` Cannot access member ""insert_all"" for type ""View"" Member ""insert_all"" is unknown ``` `insert_all` and all the other methods show up as a type issues because the program can't know whether something is a View or a Table. Fair enough. But that basically throws all type checking out the window. `pk=""index""` also shows up as a type issue: ``` Argument of type ""Literal['index']"" cannot be assigned to parameter ""pk"" of type ""Default"" in function ""insert_all"" ""Literal['index']"" is incompatible with ""Default"" ``` I think this is because DEFAULT is an empty class? maybe a few small changes could be made to make the library more type-friendly The interim solution is of course to turn off type hints completely for the line ``` db[""subs""].insert_all(subs, pk=""index"") # type: ignore ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1690765434,I_kwDOBm6k_c5kxwh6,2067,Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6,39538958,justmars,open,0,,,,,1,2023-05-01T12:42:28Z,2023-05-03T00:16:03Z,,NONE,,"Hi! Wondering if this issue is limited to my local system or if it affects others as well. It seems like 3.11 errors out on a ""litestream-restored"" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6. To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli. ```sh # create new shell with 3.11.3 litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite # SQLite version 3.41.2 2023-03-22 11:56:21 # Enter "".help"" for usage hints. # sqlite> .tables # _litestream_lock _litestream_seq movie # sqlite> ``` However this get me an `OperationalError` when reading via datasette:
Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) Traceback (most recent call last): File ""/tester/.venv/bin/datasette"", line 8, in sys.exit(cli()) ^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py"", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 660, in check_databases await database.execute_fn(check_connection) File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py"", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```
In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1686033652,I_kwDOBm6k_c5kftT0,2065,Datasette cannot be installed with Rye,9599,simonw,closed,0,,,,,4,2023-04-27T03:35:42Z,2023-04-27T05:09:36Z,2023-04-27T05:09:36Z,OWNER,,"https://github.com/mitsuhiko/rye I tried this: rye install datasette But now: ``` % ~/.rye/shims/datasette Traceback (most recent call last): File ""/Users/simon/.rye/shims/datasette"", line 5, in from datasette.cli import cli File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/cli.py"", line 17, in from .app import ( File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/app.py"", line 14, in import pkg_resources ModuleNotFoundError: No module named 'pkg_resources' ``` I think that's because `setuptools` is not included in Rye.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1686042269,I_kwDOBm6k_c5kfvad,2066,Failing test: httpx.InvalidURL: URL too long,9599,simonw,closed,0,,,,,10,2023-04-27T03:48:47Z,2023-04-27T04:27:50Z,2023-04-27T04:27:50Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/4815723640/jobs/8574667731 ``` def urlparse(url: str = """", **kwargs: typing.Optional[str]) -> ParseResult: # Initial basic checks on allowable URLs. # --------------------------------------- # Hard limit the maximum allowable URL length. if len(url) > MAX_URL_LENGTH: > raise InvalidURL(""URL too long"") E httpx.InvalidURL: URL too long /opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/httpx/_urlparse.py:155: InvalidURL =========================== short test summary info ============================ FAILED tests/test_csv.py::test_max_csv_mb - httpx.InvalidURL: URL too long ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1203842656,I_kwDOCGYnMM5HwS5g,425,`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher,9599,simonw,closed,0,,,,,5,2022-04-13T22:16:53Z,2023-04-15T20:14:58Z,2022-04-13T22:48:57Z,OWNER,,"Got this error while investigating: - #421 Even though I was using the `LD_PRELOAD` trick from https://til.simonwillison.net/sqlite/ld-preload to use a newer version of SQLite. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098531354_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,simonw,open,0,,,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it: > Or just ""Actions ⚙️ "" And got back: > `Actions ‚öôÔ∏è` Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this: ```python s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè' s = s.encode('latin-1') s = s.decode('utf-8') s = s.encode('macroman') s = s.decode('utf-8') print(s) ``` ",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,simonw,open,0,,,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it. ``` File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request await self.ds.refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas await self._refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database ``` That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665510265,I_kwDOBm6k_c5jRat5,2060,Clean up a bunch of warnings from ruff,9599,simonw,open,0,,,,,0,2023-04-13T01:23:02Z,2023-04-13T01:23:02Z,,OWNER,,"See: - #2056 `ruff` spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,mtdukes,open,0,,,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users. Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1373210675,I_kwDODD6af85R2Ygz,13,fails before generating views. ERR: table sqlite_master may not be modified,116795,pax,open,0,,,,,4,2022-09-14T15:41:50Z,2023-04-11T03:46:17Z,,NONE,,"generates checkins.db but seems to fail before generating views note: it worked on an Ubuntu WSL but fails on macOS 12.5.1 later edit: I suspect this is a problem with my local set-up, `dogsheep-beta index` also throws the same error full error: Importing 2591 checkins [###################################-] 98% 00:00:00 Traceback (most recent call last): File ""/Users/pax/devbox/envAll/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/cli.py"", line 77, in cli ensure_foreign_keys(db) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 145, in ensure_foreign_keys db[fk.table].add_foreign_key(fk.column, fk.other_table, fk.other_column) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2123, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1086, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1657861026,I_kwDOBm6k_c5i0POi,2054,"Make detailed notes on how table, query and row views work right now",9599,simonw,open,0,,,,,13,2023-04-06T18:21:09Z,2023-04-07T20:14:38Z,,OWNER,,"Research to help influence the following: - #2049 - #2053 - #2050 - #262 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related: - #262 I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too. Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1650981564,I_kwDOJHON9s5iZ_q8,12,Error running pytest,14314871,amlestin,open,0,,,,,0,2023-04-02T15:02:36Z,2023-04-02T15:07:10Z,,NONE,,"`______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'` Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using `python -m pytest` The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 907795562,MDU6SXNzdWU5MDc3OTU1NjI=,265,Using enable_fts before search term,36287,prabhur,open,0,,,,,1,2021-06-01T01:43:34Z,2023-04-01T17:27:18Z,,NONE,,"Many thanks for the sqlite-utils suite of utilities. Has made my life much much easier. I used this to create a table and enable FTS. All works fine. The datasette utility detects FTS and shows a text box. Searching for a term using that interface works well. However, when I start to use features by following https://www.sqlite.org/fts5.html section **""3. Full-text Query Syntax""** I seem to run into issues that I suspect is due to `escape_fts` wrapper function. As an example, if i search for the term `""^குகை"" `on the text box in datasette it produces 140 results. However, when i tweak the query produced by datasette to not use ""escape_fts"" it produces 5 results. Similarly, when I try to restrict the search to a single column in FTS using a spec like `{title : ^குகை}` it returns no rows. The same thing pulls results when used without `escape_fts`. The text in the table is in Tamil language and the search term is a Tamil word. ``` ... where posts_fts match escape_fts(:search) ``` vs ``` ... where posts_fts match (:search) ``` Any ideas why? How can I get the benefits of both escaping as well as utilizing different facets of providing / controlling search terms? Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 702386948,MDU6SXNzdWU3MDIzODY5NDg=,159,.delete_where() does not auto-commit (unlike .insert() or .upsert()),11712349,spdkils,open,0,,,,,9,2020-09-16T01:55:52Z,2023-04-01T17:21:05Z,,NONE,,"When you use the delete_where() function on a table, it never commits.... Is that intentional?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/159/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649791661,I_kwDOBm6k_c5iVdKt,2050,Row page JSON should use new ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-03-31T17:56:53Z,2023-03-31T17:59:49Z,,OWNER,,"https://latest.datasette.io/fixtures/facetable/2.json Related: - #2049 - #1709 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,simonw,open,0,,,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615692818,I_kwDOBm6k_c5gTYQS,2035,Potential feature: special support for `?a=1&a=2` on the query page,9599,simonw,open,0,,,3268330,Datasette 1.0,14,2023-03-08T18:05:03Z,2023-03-31T16:09:08Z,,OWNER,,"From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138 The key idea is to make it easier for people to implement `where id in (...)` that's populated from query string arguments. What if you could add `?id=11&id=32&id=62` to the URL and have that made available as a list that can be used in the query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,mmngreco,open,0,,,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the `README.md` I've faced this error: ```bash (venv) mgreco@pop-os apple-health master* (23:44|0s) $ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml Importing from HealthKit [------------------------------------] 0% Traceback (most recent call last): File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')()) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite for tag, el in find_all_tags( File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags for event, el in parser.read_events(): File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events raise event File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: syntax error: line 156, column 0 ``` So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads: 1. Uncompress the zip and move the new folder where `export.xml` is. 1. Create a `patch.txt` with the following content ```diff --- export.xml 2022-09-18 15:17:09.000000000 -0400 +++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400 @@ -15,6 +15,7 @@ HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED + HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED > - + - + - device CDATA #IMPLIED - - -> ]> ``` 1. Apply the path with the command: `patch < patch.txt` 1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml` 1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646068413,I_kwDOBm6k_c5iHQK9,2048,Test failures encountered while packaging for GNU Guix,8332263,Apteryks,open,0,,,,,0,2023-03-29T15:36:54Z,2023-03-29T15:36:54Z,,NONE,,"Hello, While reviewing a packaged submitted to Guix to add `datasette`, the test suite produces the following errors: ``` =================================== FAILURES =================================== _________________________ test_row_strange_table_name __________________________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_row_strange_table_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"" ) > assert response.status == 200 E assert 400 == 200 E + where 400 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where ""rowid""=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv _______________ test_database_page_for_database_with_dot_in_name _______________ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_with_dot = def test_database_page_for_database_with_dot_in_name(app_client_with_dot): response = app_client_with_dot.get(""/fixtures~2Edot.json"") > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___________________ test_tilde_encoded_database_names[fo%o] ____________________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'fo%o' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __________________ test_tilde_encoded_database_names[f~/c.d] ___________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'f~/c.d' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ______________ test_database_with_space_in_name[/searchable.json] ______________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________________ test_database_with_space_in_name[.json] ____________________ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ______________ test_database_with_space_in_name[/searchable_view] ______________ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects _____________________ test_database_with_space_in_name[/] ______________________ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_database_with_space_in_name[/searchable] _________________ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________ test_database_with_space_in_name[/searchable_view.json] ____________ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_weird_database_names[database (1).sqlite] ________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _____________ test_weird_database_names[test-database (1).sqlite] ______________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['
', '', '']] @pytest.mark.parametrize( ""path,expected"", ( ( ""/fixtures/compound_primary_key/a,b"", [ [ '', '', '', ] ], ), ( ""/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd"", [ [ '', '', '', ] ], ), ), ) def test_row_html_compound_primary_key(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563'] @pytest.mark.parametrize( ""path,expected_classes"", [ (""/"", [""index""]), (""/fixtures"", [""db"", ""db-fixtures""]), (""/fixtures?sql=select+1"", [""query"", ""db-fixtures""]), ( ""/fixtures/simple_primary_key"", [""table"", ""db-fixtures"", ""table-simple_primary_key""], ), ( ""/fixtures/neighborhood_search"", [""query"", ""db-fixtures"", ""query-neighborhood_search""], ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", [""table"", ""db-fixtures"", ""table-tablewithslashescsv-fa7563""], ), ( ""/fixtures/simple_primary_key/1"", [""row"", ""db-fixtures"", ""table-simple_primary_key""], ), ], ) def test_css_classes_on_body(app_client, path, expected_classes): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html' @pytest.mark.parametrize( ""path,expected_considered"", [ (""/"", ""*index.html""), (""/fixtures"", ""database-fixtures.html, *database.html""), ( ""/fixtures/simple_primary_key"", ""table-fixtures-simple_primary_key.html, *table.html"", ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""table-fixtures-tablewithslashescsv-fa7563.html, *table.html"", ), ( ""/fixtures/simple_primary_key/1"", ""row-fixtures-simple_primary_key.html, *row.html"", ), ], ) def test_templates_considered(app_client, path, expected_considered): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json' @pytest.mark.parametrize( ""path,expected"", ( # Instance index page (""/"", ""http://localhost/.json""), # Table page (""/fixtures/facetable"", ""http://localhost/fixtures/facetable.json""), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json"", ), # Row page ( ""/fixtures/no_primary_key/1"", ""http://localhost/fixtures/no_primary_key/1.json"", ), # Database index page ( ""/fixtures"", ""http://localhost/fixtures.json"", ), # Custom query page ( ""/fixtures?sql=select+*+from+facetable"", ""http://localhost/fixtures.json?sql=select+*+from+facetable"", ), # Canned query page ( ""/fixtures/neighborhood_search?text=town"", ""http://localhost/fixtures/neighborhood_search.json?text=town"", ), # /-/ pages ( ""/-/plugins"", ""http://localhost/-/plugins.json"", ), ), ) def test_alternate_url_json(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B' @pytest.mark.parametrize( ""path,expected"", [ ( ""/fixtures/neighborhood_search"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text="", ), ( ""/fixtures/neighborhood_search?text=ber"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text=ber"", ), (""/fixtures/pragma_cache_size"", None), ( # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬 ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC"", ""/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B"", ), (""/fixtures/magic_parameters"", None), ], ) def test_edit_sql_link_on_canned_queries(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _______________________ test_table_with_slashes_in_name ________________________ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"" ) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError __________________ test_custom_query_with_unicode_characters ___________________ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_custom_query_with_unicode_characters(app_client): # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json response = app_client.get( ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"" ) > assert [{""id"": 1, ""name"": ""San Francisco""}] == response.json /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/__init__.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , s = '', idx = 0 def raw_decode(self, s, idx=0): """"""Decode a JSON document from ``s`` (a ``str`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """""" try: obj, end = self.scan_once(s, idx) except StopIteration as err: > raise JSONDecodeError(""Expecting value"", s, err.value) from None E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""path,expected_rows"", [ ( ""/fixtures/searchable.json?_search=dog"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # Special keyword shouldn't break FTS query ""/fixtures/searchable.json?_search=AND"", [], ), ( # Without _searchmode=raw this should return no results ""/fixtures/searchable.json?_search=te*+AND+do*"", [], ), ( # _searchmode=raw ""/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # _searchmode=raw combined with _search_COLUMN ""/fixtures/searchable.json?_search_text2=te*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], ], ), ( ""/fixtures/searchable.json?_search=weasel"", [[2, ""terry dog"", ""sara weasel"", ""puma""]], ), ( ""/fixtures/searchable.json?_search_text2=dog"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ( ""/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ], ) def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _____ test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] ______ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {'searchmode': 'raw'}, querystring = '_search=te*+AND+do*' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {}, querystring = '_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: ""/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest"" arguments: (""-vv"" ""-n"" ""24"" ""-m"" ""not serial"") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds ``` The tests run in a private namespace without internet connectivity, and the Python dependencies are at: ``` python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9. Thank you!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323658641,MDU6SXNzdWUzMjM2NTg2NDE=,262,Add ?_extra= mechanism for requesting extra properties in JSON,9599,simonw,open,0,,,3268330,Datasette 1.0,27,2018-05-16T14:55:42Z,2023-03-29T06:22:22Z,,OWNER,,"Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional `template_data()` function which is called if the view is being rendered as HTML. This `template_data()` function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON. Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704 With features like Facets in #255 I'm beginning to want to move more items into the `template_data()` - in the case of facets it's the `suggested_facets` array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used. But... as an API user, I want to still optionally be able to access that information. Solution: Add a `?_extra=suggested_facets&_extra=table_metadata` argument which can be used to optionally request additional blocks to be added to the JSON API. Then redefine as many of the current `template_data()` features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates. This could allow the JSON representation to be slimmed down further (removing e.g. the `table_definition` and `view_definition` keys) while still making that information available to API users who need it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114543475,I_kwDOCGYnMM5CbpVz,388,Link to stable docs from older versions,9599,simonw,closed,0,,,,,7,2022-01-26T01:55:46Z,2023-03-26T23:43:12Z,2022-01-26T02:00:22Z,OWNER,,"https://sqlite-utils.datasette.io/en/2.14.1/ isn't showing a link to the stable release right now. I should also apply the same fix I used for Datasette in: - https://github.com/simonw/datasette/issues/1608 TIL: https://til.simonwillison.net/readthedocs/link-from-latest-to-stable",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that. On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1641013220,I_kwDOBm6k_c5hz9_k,2045,First column on a view page has no facet option in cog menu,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2023-03-26T18:02:47Z,2023-03-26T18:02:48Z,,OWNER,,"e.g. first column on this page - cog menu has no option to facet. https://datasette.io/content/tools ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,simonw,open,0,,,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment: datasette fixtures.db -p 3344 --setting base_url /foo/bar/ Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3 But... that `json` link goes here, which is a 404: http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,simonw,open,0,,,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0. The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1633077183,I_kwDOBm6k_c5hVse_,2041,Remove obsolete table POST code,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-03-21T01:01:40Z,2023-03-21T01:17:44Z,2023-03-21T01:17:43Z,OWNER,,"Spotted this in: - #1999 `POST /db/table` currently executes obsolete code for inserting a row - I replaced that with `/db/table/-/insert` in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620516340,I_kwDOCGYnMM5glx30,533,ReadTheDocs error: not all arguments converted during string formatting,9599,simonw,closed,0,,,,,2,2023-03-12T21:21:05Z,2023-03-12T21:25:33Z,2023-03-12T21:25:33Z,OWNER,,"This came up as a failure running tests for: - #531 Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/ ``` File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620515757,I_kwDOBm6k_c5glxut,2039,Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\`,15178711,asg017,open,0,,,,,0,2023-03-12T21:18:52Z,2023-03-12T21:18:52Z,,CONTRIBUTOR,,"From the Datasette discord: A user tried running the following command on windows: ``` datasette --load-extension=""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"" ``` This failed with `""The specified module could not be found""`, because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at `""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""`, it instead tried to load the extension at `""C""` with entrypoint `""\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"". This is hard because most absolute windows paths have a colon in them, like `C:\foo.txt` or `D:\bar.txt`. I'd image the `--static` flag is also vulnerable to this type of bug. The ""solution"" is to use a relative path instead, but that doesn't feel that great. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 317001500,MDU6SXNzdWUzMTcwMDE1MDA=,236,datasette publish lambda plugin,9599,simonw,open,0,,,,,11,2018-04-23T22:10:30Z,2023-03-12T14:04:15Z,,OWNER,,"Refs #217 - create a publish plugin that can deploy to AWS Lambda. https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it). Lambdas do get a 512 MB `/tmp` directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the `.db` file there and then have the lambda download it on startup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/236/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,simonw,open,0,,,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here: ```diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( ""Allow display of SQL trace debug information with ?_trace=1"", ), Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""), + Setting(""strict_templates"", False, ""Raise errors for undefined template variables""), ) _HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead"" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting(""strict_templates""): + env_extras[""undefined""] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters[""escape_css_string""] = escape_css_string self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus ``` Explored this idea a bit in: - #1999",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,simonw,open,0,,,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,simonw,open,0,,,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then. Problem is I don't actually know what order it iterates over the notes in.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,simonw,open,0,,,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615862295,I_kwDOBm6k_c5gUBoX,2036,"`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems",9599,simonw,closed,0,,,,,6,2023-03-08T20:11:44Z,2023-03-08T20:57:34Z,2023-03-08T20:57:34Z,OWNER,,"See this issue: - https://github.com/simonw/datasette.io/issues/141",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,dmick,open,0,,,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1612296210,I_kwDOBm6k_c5gGbAS,2033,`datasette install -r requirements.txt`,9599,simonw,closed,0,,,,,2,2023-03-06T22:17:17Z,2023-03-06T22:54:52Z,2023-03-06T22:27:34Z,OWNER,,"Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,cldellow,open,0,,,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,gk7279,closed,0,,,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 828858421,MDU6SXNzdWU4Mjg4NTg0MjE=,1258,Allow canned query params to specify default values,1385831,wdccdw,open,0,,,,,5,2021-03-11T07:19:02Z,2023-02-20T23:39:58Z,,NONE,,"If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out. ![image](https://user-images.githubusercontent.com/1385831/110748683-13e72300-820e-11eb-855c-32e03dfef5bf.png) Is there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1258/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1592327343,I_kwDOBm6k_c5e6Pyv,2029,"Sorry Simon, didn't know how else to contact you",5804626,llchristopherson,open,0,,,,,0,2023-02-20T19:02:53Z,2023-02-20T19:02:53Z,,NONE,,"Hi Simon, Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org. Thanks, Laura",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2029/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1572766460,I_kwDOCGYnMM5dvoL8,524,Transformation type `--type DATETIME`,21095447,4l1fe,closed,0,,,,,15,2023-02-06T15:18:42Z,2023-02-15T12:10:54Z,2023-02-15T12:10:54Z,NONE,,"Hey. Currently i do transformation with the type `--type TEXT`, but i noticed using the sqlalchemy based library [dataset](https://github.com/pudo/dataset) that is reading and writing differ depending on the column types `TEXT`, `DATETIME`. Is it possible to alter a column type to `DATETIME` somehow using Sqlite-Utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1323346408,I_kwDOBm6k_c5O4Kno,1775,i18n support,428820,johnfelipe,open,0,,,,,9,2022-07-31T02:51:04Z,2023-02-10T18:04:40Z,,NONE,,"I want contribute for translate UI to es, de, de and it if you share strings",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1775/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1579973223,I_kwDOBm6k_c5eLHpn,2024,Mention WAL mode in documentation,9599,simonw,open,0,,,,,1,2023-02-10T16:11:10Z,2023-02-10T16:11:53Z,,OWNER,,It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,DavidPratten,closed,0,,,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1577548579,I_kwDOBm6k_c5eB3sj,2021,Docker images for 1.0 alphas?,1563881,meowcat,open,0,,,,,0,2023-02-09T09:35:52Z,2023-02-09T09:35:52Z,,NONE,,"Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2021/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575880841,I_kwDOBm6k_c5d7giJ,2020,"Documentation refers to ""off"" setting; doesn't seem to work, ""false"" does",1350673,dmick,open,0,,,,,0,2023-02-08T10:38:10Z,2023-02-08T10:38:10Z,,NONE,,"https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using ""off"" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a ""JSON boolean"" and have the values ""true"" or ""false"". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2020/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1573424830,I_kwDOBm6k_c5dyI6-,2019,Refactor out the keyset pagination code,9599,simonw,open,0,,,,,14,2023-02-06T23:04:00Z,2023-02-08T01:40:46Z,,OWNER,,"While working on: - #1999 I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493 Extracting that into a utility function would simplify that code a lot.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2019/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571711808,I_kwDOBm6k_c5drmtA,2018,`check_visibility` gives confusing (wrong?) results if permission is `None`,193185,cldellow,open,0,,,,,0,2023-02-06T01:03:08Z,2023-02-06T01:03:46Z,,CONTRIBUTOR,,"I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ (""update-row"", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571207083,I_kwDOBm6k_c5dprer,2016,Database metadata fields like description are not available in the index page template's context,9993,palewire,open,0,,,3268330,Datasette 1.0,1,2023-02-05T02:25:53Z,2023-02-05T22:56:43Z,,NONE,,"When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1565179870,I_kwDOBm6k_c5dSr_e,2013,Datasette uses non-standard quoting for identifiers,193185,cldellow,open,0,,,,,0,2023-02-01T00:05:39Z,2023-02-01T00:06:30Z,,CONTRIBUTOR,,"Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html: > ""keyword"" A keyword in double-quotes is an identifier. > [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2013/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564774831,I_kwDOBm6k_c5dRJGv,2012,Missing space in database summary,9599,simonw,open,0,,,,,0,2023-01-31T18:01:13Z,2023-01-31T18:01:13Z,,OWNER,,"Spotted this on an instance index page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2012/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564769997,I_kwDOBm6k_c5dRH7N,2011,"Applied facet did not result in an ""x"" icon to dismiss it",9599,simonw,open,0,,,,,1,2023-01-31T17:57:44Z,2023-01-31T17:58:54Z,,OWNER,,"![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png) That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata It's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1563264257,I_kwDOBm6k_c5dLYUB,2010,Row page should default to card view,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-01-30T21:49:37Z,2023-01-30T21:52:06Z,,OWNER,,"Datasette currently uses the same table layout on the row pages as it does on the table pages: https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect https://datasette.io/content/pypi_packages/datasette-column-inspect If you shrink down to mobile width you get this instead, on both of those pages: I think that view, which I think of as the ""card view"", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1186696202,I_kwDOBm6k_c5Gu4wK,1696,Show foreign key label when filtering,9599,simonw,open,0,,,,,2,2022-03-30T16:18:54Z,2023-01-29T20:56:20Z,,OWNER,,"For example here: 3 corresponds to ""Human Related: Other"" - it would be neat to display this in this area of the page somehow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515815014,I_kwDOBm6k_c5aWYBm,1973,render_cell plugin hook's row object is not a sqlite.Row,193185,cldellow,open,0,,,,,4,2023-01-01T20:27:46Z,2023-01-29T00:40:31Z,,CONTRIBUTOR,,"From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette: > row - sqlite.Row > The SQLite row object that the value being rendered is part of This appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue. I have a table: ```sql CREATE TABLE IF NOT EXISTS ""dss_job_stats""( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) ); ``` On datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like: ``` CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')]) ``` I expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`. I can work around this, but was wondering if this was intended behaviour?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1560651350,I_kwDOCGYnMM5dBaZW,523,Feature request: trim all leading and trailing white space for all columns for all tables in a database,536941,fgregg,open,0,,,,,1,2023-01-28T02:40:10Z,2023-01-28T02:41:14Z,,CONTRIBUTOR,,"It's pretty common that i need to trim leading or trailing white space from lots of columns in a database a part of an initial ETL. I use the following recipe a lot, and it would be great to include this functionality into sqlite-utils `trimify.sql` ```sql select 'select group_concat(''update [' || name || '] set ['' || name || ''] = trim(['' || name || ''])'', ''; '') || ''; '' as sql_to_run from pragma_table_info('''||name||''');' from sqlite_schema; ``` then something like: ```bash sqlite3 example.db < scripts/trimify.sql > table_trim.sql && \ sqlite3 $example.db < table_trim.sql > trim.sql && \ sqlite3 $example.db < trim.sql ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557599877,I_kwDODFE5qs5c1xaF,12,location history changes,14809320,gerardrbentley,open,0,,,,,0,2023-01-26T03:57:25Z,2023-01-26T03:57:25Z,,NONE,,"not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from ""Location History.json"" to ""Records.json"" `""timestampMs""` is not present, `""timestamp""` is roughly iso timestamp ```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%SZ"").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%S.%fZ"").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open(""Takeout/Location History/Records.json"") ) db[""location_history""].upsert_all( ( { ""id"": id_for_location_history(row), ""latitude"": row[""latitudeE7""] / 1e7, ""longitude"": row[""longitudeE7""] / 1e7, ""accuracy"": row[""accuracy""], ""timestampMs"": get_timestamp_ms(row[""timestamp""]), ""when"": row[""timestamp""], } for row in location_history[""locations""] ), pk=""id"", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=("","", "":""), sort_keys=True).encode(""utf8"") ).hexdigest()[:6] return ""{}-{}"".format( row['timestamp'], first_six, ) ``` example locations from mine ```json { ""latitudeE7"": 427220206, ""longitudeE7"": -923423972, ""accuracy"": 10, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:31:50.867Z"" } ``` ```json { ""latitudeE7"": 427011317, ""longitudeE7"": -923448300, ""accuracy"": 5, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:33:53Z"" }, ```",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 2}",, 1557507274,I_kwDOBm6k_c5c1azK,2005,`extra_template_vars` should be OK to return `None`,9599,simonw,open,0,,,,,1,2023-01-26T01:40:45Z,2023-01-26T01:41:50Z,,OWNER,,"Got this exception and had to make sure it always returned `{}`: ``` File "".../python3.11/site-packages/datasette/app.py"", line 1049, in render_template assert isinstance(extra_vars, dict), ""extra_vars is of type {}"".format( AssertionError: extra_vars is of type ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553615704,I_kwDOBm6k_c5cmktY,2001,Datasette is not compatible with SQLite's strict quoting compilation option,406380,gwk,open,0,,,,,4,2023-01-23T19:10:07Z,2023-01-25T04:59:58Z,,NONE,,"I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a ""MySQL 3.x misfeature"". See https://www.sqlite.org/quirks.html#dblquote for background. Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment. My experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response. The error: `sqlite3.OperationalError: no such column: geometry_columns` The responsible SQL: `'select 1 from sqlite_master where tbl_name = ""geometry_columns""'` I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `""select 1 from sqlite_master where tbl_name = 'geometry_columns'""`. With this change, I get a little further, but have the same problem with the first table name in my database (in my case, ""Meta""): ``` OperationalError: no such column: Meta Traceback (most recent call last): File ""/Users/gwk/external/datasette/datasette/app.py"", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/index.py"", line 70, in get ""fts_table"": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/utils/__init__.py"", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - ""GET / HTTP/1.1"" 500 Internal Server Error ``` I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon. Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: ``` sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0); ``` As far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 743371103,MDU6SXNzdWU3NDMzNzExMDM=,1099,Support linking to compound foreign keys,9599,simonw,open,0,,,,,6,2020-11-15T23:23:17Z,2023-01-25T00:58:26Z,,OWNER,,Reported as a bug in #1098 because they caused 500 errors - but it would be even better if Datasette could hyperlink to related rows via compound foreign keys.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1099/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1554032168,I_kwDOBm6k_c5coKYo,2002,Document how actors are displayed,9599,simonw,open,0,,,,,0,2023-01-24T00:08:49Z,2023-01-24T00:08:49Z,,OWNER,,"https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/__init__.py#L1052-L1056 This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1536851861,I_kwDOBm6k_c5bmn-V,1994,Stuck on loading screen,10913053,jackhagley,open,0,,,,,1,2023-01-17T18:33:49Z,2023-01-23T08:21:08Z,,NONE,,"Can’t actually open it! Downloaded today from the releases tab Running macOS13.1 ``` bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms ``` STUCK",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1994/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1552368054,I_kwDOBm6k_c5ch0G2,2000,rewrite_sql hook,193185,cldellow,open,0,,,,,1,2023-01-23T01:02:52Z,2023-01-23T06:08:01Z,,CONTRIBUTOR,,"I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like ```python def rewrite_sql(datasette, database, request, fn, sql, params) ``` It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement. The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed. Plugins that could be written with this hook: - https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching - a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004)) - a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID - a plugin to maintain audit tables when users write to a table - a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue Flaws with this idea: `execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2000/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 749283032,MDU6SXNzdWU3NDkyODMwMzI=,1101,register_output_renderer() should support streaming data,9599,simonw,open,0,,,3268330,Datasette 1.0,13,2020-11-24T02:17:09Z,2023-01-21T22:07:19Z,,OWNER,,"> I'd like to implement this by first extending the `register_output_renderer()` hook to support streaming huge responses, then switching CSV to use the plugin hook in addition to TSV using it. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1096#issuecomment-732542285_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1101/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,simonw,open,0,,,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1550536442,I_kwDOCGYnMM5ca076,521,Custom JSON encoder,31504,janrito,open,0,,,,,0,2023-01-20T09:19:40Z,2023-01-20T09:19:40Z,,NONE,,"It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1533673397,I_kwDOBm6k_c5baf-1,1991,fts5 tables are not auto-detected and hidden,83819,keturn,open,0,,,,,0,2023-01-15T06:00:42Z,2023-01-20T04:54:24Z,,NONE,,"I set up a [Datasette instance](https://huggingface.co/spaces/Sygil/INE-dataset-explorer/tree/main) and was following the docs on full-text search. When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either. If I [manually set](https://huggingface.co/spaces/keturn/INED-datasette/blob/main/metadata.json#L9) `fts_table` for a view, then search does work as expected. My table and view creation code looks like this: ```py connection.execute(""""""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """""")   connection.execute(""""""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """""") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1991/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1034535001,I_kwDOBm6k_c49qcBZ,1497,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""",9599,simonw,closed,0,,,,,18,2021-10-24T22:57:07Z,2023-01-18T17:13:45Z,2021-10-24T23:36:55Z,OWNER,,"This means the Datasette 0.59.1 release has not been published to Docker Hub. Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true ``` Preparing to unpack .../libc6_2.32-4_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype Checking for services that may need to be restarted... Checking init scripts... Unpacking libc6:amd64 (2.32-4) over (2.28-10) ... Setting up libc6:amd64 (2.32-4) ... /usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory dpkg: error processing package libc6:amd64 (--configure): installed libc6:amd64 package post-installation script subprocess returned error exit status 127 Errors were encountered while processing: libc6:amd64 E: Sub-process /usr/bin/dpkg returned an error code (1) The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100 ``` Same problem when I attempted to publish using the ""Push specific Docker tag"" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1538197093,I_kwDOBm6k_c5brwZl,1995,foreign_keys error 500,137183,jonschoning,open,0,,,,,0,2023-01-18T15:27:36Z,2023-01-18T16:44:01Z,,NONE,,"**Error 500 expected string or bytes-like object** [espial-new.sqlite3.zip](https://github.com/simonw/datasette/files/10447965/espial-new.sqlite3.zip) run `datasette espial-new.sqlite3` & click on any table other than `User` ``` /home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """""".format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk[""other_column""]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk[""other_table""]), │ │ 817 │ │ │ placeholders="", "".join([""?""] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f""[{s}]"" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 134, in view return await self.dispatch_request(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 91, in dispatch_request return await handler(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 603, in _data_traced await self.ds.expand_foreign_keys( File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 814, in expand_foreign_keys other_column=escape_sqlite(fk[""other_column""]), File ""/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py"", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - ""GET /espial-new/bookmark HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:38574 - ""GET /-/static/app.css?d59929 HTTP/1.1"" 200 OK ``` Schema: ``` CREATE TABLE IF NOT EXISTS ""user"" ( ""id"" INTEGER PRIMARY KEY, ""name"" VARCHAR NOT NULL, ""password_hash"" VARCHAR NOT NULL, ""api_token"" VARCHAR NULL, ""private_default"" BOOLEAN NOT NULL, ""archive_default"" BOOLEAN NOT NULL, ""privacy_lock"" BOOLEAN NOT NULL, CONSTRAINT ""unique_user_name"" UNIQUE (""name"") ); CREATE TABLE IF NOT EXISTS ""bookmark"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), ""href"" VARCHAR NOT NULL, ""description"" VARCHAR NOT NULL, ""extended"" VARCHAR NOT NULL, ""time"" TIMESTAMP NOT NULL, ""shared"" BOOLEAN NOT NULL, ""to_read"" BOOLEAN NOT NULL, ""selected"" BOOLEAN NOT NULL, ""archive_href"" VARCHAR NULL, CONSTRAINT ""unique_user_href"" UNIQUE (""user_id"", ""href""), CONSTRAINT ""unique_user_slug"" UNIQUE (""user_id"", ""slug"") ); CREATE TABLE IF NOT EXISTS ""bookmark_tag"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""tag"" VARCHAR NOT NULL, ""bookmark_id"" INTEGER NOT NULL REFERENCES ""bookmark"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""seq"" INTEGER NOT NULL, CONSTRAINT ""unique_user_tag_bookmark_id"" UNIQUE (""user_id"", ""tag"", ""bookmark_id""), CONSTRAINT ""unique_user_bookmark_id_tag_seq"" UNIQUE (""user_id"", ""bookmark_id"", ""tag"", ""seq"") ); CREATE TABLE IF NOT EXISTS ""note"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), ""length"" INTEGER NOT NULL, ""title"" VARCHAR NOT NULL, ""text"" VARCHAR NOT NULL, ""is_markdown"" BOOLEAN NOT NULL, ""shared"" BOOLEAN NOT NULL DEFAULT false, ""created"" TIMESTAMP NOT NULL, ""updated"" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1995/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1532000914,I_kwDOBm6k_c5bUHqS,1990,Suggestion: Highlight error messages ('These facets timed out'),116795,pax,open,0,,,,,0,2023-01-13T09:40:58Z,2023-01-13T09:40:58Z,,NONE,,"I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the _These facets timed out_ notice. It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1990/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529707837,I_kwDOBm6k_c5bLX09,1988,Reconsider pattern where plugins could break existing template context,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2023-01-11T21:13:43Z,2023-01-11T21:25:05Z,,OWNER,,"> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing. _Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1528448642,I_kwDOBm6k_c5bGkaC,1985,Don't let Datasette(path) without a list cause weird errors,9599,simonw,closed,0,,,,,1,2023-01-11T05:17:44Z,2023-01-11T18:25:04Z,2023-01-11T18:25:04Z,OWNER,,"I got a confusing `sqlite3.OperationalError: disk I/O error` error in my tests, it turned out it was because this: ```python ds = Datasette(path) ``` Should have been this: ```python ds = Datasette([path]) ``` _Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1509783085,I_kwDOBm6k_c5Z_XYt,1969,sql-formatter javascript is not now working with CloudFlare rocketloader,536941,fgregg,open,0,,,,,0,2022-12-23T21:14:06Z,2023-01-10T01:56:33Z,,CONTRIBUTOR,,"This is probably not a bug with datasette, but I thought you might want to know, @simonw. I noticed today that my CloudFlare proxied datasette instance lost the ""Format SQL"" option. I'm pretty sure it was there last week. In the CloudFlare settings, if I turn off [Rocket Loader](https://developers.cloudflare.com/fundamentals/speed/rocket-loader/), I get the ""Format SQL"" option back. Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading? I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1426080014,I_kwDOBm6k_c5VAEEO,1867,/db/table/-/rename API (also allows atomic replace),9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-10-27T18:13:23Z,2023-01-09T15:34:12Z,,OWNER,,"> There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first. > > ... > > If people care about that kind of thing they could always push all of their inserts to a table called `_tablename` and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism). _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1867/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 710650633,MDU6SXNzdWU3MTA2NTA2MzM=,979,Default table view JSON should include CREATE TABLE,9599,simonw,closed,0,,,,,3,2020-09-28T23:54:58Z,2023-01-09T15:32:39Z,2023-01-09T15:32:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used. > > I'll open a separate issue to redesign this better for Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1522778923,I_kwDOBm6k_c5aw8Mr,1978,Document datasette.urls.row and row_blob,25778,eyeseast,closed,0,,,,,2,2023-01-06T15:45:51Z,2023-01-09T14:30:00Z,2023-01-09T14:30:00Z,CONTRIBUTOR,,"These are in the codebase but not in documentation. I think everything else in this class is documented. ```python class Urls: ... def row(self, database, table, row_path, format=None): path = f""{self.table(database, table)}/{row_path}"" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) def row_blob(self, database, table, row_path, column): return self.table(database, table) + ""/{}.blob?_blob_column={}"".format( row_path, urllib.parse.quote_plus(column) ) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned 1524983536,I_kwDOBm6k_c5a5Wbw,1981,Canned query field labels truncated,9599,simonw,open,0,,,,,1,2023-01-09T06:04:24Z,2023-01-09T06:05:44Z,,OWNER,,"Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776 ![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524867951,I_kwDOBm6k_c5a46Nv,1980,"""Cannot sort table by id"" when sortable_columns is used",9599,simonw,open,0,,,,,2,2023-01-09T03:21:33Z,2023-01-09T03:23:53Z,,OWNER,,"I had an instance with this in `metadata.yml`: ```yaml databases: timezones: tables: timezones: sortable_columns: - tzid ``` When I clicked on the ""Apply"" button here: It sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message: > 500: Cannot sort table by id",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524431805,I_kwDODEm0Qs5a3Pu9,72,"Import thread, including self- and others' replies",601708,mcint,open,0,,,,,0,2023-01-08T09:51:06Z,2023-01-08T09:51:06Z,,NONE,,"statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this. `twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234` twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is [implemented in twarc](https://sourcegraph.com/github.com/DocNow/twarc/-/blob/twarc/client.py?L708-766&subtree=true), using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a [search query](https://stackoverflow.com/a/30480103/1020467), use of [undocumented `related_results` api](https://stackoverflow.com/a/9419346/1020467), or with requested inclusion of [newer conversation_id](https://stackoverflow.com/a/68115718/1020467) with subsequent query. ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524076587,I_kwDOBm6k_c5a15Ar,1979,More useful error message if enable_load_extension is not available,9599,simonw,closed,0,,,,,5,2023-01-07T19:13:19Z,2023-01-08T00:21:23Z,2023-01-08T00:21:23Z,OWNER,,"I get this from: datasette --load-extension spatialite --get /-/versions.json ``` File ""/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/datasette/app.py"", line 614, in _prepare_connection conn.enable_load_extension(True) AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' ``` It would be useful if Datasette caught this error and output something more friendly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957310278,MDU6SXNzdWU5NTczMTAyNzg=,1409,`default_allow_sql` setting (a re-imagining of the old `allow_sql` setting),9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2021-07-31T19:48:56Z,2023-01-07T18:06:01Z,2023-01-05T00:51:31Z,OWNER,,"In 49d6d2f7b0f6cb02e25022e1c9403811f1fa0a7c as part of #813 I removed the `allow_sql` setting - on the basis that users could disable the ability to execute custom SQL queries using the new permission system instead. I don't think this was the right decision. Disabling custom SQL is an important security capability, and explaining how to do it using permissions is significantly more complex than letting people know they can add `--setting allow_sql off`. So I want to bring that setting back - maybe with a different, better name - and have it modify the default for that option if the permissions system doesn't have an opinion. That way people can still use the setting but then use permissions to allow specific signed-in users access to execute SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779088071,MDU6SXNzdWU3NzkwODgwNzE=,54,Archive import appears to be broken on recent exports,21148,jacobian,open,0,,,,,5,2021-01-05T14:18:01Z,2023-01-04T11:06:55Z,,CONTRIBUTOR,,"I requested a Twitter export yesterday, and unfortunately they seem to have changed it such that `twitter-to-sqlite import` can't handle it anymore 😢 So far I've ran into two issues. The first was easy to work around, but the second will take more investigation. If I can find the time I'll keep working on it and update this issue accordingly. The issues (so far): ### 1. Data seems to have moved to a `data/` subdirectory Running `twitter-to-sqlite import` on the raw zip file reports a bunch of ""not yet implemented"" errors, and then exits without actually importing anything: ``` ❯ twitter-to-sqlite import tarchive.db twitter.zip ... data/manifest: not yet implemented data/account-creation-ip: not yet implemented data/account-suspension: not yet implemented ... (dozens of more lines like this, including critical stuff like data/tweets) ... ``` (`tarchive.db` now exists, but is empty) Workaround: unpack the zip file, and run `twitter-to-sqlite import tarchive.db path/to/archive/data` That gets further, but: ### 2. Some schema(s?) have changed At least, the `blocks` schema seems different now: ``` ❯ twitter-to-sqlite import tarchive.db archive/data direct-messages-group: not yet implemented branch-links: not yet implemented periscope-expired-broadcasts: not yet implemented direct-messages: not yet implemented mute: not yet implemented Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/cli.py"", line 772, in import_ archive.import_from_file(db, filepath.name, open(filepath, ""rb"").read()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 215, in import_from_file to_insert = transformer(data) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 115, in lists_member return {""lists-member"": _list_from_common(data)} File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 200, in _list_from_common for url in block[""userListInfo""][""urls""]: KeyError: 'urls' ``` That's as far as I got before I needed to work on something else. I'll report back if I get further!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1516815571,I_kwDOBm6k_c5aaMTT,1975,_col=id can cause id column to export twice in CSV export,9599,simonw,open,0,,,,,0,2023-01-03T00:25:15Z,2023-01-03T00:25:21Z,,OWNER,,"https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1 ```csv id,id,title,body 1,1,WaSP Phase II,""

The Web Standards project has launched Phase II.

"" ``` That should not have two `id` columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1975/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515182998,I_kwDOBm6k_c5aT9uW,1970,"Path ""None"" in _internal database table",9599,simonw,closed,0,,,,,2,2022-12-31T18:51:05Z,2022-12-31T19:22:58Z,2022-12-31T18:52:49Z,OWNER,,"See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,lbellomo,closed,0,,,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515186569,I_kwDOBm6k_c5aT-mJ,1972,Fix Sphinx warning about extlink extension,9599,simonw,closed,0,,,,,0,2022-12-31T19:12:04Z,2022-12-31T19:13:26Z,2022-12-31T19:13:26Z,OWNER,,"``` [sphinx-autobuild] > sphinx-build -b html /Users/simon/Dropbox/Development/datasette/docs /Users/simon/Dropbox/Development/datasette/docs/_build Running Sphinx v5.3.0 loading pickled environment... done WARNING: extlinks: Sphinx-6.0 will require a caption string to contain exactly one '%s' and all other '%' need to be escaped as '%%'. ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1972/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1115435536,I_kwDOBm6k_c5CfDIQ,1614,Try again with SQLite codemirror support,9599,simonw,open,0,,,,,1,2022-01-26T20:05:20Z,2022-12-23T21:27:10Z,,OWNER,,"I tried and failed to implement autocomplete a while ago. Relevant code: https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335 Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1373224657,I_kwDOCGYnMM5R2b7R,488,`sqlite-utils transform` should set empty strings to null when converting text columns to integer/float,9599,simonw,open,0,,,,,5,2022-09-14T15:51:30Z,2022-12-23T17:38:55Z,,OWNER,,"``` /tmp % echo ""id,age,weight\n1,3,2.5\n2,,"" | sqlite-utils insert test.db test - --csv /tmp % sqlite-utils schema test.db CREATE TABLE [test] ( [id] TEXT, [age] TEXT, [weight] TEXT ); /tmp % sqlite-utils transform test.db test --type age integer --type weight float /tmp % sqlite-utils schema test.db CREATE TABLE ""test"" ( [id] TEXT, [age] INTEGER, [weight] FLOAT ); /tmp % sqlite-utils rows test.db test [{""id"": ""1"", ""age"": 3, ""weight"": 2.5}, {""id"": ""2"", ""age"": """", ""weight"": """"}] ``` It would be neat if this resulted in the following instead: ``` {""id"": ""2"", ""age"": null, ""weight"": null} ``` Related Discord discussion: https://discord.com/channels/823971286308356157/823971286941302908/1019635490833567794",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/488/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1505411725,I_kwDODFdgUs5ZusKN,78,self-hosted or corp github enterprise,549431,ebdavison,open,0,,,,,0,2022-12-20T22:51:45Z,2022-12-20T22:51:45Z,,NONE,,"We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 664485022,MDU6SXNzdWU2NjQ0ODUwMjI=,46,Feature: pull request reviews and comments,1326704,bhrutledge,open,0,,,,,6,2020-07-23T13:43:45Z,2022-12-20T14:40:15Z,,NONE,,"Hi there! I saw your [presentation at Boston Python](https://www.meetup.com/bostonpython/events/271887195). I'm already a light user of Datasette (thank you!), but wasn't aware of this project. I've been working on a ""pull request dashboard"" to get a comprehensive view of the state of open PR's, esp. related to reviews (i.e., pending, approved, changes requested). Currently it's a CLI command, but I thought a Datasette UI might be fun. I see that PR's are available from the `issues` command, but I don't see reviews anywhere. From the [API docs](https://docs.github.com/en/rest/reference/pulls#reviews), it looks like there are separate endpoints for those (as well as pull requests in general). What do you think about adding that? Would you accept a PR? Any sense of the level of effort?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,CharlesNepote,open,0,,,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1496652622,I_kwDOBm6k_c5ZNRtO,1955,"invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things",32839123,Rik-de-Kort,closed,0,,,,,36,2022-12-14T13:39:56Z,2022-12-19T04:34:16Z,2022-12-18T02:45:18Z,NONE,,"In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: `gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app`. My most recent deployment, however, fails loudly by shouting that `Datasette.invoke_startup()` was not called. It does not seem to be possible to call `invoke_startup` when running using a uvicorn command directly like this (I've reproduced this locally using `uvicorn`). Two candidates that I have tried: * Uvicorn has a `--factory` option, but the app factory has to be synchronous, so no `await invoke_startup` there * `asyncio.get_event_loop().run_until_complete` is also not an option because `uvicorn` already has the event loop running. One additional option is: * Use Gunicorn's [server hooks](https://docs.gunicorn.org/en/stable/settings.html#server-hooks) to call `invoke_startup`. These are also synchronous, but I might be able to get ahead of the event loop starting here. In my current deployment setup, it does not appear to be possible to use `datasette serve` directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described [here](https://github.com/simonw/azure-functions-datasette)) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation. Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears! Almost forgot, minimal reproducer: ```python from datasette import Datasette ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ``` Save as app.py in the same folder as global-power-plants.db, and then try running `uvicorn app:app`. Opening the resulting Datasette instance in the browser will show the error message.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1447050738,I_kwDOBm6k_c5WQD3y,1886,"Call for birthday presents: if you're using Datasette, let us know how you're using it here",9599,simonw,open,0,,,,,13,2022-11-13T19:25:51Z,2022-12-18T17:34:20Z,,OWNER,,"Datasette is 5 years old today. To celebrate, I'm asking the community for birthday presents: https://simonwillison.net/2022/Nov/13/datasette-birthday/ > To celebrate this open source project’s birthday, I’ve decided to try something new: I’m going to ask for birthday presents. > > An aspect of Datastte’s marketing that I’ve so far neglected is social proof. I think it’s time to change that: I know people are using the software to do cool things, but this often happens behind closed doors. > > For Datastte’s birthday, I’m looking for endorsements and case studies and just general demonstrations that show how people are using it do so cool stuff. > > So: if you’ve used Datasette to solve a problem, and you’re willing to publicize it, please give us the gift of your endorsement! > > [...] > > Add a comment to [this issue thread](https://github.com/simonw/datasette/issues/1886) describing what you’re doing. Just a few sentences is fine—though a screenshot or even a link to a live instance would be even better",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1886/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501778647,I_kwDOBm6k_c5Zg1LX,1964,Cog menu is not keyboard accessible (also no ARIA),9599,simonw,open,0,,,,,1,2022-12-18T06:36:28Z,2022-12-18T06:37:28Z,,OWNER,,"This menu here: https://latest.datasette.io/fixtures/attraction_characteristic You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard. ![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1964/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y: `",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1939/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1384549993,I_kwDOBm6k_c5Sho5p,1818,Setting to turn off table row counts entirely,9599,simonw,open,0,,,,,4,2022-09-24T06:39:22Z,2022-12-11T02:03:09Z,,OWNER,,"There are situations - such as loading SQLite files remotely using HTTP range headers - where counting all of the rows in a table should be avoided entirely. > > Also, this chunked inefficiency means that I have to hack the URL to not load tables of a database as it seems to try to load the whole database when I click on a database. > > I bet that's because Datasette tries to show a count of all of the rows in each table when it shows the list on that page, which triggers a full table scan. > > Would be great to have a setting that turns that feature off, which could then be exposed as a query string option for Datasette Lite. _Originally posted by @simonw in https://github.com/simonw/datasette-lite/issues/49#issuecomment-1256880715_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1818/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1487764628,I_kwDOCGYnMM5YrXyU,518,flake8 ValueError: Error code '#' supplied to 'extend-ignore' option...,9599,simonw,closed,0,,,,,0,2022-12-10T01:30:24Z,2022-12-10T01:36:46Z,2022-12-10T01:36:46Z,OWNER,,"> `Error code '#' supplied to 'extend-ignore' option does not match '^[A-Z]{1,3}[0-9]{0,3}$'` https://github.com/simonw/sqlite-utils/actions/runs/3662011265/jobs/6190770361 I think from this: https://github.com/simonw/sqlite-utils/blob/e660635cea6c32f4022818380b1e1ee88e7c93a6/setup.cfg#L1-L3 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487757143,I_kwDOCGYnMM5YrV9X,517,Drop support for Python 3.6,9599,simonw,closed,0,,,,,1,2022-12-10T01:23:31Z,2022-12-10T01:36:36Z,2022-12-10T01:36:36Z,OWNER,,"CI has started failing for Python 3.6: https://github.com/simonw/sqlite-utils/actions/runs/3576322798 It's fixable by swiching away from `ubuntu-latest` according to: - https://github.com/actions/setup-python/issues/355#issuecomment-1335042510 But https://endoflife.date/python says that 3.6 end of life was almost 6 years ago, and end of security support nearly 1 year ago. So I'm OK dropping support entirely - Python 3.6 users will still be able to install version 3.30, just not any releases that come next.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487738738,I_kwDOBm6k_c5YrRdy,1942,Option for plugins to request that JSON be served on the page,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-12-10T01:08:53Z,2022-12-10T01:11:30Z,,OWNER,,"Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say ""I'd like the JSON for a page to be included in a variable on the HTML page""? `datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data. This idea fits with my overall goals to unify the JSON and HTML context too. Refs: - #1711",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1942/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1198822563,I_kwDOBm6k_c5HdJSj,1706,"[feature] immutable mode for a directory, not just individual sqlite file",9020979,hydrosquall,open,0,,,,,4,2022-04-10T00:50:57Z,2022-12-09T19:11:40Z,,CONTRIBUTOR,,"## Motivation - I have a directory of sqlite databases - I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode) - Currently using this flag throws the following error IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory' ## Proposal Immutable flag works for both single files and directories datasette -i /folder-of-sqlite-files",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1706/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1486036269,I_kwDOBm6k_c5Ykx0t,1941,Mechanism for supporting key rotation for DATASETTE_SECRET,9599,simonw,open,0,,,,,1,2022-12-09T05:24:53Z,2022-12-09T05:25:20Z,,OWNER,,"Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire. Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly. Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1941/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1434094365,I_kwDOBm6k_c5Veosd,1881,Tool for simulating permission checks against actors,9599,simonw,closed,0,,,,,9,2022-11-03T04:43:20Z,2022-12-09T01:38:21Z,2022-11-04T00:13:05Z,OWNER,,"In working on this issue: - #1855 I realized that if I'm going to make actors more complicated (the proposed `_r` key for additional restricted permissions) I really need an interactive tool for simulating these checks, similar to the https://latest.datasette.io/-/allow-debug tool.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1881/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1485017981,I_kwDODEpn8M5Yg5N9,2,table identifications has no column named previous_observation_taxon,520541,heaversm,open,0,,,,,0,2022-12-08T16:47:17Z,2022-12-08T16:47:17Z,,NONE,,"Installed successfully with pip and ran `inaturalist-to-sqlite inaturalist.db simonw` and got the error: ``` sqlite3.OperationalError: table identifications has no column named previous_observation_taxon ```",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1483250004,I_kwDOBm6k_c5YaJlU,1936,Fix /db/table/-/upsert in the API explorer,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2022-12-08T00:59:34Z,2022-12-08T01:36:02Z,,OWNER,,"Split from: - #1931 - #1878 This is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not `rowid` should be included.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1936/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1432013704,I_kwDOBm6k_c5VWsuI,1878,/db/table/-/upsert API,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,8,2022-11-01T20:01:18Z,2022-12-08T01:12:18Z,2022-12-08T01:12:17Z,OWNER,,Equivalent to `sqlite-utils upsert`: https://sqlite-utils.datasette.io/en/stable/python-api.html#upserting-data,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1878/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1473659191,I_kwDOBm6k_c5X1kE3,1929,Incorrect link from the API explorer to the JSON API documentation,3556,davidbgk,closed,0,,,,,4,2022-12-03T02:08:58Z,2022-12-06T19:36:23Z,2022-12-06T19:34:20Z,CONTRIBUTOR,,"I installed `datasette==1.0a1`. When I go to http://127.0.0.1:8001/-/api I have a link: `Use this tool to try out the [Datasette API](https://docs.datasette.io/en/1.0a1/json_api.html).` but that documentation page does not exist. I'm not sure where it has to be fixed, should it link to the stable page https://docs.datasette.io/en/stable/json_api.html , the latest one https://docs.datasette.io/en/latest/json_api.html#the-json-write-api or would it be more appropriated to deploy documentation for the `1.0a1` version?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1929/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1479914599,I_kwDOCGYnMM5YNbRn,516,Feature request: output number of ignored/replaced rows for insert command,9599,simonw,open,0,,,,,4,2022-12-06T18:59:21Z,2022-12-06T19:08:14Z,,OWNER,,"https://hachyderm.io/@briandorsey/109468185742876820 > I'm fiddling with piping json to `insert -ignore` I'd love to see the count of records inserted & ignored, but didn't see a way to do that in the help/docs. > > Example: `xh ""https://hachyderm.io/api/v1/timelines/tag/rust?max_id=109443380308326328"" | sqlite-utils insert aoc.db aoc - --pk=id --ignore`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1479920517,I_kwDOBm6k_c5YNcuF,1934,Return number of ignored/replaced items from /-/insert,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-12-06T19:01:58Z,2022-12-06T19:02:03Z,,OWNER,,"Idea from here: - https://github.com/simonw/sqlite-utils/issues/516",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1934/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1473481262,I_kwDOBm6k_c5X04ou,1928,Hacker News Datasette write demo,9599,simonw,closed,0,,,,,7,2022-12-02T21:17:41Z,2022-12-02T23:47:11Z,2022-12-02T21:43:19Z,OWNER,,"Idea is to have my existing scraper at https://github.com/simonw/scrape-hacker-news-by-domain also write to my private Datasette Cloud account, then create an atom feed from it. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1471969984,I_kwDOBm6k_c5XvHrA,1926,Release notes for 1.0a1 (and release it),9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-12-01T21:18:12Z,2022-12-01T22:06:13Z,2022-12-01T22:06:12Z,OWNER,,"Mainly CORS support and a few small bug fixes. Changes: https://github.com/simonw/datasette/compare/1.0a0...99da46f7258225fc6fd8e94ddc20859ccccc4109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1926/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1215174094,I_kwDOBm6k_c5IbhXO,1720,Design plugin hook for extras,9599,simonw,closed,0,,,,,14,2022-04-26T00:08:10Z,2022-12-01T21:15:19Z,2022-04-26T20:20:27Z,OWNER,,"Refs: - #262 - #1709 I realized that this is a really natural plugin hook - and if I design it as a hook I can implement Datasette's core extras as default plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212823665,I_kwDOBm6k_c5ISjhx,1715,Refactor TableView to use asyncinject,9599,simonw,closed,0,,,,,13,2022-04-22T21:43:39Z,2022-12-01T21:15:18Z,2022-04-28T22:26:56Z,OWNER,,"I've been working on a dependency injection mechanism in a separate library: - https://github.com/simonw/asyncinject I think it's ready to try out with Datasette to see if it's a pattern that will work here. I'm going to attempt to refactor `TableView` to use it. There are two overall goals here: - Use `asyncinject` to add parallel execution of some aspects of the table page - most notably I want to be able to execute the `count(*)` query, the `select ...` query, the various faceting queries and the facet suggestion queries in parallel - and measure if doing so is good for performance. - Use it to execute different output formats (possibly with some changes to the existing `register_output_renderer()` plugin hook). I want CSV and JSON to use the same mechanism that plugins use. Stretch goal is to get this working with streaming data too, see: - #1101",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469973742,I_kwDOBm6k_c5XngTu,1922,Make sure CORS works for write APIs,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,13,2022-11-30T17:15:55Z,2022-12-01T18:47:00Z,2022-12-01T18:47:00Z,OWNER,,"Split from: - #1850",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1922/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470509936,I_kwDOBm6k_c5XpjNw,1924,Docs for replace:true and ignore:true options for insert API,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,4,2022-12-01T01:33:25Z,2022-12-01T18:15:15Z,2022-12-01T02:08:02Z,OWNER,,Equivalent to https://sqlite-utils.datasette.io/en/stable/cli.html#insert-replacing-data,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1924/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469821027,I_kwDOBm6k_c5Xm7Bj,1921,Document methods to get canned queries,25778,eyeseast,open,0,,,,,0,2022-11-30T15:26:33Z,2022-11-30T23:34:21Z,,CONTRIBUTOR,,"Two methods will get canned queries for a Datasette instance: [`Datasette.get_canned_queries`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L575) will return all canned queries for a database that an `actor` can see. [`Datasette.get_canned_query`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L592) will return a single canned query by name. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1921/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1470320227,I_kwDOBm6k_c5Xo05j,1923,latest.datasette.io Cloud Run deploys failing,9599,simonw,closed,0,,,,,3,2022-11-30T22:49:34Z,2022-11-30T23:04:56Z,2022-11-30T23:04:56Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/3587402085/jobs/6038106719v ``` Warning: ""service_account_key"" has been deprecated. Please switch to using google-github-actions/auth which supports both Workload Identity Federation and Service Account Key JSON authentication. For more details, see https://github.com/google-github-actions/setup-gcloud#authorization Error: google-github-actions/setup-gcloud failed with: failed to execute command `gcloud --quiet auth activate-service-account *** --key-file -`: /opt/hostedtoolcache/gcloud/275.0.0/x64/lib/googlecloudsdk/core/console/console_io.py:544: SyntaxWarning: ""is"" with a literal. Did you mean ""==""? if answer is None or (answer is '' and default is not None): ERROR: gcloud failed to load: module 'collections' has no attribute 'MutableMapping' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469796454,I_kwDOBm6k_c5Xm1Bm,1920,Document Datasette.metadata() method,25778,eyeseast,open,0,,,,,0,2022-11-30T15:10:36Z,2022-11-30T15:10:36Z,,CONTRIBUTOR,,"Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503 This will be the official way to access metadata from plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1920/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108671952,I_kwDOBm6k_c5CFP3Q,1605,Scripted exports,25778,eyeseast,open,0,,,,,10,2022-01-19T23:45:55Z,2022-11-30T15:06:38Z,,CONTRIBUTOR,,"Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries. I used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries. This is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1469044738,I_kwDOBm6k_c5Xj9gC,1918,API explorer should list mutable databases first,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-11-30T04:53:33Z,2022-11-30T05:22:07Z,2022-11-30T05:07:56Z,OWNER,,"https://latest.datasette.io/-/api hides `ephemeral` down at the bottom, would be more interesting if it was at the top. Related: - #1915 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1918/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469062686,I_kwDOBm6k_c5XkB4e,1919,Intermittent `test_delete_row` test failure ,9599,simonw,open,0,,,,,1,2022-11-30T05:18:46Z,2022-11-30T05:20:56Z,,OWNER,,"https://github.com/simonw/datasette/actions/runs/3580503393/jobs/6022689591 ``` delete_response = await ds_write.client.post( ""/data/{}/{}/-/delete"".format(table, delete_path), headers={ ""Authorization"": ""***"".format(write_token(ds_write)), }, ) > assert delete_response.status_code == 200 E assert 404 == 200 E + where 404 = .status_code /home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError =========================== short test summary info ============================ FAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200 + where 404 = .status_code ``` This passes most of the time, but very occasionally fails - in this case in Python 3.7 It seems to only fail for the `article,k` compound primary key test.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1919/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1469015001,I_kwDOBm6k_c5Xj2PZ,1916,GET requests against POST endpoints should not 500 error,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,1,2022-11-30T04:04:43Z,2022-11-30T05:15:19Z,2022-11-30T05:15:19Z,OWNER,,"![CF37BA4D-0677-4DDD-A339-EAF163BB63B7](https://user-images.githubusercontent.com/9599/204705025-6f88e9f7-757d-45e8-a89c-ab97e84781e8.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1916/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1469043836,I_kwDOBm6k_c5Xj9R8,1917,Don't allow writable API to edit the `_memory` database,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,2,2022-11-30T04:51:59Z,2022-11-30T05:07:56Z,2022-11-30T05:07:55Z,OWNER,,"It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1917/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468709531,I_kwDOBm6k_c5Xirqb,1915,Interactive demo of Datasette 1.0 write APIs,9599,simonw,closed,0,,,,,6,2022-11-29T21:16:03Z,2022-11-30T04:05:46Z,2022-11-30T04:05:46Z,OWNER,,"I'm going to try to get this working on https://latest.datasette.io/ - it already has a way for people to sign in as root, but none of the databases there are writable. So I'm going to build a plugin which adds a writable named in-memory database. And some kind of mechanism for clearing out that database on a regular basis - maybe tables in that database get deleted automatically an hour after they are created? (Would be neat to display their time-left-until-deleted too)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1915/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421529723,I_kwDOBm6k_c5UutJ7,1850,Write API in Datasette core,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,13,2022-10-24T22:13:24Z,2022-11-29T20:11:20Z,2022-11-29T20:11:20Z,OWNER,,"I need this for Datasette Cloud, and in thinking it through I realized that it's really time Datasette grew a default write API as well. I'm going to mostly model this off `sqlite-utils`, since I've spent a bunch of time iterating on a pseudo-JSON API for that over the past few years (piping JSON to stdin etc). I want this for Datasette 1.0. I'm going to be building it in the new [1.0-dev](https://github.com/simonw/datasette/tree/1.0-dev) branch, which is automatically deployed to https://latest-1-0-dev.datasette.io/ running on Cloud Run. API features to build: - [x] #1852 - [x] #1856 - [x] #1857 - [x] #1858 - [x] #1859 - [x] #1871 - [x] #1888 - [x] #1868 - [x] #1851 - [x] #1863 - [x] #1864 - [x] #1866 - [x] https://github.com/simonw/datasette/issues/1882 - [x] #1862 - [x] #1874 - [x] https://github.com/simonw/datasette/issues/1887 - [x] #1877 Bumped to later on: - #1855 - #1878 - #1873 - #1875 - Make sure CORS works - https://github.com/simonw/datasette/issues/1889 - Alter a table - `sqlite-utils transform` style (more powerful than straight ALTER) - Execute SQL against a write connection - Maybe even multiple write SQL statements bundled in a single transaction - https://github.com/simonw/datasette/issues/1867 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1850/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468603401,I_kwDOBm6k_c5XiRwJ,1913,Release Datasette 1.0a0,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,9,2022-11-29T19:41:42Z,2022-11-29T20:10:35Z,2022-11-29T20:10:35Z,OWNER,,"I attempted the release just now - https://github.com/simonw/datasette/releases/tag/1.0a0 - and got an unexpected test failure: https://github.com/simonw/datasette/actions/runs/3577355358/attempts/1 ``` > assert delete_response.status_code == 200 E assert 404 == 200 E + where 404 = .status_code /home/runner/work/datasette/datasette/tests/test_api_write.py:396: AssertionError =========================== short test summary info ============================ FAILED tests/test_api_write.py::test_delete_row[compound_pk_table-row_for_create2-pks2-article,k] - assert 404 == 200 + where 404 = .status_code ``` I hit ""retry"" on that test but I expect it to fail again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1913/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1432012302,I_kwDOBm6k_c5VWsYO,1877,Refactor and tidy up final write API code,9599,simonw,closed,0,,,,,1,2022-11-01T20:00:11Z,2022-11-29T19:44:16Z,2022-11-29T19:44:07Z,OWNER,,"- `views/table.py` has got a bit too big - I think the write classes should be pulled out into a separate module. - [x] There's duplicate logic for deciding if the table and database exist and checking permissions",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1877/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450303205,I_kwDOBm6k_c5Wcd7l,1891,1.0a0 release notes,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-15T19:58:20Z,2022-11-29T19:23:41Z,2022-11-29T19:23:41Z,OWNER,,"This release will mainly help preview the new Datasette write API: - #1850",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1891/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029275,I_kwDOBm6k_c5U8Dib,1864,Delete a single record from an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-27T04:53:22Z,2022-11-29T18:54:04Z,2022-11-29T18:54:04Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/delete Or... DELETE /db/table/row-pks/-/delete ``` I'm just going to do `POST` for the moment, like I did here: - #1874 Permission: `delete-row` Still needed: - [ ] Tests for rowid tables - [ ] Tests for compound primary keys",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1864/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1468519699,I_kwDOBm6k_c5Xh9UT,1911,`/db/-/create` should support creating tables with compound primary keys,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-29T18:30:47Z,2022-11-29T18:50:58Z,2022-11-29T18:48:05Z,OWNER,,"Found myself needing this to write the tests for: - #1864 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1911/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029242,I_kwDOBm6k_c5U8Dh6,1863,Update a single record in an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,16,2022-10-27T04:53:17Z,2022-11-29T18:08:53Z,2022-11-29T18:06:37Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/update { ""field"": ""updated_value"" } ``` Only the fields that you pass will be updated. Maybe this is the wrong design though? The design for insert currently looks like this: - https://github.com/simonw/datasette/issues/1851#issuecomment-1294224185 ``` POST /db/table/-/insert Authorization: Bearer xxx Content-Type: application/json { ""row"": { ""id"": 1, ""name"": ""New name"" } } ``` I could use the same format for `/-/update`, but in this case the API doesn't require you to pass every field so `""row""` doesn't seem like the right key. I think I'll go with this: ``` POST /db/table/1/-/update Authorization: Bearer xxx Content-Type: application/json { ""update"": { ""name"": ""New name"" } } ``` The benefit of having an `""update""` key is that it allows me to use other keys in the future. Maybe a `""alter"": true` key to indicate that new columns should be added if they are missing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1863/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1466952626,I_kwDOBm6k_c5Xb-uy,1909,Option to sort facets alphabetically,9599,simonw,open,0,,,,,1,2022-11-28T19:18:14Z,2022-11-28T19:19:26Z,,OWNER,,"Suggested here: - https://github.com/simonw/datasette/discussions/1908",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1909/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1439009231,I_kwDOBm6k_c5VxYnP,1884,Exclude virtual tables from datasette inspect,25778,eyeseast,open,0,,,,,6,2022-11-07T21:26:01Z,2022-11-21T04:40:56Z,,CONTRIBUTOR,,"Ran `inspect` on a spatialite database and got these warnings: ``` ERROR: conn=, sql = 'select count(*) from [SpatialIndex]', params = None: no such module: VirtualSpatialIndex ERROR: conn=, sql = 'select count(*) from [ElementaryGeometries]', params = None: no such module: VirtualElementary ERROR: conn=, sql = 'select count(*) from [KNN]', params = None: no such module: VirtualKNN ``` It still worked, but probably want to catch this. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1884/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1455928469,I_kwDOBm6k_c5Wx7SV,1903,Refactor all error classes into a datasette.exceptions module,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2022-11-18T22:44:45Z,2022-11-20T22:35:01Z,,OWNER,,"While working on this issue: - #1896 I realized that Datasette has error classes scattered around a fair bit, including some in the `datasette.utils.asgi` module for some reason. I should clean these up.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1903/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1456012874,I_kwDOBm6k_c5WyP5K,1905,`publish heroku` failing due to old Python version,9599,simonw,closed,0,,,,,4,2022-11-19T00:01:45Z,2022-11-19T01:12:05Z,2022-11-19T00:52:29Z,OWNER,,"Reported on Discord: https://discord.com/channels/823971286308356157/823971286941302908/1042814317118115901 ``` -----> Building on the Heroku-22 stack -----> Determining which buildpack to use for this app -----> Python app detected -----> Using Python version specified in runtime.txt ! Requested runtime 'python-3.8.10' is not available for this stack (heroku-22). ! For supported versions, see: https://devcenter.heroku.com/articles/python-support ! Push rejected, failed to compile Python app. ! Push failed ▸ Build failed ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1905/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1456013930,I_kwDOBm6k_c5WyQJq,1906,Extract publish Heroku support to a plugin,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-11-19T00:02:51Z,2022-11-19T00:03:10Z,,OWNER,,"> This is a strong argument for extracting the Heroku support out to a plugin - it would allow this to be fixed with a plugin release without needing to push a full release of Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1905#issuecomment-1320678715_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1906/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1455932972,I_kwDOBm6k_c5Wx8Ys,1904,Datasette Lite tests failing due to httpx upgrade,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,0,2022-11-18T22:49:31Z,2022-11-18T22:57:48Z,2022-11-18T22:52:22Z,OWNER,,"Same problem as this one: - https://github.com/simonw/datasette-lite/issues/56 Caused this failure: https://github.com/simonw/datasette/actions/runs/3500765964",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1904/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452364777,I_kwDOBm6k_c5WkVPp,1896,Extract logic for resolving a URL to a database / table / row,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-16T22:25:20Z,2022-11-18T22:57:47Z,2022-11-18T22:56:55Z,OWNER,,"> In trying to write this I realize that there's a lot of duplicated code with delete row, specifically around resolving the incoming URL into a row (or a database or a table). > > Since this is so common, I think it's worth extracting the logic out first. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1317755263_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1896/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,ar-jan,closed,0,,,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452572348,I_kwDOBm6k_c5WlH68,1900,datasette package --spatialite throws error during build,419145,rdmurphy,open,0,,,,,11,2022-11-17T02:03:28Z,2022-11-18T08:00:38Z,,NONE,,"Hello! Attempting to use `datasette package` to bundle up a SpatiaLite DB and I'm getting this error during the `docker build`: ``` sqlite3.OperationalError: /usr/lib/x86_64-linux-gnu/mod_spatialite.so.so: cannot open shared object file: No such file or directory ``` Seems to be throwing when this step is ran: ``` ERROR [6/6] RUN datasette inspect results.db --inspect-file inspect-data.json ``` This is with `v0.63.1`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1900/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1450796965,I_kwDOBm6k_c5WeWel,1894,Initialize CodeMirror during DOMContentLoaded instead of onload,95570,bgrins,closed,0,,,,,0,2022-11-16T03:52:19Z,2022-11-18T07:29:02Z,2022-11-18T07:29:02Z,CONTRIBUTOR,,As per https://github.com/simonw/datasette/pull/1893/files#r1023248927 this should prevent a flash between the textarea being replaced by CodeMirror.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1894/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452495049,I_kwDOBm6k_c5Wk1DJ,1899,Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused ,95570,bgrins,closed,0,,,,,4,2022-11-17T00:29:52Z,2022-11-18T07:28:28Z,2022-11-18T07:20:53Z,CONTRIBUTOR,,"After the upgrade to 6 (#1893) I noticed this. I think it's because we're doing overflow:hidden to accomplish the CSS resizer. When there's a single line of SQL there's a gap below that line where clicking doesn't do anything. It should focus at the end of the line.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1453813400,I_kwDOBm6k_c5Wp26Y,1901,"Some plugins show ""home"" breadcrumbs twice in the top left",95570,bgrins,closed,0,,,,,8,2022-11-17T18:44:58Z,2022-11-18T07:22:37Z,2022-11-18T07:02:56Z,CONTRIBUTOR,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1901/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1454532488,I_kwDOBm6k_c5WsmeI,1902,Document {% block crumbs %} for plugin authors,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-11-18T06:16:30Z,2022-11-18T06:16:39Z,,OWNER,,"> For `datasette-copyable` I want to show breadcrumbs that take database/instance permissions into account, so I'm removing `{% block nav %}` entirely and replacing it with this: > > ```html+jinja > {% block crumbs %} > {{ crumbs.nav(request=request, database=database, table=table) }} > {% endblock %} > ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1901#issuecomment-1319588163_ I should document this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1902/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1452457263,I_kwDOBm6k_c5Wkr0v,1897,Serve schema JSON to the SQL editor to enable autocomplete,9599,simonw,closed,0,,,,,9,2022-11-16T23:47:45Z,2022-11-18T05:33:20Z,2022-11-18T02:54:43Z,OWNER,,"See: - https://github.com/simonw/datasette/issues/1893#issuecomment-1317831555 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1897/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1453134846,I_kwDOCGYnMM5WnRP-,513,Add or document streamlined workflow for importing Datasette csv / json exports,19328961,henry501,open,0,,,,,0,2022-11-17T10:54:47Z,2022-11-17T10:54:47Z,,NONE,,"I'm working on some small front-end enhancements to the laion-aesthetic-datasette project, and I wanted to partially populate a database directly using exports from the existing Datasette instance instead of downloading the parquet files and creating my own multi-GB database. There have been a number of small issues that are certainly related to my relative lack of familiarity with the toolkit, but that are still surprising. For example: a CSV export of the images table (http://laion-aesthetic.datasette.io/laion-aesthetic-6pls.csv?sql=select+rowid%2C+url%2C+text%2C+domain_id%2C+width%2C+height%2C+similarity%2C+punsafe%2C+pwatermark%2C+aesthetic%2C+hash%2C+__index_level_0__+from+images+order+by+random%28%29+limit+100) has nested single quotes, double quotes, and commas that aren't handled by rows_from_file. Similarly, the json output has to be manually transformed to add the column names and remove extraneous information before sqlite_utils can import it. I was able to work through these issues, but as an enhancement it would be really helpful to create or document a clear workflow that avoids the friction of this data transformation.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1448143294,I_kwDOBm6k_c5WUOm-,1890,Autocomplete text entry for filter values that correspond to facets,536941,fgregg,closed,0,,,,,16,2022-11-14T14:11:31Z,2022-11-17T00:47:36Z,2022-11-16T03:23:01Z,CONTRIBUTOR,,"datasette allows users to enter in the value for named parameters into a free-text form field. I think it would add a lot of usability, if the form field could be a drop down of options when query value is already a faceted column.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1890/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452360613,I_kwDOBm6k_c5WkUOl,1895,Avoid using host name when building absolute URLs?,14294,hubgit,open,0,,,,,0,2022-11-16T22:21:27Z,2022-11-16T22:21:27Z,,NONE,,"When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with `https://[service].run.app` rather than the (custom) domain of the Firebase app. I guess this is because a) the custom domain of the Firebase app isn't being passed through in the `host` header of the request to the Cloud Run instance and b) the `absolute_url` function in Datasette is using information from the request to build the URL. Would it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1895/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1433576351,I_kwDOBm6k_c5VcqOf,1880,Datasette with many and large databases > Memory use,525934,amitkoth,open,0,,,,,4,2022-11-02T18:10:27Z,2022-11-16T17:50:29Z,,NONE,,"> Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases. The above is from the docs ^. There's two problems here - the number of datasette ""instances"" in a single server/VM and the size of the database itself. We want the **opposite** of in-memory, including what happens on SQLlite - documented in https://www.sqlite.org/inmemorydb.html From the context in https://github.com/simonw/datasette/issues/1150 - does it mean datasette is memory-bound to the size of the dataset - which might be a deal-breaker for many large-scale use cases? In an extreme case - let's say a single server had 100 SQLlite databases, which would enable 100 ""instances"" of datasette to run, one per client (e.g. in a SaaS multi-tenant environment). How could we achieve all these goals: 1. Allow any _one_ of these 100 databases to grow to say 2Tb in size 2. Have one datasette instance, which connects to 1 of the 100 instances, based on incoming credentials/tenant ID 3. Minimize memory use entirely - both by datasette and SQLlite, such that almost all operations are executed in real-time on-disk with little to no memory consumption per-tenant, or per-database. Any ideas appreciated - we're looking to use this in a SaaS type of setting - many instances, single server. @simonw great work on datasette, in general! Possibly related to https://github.com/simonw/datasette/issues/1480 but we don't want use any kind of serverless infra - this is a long-running VM/server.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,simonw,closed,0,,,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1447388809,I_kwDOBm6k_c5WRWaJ,1887,Add a confirm step to the drop table API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-14T04:59:53Z,2022-11-15T19:59:59Z,2022-11-14T05:18:51Z,OWNER,,"> In playing with the API explorer just now I realized it's way too easy to accidentally drop a table using it. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1871#issuecomment-1313097057_ Added drop table API in: - #1874",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1887/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429030341,I_kwDOBm6k_c5VLUXF,1874,API to drop a table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-30T21:55:11Z,2022-11-15T19:59:53Z,2022-11-14T05:45:06Z,OWNER,,"`POST /db/table/-/drop` Require `drop-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1874/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425011030,I_kwDOBm6k_c5U7_FW,1862,"Create a new table from one or more records, `sqlite-utils` style",9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T04:25:02Z,2022-11-15T19:59:47Z,2022-11-15T06:42:09Z,OWNER,,"It's interesting to also think about what the form-based UI for this could look like - since that would involve users creating new columns of different types on the fly. Will need the `create-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1862/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1435294468,I_kwDOBm6k_c5VjNsE,1882,`/db/-/create` API for creating tables,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,12,2022-11-03T21:44:32Z,2022-11-15T19:59:43Z,2022-11-15T06:00:41Z,OWNER,,"> It really feels like this should be accompanied by a `/db/-/create` API for creating tables. I had to add that to `sqlite-utils` eventually (initially it only supported creating by passing in an example document): > > https://sqlite-utils.datasette.io/en/stable/cli.html#cli-create-table _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1862#issuecomment-1299073433_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1882/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426001541,I_kwDOBm6k_c5U_w6F,1866,API for bulk inserting records into a table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,12,2022-10-27T17:19:25Z,2022-11-15T19:59:34Z,2022-10-30T06:04:07Z,OWNER,,"Similar to https://github.com/simonw/datasette-insert/blob/0.8/README.md#inserting-data-and-creating-tables I expect this to become by far the most common way that data gets into a Datasette instance - more so than the individual row API in: - #1851 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1866/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421544654,I_kwDOBm6k_c5UuwzO,1851,API to insert a single record into an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,22,2022-10-24T22:24:21Z,2022-11-15T19:59:18Z,2022-10-28T00:59:25Z,OWNER,,Controlled by a new `insert-row` permission.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1851/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426195437,I_kwDOBm6k_c5VAgPt,1868,Design URLs for the write API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,5,2022-10-27T19:55:30Z,2022-11-15T19:59:14Z,2022-10-27T20:07:01Z,OWNER,,"My original design for this issue: - #1851 Was `POST /db/table` with JSON of `{""insert"": {...}}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1447439985,I_kwDOBm6k_c5WRi5x,1888,API explorer should take immutability into account,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,1,2022-11-14T06:00:14Z,2022-11-15T19:59:10Z,2022-11-14T06:04:48Z,OWNER,,"Refs: - #1871 I noticed the API explorer doesn't show any links on https://latest-1-0-dev.datasette.io/-/api because the `fixtures` database is immutable. It should still show read examples there.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1888/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1427293909,I_kwDOBm6k_c5VEsbV,1871,API explorer tool,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,24,2022-10-28T13:49:11Z,2022-11-15T19:59:05Z,2022-11-14T04:59:59Z,OWNER,,The API will be much easier to develop if there's a page that helps you execute JSON POSTs against it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1871/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423369494,I_kwDOBm6k_c5U1uUW,1859,datasette create-token CLI command,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T03:12:59Z,2022-11-15T19:59:00Z,2022-10-26T04:31:39Z,OWNER,,The CLI equivalent of the `/-/create-token` page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423364990,I_kwDOBm6k_c5U1tN-,1858,`max_signed_tokens_ttl` setting for a maximum duration on API tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-26T03:05:53Z,2022-11-15T19:58:52Z,2022-10-27T03:15:05Z,OWNER,,"It's currently possible to use `/-/create-token` to create a token that lasts forever. Some administrators may wish to have a maximum expiry instead. I should support that with a setting.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1858/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423347412,I_kwDOBm6k_c5U1o7U,1857,Prevent API tokens from using /-/create-token to create more tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,1,2022-10-26T02:38:09Z,2022-11-15T19:57:11Z,2022-10-26T02:57:26Z,OWNER,,"> It strikes me that users should NOT be able to use a token to create additional tokens. > > The current design actually does allow that, since the `dstok_` Bearer token can be used to authenticate calls to the `/-/create-token` page. > > So I think I need a mechanism whereby that page can only allow access to users authenticated by cookie. > > Not obvious how to do that though, since Datasette's authentication actor system is designed to abstract that detail away! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1291417100_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1857/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336122,I_kwDOBm6k_c5U1mK6,1856,allow_signed_tokens setting for disabling API signed token mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T02:20:55Z,2022-11-15T19:57:05Z,2022-10-26T02:58:35Z,OWNER,,"Had some design thoughts here: https://github.com/simonw/datasette/issues/1852#issuecomment-1291272280 I liked this option the most: --setting allow_create_tokens off",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1856/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1421552095,I_kwDOBm6k_c5Uuynf,1852,Default API token authentication mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,30,2022-10-24T22:31:07Z,2022-11-15T19:57:00Z,2022-10-26T02:19:54Z,OWNER,,"API authentication will be via `Authorization: Bearer XXX` request headers. I'm inclined to add a default token mechanism to Datasette based on tokens that are signed with the `DATASETTE_SECRET`. Maybe the root user can access `/-/create-token` which provides a UI for generating a time-limited signed token? Could also have a `datasette token` command for creating such tokens at the command-line. Plugins can then define alternative ways of creating tokens, such as the existing https://datasette.io/plugins/datasette-auth-tokens plugin. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1850#issuecomment-1289706439_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1852/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1446657889,I_kwDOBm6k_c5WOj9h,1885,Integrate inside GUI app (tkinter),5115787,dmalves,open,0,,,,,0,2022-11-13T00:10:43Z,2022-11-13T00:11:09Z,,NONE,,"Hi, I'd like to integrate datasette inside a tkinter app. The app should be able to start/stop datasette server. How could I integrate datasette inside my app, so it can start and stop datasette server?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1885/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1435917503,I_kwDOBm6k_c5Vlly_,1883,Errors when using table filters behind a proxy,31312775,mattmalcher,closed,0,,,,,13,2022-11-04T11:18:47Z,2022-11-11T09:20:22Z,2022-11-11T06:54:58Z,NONE,,"Using datasette==0.63 table filters do not respect the `base_url` setting as described [here](https://docs.datasette.io/en/stable/deploying.html#running-datasette-behind-a-proxy) To reproduce, go to: https://datasette-apache-proxy-demo.datasette.io/prefix/fixtures/binary_data Then use the table filter buttons. The `/prefix/` is dropped, resulting in URL not found: https://datasette-apache-proxy-demo.datasette.io/fixtures/binary_data?_sort=rowid&rowid__exact=1 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1883/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1436539554,I_kwDOCGYnMM5Vn9qi,511,"[insert_all, upsert_all] IntegrityError: constraint failed",7908073,chapmanjacobd,closed,0,,,,,2,2022-11-04T19:21:48Z,2022-11-04T22:59:54Z,2022-11-04T22:54:09Z,CONTRIBUTOR,,"My understand is that `INSERT OR IGNORE` will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from `sqlite3`. ``` import argparse from pathlib import Path from xklb import db, utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument(""database"") parser.add_argument(""dbs"", nargs=""*"") parser.add_argument(""--upsert"") parser.add_argument(""--db"", ""-db"", help=argparse.SUPPRESS) parser.add_argument(""--verbose"", ""-v"", action=""count"", default=0) args = parser.parse_args() if args.db: args.database = args.db Path(args.database).touch() args.db = db.connect(args) log.info(utils.dict_filter_bool(args.__dict__)) return args def merge_db(args, source_db): source_db = str(Path(source_db).resolve()) s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose)) for table in [s for s in s_db.table_names() if not ""_fts"" in s and not s.startswith(""sqlite_"")]: log.info(""[%s]: %s"", source_db, table) with s_db.conn: data = s_db[table].rows with args.db.conn: if args.upsert: args.db[table].upsert_all(data, pk=args.upsert.split("",""), alter=True) else: args.db[table].insert_all(data, alter=True, replace=True) def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db) if __name__ == ""__main__"": merge_dbs() ``` ``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO [media]([path]) VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze) 3116 all_columns += [ 3117 column for column in record if column not in all_columns 3118 ] 3120 first = False -> 3122 self.insert_chunk( 3123 alter, 3124 extracts, 3125 chunk, 3126 all_columns, 3127 hash_id, 3128 hash_id_columns, 3129 upsert, 3130 pk, 3131 conversions, 3132 num_records_processed, 3133 replace, 3134 ignore, 3135 ) 3137 if analyze: 3138 self.analyze() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore) 2885 for query, params in queries_and_params: 2886 try: -> 2887 result = self.db.execute(query, params) 2888 except OperationalError as e: 2889 if alter and ("" column"" in e.args[0]): 2890 # Attempt to add any missing columns, then try again File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters) 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) IntegrityError: constraint failed > /home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py(484)execute() 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) ``` ``` sqlite3 --version 3.36.0 2021-06-18 18:36:39 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 802513359,MDU6SXNzdWU4MDI1MTMzNTk=,1217,Possible to deploy as a python app (for Rstudio connect server)?,6165713,plpxsk,open,0,,,,,4,2021-02-05T22:21:24Z,2022-11-04T11:37:52Z,,NONE,,"Is it possible to deploy a `datasette` application as a python web app? In my enterprise, I have option to deploy python apps via [Rstudio Connect](https://github.com/rstudio/rsconnect-python), and I would like to publish a `datasette` dashboard for sharing. I welcome any pointers to converting `datasette serve` into a python app that can be run as something like `python datasette.py --my_data.db`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1217/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1432037325,I_kwDOBm6k_c5VWyfN,1879,Make it easier to fix URL proxy problems,9599,simonw,open,0,,,,,5,2022-11-01T20:19:23Z,2022-11-01T20:33:52Z,,OWNER,,"This came up on Discord again today: figuring out how to run Datasette behind a proxy that might hide the incoming Host: header (and strip HTTPS) is really hard! https://discord.com/channels/823971286308356157/823971286941302908/1037012475322847263",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1879/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,simonw,closed,0,,,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1431786951,I_kwDOBm6k_c5VV1XH,1876,SQL query should wrap on SQL interrupted screen,9599,simonw,closed,0,,,,,2,2022-11-01T17:14:01Z,2022-11-01T17:22:33Z,2022-11-01T17:22:33Z,OWNER,,"Just saw this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1876/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1430325103,I_kwDOCGYnMM5VQQdv,507,conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-31T18:49:51Z,2022-11-01T00:40:17Z,2022-11-01T00:40:16Z,CONTRIBUTOR,,"I'm not really sure what caused this and it happened in the middle of my program (after running for 35775 seconds). ``` Extracting metadata 49.9% (chunk 9893 of 19831) ... File ""/home/xk/.local/lib/python3.10/site-packages/xklb/fs_extract.py"", line 90, in extract_chunk args.db[""media""].insert_all(utils.list_dict_filter_bool(media), pk=""path"", alter=True, replace=True) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3107, in insert_all self.insert_chunk( File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2872, in insert_chunk result = self.db.execute(query, params) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 483, in execute return self.conn.execute(sql, parameters) UnicodeEncodeError: 'utf-8' codec can't encode character '\udcc3' in position 62: surrogates not allowed ``` This might be relevant: https://stackoverflow.com/questions/31898353/python-cant-encode-with-surrogateescape I'm going to try re-running with ```py def execute( self, sql: str, parameters: Optional[Union[Iterable, dict]] = None ) -> sqlite3.Cursor: """""" Execute SQL query and return a ``sqlite3.Cursor``. :param sql: SQL query to execute :param parameters: Parameters to use in that query - an iterable for ``where id = ?`` parameters, or a dictionary for ``where id = :id`` """""" try: if self._tracer: self._tracer(sql, parameters) if parameters is not None: return self.conn.execute(sql, parameters) else: return self.conn.execute(sql) except UnicodeEncodeError: sql = sql.encode('utf-8', 'surrogatepass').decode('utf-8') if parameters is not None: parameters = parameters.encode('utf-8', 'surrogatepass').decode('utf-8') return self.execute(sql, parameters) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077560091,I_kwDODEm0Qs5AOkMb,61,"Data Pull fails for ""Essential"" level access to the Twitter API (for Documentation)",57161638,jmnickerson05,open,0,,,,,1,2021-12-11T14:59:41Z,2022-10-31T14:47:58Z,,NONE,,"Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note. ![image](https://user-images.githubusercontent.com/57161638/145681272-8c85b3b9-be95-44ff-9760-1bafa4917ce2.png) And this is how I'm surfacing the message from utils.py: ![image](https://user-images.githubusercontent.com/57161638/145681005-2776c0ad-9822-4461-b43a-450ab2e828eb.png) ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1428560020,I_kwDOBm6k_c5VJhiU,1872,"SITE-BUSTING ERROR: ""render_template() called before await ds.invoke_startup()""",192568,mroswell,closed,0,,,,,3,2022-10-30T02:28:39Z,2022-10-30T06:26:01Z,2022-10-30T06:26:01Z,CONTRIBUTOR,,"1. My https://list.saferdisinfectants.org/disinfectants/listN page (linked from https://SaferDisinfectants.org ) has been running beautifully for a year and a half, including a GitHub Actions workflow that's been routinely updating the database. 2. I received a recent report that the list page is down. I don't know when it went down, but the content is replaced with: ""render_template() called before await ds.invoke_startup()"" 3. The local datasette repo runs without incident. 4. The site is hosted on vercel, linked to my github repo. Perhaps some vercel changes were made, but not by anyone on our side. Here is a screenshot of the current project settings: Here a screenshot of the latest deployment status: This is my repository: https://github.com/mroswell/list-N (I notice: datasette==0.59 in my requirements.txt file) Because it's been long while since I actively worked on this or any other datasette project, I forget a lot of what I knew at one point. Perhaps some configuration file could be missing? Or perhaps I just need to know the right incantation to add to that vercel settings page. Help is welcome as the nonprofit org is soon hosting its annual conference, and we'd love to have the page working again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,simonw,closed,0,,,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1342430983,I_kwDOBm6k_c5QA98H,1786,Adjust height of textarea for no JS case,9599,simonw,closed,0,,,,,4,2022-08-18T01:15:15Z,2022-10-27T21:50:12Z,2022-08-18T16:06:09Z,OWNER,,"Datasette Lite: https://lite.datasette.io/?sql=https://gist.githubusercontent.com/simonw/1f8a91123ccefd8844187225b1832d7a/raw/5069075b86aa79358fbab3d4482d1d269077d632/recipes.sql#/data?sql=select+id%2C+name%2C+ingredients%2C+%28%0A++select+json_group_array%28value%29+from+json_each%28ingredients%29%0A++where+value+in+%28select+value+from+json_each%28%3Ap0%29%29%0A%29+as+matching_ingredients%0Afrom+recipes%0Awhere+json_array_length%28matching_ingredients%29+%3E+0%0Aorder+by+json_array_length%28matching_ingredients%29+desc&p0=%5B%22sugar%22%2C+%22cheese%22%5D ![46F8101E-8CE3-4F61-B200-F865E6B5DBCC](https://user-images.githubusercontent.com/9599/185270723-f55513b0-b561-434d-9d7c-4fe5be9756e0.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1424378012,I_kwDOBm6k_c5U5kic,1860,SQL query field can't begin by a comment,562352,CharlesNepote,closed,0,,,,,12,2022-10-26T16:55:31Z,2022-10-27T18:57:37Z,2022-10-27T04:21:40Z,NONE,,"![image](https://user-images.githubusercontent.com/562352/198085197-f26fcd61-4dac-4ca4-a346-e70f88a30ecc.png) SQL comments are **very** useful to explain the meaning of the query. It's currently impossible to put it at the beginning of the field as seen on the screen capture: it leads to an error: `Statement must be a SELECT`. It would be great to make it possible because: * as the request is the title of the page: * it eases the search with search engines * it eases the search in the browsers' url field * it acts as a kind of title: the global meaning of the query is immediately understandable * some tools, such as Slack, are shortening long URLs and displaying the beginning of the URLs (eg. `https://example.org/products?sql=select+%28length%28data_quality_errors_ta[...]+%21%3D+%22%22+group+by+NB_of_issues+order+by+NB_of_issues+desc+limit+200`) Beginning a query with a comment is possible with SQLite. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1860/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425682079,I_kwDOBm6k_c5U-i6f,1865,Stop syncing main to master,9599,simonw,closed,0,,,,,1,2022-10-27T13:55:38Z,2022-10-27T13:58:27Z,2022-10-27T13:56:13Z,OWNER,,"I think it's been long enough now that I can drop the code that syncs the main branch to master. I originally added this for people who might be using `datasette publish ... --branch master` - which might only have been me anyway!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1865/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 639072811,MDU6SXNzdWU2MzkwNzI4MTE=,849,Rename master branch to main,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2020-06-15T19:05:54Z,2022-10-27T13:57:08Z,2020-09-15T20:37:14Z,OWNER,,"I was waiting for consensus to form around this (and kind-of hoping for `trunk` since I like the tree metaphor) and it looks like `main` is it. I've seen convincing arguments against `trunk` too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So `main` is better anyway.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/849/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1424980545,I_kwDOBm6k_c5U73pB,1861,"request.headers.get(""Content-Type"") fails",9599,simonw,open,0,,,,,0,2022-10-27T03:39:12Z,2022-10-27T03:39:12Z,,OWNER,,"Turns out this is case-sensitive, needs to be: request.headers.get(""content-type"") != ""application/json"" That's not great usability. It should be case insensitive.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1861/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1420174670,I_kwDOBm6k_c5UpiVO,1849,NoneType' object has no attribute 'actor',9599,simonw,closed,0,,,,,5,2022-10-24T04:02:15Z,2022-10-26T21:13:40Z,2022-10-26T21:13:40Z,OWNER,,"``` File ""/usr/local/lib/python3.10/site-packages/datasette/templates/_crumbs.html"", line 3, in template {% set items=crumb_items(request=request, database=database, table=table) %} File ""jinja2/async_utils.py"", line 65, in auto_await return await t.cast(""t.Awaitable[V]"", value) File ""datasette/app.py"", line 638, in _crumb_items actor=request.actor, action=""view-instance"", default=True ``` From Sentry.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1849/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642297505,MDU6SXNzdWU2NDIyOTc1MDU=,857,Comprehensive documentation for variables made available to templates,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2020-06-20T03:19:43Z,2022-10-26T02:58:17Z,2022-10-26T02:58:17Z,OWNER,,"Needed for the Datasette 1.0 release, so template authors can trust that Datasette is unlikely to break their templates.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/857/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423182778,I_kwDOCGYnMM5U1Au6,505,Release sqlite-utils 3.30,9599,simonw,closed,0,,,,,2,2022-10-25T22:20:05Z,2022-10-25T22:41:26Z,2022-10-25T22:41:16Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.29...defa2974c6d3abc19be28d6b319649b8028dc966,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,simonw,open,0,,,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,mhalle,closed,0,,,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393212964,I_kwDOCGYnMM5TCr4k,497,column_names,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-01T03:34:21Z,2022-10-25T21:09:28Z,2022-10-25T21:09:28Z,CONTRIBUTOR,,"It would be nice to have a `column_names`. Similar to `table_names`. Or if you could get one or all of the following syntax to work for both Database and Table that might be even better: Style 1 - `if 'table1' in db` - `if 'col1' in db['table1']` Style 2 - `if 'table1' in db.tables` - `if 'col1' in db['table1'].columns` maybe the table ones actually work but I'm too lazy to check. I just know that I have to do: `[c.name for c in db['table1'].columns]` Edit: This is possible with `columns_dict`. I have actually used that before but I forgot about it. Feel free to close, but I do think accessing this data could be more consistent and intuitive.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423069384,I_kwDOCGYnMM5U0lDI,504,"db.close() method, calling db.conn.close()",9599,simonw,closed,0,,,,,1,2022-10-25T20:50:50Z,2022-10-25T21:00:29Z,2022-10-25T20:57:47Z,OWNER,,"I ended up needing to use `db.conn.close()` to fix this issue: - #503 I think `.close()` should be a method on `Database` itself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423000702,I_kwDOCGYnMM5U0UR-,503,test_recreate failing on Windows Python 3.11,9599,simonw,closed,0,,,,,10,2022-10-25T20:01:41Z,2022-10-25T20:47:34Z,2022-10-25T20:45:43Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3323672128/jobs/5494726927 Related: - #502 ``` FAILED tests/test_recreate.py::test_recreate[True-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_True_True_0\\data.db' FAILED tests/test_recreate.py::test_recreate[False-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_False_True_0\\data.db' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422973111,I_kwDOBm6k_c5U0Ni3,1854,Flaky test: test_serve_localhost_http,9599,simonw,closed,0,,,,,3,2022-10-25T19:37:35Z,2022-10-25T19:53:02Z,2022-10-25T19:53:02Z,OWNER,,Failing on Python 3.10 at the moment: https://github.com/simonw/datasette/actions/runs/3323629947/jobs/5494340302,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422915587,I_kwDOBm6k_c5Uz_gD,1853,Upgrade Datasette Docker to Python 3.11,9599,simonw,closed,0,,,,,7,2022-10-25T18:44:31Z,2022-10-25T19:28:56Z,2022-10-25T19:05:16Z,OWNER,,"Related: - #1768 I think this base image looks right: [3.11.0-slim-bullseye](https://hub.docker.com/layers/library/python/3.11.0-slim-bullseye/images/sha256-244c0b0e6e7608a16f87382fc8a5ef3c330d042113a9a7b6fc15a95360181651?context=explore)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1853/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422954582,I_kwDOCGYnMM5U0JBW,502,Fix tests for Python 3.11,9599,simonw,closed,0,,,,,1,2022-10-25T19:20:31Z,2022-10-25T19:23:47Z,2022-10-25T19:23:47Z,OWNER,,"The way errors are represented has changed: https://github.com/simonw/sqlite-utils/actions/runs/3323588047/jobs/5494127154 ``` _________________________ test_query_invalid_function __________________________ db_path = '/tmp/pytest-of-runner/pytest-0/test_query_invalid_function0/test.db' def test_query_invalid_function(db_path): result = CliRunner().invoke( cli.cli, [db_path, ""select bad()"", ""--functions"", ""def invalid_python""] ) assert result.exit_code == 1 > assert ( result.output.strip() == ""Error: Error in functions definition: invalid syntax (, line 1)"" ) E AssertionError: assert 'Error: Error...ing>, line 1)' == 'Error: Error...ing>, line 1)' E - Error: Error in functions definition: invalid syntax (, line 1) E ? ^^^^^^ ^^^^^^ E + Error: Error in functions definition: expected '(' (, line 1) E ? ^^^^^^^ ^^^ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1420090659,I_kwDOBm6k_c5UpN0j,1848,Private database page should show padlock on every table,9599,simonw,closed,0,,,,,3,2022-10-24T02:28:38Z,2022-10-24T02:50:29Z,2022-10-24T02:42:34Z,OWNER,,"Following: - #1829 https://latest.datasette.io/_internal looks like this: But those queries and tables are private too, and should also show the padlock icon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1396948693,I_kwDOBm6k_c5TQ77V,1829,Table/database that is private due to inherited permissions does not show padlock,9599,simonw,closed,0,,,,,8,2022-10-04T23:14:16Z,2022-10-24T02:23:46Z,2022-10-24T02:11:37Z,OWNER,,"I noticed that a table page that is private because the database or instance is private, e.g. this one: Is not displaying the padlock icon that indicates the table is not visible to the public. Same issue for the database page too, which in this case is private due to `view-instance`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1829/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1420055377,I_kwDOBm6k_c5UpFNR,1847,Both _local_metadata and _metadata_local?,9599,simonw,closed,0,,,,,2,2022-10-24T01:43:08Z,2022-10-24T01:53:13Z,2022-10-24T01:53:13Z,OWNER,,"Spotted this in the debugger against the `datasette` object while running tests (`pytest -k test_permissions_cascade` to be exact): ``` (Pdb) [p for p in dir(self) if p.startswith('_') and '__' not in p] ['_actor', '_asset_urls', '_connected_databases', '_crumb_items', '_local_metadata', '_metadata', '_metadata_local', '_metadata_recursive_update', '_permission_checks', '_plugins', '_prepare_connection', '_refresh_schemas', '_refresh_schemas_lock', '_register_custom_units', '_register_renderers', '_root_token', '_routes', '_secret', '_settings', '_show_messages', '_startup_hook_calculation', '_startup_hook_fired', '_startup_invoked', '_threads', '_versions', '_write_messages_to_response'] ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1847/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413641049,I_kwDOCGYnMM5UQnNZ,501,Tests failing due to updated tabulate library,9599,simonw,closed,0,,,,,4,2022-10-18T18:07:52Z,2022-10-18T18:23:40Z,2022-10-18T18:23:40Z,OWNER,,"Failure here: https://github.com/simonw/sqlite-utils/actions/runs/3275786702/jobs/5391063221 I figured out the problem: ```diff diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index b88e38a..82b4b6c 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -112,11 +112,15 @@ See :ref:`cli_query`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -176,11 +180,15 @@ See :ref:`cli_memory`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -401,11 +409,14 @@ See :ref:`cli_search`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -651,11 +662,14 @@ See :ref:`cli_tables`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each table @@ -689,11 +703,14 @@ See :ref:`cli_views`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each view @@ -732,11 +749,15 @@ See :ref:`cli_rows`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional @@ -768,11 +789,14 @@ See :ref:`cli_triggers`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -804,11 +828,14 @@ See :ref:`cli_indexes`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint diff --git a/docs/cli.rst b/docs/cli.rst index 8bc4176..1d67e88 100644 --- a/docs/cli.rst +++ b/docs/cli.rst @@ -187,10 +187,15 @@ Available ``--fmt`` options are: cog.out(""\n"" + ""\n"".join('- ``{}``'.format(t) for t in tabulate.tabulate_formats) + ""\n\n"") .. ]]] +- ``asciidoc`` +- ``double_grid`` +- ``double_outline`` - ``fancy_grid`` - ``fancy_outline`` - ``github`` - ``grid`` +- ``heavy_grid`` +- ``heavy_outline`` - ``html`` - ``jira`` - ``latex`` @@ -198,15 +203,22 @@ Available ``--fmt`` options are: - ``latex_longtable`` - ``latex_raw`` - ``mediawiki`` +- ``mixed_grid`` +- ``mixed_outline`` - ``moinmoin`` - ``orgtbl`` +- ``outline`` - ``pipe`` - ``plain`` - ``presto`` - ``pretty`` - ``psql`` +- ``rounded_grid`` +- ``rounded_outline`` - ``rst`` - ``simple`` +- ``simple_grid`` +- ``simple_outline`` - ``textile`` - ``tsv`` - ``unsafehtml`` ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413610718,I_kwDOCGYnMM5UQfze,500,Turn --flatten into a documented utility function,9599,simonw,closed,0,,,,,4,2022-10-18T17:43:36Z,2022-10-18T18:02:10Z,2022-10-18T18:00:40Z,OWNER,,The `--flatten` implementation isn't currently available to Python code - people have to roll their own implementation. Feedback from a conversation at DjangoCon.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1410548368,I_kwDODFdgUs5UE0KQ,77,Feature: Support GitHub discussions,631242,frosencrantz,open,0,,,,,0,2022-10-16T16:53:38Z,2022-10-16T16:53:38Z,,CONTRIBUTOR,,"Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it. I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database. However, I'm not offering a pull request.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1410305897,I_kwDOBm6k_c5UD49p,1845,Reconsider the Datasette first-run experience,9599,simonw,open,0,,,,,3,2022-10-15T22:21:31Z,2022-10-16T08:54:53Z,,OWNER,,"Had a really interesting conversation today about how hard it is to get from ""I installed Datasette"" to ""I've done something useful with it"": https://news.ycombinator.com/item?id=33216789#33218590 Spending some time focusing on that first-run experience feels very worthwhile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1409679008,I_kwDOBm6k_c5UBf6g,1844,Update screenshots in documentation to match latest designs,9599,simonw,closed,0,,,,,18,2022-10-14T18:01:18Z,2022-10-14T23:51:46Z,2022-10-14T19:57:17Z,OWNER,,"https://docs.datasette.io/en/0.62/full_text_search.html has this out-of-date screenshot: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1844/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1063982712,I_kwDODEm0Qs4_axZ4,60,Execution on Windows,1733616,bernard01,open,0,,,,,1,2021-11-26T00:24:34Z,2022-10-14T16:58:27Z,,NONE,,"My installation on Windows using pip has been successful. I have Python 3.6. How do I run twitter-to-sqlite? I cannot even figure out how ""auth"" is a command. I have python on my path: C:\prog\python\Python36;C:\prog\python\Python36\Scripts Where should the commands be executed, and where are the files created? Could some basics please be added to the documentation to get beginners started?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703246031,MDU6SXNzdWU3MDMyNDYwMzE=,51,github-to-sqlite should handle rate limits better,9599,simonw,open,0,,,,,4,2020-09-17T04:01:50Z,2022-10-14T16:34:07Z,,MEMBER,,From #50 - right now it will crash with an error of it hits the rate limit. Since the rate limit information (including reset time) is available in the headers it could automatically sleep and try again instead.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1397084281,I_kwDOBm6k_c5TRdB5,1831,If user can see table but NOT database/instance nav links should not display,9599,simonw,closed,0,,,,,10,2022-10-05T02:16:31Z,2022-10-13T21:52:04Z,2022-10-13T21:52:04Z,OWNER,,"Spotted this bug while building this plugin: - https://github.com/simonw/datasette-public This is a public table, but the two links in the nav go to forbidden pages: Those nav links shouldn't be shown at all.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1831/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1406860394,I_kwDOBm6k_c5T2vxq,1841,Drop format_bytes for Jinja filesizeformat filter,9599,simonw,open,0,,,,,0,2022-10-12T22:06:34Z,2022-10-12T22:06:34Z,,OWNER,,"Turns out this isn't necessary: https://github.com/simonw/datasette/blob/5aa359b86907d11b3ee601510775a85a90224da8/datasette/utils/__init__.py#L849-L858 I can use this instead: https://jinja.palletsprojects.com/en/3.1.x/templates/#jinja-filters.filesizeformat",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1841/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1361355564,I_kwDOCGYnMM5RJKMs,482,balanced table default column_order,7908073,chapmanjacobd,closed,0,,,,,1,2022-09-05T03:00:18Z,2022-10-10T17:43:02Z,2022-09-06T20:17:27Z,CONTRIBUTOR,,"Is there any performance or size difference with column order in SQLITE ? similar to this https://www.cybertec-postgresql.com/en/column-order-in-postgresql-does-matter/ It might be interesting to have an option to create with an optimized column order. I'm assuming this would look something like INTEGER columns, REAL columns, BLOB columns, TEXT columns, NULL columns. NULL columns at the end because they are more likely to be TEXT and it is impossible to know if they will become INTEGER (Of course, any schema evolution would reduce optimization but maybe column order could also be re-evaluated when schema changes) edit: this is easy to accomplish with the existing `transform` method: ``` int_columns = [k for k, v in table_columns.items() if v == int] db[table].transform(column_order=[*int_columns]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912864936,MDU6SXNzdWU5MTI4NjQ5MzY=,1362,Consider using CSP to protect against future XSS,9599,simonw,open,0,,,,,17,2021-06-06T15:32:20Z,2022-10-08T18:42:09Z,,OWNER,,The XSS in #1360 would have been a lot less damaging if Datasette used CSP to protect against such vulnerabilities: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1400374908,I_kwDOBm6k_c5TeAZ8,1836,docker image is duplicating db files somehow,536941,fgregg,open,0,,,,,13,2022-10-06T22:35:54Z,2022-10-08T16:56:51Z,,CONTRIBUTOR,,"if you look into the docker image created by docker publish, the `datasette inspect` line is duplicating the db files. here's the result of the inspect command: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1836/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1015646369,I_kwDOBm6k_c48iYih,1480,Exceeding Cloud Run memory limits when deploying a 4.8G database,110420,ghing,open,0,,,,,9,2021-10-04T21:20:24Z,2022-10-07T04:39:10Z,,CONTRIBUTOR,,"When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message: > Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M. Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally: - Real Memory Size: 70.6 MB - Virtual Memory Size: 4.51 GB - Shared Memory Size: 2.5 MB - Private Memory Size: 57.4 MB I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow. This is somewhat related to #1082, but on a different platform, so I decided to open a new issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1480/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 860722711,MDU6SXNzdWU4NjA3MjI3MTE=,1301,Publishing to cloudrun with immutable mode?,5413548,louispotok,open,0,,,,,1,2021-04-18T17:51:46Z,2022-10-07T02:38:04Z,,CONTRIBUTOR,,"I'm a bit confused about immutable mode and publishing to cloudrun. (I want to publish with immutable mode so that I can support database downloads.) Running `datasette publish cloudrun --extra-options=""-i example.db""` leads to an error: > Error: Invalid value for '-i' / '--immutable': Path 'example.db' does not exist. However, running `datasette publish cloudrun example.db` not only works but seems to publish in immutable mode anyway! I'm seeing this both with `/-/databases.json` and the fact that downloads are working. When I just `datasette serve` locally, this succeeds both ways and works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1400083043,I_kwDOBm6k_c5Tc5Jj,1834,inspect data is not used for caching database hash,536941,fgregg,closed,0,,,,,0,2022-10-06T17:52:01Z,2022-10-06T20:06:21Z,2022-10-06T20:06:08Z,CONTRIBUTOR,,"When databases are loaded, https://github.com/simonw/datasette/blob/cb1e093fd361b758120aefc1a444df02462389a3/datasette/app.py#L257-L260 there is nothing preventing the rehashing of the database for immutable databases. https://github.com/simonw/datasette/blob/cb1e093fd361b758120aefc1a444df02462389a3/datasette/database.py#L50-L53 what i might expect is that relevant values of `inspect_data` get passed to the `Database` class to prevent re-hashing? With data that is many gigs large, this is a significant start up time. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1834/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1399933513,I_kwDOBm6k_c5TcUpJ,1833,Ability to submit long queries by POST,9599,simonw,open,0,,,,,0,2022-10-06T16:03:26Z,2022-10-06T16:18:00Z,,OWNER,,"Datasette doesn't limit URL lengths but some common web proxies do - the one in front of Google Cloud Run for example limits to 8KB total for incoming request headers: https://cloud.google.com/load-balancing/docs/quotas#https-lb-header-limits This means longer SQL queries can break! Need an optional mechanism for submitting queries by POST instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1833/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1387712501,I_kwDOBm6k_c5Sts_1,1824,Convert &_hide_sql=1 to #_hide_sql,562352,CharlesNepote,open,0,,,,,1,2022-09-27T12:53:31Z,2022-10-05T12:56:27Z,,NONE,,"Hiding the SQL textarea with `&_hide_sql=1` enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries). It could probably be done with a few lines of Javascript (I'm going to see if I can do that).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1824/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1397193691,I_kwDOBm6k_c5TR3vb,1832,__bool__ method on Results,9599,simonw,closed,0,,,,,2,2022-10-05T04:18:12Z,2022-10-05T04:32:33Z,2022-10-05T04:32:33Z,OWNER,,"Wrote this code today: https://github.com/simonw/datasette-public/blob/1401bfae50e71c1dfd2bfb6954f2e86d5a7ab21b/datasette_public/__init__.py#L41 ```python results = await db.execute( ""select 1 from _public_tables where table_name = ?"", [table_name] ) if len(results): return True ``` Would be nice if I could use `if results` there instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1832/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1396977994,I_kwDOBm6k_c5TRDFK,1830,Add documentation for writing tests with signed actor cookies,9599,simonw,open,0,,,,,0,2022-10-04T23:51:26Z,2022-10-04T23:51:26Z,,OWNER,,"I use this pattirn in a lot of plugin tests, e.g. https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/tests/test_edit_templates.py#L55-L58 ```python actor = ds.sign({""a"": {""id"": ""root""}}, ""actor"") response1 = await ds.client.get( ""/-/edit-templates/_footer.html"", cookies={""ds_actor"": actor} ) ``` I should add this to the documentation on this page: https://docs.datasette.io/en/latest/testing_plugins.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363552780,I_kwDOBm6k_c5RRioM,1805,truncate_cells_html does not work for links?,562352,CharlesNepote,open,0,,,,,7,2022-09-06T16:41:29Z,2022-10-03T09:18:06Z,,NONE,,"We have many links inside our dataset (please don't blame us ;-). When I use `--settings truncate_cells_html 60` it is not working for the links. Eg. https://images.openfoodfacts.org/images/products/000/000/000/088/nutrition_fr.5.200.jpg (87 chars) is not truncated: ![image](https://user-images.githubusercontent.com/562352/188689045-1946d776-2305-47cf-bfc5-b5685b9206b7.png) IMHO It would make sense that links should be treated as HTML. The link should work of course, but Datasette could truncate it: [https://images.openfoodfacts.org/images/products/00[...].jpg](https://images.openfoodfacts.org/images/products/000/000/000/088/nutrition_fr.5.200.jpg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1805/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 447469253,MDU6SXNzdWU0NDc0NjkyNTM=,485,Improvements to table label detection ,9599,simonw,open,0,9599,simonw,,,10,2019-05-23T06:19:49Z,2022-10-03T00:04:42Z,,OWNER,,"Label detection doesn't work if the primary key is called pk rather than id, so this page doesn't work: https://latest.datasette.io/fixtures/roadside_attraction_characteristics Code is here: https://github.com/simonw/datasette/blob/cccea85be6aaaeadb31f3b588ec7f732628815f5/datasette/app.py#L644-L653",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1393903845,I_kwDOBm6k_c5TFUjl,1828,word-wrap: anywhere resulting in weird display,9599,simonw,closed,0,,,,,2,2022-10-02T21:25:03Z,2022-10-02T23:01:17Z,2022-10-02T23:01:17Z,OWNER,,"e.g. on https://github-to-sqlite.dogsheep.net/github/commits This is from a change introduced here: https://github.com/simonw/datasette/commit/bf8d84af5422606597be893cedd375020cb2b369 in #1805 https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/static/app.css#L447-L450",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1828/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,simonw,open,0,,,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 377155320,MDU6SXNzdWUzNzcxNTUzMjA=,370,Integration with JupyterLab,82988,psychemedia,open,0,,,,,4,2018-11-04T13:57:13Z,2022-09-29T08:17:47Z,,CONTRIBUTOR,,"I just watched a demo video for the [JupyterLab Chart Editor](https://www.crowdcast.io/e/introducing-JupyterLab-Chart-Editor/) which wraps the plotly chart editor app in a JupyterLab panel and lets you open a plotly chart JSON file in that editor. Essentially, it pops an HTML app into a panel in JupyterLab, and I think registers the app as a file viewer for a particular file type. (I'm not completely taken by it, tbh, because it means you can do irreproducible things to the chart definition file, but that's another issue). JupyterLab extensions can also open files from a dialogue as the iframe/html previewer shows: https://github.com/timkpaine/jupyterlab_iframe. This made me wonder about what `datasette` integration with JupyterLab might do. For example, by right-clicking on a CSV file (for which there is already a CSV table view) in the file browser, offer a *View / Run as datasette* file viewer option that will: - run the CSV file through `csvs-to-sqlite`; - launch the `datasette` server and display the `datasette` view in a JupyterLab panel. (? Create a new SQLite db for each CSV file and launch each datasette view on a new port? Or have a JupyterLab (session?) SQLite db that stores all `datasette` viewed CSVs and runs on a single port?) As a freebie, the `datasette` API would allow you to run efficient SQL queries against the file eg using using `pandas.read_sql()` queries in a notebook in the same space. Related: - [JupyterLab extensions docs](https://jupyterlab.readthedocs.io/en/stable/user/extensions.html) - a [cookiecutter for wrting JupyterLab extensions using Javascript](https://github.com/jupyterlab/extension-cookiecutter-js) - a [cookiecutter for writing JupyterLab extensions using Typescript](https://github.com/jupyterlab/extension-cookiecutter-ts) - tutorial: [Let’s Make an xkcd JupyterLab Extension](https://jupyterlab.readthedocs.io/en/stable/developer/xkcd_extension_tutorial.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/370/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette"" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked % curl -I 'https://latest.datasette.io/' HTTP/1.1 200 OK link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette"" content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:52 GMT Server: Google Frontend Transfer-Encoding: chunked ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 732674148,MDU6SXNzdWU3MzI2NzQxNDg=,1062,Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows,9599,simonw,open,0,,,3268330,Datasette 1.0,5,2020-10-29T21:25:02Z,2022-09-28T14:09:54Z,,OWNER,,This can drive the upgrade of the `register_output_renderer` hook to be able to handle streaming all rows in a large query.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1062/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,pjamargh,closed,0,,,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459882902,MDU6SXNzdWU0NTk4ODI5MDI=,526,Stream all results for arbitrary SQL and canned queries,50578294,matej-fr,open,0,,,,,23,2019-06-24T13:09:45Z,2022-09-28T04:01:25Z,,NONE,,"I think that there is a difficulty with canned queries. When I want to stream all results of a canned query TwoDays I get only first 1.000 records. Example: `http://myserver/history_sample/two_days.csv?_stream=on` returns only first 1.000 records. If I do the same with the whole database i.e. `http://myserver/history_sample/database.csv?_stream=on` I get correctly all records. Any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386854246,I_kwDOBm6k_c5Sqbdm,1822,Switch to keyword-only arguments for a bunch of internal methods,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-09-26T23:20:38Z,2022-09-27T00:44:04Z,,OWNER,,"This is a good idea, and one that needs to happen before Datasette 1.0: > While you are adding features, would you be future-proofing your APIs if you switched over some arguments over to keyword-only arguments or would that be too disruptive? > > Thinking out loud: > > ``` > async def render_template( > self, templates, *, context=None, plugin_context=None, request=None, view_name=None > ): > ``` _Originally posted by @jefftriplett in https://github.com/simonw/datasette/issues/1817#issuecomment-1256781274_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1822/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1384273985,I_kwDOBm6k_c5SglhB,1817,Expose `sql` and `params` arguments to various plugin hooks,9599,simonw,open,0,,,,,7,2022-09-23T20:34:45Z,2022-09-27T00:27:53Z,,OWNER,,"On Discord: https://discord.com/channels/823971286308356157/996877076982415491/1022784534363787305 > Hi! I'm attempting to write a plugin that would provide some statistics on text fields (most common words, etc). I would want this information displayed in the table pages, and (ideally) also updated when users make custom queries from the table pages. > > It seems one way to do this would be to use the extra_template_vars hook, and make the appropriate SQL query there. So extra_template_vars would create a variable that is a list of most common words, and this is displayed on the page, possibly above the regular table view. > > Is there a way that the plugin code can access the SQL query (or even the data) that was used to produce the table view? I can see that TableView class constructs the SQL query, but I can't seem to find a way to access that information from the objects that are available to extra_template_vars. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1817/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,danp,closed,0,,,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386734383,I_kwDOBm6k_c5Sp-Mv,1821,Release Datasette 0.63a0,9599,simonw,closed,0,,,,,1,2022-09-26T21:15:27Z,2022-09-26T22:06:39Z,2022-09-26T22:06:39Z,OWNER,,"> - The [prepare_jinja2_environment(env, datasette)](https://docs.datasette.io/en/latest/plugin_hooks.html#plugin-hook-prepare-jinja2-environment) plugin hook now accepts an optional `datasette` argument. Hook implementations can also now return an `async` function which will be awaited automatically. ([#1809](https://github.com/simonw/datasette/issues/1809)) > - `--load-extension` option now supports entrypoints. Thanks, Alex Garcia. ([#1789](https://github.com/simonw/datasette/pull/1789)) > - New tutorial: [Cleaning data with sqlite-utils and Datasette](https://datasette.io/tutorials/clean-data). > - Facet size can now be set per-table with the new `facet_size` table metadata option. ([#1804](https://github.com/simonw/datasette/issues/1804)) > - `truncate_cells_html` setting now also affects long URLs in columns. ([#1805](https://github.com/simonw/datasette/issues/1805)) > - `Database(is_mutable=)` now defaults to `True`. ([#1808](https://github.com/simonw/datasette/issues/1808)) > - Non-JavaScript textarea now increases height to fit the SQL query. ([#1786](https://github.com/simonw/datasette/issues/1786)) > - More detailed command descriptions on the [CLI reference](https://docs.datasette.io/en/latest/cli-reference.html#cli-reference) page. ([#1787](https://github.com/simonw/datasette/issues/1787)) > - Datasette no longer enforces upper bounds on its depenedencies. ([#1800](https://github.com/simonw/datasette/issues/1800)) > - Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. ([#1794](https://github.com/simonw/datasette/pull/1794)) > - The `settings.json` file used in [Configuration directory mode](https://docs.datasette.io/en/latest/settings.html#config-dir) is now validated on startup. ([#1816](https://github.com/simonw/datasette/issues/1816))",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1821/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386593843,I_kwDOCGYnMM5Spb4z,494,Document how to use Just,9599,simonw,closed,0,,,,,2,2022-09-26T19:25:12Z,2022-09-26T19:32:36Z,2022-09-26T19:26:39Z,OWNER,,"I'm using `just` a lot know, based on this file - I should add that to https://sqlite-utils.datasette.io/en/latest/contributing.html https://github.com/simonw/sqlite-utils/blob/afbd2b2cba45cccb305c3d4638d18db4dd3d4bbd/Justfile#L1-L24",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363765916,I_kwDOCGYnMM5RSWqc,483,`sqlite-utils install` command,9599,simonw,closed,0,,,,,2,2022-09-06T20:13:55Z,2022-09-26T19:04:43Z,2022-09-26T18:57:15Z,OWNER,,"With the addition of `--functions` in: - #471 In addition to the existing `convert` command, there are now very good reasons to want to install additional packages into the same virtual environment as `sqlite-utils` itself, to allow them to be used with those features. This isn't easy if you installed the tool with `pipx` or `brew install sqlite-utils`. Datasette solved this problem with the `datasette install` command: - https://github.com/simonw/datasette/issues/925 `sqlite-utils` could benefit from the same idea.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/483/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386530156,I_kwDOCGYnMM5SpMVs,492,Idea: ability to pass extra variables to `--convert` scripts,9599,simonw,open,0,,,,,1,2022-09-26T18:30:45Z,2022-09-26T18:33:19Z,,OWNER,,"Got this idea from this example in https://jeqo.github.io/notes/2022-09-24-ingest-logs-sqlite/ ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': 'localhost'}) row.update({'component': 'broker'}) return rows "" ``` And the accompanying note: > The `row.update` allows to label rows as I’m planning to ingest logs from different hosts and potentially different components. This made me think: it might be neat if you could inject additional variable values into that script with extra command-line options, to make this kind of reuse easier. Something like this: ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': server}) row.update({'component': component}) return rows "" --var server ""localhost"" --var component ""broker"" ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/492/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1382457780,I_kwDOCGYnMM5SZqG0,490,Ability to insert multi-line files,6180701,jeqo,closed,0,,,,,4,2022-09-22T13:29:22Z,2022-09-26T18:24:44Z,2022-09-23T16:37:58Z,NONE,,"I was looking into how to parse application log files that contain multiline text (e.g. Java stack traces) into sqlite. I can see that at the moment `--lines` helps, but falls short when processing multi-line texts. I wonder if this functionality would be useful for sqlite-utils. A similar approach to Elastic logstash/filebeat can be adopted: https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html Potential changes: - add a `--multiline` option - additional properties for - multiline-pattern (regex expression) - multiline-negate: true/false - multiline-what: previous or next Or if this is achievable in a different way, please share. Thanks!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1217759117,I_kwDOBm6k_c5IlYeN,1727,Research: demonstrate if parallel SQL queries are worthwhile,9599,simonw,open,0,,,,,32,2022-04-27T18:54:21Z,2022-09-26T14:48:31Z,,OWNER,,"I added parallel SQL query execution here: - https://github.com/simonw/datasette/issues/1723 My hunch is that this will take advantage of multiple cores, since Python's `sqlite3` module releases the GIL once a query is passed to SQLite. I'd really like to prove this is the case though. Just not sure how to do it! Larger question: is this performance optimization actually improving performance at all? Under what circumstances is it worthwhile?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1727/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082651698,I_kwDOCGYnMM5Ah_Qy,358,Support for CHECK constraints,11597658,luxint,open,0,,,,,7,2021-12-16T21:19:45Z,2022-09-25T07:15:59Z,,NONE,,"Hi, I noticed the `transform.table()` method doesn't have an option to add/change or drop a check constraint (see https://sqlite.org/lang_createtable.html -> 3.7 Check Constraints. would be great to have this as an option! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378640768,I_kwDOBm6k_c5SLGOA,1816,Validate settings.json on startup in configuration directory mode,9599,simonw,closed,0,,,,,2,2022-09-19T23:35:18Z,2022-09-20T01:15:48Z,2022-09-20T01:15:48Z,OWNER,,"> It might have been useful for Datasette to show an error when started against a `settings.json` file that contains an invalid setting though. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1816/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378495690,I_kwDOBm6k_c5SKizK,1814,Static files not served,4068,frafra,closed,0,,,,,2,2022-09-19T20:38:17Z,2022-09-19T23:35:06Z,2022-09-19T23:34:30Z,NONE,,"Folder structure: ``` bibliography/ bibliography/static-files bibliography/static-files/styles.css bibliography/bibliography.db bibliography/metadata.json bibliography/settings.json ``` ``` $ cat bibliography/settings.json { ""suggest_facets"": false, ""truncate_cells_html"": 1000, ""static"": ""assets:static-files/"" } ``` File `/assets/styles.css` is not found (HTTP 404, `Database not found: assets`). Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1814/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378636455,I_kwDOBm6k_c5SLFKn,1815,"`datasette publish provider .` to publish whole directory, similar to configuration directory mode",9599,simonw,open,0,,,,,0,2022-09-19T23:28:59Z,2022-09-19T23:29:11Z,,OWNER,,"> I haven't done this with any of my other `datasette publish` tools, but I do think it's a good idea. Being able to publish the entire directory - with templates and plugins and metadata - does seem very useful to me. _Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/issues/23#issuecomment-1251673489_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1815/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1377811868,I_kwDOBm6k_c5SH72c,1813,missing next and next_url in JSON responses from an instance deployed on Fly ,883348,adipasquale,closed,0,,,,,1,2022-09-19T11:32:34Z,2022-09-19T11:34:45Z,2022-09-19T11:34:45Z,CONTRIBUTOR,,"👋 thank you for an incredibly useful project! I have noticed that my deployed instance on Fly does not include the `next` and `next_url` keys even for a truncated response : This is publically accessible here: `https://collectif-objets-datasette.fly.dev/collectif-objets.json?sql=select+*+from+mairies` However when I run the dataset server locally with the same data I get these next keys for the exact same query: I am wondering if I've missed some config or something specific to deployments on Fly.io? I am running datasette v0.62, without any specific config : - locally `poetry run datasette data/collectif-objets.sqlite` - for the deploy : `poetry run datasette publish fly data/collectif-objets.sqlite` as visible in [the Makefile](https://github.com/adipasquale/collectif-objets-datasette/blob/main/Makefile). _The very limited codebase is public but the sqlite db is not versioned yet because it is too large._",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1813/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1373595927,I_kwDOBm6k_c5R32kX,1809,`prepare_jinja2_environment()` hook should take `datasette` argument,9599,simonw,closed,0,,,,,11,2022-09-14T21:15:46Z,2022-09-17T03:39:05Z,2022-09-17T03:38:33Z,OWNER,,"That plugin hook's current signature is: https://github.com/simonw/datasette/blob/610425460b519e9c16d386cb81aa081c9d730ef0/datasette/hookspecs.py#L28-L30 As a result in the first alpha release of `datasette-edit-templates` I had to include this horrific hack: https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/datasette_edit_templates/__init__.py#L72-L75 ```python @hookimpl def prepare_jinja2_environment(env): # TODO: This should ideally take datasette, but that's not an argument yet datasette = inspect.currentframe().f_back.f_back.f_back.f_back.f_locals[""self""] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1809/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1375792876,I_kwDOBm6k_c5SAO7s,1811,"Drop-down menu with ""REGEXP"" choice",562352,CharlesNepote,open,0,,,,,0,2022-09-16T11:06:18Z,2022-09-16T15:30:31Z,,NONE,,"Drop-down menu below could add ""REGEXP"" choice when REGEXP sqlite extension is installed and used ![image](https://user-images.githubusercontent.com/562352/190675352-810fbdca-0827-4034-8b9f-fd67d5c35afb.png) Not sure. Close the issue if you don't find it relevant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1811/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1374939463,I_kwDOCGYnMM5R8-lH,489,Ability to load JSON records held in a file with a single top level key that is a list of objects,9599,simonw,open,0,,,,,9,2022-09-15T18:46:03Z,2022-09-15T20:56:10Z,,OWNER,,"It's very common for JSON to look like this: ```json { ""Version"": ""5.5.52.6"", ""List"": [ { ""Description"": ""Nonpartisan"", ""Id"": 1, ""ExternalId"": """" }, { ""Description"": ""Undeclared"", ""Id"": 2, ""ExternalId"": """" } ] } ``` This example taken from the records downloaded from https://www.elections.alaska.gov/election-results/e/ Right now you can't import this into `sqlite-utils` - you need to run it through `jq .List` first. But since this is so common, it would be neat if `sqlite-utils` could have a rule of thumb that says ""if it's an object, but it has a single key that is is a list of objects, use that instead"".",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/489/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1366423176,I_kwDOCGYnMM5RcfaI,485,Progressbar not shown when inserting/upserting jsonlines file,99098079,MischaU8,closed,0,,,,,1,2022-09-08T14:13:18Z,2022-09-15T20:39:52Z,2022-09-15T20:37:52Z,CONTRIBUTOR,,"When inserting or upserting a jsonlines file, no progressbar is shown. Expected behavior is that, just like with .csv/.tsv files, also for a jsonlines file (--nl), unless --silent is provided, a progressbar is shown. ```bash sql-utils upsert mydb.db posts posts.jl --nl --pk post_id (silence) ``` Currently `file_progress` is only called within the tsv/csv logic, however I think it can be safely wrapped around all the all the input formats that use `decoded`: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py#L963",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128466114,I_kwDOCGYnMM5DQwbC,406,Creating tables with custom datatypes,82988,psychemedia,open,0,,,,,5,2022-02-09T12:16:31Z,2022-09-15T18:13:50Z,,NONE,,"Via https://stackoverflow.com/a/18622264/454773 I note the ability to register custom handlers for novel datatypes that can map into and out of things like sqlite `BLOB`s. From a quick look and a quick play, I didn't spot a way to do this in `sqlite_utils`? For example: ```python # Via https://stackoverflow.com/a/18622264/454773 import sqlite3 import numpy as np import io def adapt_array(arr): """""" http://stackoverflow.com/a/31312102/190597 (SoulNibbler) """""" out = io.BytesIO() np.save(out, arr) out.seek(0) return sqlite3.Binary(out.read()) def convert_array(text): out = io.BytesIO(text) out.seek(0) return np.load(out) # Converts np.array to TEXT when inserting sqlite3.register_adapter(np.ndarray, adapt_array) # Converts TEXT to np.array when selecting sqlite3.register_converter(""array"", convert_array) ``` ```python from sqlite_utils import Database db = Database('test.db') # Reset the database connection to used the parsed datatype # sqlite_utils doesn't seem to support eg: # Database('test.db', detect_types=sqlite3.PARSE_DECLTYPES) db.conn = sqlite3.connect(db_name, detect_types=sqlite3.PARSE_DECLTYPES) # Create a table the old fashioned way # but using the new custom data type vector_table_create = """""" CREATE TABLE dummy (title TEXT, vector array ); """""" cur = db.conn.cursor() cur.execute(vector_table_create) # sqlite_utils doesn't appear to support custom types (yet?!) # The following errors on the ""array"" datatype """""" db[""dummy""].create({ ""title"": str, ""vector"": ""array"", }) """""" ``` We can then add / retrieve records from the database where the datatype of the `vector` field is a custom registered `array` type (which is to say, a `numpy` array): ```python import numpy as np db[""dummy""].insert({'title':""test1"", 'vector':np.array([1,2,3])}) for row in db.query(""SELECT * FROM dummy""): print(row['title'], row['vector'], type(row['vector'])) """""" test1 [1 2 3] """""" ``` It would be handy to be able to do this idiomatically in `sqlite_utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1374626873,I_kwDOBm6k_c5R7yQ5,1810,Featured table(s) on the homepage,9599,simonw,open,0,,,,,4,2022-09-15T14:30:49Z,2022-09-15T15:51:25Z,,OWNER,,"Many Datasette instances mainly exist to serve a single table - for example: - https://global-power-plants.datasettes.com/global-power-plants/global-power-plants - https://laion-aesthetic.datasette.io/laion-aesthetic-6pls/images It would be neat if the / homepage of those instances could be configured to highlight that specific table. Or maybe more than one?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1810/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1367835380,I_kwDOCGYnMM5Rh4L0,487,Specify foreign key against compound key in other table,540968,ryanfox,closed,0,,,,,2,2022-09-09T13:32:09Z,2022-09-11T04:00:44Z,2022-09-11T04:00:44Z,NONE,,"When inserting rows via the library, is it possible to specify a foreign key to a compound primary key? For example, suppose I create a table: ``` db = Database('events.db') db['events'].insert_all([ {'venue': 'Times Square', 'date': '2022-12-31', 'title': 'Rockin New Year Eve'}, {'venue': 'Wembley Stadium', 'date': '2022-06-05', 'title': 'FA Cup'}, {'venue': 'Times Square', 'date': '2021-12-31', 'title': 'Rockin New Year Eve'}, ], pk=('date', 'venue')) ``` And I want to add related data in another table: ``` act = {'name': 'Rick Astley', 'venue': 'Times Square', 'date': '2021-12-31' } db['performers'].insert(act, pk=) ``` Is it possible to specify a value for `pk` that will point to the compound primary key in `events`? SQLite does support it: https://www.sqlite.org/foreignkeys.html#fk_composite",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1368030952,I_kwDOBm6k_c5Rin7o,1808,Database() constructor currently defaults is_mutable to False,9599,simonw,closed,0,,,,,5,2022-09-09T16:02:41Z,2022-09-09T16:37:57Z,2022-09-09T16:19:25Z,OWNER,,"This is surprising. It caused a bug in `datasette-upload-dbs` because I didn't expect it to do that. > I think this is an API design flaw in Datasette itself, but I can fix it here first. _Originally posted by @simonw in https://github.com/simonw/datasette-upload-dbs/issues/6#issuecomment-1242150394_ Code in question: https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/database.py#L29-L32",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1808/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1366915240,I_kwDOBm6k_c5ReXio,1807,Plugin ecosystem needs to avoid crashes due to no available databases,9599,simonw,open,0,,,,,1,2022-09-08T19:54:34Z,2022-09-08T20:14:05Z,,OWNER,,"Opening this here to track the issue first reported in: - https://github.com/simonw/datasette-upload-dbs/issues/5 Plugins that expect to be able to write to a database need to not crash in situations where no writable database is available.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1807/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1365741480,I_kwDOBm6k_c5RZ4-o,1806,"UX to recover from Error 500: ""You can only execute one statement at a time.""",1470389,jieter,open,0,,,,,0,2022-09-08T08:01:27Z,2022-09-08T08:01:37Z,,NONE,,"When using the Custom SQL query view, when accidentally adding a semicolon in the middle of my query, datasette errors with: > # Error 500 > You can only execute one statement at a time. The error view doesn't contain the query textarea anymore, so it provides no easy way recover from the error. It would be nice if I could change and submit it again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1806/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363766973,I_kwDOCGYnMM5RSW69,484,Expose convert recipes to `sqlite-utils --functions`,9599,simonw,open,0,,,,,11,2022-09-06T20:15:08Z,2022-09-07T19:09:52Z,,OWNER,,"`--functions` was added in: - #471 It would be useful if the `r.jsonsplit()` and similar recipes for `sqlite-utils convert` could be used in these blocks of code too: https://sqlite-utils.datasette.io/en/stable/cli.html#sqlite-utils-convert-recipes",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353441389,I_kwDOCGYnMM5Qq-Bt,477,Conda Forge,49702524,thewchan,closed,0,,,,,2,2022-08-28T19:03:08Z,2022-09-07T03:46:55Z,2022-09-07T03:46:55Z,NONE,,"Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks. https://github.com/conda-forge/sqlite-utils-feedstock",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932716,I_kwDOCGYnMM5QpB1s,471,sqlite-utils query --functions mechanism for registering extra functions,9599,simonw,closed,0,,,8355157,3.29,12,2022-08-27T03:57:53Z,2022-09-07T03:46:26Z,2022-08-27T05:10:57Z,OWNER,,"It would be really cool if you could register additional custom SQL functions for use with the `sqlite-utils query` command - something like this: ``` sqlite-utils data.db 'update images set domain = extract_domain(url)' --functions ' from urllib.parse import urlparse def extract_domain(url): return urlparse(url).netloc ' ``` Every function defined in that code block would be registered with the connection, unless the name began with an underscore.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 507454958,MDU6SXNzdWU1MDc0NTQ5NTg=,596,Handle really wide tables better,9599,simonw,open,0,,,,,9,2019-10-15T20:05:46Z,2022-09-07T00:58:41Z,,OWNER,,"If a table has hundreds of columns the Datasette UI starts getting unwieldy. Addressing this would be neat. One option would be to only select the first 30 columns by default and provide a UI for selecting more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/596/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363440999,I_kwDOBm6k_c5RRHVn,1804,Ability to set a custom facet_size per table,9599,simonw,closed,0,,,,,6,2022-09-06T15:11:40Z,2022-09-07T00:21:56Z,2022-09-06T18:06:53Z,OWNER,,"Suggestion from Discord: https://discord.com/channels/823971286308356157/823971286941302908/1016725586351247430 > Is it possible to limit the facet size per database or even per table? This is a really good idea, it could be done in `metadata.yml`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1804/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363244199,I_kwDODFdgUs5RQXSn,75,Fetch repos doesn't support organisations,2757699,OverkillGuy,open,0,,,,,0,2022-09-06T12:55:06Z,2022-09-06T12:55:06Z,,NONE,,"Say I want to get all my Github Org's repos info, for data analysis. Not just the public repos, but also the private/internal repos. The endpoints are different for organisation, and this tool doesn't take it into account: https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L453 https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L455 The endpoints for organisation repos is instead ([source](https://docs.github.com/en/rest/repos/repos#list-organization-repositories)): `url = ""https://api.github.com/orgs/{}/repos"".format(username)` Let's add support for organisations repo scraping.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1362402998,I_kwDOBm6k_c5RNJ62,1802,Tests reliably failing on Python 3.7,9599,simonw,closed,0,,,,,15,2022-09-05T19:21:16Z,2022-09-06T00:40:20Z,2022-09-06T00:40:20Z,OWNER,," https://github.com/simonw/datasette/runs/8194907739?check_suite_focus=true I thought this might be an intermittent failure but attempts to re-run the tests have not made it pass. End of that trace is: ``` /home/runner/work/datasette/datasette/datasette/app.py:234: in __init__ self._refresh_schemas_lock = asyncio.Lock() /opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/asyncio/locks.py:161: in __init__ self._loop = events.get_event_loop() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_event_loop(self): """"""Get the event loop for the current context. Returns an instance of EventLoop or raises an exception. """""" if (self._local._loop is None and not self._local._set_called and isinstance(threading.current_thread(), threading._MainThread)): self.set_event_loop(self.new_event_loop()) if self._local._loop is None: raise RuntimeError('There is no current event loop in thread %r.' > % threading.current_thread().name) E RuntimeError: There is no current event loop in thread 'MainThread'. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1802/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 903986178,MDU6SXNzdWU5MDM5ODYxNzg=,1344,Test Datasette Docker images built for different architectures,9599,simonw,open,0,,,,,10,2021-05-27T16:52:29Z,2022-09-06T00:07:58Z,,OWNER,,"Continuing on from #1319 - now that we have the ability to build Datasette's Docker image against multiple architectures we should test that it works. We can do this with QEMU emulation, see https://twitter.com/nevali/status/1397958044571602945",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1362363685,I_kwDOBm6k_c5RNAUl,1800,Remove upper bound dependencies as a default policy,9599,simonw,closed,0,,,,,3,2022-09-05T18:23:45Z,2022-09-05T18:39:52Z,2022-09-05T18:35:41Z,OWNER,,"https://iscinumpy.dev/post/bound-version-constraints/ has convinced me not to use upper bound dependencies unless I'm certain they are needed. Relevant PR: - https://github.com/simonw/datasette/pull/1799 Also: https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L45-L46 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L48-L49 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L51-L55 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L57-L59 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L75-L78 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L81-L82 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1800/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816526538,MDU6SXNzdWU4MTY1MjY1Mzg=,239,sqlite-utils extract could handle nested objects,9599,simonw,open,0,,,,,16,2021-02-25T15:10:28Z,2022-09-03T23:46:02Z,,OWNER,,"Imagine a table (imported from a nested JSON file) where one of the columns contains values that look like this: {""email"": ""anonymous@noreply.airtable.com"", ""id"": ""usrROSHARE0000000"", ""name"": ""Anonymous""} The `sqlite-utils extract` command already uses single text values in a column to populate a new table. It would not be much of a stretch for it to be able to use JSON instead, including specifying which of those values should be used as the primary key in the new table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/239/reactions"", ""total_count"": 6, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849978964,MDU6SXNzdWU4NDk5Nzg5NjQ=,1293,Show column metadata plus links for foreign keys on arbitrary query results,9599,simonw,open,0,,,,,51,2021-04-04T22:59:42Z,2022-09-02T17:34:09Z,,OWNER,,"Related to #620. It would be _really_ cool if Datasette could magically detect the source of the data displayed in an arbitrary query and, if that data represents a foreign key, display it as a hyperlink. Compare https://latest.datasette.io/fixtures/facetable To https://latest.datasette.io/fixtures?sql=select+pk%2C+created%2C+planet_int%2C+on_earth%2C+state%2C+city_id%2C+neighborhood%2C+tags%2C+complex_array%2C+distinct_some_null+from+facetable+order+by+pk+limit+101 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 1359604075,I_kwDOCGYnMM5RCelr,481,"Idea: `sqlite-utils create-table tablename --sql ""select ...""`",9599,simonw,open,0,,,,,0,2022-09-02T01:41:24Z,2022-09-02T01:42:08Z,,OWNER,,"Could offer syntactic sugar for: ```sql create table foo as select * from bar ``` ``` sqlite-utils create-table data.db foo --sql ""select * from bar"" ``` https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-table",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1359557737,I_kwDOBm6k_c5RCTRp,1798,"Parts of YAML file do not work when db name is ""off""",562352,CharlesNepote,closed,0,,,,,4,2022-09-01T22:10:57Z,2022-09-02T00:02:53Z,2022-09-01T23:56:33Z,NONE,,"I guess this issue is not very important and probably rare. To reproduce: * create and populate a db named `off.db` * in the yaml file, add any kind of information below `databases:\n off:` * the data are not taken into account (because ""off"" is interpreted as ""false"") YAML file: ```yaml title: Some title description_html: |-

This is an experiment.

databases: off: tables: products_from_owners: title: products_from_owners* description_html: |-

Description

``` The result for http://xxxx.xxx/-/metadata gives: ```json { ""title"": ""Some title"", ""description_html"": ""

This is an experiment.

"", ""databases"": { ""false"": { ""tables"": { ""products_from_owners"": { ""title"": ""products_from_owners*"", ""description_html"": ""

Description

"" } } } } } ``` => see the `""false""` instead of `""off""`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,hubgit,open,0,,,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855476501,MDU6SXNzdWU4NTU0NzY1MDE=,1298,improve table horizontal scroll experience,192568,mroswell,open,0,,,,,4,2021-04-12T01:55:16Z,2022-08-30T21:11:49Z,,CONTRIBUTOR,,"Wide tables aren't a huge problem if you know to click and drag right. But it's not at all obvious to do that. (it also tends to blue-select any content as it's dragging.) Depending on column widths, public users might entirely miss all the columns to the right. There is a scrollbar at the bottom of the table, but I'm displaying ALL my records because it's the only way for datasette-vega to make accurate charts. So that bottom scrollbar is likely to be missed. I wonder if some sort of javascript-y mouseover to an arrow might help, similar to those seen in image carousels. Ah: here's a perfect example: 1. Visit http://google.com 2. Search for: animals endangered 3. Note the 'g-right-button' (in the code) that looks like a right-facing caret in a circle. 4. Click on that and the carousel scrolls right (and 'g-left-button' appears on the left). Might be tricky to do that on a table, rather than a one-row carousel, but it's worth experimenting with. Another option is just to put the scrollbars at the top of the table, too. Meantime, I'm trying to build a button like the ""View/hide all columns on https://salaries.news.baltimoresun.com/salaries-be494cf/2019+Maryland+state+salaries Might be nice to have that available by default, with settings in the metadata showing which are on by default. (I saw some other closed issues related to horizontal scrolling, and admit I don't entirely understand them. For instance, the animated gif at https://github.com/simonw/datasette/issues/998#issuecomment-714117534 confuses me. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1298/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1355193529,I_kwDOCGYnMM5Qxpy5,479,OperationalError: cannot VACUUM from within a transaction,7908073,chapmanjacobd,open,0,,,,,0,2022-08-30T05:34:24Z,2022-08-30T05:34:24Z,,CONTRIBUTOR,,"Maybe when calling `.vacuum()` and other DB-level write-lock operations `sqlite_utils` could guard against this error message by automatically committing first? ``` 46 db[""media""].optimize() # type: ignore ---> 47 db.vacuum() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:1047, in Database.vacuum(self) 1045 def vacuum(self): 1046 ""Run a SQLite ``VACUUM`` against the database."" -> 1047 self.execute(""VACUUM;"") File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:470, in Database.execute(self, sql, parameters) 468 return self.conn.execute(sql, parameters) 469 else: --> 470 return self.conn.execute(sql) OperationalError: cannot VACUUM from within a transaction ``` It might also be nice to add a sentence or two about how transactions are committed on the [docs page](https://sqlite-utils.datasette.io/en/latest/python-api.html#detect-fts). When I was swapping out my sqlite3 code for this library it was nice that everything was pretty much drop-in but I was/am unsure what to do about the places I explicitly call `.commit()` in my code Related to https://github.com/simonw/sqlite-utils/issues/121",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/479/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353481513,I_kwDOCGYnMM5QrH0p,478,`sqlite-utils tables data.db table1 table2`,9599,simonw,open,0,,,,,1,2022-08-28T22:05:53Z,2022-08-28T22:22:35Z,,OWNER,,"The `sqlite-utils tables` command currently lists all tables. If you have a huge table in there then running it with `--counts` can get expensive, because of the huge table. Would be useful if it could accept an optional list of tables that it should execute against, as an alternative to the default of all of them. This should be a backwards compatible change. Current design is: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#tables ``` Usage: sqlite-utils tables [OPTIONS] PATH List the tables in the database Example: sqlite-utils tables trees.db ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353411865,I_kwDODEpn8M5Qq20Z,1,Problem with my user,2467,fernand0,open,0,,,,,0,2022-08-28T16:59:37Z,2022-08-28T16:59:37Z,,NONE,,"If I call the program with: inaturalist-to-sqlite inaturalist.db ftricas the program exits with an error: `Importing 36 observations Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/inaturalist-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/cli.py"", line 51, in cli save_observation(observation, db) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/utils.py"", line 34, in save_observation db[""observations""] File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2965, in insert return self.insert_all( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3068, in insert_all self.create( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 1564, in create self.db.create_table( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 951, in create_table sql = self.create_table_sql( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 765, in create_table_sql foreign_keys = self.resolve_foreign_keys(name, foreign_keys or []) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 702, in resolve_foreign_keys other_table = table.guess_foreign_table(column) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2061, in guess_foreign_table raise NoObviousTable( sqlite_utils.db.NoObviousTable: No obvious foreign key table for column 'taxon' - tried ['taxon', 'taxons'] ` If I call the program with your user everything seems to go well and then, I can call the program with my own user without problems. Moreover, I can call the program again with my own user and everything goes well now. Additional info, the command: sqlite-utils tables inaturalist.db shows that the correct name can be 'taxons'. There is another small problem with a warning: warnings.warn(""urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "" ",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1178546862,I_kwDOCGYnMM5GPzKu,420,Document how to use a `--convert` function that runs initialization code first,770231,strada,closed,0,,,,,12,2022-03-23T19:07:36Z,2022-08-28T11:34:37Z,2022-03-25T20:07:33Z,NONE,,"When I have an insert command with transform like this: ``` cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert ' d = enchant.Dict(""en_US"") row[""is_dictionary_word""] = d.check(row[""name""]) ' --import=enchant --ignore ``` I noticed as the number of rows increases the operation becomes quite slow, likely due to the creation of the `d = enchant.Dict(""en_US"")` object for each row. Is there a way to share that instance `d` between transform function calls, like a shared context?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353196970,I_kwDOCGYnMM5QqCWq,476,Release notes for 3.29,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T23:21:21Z,2022-08-28T04:07:15Z,2022-08-28T04:07:03Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.28...104f37fa4d2e7e5999c1d829267b62c737f74d3e,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1348169997,I_kwDOCGYnMM5QW3EN,467,Mechanism for ensuring a table has all the columns,9599,simonw,closed,0,,,8355157,3.29,13,2022-08-23T15:50:23Z,2022-08-27T23:19:41Z,2022-08-27T23:17:56Z,OWNER,,Suggested by @jefftriplett on Discord: https://discord.com/channels/823971286308356157/997738192360964156/1011655389063958600,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353189941,I_kwDOCGYnMM5QqAo1,475,table.default_values introspection property,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T22:33:31Z,2022-08-27T22:44:46Z,2022-08-27T22:43:02Z,OWNER,,"> Interesting challenge with `default_value`: I need to be able to tell if the default values passed to `.create()` differ from those in the database already. > > Introspecting that is a bit tricky: > > ```pycon > >>> import sqlite_utils > >>> db = sqlite_utils.Database(memory=True) > >>> db[""blah""].create({""id"": int, ""name"": str}, not_null=(""name"",), defaults={""name"": ""bob""}) >
a/b.c-dcabca/b.c-dc
> >>> db[""blah""].columns > [Column(cid=0, name='id', type='INTEGER', notnull=0, default_value=None, is_pk=0), Column(cid=1, name='name', type='TEXT', notnull=1, default_value=""'bob'"", is_pk=0)] > ``` > Note how a default value of the Python string `bob` is represented in the results of `PRAGMA table_info()` as `default_value=""'bob'""` - it's got single quotes added to it! > > So comparing default values from introspecting the database needs me to first parse that syntax. This may require a new table introspection method. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/468#issuecomment-1229279539_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353088849,I_kwDOBm6k_c5Qpn9R,1795,Consider automatically cleaning up curly quotes in searches,9599,simonw,open,0,,,,,0,2022-08-27T16:35:25Z,2022-08-27T16:35:25Z,,OWNER,,"If your phone helpfully adds curly quotes for you then phrase searches against FTS won't work: “Rebecca Sugar” In regular (not `?_searchmode=raw` search mode Datasette could clean these up for you to help avoid that mistake.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1199158210,I_kwDOCGYnMM5HebPC,423,.extract() doesn't set foreign key when extracted columns contain NULL value,37447552,jlieth,closed,0,,,,,1,2022-04-10T20:05:30Z,2022-08-27T14:45:04Z,2022-08-27T14:45:04Z,NONE,,"I've run into an issue with `extract` and I don't believe this is the intended behaviour. I'm working with a database with music listening information. Currently it has one large table `listens` that contains all information. I'm trying to normalize the database by extracting relevant columns to separate tables (`artists`, `tracks`, `albums`). Not every track has an album. A simplified demonstration with just `track_title` and `album_title` columns: ```ipython In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""listens""].insert_all([ ...: {""id"": 1, ""track_title"": ""foo"", ""album_title"": ""bar""}, ...: {""id"": 2, ""track_title"": ""baz"", ""album_title"": None} ...: ], pk=""id"") Out[3]:
``` The track in the first row has an album, the second track doesn't. Now I extract album information into a separate column: ```ipython In [4]: db[""listens""].extract(columns=[""album_title""], table=""albums"", fk_column=""album_id"") Out[4]:
In [5]: list(db[""albums""].rows) Out[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}] In [6]: list(db[""listens""].rows) Out[6]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] ``` This behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty). Now I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table. ```ipython In [7]: db[""listens""].extract(columns=[""track_title"", ""album_id""], table=""tracks"", fk_column=""track_id"") Out[7]:
In [8]: list(db[""tracks""].rows) Out[8]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] In [9]: list(db[""listens""].rows) Out[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}] ``` Extracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL. Changing the order of extracts doesn't help. I poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932038,I_kwDOCGYnMM5QpBrG,470,Upgrade `--load-extension` to accept entrypoints like Datasette,9599,simonw,closed,0,,,8355157,3.29,6,2022-08-27T03:53:20Z,2022-08-27T05:55:49Z,2022-08-27T05:55:48Z,OWNER,,"Imitate: - https://github.com/simonw/datasette/pull/1789 ``` # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the ""sqlite3_foo_init"" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the ""sqlite3_bar_init"" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352946135,I_kwDOCGYnMM5QpFHX,472,Reuse the locals/globals fix from --functions for other code accepting options,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T05:12:05Z,2022-08-27T05:20:12Z,2022-08-27T05:20:12Z,OWNER,,"I figured out a workaround for the ugly `global x` hack here: - https://github.com/simonw/sqlite-utils/issues/471#issuecomment-1229120653",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352931464,I_kwDOCGYnMM5QpBiI,469,sqlite-utils rows --order option,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T03:49:51Z,2022-08-27T04:30:49Z,2022-08-27T04:10:32Z,OWNER,,"For consistency with `search`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#search ``` -o, --order TEXT Order by ('column' or 'column desc') ``` I wanted to run `sqlite-utils rows db.db mytable --order 'rowid desc'` to see the most recently imported rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1320243134,I_kwDOCGYnMM5OsU--,458,Support custom names for registered functions,9599,simonw,closed,0,,,8355157,3.29,1,2022-07-28T00:13:00Z,2022-08-27T03:56:01Z,2022-07-28T00:13:57Z,OWNER,,"In this example: ```python @db.register_function def reverse_string(s): return """".join(reversed(list(s))) print(db.execute('select reverse_string(""hello"")').fetchone()[0]) ``` There's currently no way to over-ride the automatically selected name for the SQL function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1339663518,I_kwDOBm6k_c5P2aSe,1784,"Include ""entrypoint"" option on `--load-extension`?",15178711,asg017,closed,0,,,,,2,2022-08-16T00:22:57Z,2022-08-23T18:34:31Z,2022-08-23T18:34:31Z,CONTRIBUTOR,,"## Problem SQLite extensions have the option to define multiple ""entrypoints"" in each loadable extension. For example, the upcoming version of `sqlite-lines` will have 2 entrypoints: the default `sqlite3_lines_init` (which SQLite will automatically guess for) and `sqlite3_lines_noread_init`. The `sqlite3_lines_noread_init` version omits functions that read from the filesystem, which is necessary for security purposes when running untrusted SQL (which Datasette does). (Similar multiple entrypoints will also be added for sqlite-http). The `--load-extension` flag, however, doesn't give the option to specify a different entrypoint, so the default one is always used. ## Proposal I want there to be a new command line option of the `--load-extension` flag to specify a custom entrypoint like so: ``` datasette my.db \ --load-extension ./lines0 sqlite3_lines0_noread_init ``` Then, under the hood, this line of code: https://github.com/simonw/datasette/blob/7af67b54b7d9bca43e948510fc62f6db2b748fa8/datasette/app.py#L562 Would look something like this: ```python conn.execute(""SELECT load_extension(?, ?)"", [extension, entrypoint]) ``` One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options (""arity""). So I guess it could also use a `:` delimiter like `--static`: ``` datasette my.db \ --load-extension ./lines0:sqlite3_lines0_noread_init ``` Or maybe even a new flag name? ``` datasette my.db \ --load-extension-entrypoint ./lines0 sqlite3_lines0_noread_init ``` Personally I prefer the `:` option... and maybe even `--load-extension` -> `--load`? Definitely out of scope for this issue tho ``` datasette my.db \ --load./lines0:sqlite3_lines0_noread_init ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1784/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1347717749,I_kwDOBm6k_c5QVIp1,1791,Updating metadata.json on Datasette for MacOS,1780782,ment4list,open,0,,,,,1,2022-08-23T10:41:16Z,2022-08-23T13:29:51Z,,NONE,,"I've installed Datasette for Mac as per [the documentation](https://docs.datasette.io/en/stable/installation.html#datasette-desktop-for-mac) and it's working great! However, I'm not sure how to go about adding something like ""[Canned Queries](https://docs.datasette.io/en/stable/sql_queries.html#canned-queries)"" or utilising other advanced features or settings by manipulating the `metadata.json` or `settings.json` files. I can view these files from the Datasette App from the top right ""burger"" menu but it only shows the contents of the file with no way to edit or change it. Am I missing something? Where can I update the `metadata.json` file using the MacOS App? PS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345561209,I_kwDOBm6k_c5QM6J5,1790,A better HTML title for canned query pages,9599,simonw,open,0,,,,,0,2022-08-21T18:27:46Z,2022-08-21T18:27:46Z,,OWNER,,"https://scotrail.datasette.io/scotrail/assemble_sentence?terms=This+train+is+formed+of%2Cbomb+which Current title is: `scotrail: with phrases as ( select key, value from json_each('["' || replace(:terms, ',', '","') || '"]')),matches as (select phrases.key, phrases.value, ( select File from announcements where announcements.Transcription like '%' || trim(phrases.value) || '%' order by length(announcements.Transcription) limit 1 ) as Filefrom phrases),results as ( select key, announcements.Transcription, announcements.mp3 from announcements join matches on announcements.File = matches.File order by key)select 'Combined sentence:' as mp3, group_concat(Transcription, ' ') as Transcription, -1 as keyfrom results unionselect mp3, Transcription, keyfrom resultsorder by key` I think a better title would be: `scotrail: assemble_sentence, terms = This train is formed of,bomb which`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1790/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1343732788,I_kwDOBm6k_c5QF7w0,1788,Make it more obvious that Datasette publish can publish multiple databases,9599,simonw,closed,0,,,,,0,2022-08-18T22:57:51Z,2022-08-18T23:06:16Z,2022-08-18T23:06:16Z,OWNER,,Feedback initially for `datasette-publish-fly` but it applies to the others too.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1788/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1343422749,I_kwDOBm6k_c5QEwEd,1787,"Move ""datasette --get"" from Getting Started to CLI Reference",9599,simonw,closed,0,,,,,5,2022-08-18T17:53:39Z,2022-08-18T21:57:09Z,2022-08-18T21:56:21Z,OWNER,,It really shouldn't be here: https://docs.datasette.io/en/0.62/getting_started.html#datasette-get,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1787/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338001039,I_kwDOCGYnMM5PwEaP,464,Link from documentation to source code,9599,simonw,closed,0,,,,,5,2022-08-13T16:19:57Z,2022-08-17T23:38:03Z,2022-08-17T23:38:03Z,OWNER,,Twitter conversation asking for ways to automate this here: https://twitter.com/simonw/status/1558260492015046656,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1340900019,I_kwDOBm6k_c5P7IKz,1785,Can't use cog menu to facet by first column in a view,9599,simonw,open,0,,,,,0,2022-08-16T21:27:23Z,2022-08-16T21:27:23Z,,OWNER,,"https://latest.datasette.io/fixtures/paginated_view Compare with: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1785/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1337541526,I_kwDOBm6k_c5PuUOW,1780,`facet_time_limit_ms` and `sql_time_limit_ms` overlap?,53165,davepeck,open,0,,,,,1,2022-08-12T17:55:37Z,2022-08-15T23:50:08Z,,NONE,,"I needed more than the default 200ms to facet a specific column in a database I was working with, so I ran `datasette` with `--setting facet_time_limit_ms 30000` — definitely overkill! But it still didn't work; it took a moment to realize I also needed to up my `sql_time_limit_ms` to something larger too. I'm happy to submit a PR that documents this behavior if it's helpful. Or, if there's a code change we'd like to make (like making sure `sql_time_limit_ms` is always set to the larger of itself and `facet_time_limit_ms`), happy to do that too. Apologies if I missed this somewhere in the docs. And: thanks. I'm really enjoying the simple, effective tooling datasette gives me out of the box for exploring my databases!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1780/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1339444565,I_kwDOBm6k_c5P1k1V,1783,Better guidance as to what to do after you've installed Datasette,9599,simonw,open,0,,,,,2,2022-08-15T20:11:06Z,2022-08-15T20:14:01Z,,OWNER,,"Feedback [from Discord](https://discord.com/channels/823971286308356157/823971286941302908/1008822978793984060): > hello, love the project and came for help and to point out a possible gap in the docs. starting with ""getting started"" and ""installation"" every thing looks great, but then there's a giant leap after you have it installed and running. from the user perspective of ""i have a csv of set of csvs that i want to turn into a table(s), what do i do next?"" --- so something like maybe a page for creating your first project should go after ""installation"". - https://docs.datasette.io/en/0.62/getting_started.html - https://docs.datasette.io/en/0.62/installation.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1783/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1334628400,I_kwDOBm6k_c5PjNAw,1779,google cloudrun updated their limits on maxscale based on memory and cpu count,536941,fgregg,closed,0,,,8303187,Datasette 0.62,13,2022-08-10T13:27:21Z,2022-08-14T19:42:59Z,2022-08-14T17:07:34Z,CONTRIBUTOR,,"if you don't set an explicit limit on container scaling, then [google defaults to 100](https://cloud.google.com/run/docs/configuring/max-instances#limits) google recently updated the [limits on container scaling](https://cloud.google.com/run/docs/configuring/max-instances#limits), such that if you set up datasette to use more memory or cpu, then you need to set the maxScale argument much smaller than 100. would be nice if `datasette publish` could do this math for you and set the right maxScale. [Log of an failing publish run](https://github.com/labordata/warehouse/runs/7764725972?check_suite_focus=true#step:8:332). ``` ERROR: (gcloud.run.deploy) spec.template.spec.containers[0].resources.limits.cpu: Invalid value specified for cpu. For the specified value, maxScale may not exceed 15. Consider running your workload in a region with greater capacity, decreasing your requested cpu-per-instance, or requesting an increase in quota for this region if you are seeing sustained usage near this limit, see https://cloud.google.com/run/quotas. Your project may gain access to further scaling by adding billing information to your account. Traceback (most recent call last): File ""/home/runner/.local/bin/datasette"", line 8, in sys.exit(cli()) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/runner/.local/lib/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/runner/.local/lib/python3.8/site-packages/datasette/publish/cloudrun.py"", line 160, in cloudrun check_call( File ""/usr/lib/python3.8/subprocess.py"", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/labordata/datasette warehouse --memory 8Gi --cpu 2' returned non-zero exit status 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1779/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338278056,I_kwDOBm6k_c5PxICo,1782,Release notes for Datasette 0.62,9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-08-14T15:26:45Z,2022-08-14T17:40:45Z,2022-08-14T17:32:54Z,OWNER,,"I've written a lot of these already for the alphas: - https://github.com/simonw/datasette/releases/tag/0.62a0 - https://github.com/simonw/datasette/releases/tag/0.62a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1218133366,I_kwDOBm6k_c5Imz12,1728,Writable canned queries fail with useless non-error against immutable databases,127565,wragge,closed,0,,,8303187,Datasette 0.62,13,2022-04-28T03:10:34Z,2022-08-14T16:34:40Z,2022-08-14T16:34:40Z,CONTRIBUTOR,,"I've been banging my head against a wall for a while and would appreciate any pointers... - I have a writeable canned query to update rows in the db. - I'm using the github-oauth plugin for authentication. - I have `allow` set on the query to accept my GitHub id and a GH organisation. - Authentication seems to work as expected both locally and on Cloudrun -- viewing `/-/actor` gives the same result in both environments - I can access the 'padlocked' canned query in both environments. Everything seems to be the same, but the canned query works perfectly when run locally, and fails when I try it on Cloudrun. I'm redirected back to the canned query page and the db is not changed. There's nothing in the Cloudstor logs to indicate an error. Any clues as to where I should be looking for the problem?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223527226,I_kwDOBm6k_c5I7Ys6,1738,"""Cannot use _sort and _sort_desc at the same time""",9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-05-03T01:06:24Z,2022-08-14T16:13:55Z,2022-08-14T16:13:55Z,OWNER,,"Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width: https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1 Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply: ![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif) Also notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1318907685,I_kwDOBm6k_c5OnO8l,1773,500 error if sorted by a column not in the ?_col= list,9599,simonw,closed,0,,,8303187,Datasette 0.62,4,2022-07-27T01:20:27Z,2022-08-14T16:06:25Z,2022-08-14T15:44:05Z,OWNER,,"For example: https://latest.datasette.io/fixtures/sortable?_sort_desc=sortable&_col=sortable_with_nulls That's `?_sort_desc=sortable&_col=sortable_with_nulls` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1773/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1296222572,I_kwDOBm6k_c5NQsls,1768,Upgrade to 3.10.6-slim-bullseye Docker base image,9599,simonw,closed,0,,,8303187,Datasette 0.62,5,2022-07-06T18:37:49Z,2022-08-14T15:54:36Z,2022-08-14T15:54:11Z,OWNER,,For the package published to Docker Hub and also the containers used by `datasette package` and `datasette publish cloudrun`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1768/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306492437,I_kwDOBm6k_c5N334V,1770,`handle_exception` plugin hook for custom error handling,9599,simonw,closed,0,,,8303187,Datasette 0.62,14,2022-07-15T20:52:49Z,2022-08-14T15:25:51Z,2022-08-14T15:25:51Z,OWNER,,"I need this for a couple of plugins, both of which are broken at the moment: - https://github.com/simonw/datasette-sentry/issues/1 - https://github.com/simonw/datasette-show-errors/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1338137350,I_kwDOBm6k_c5PwlsG,1781,Ensure Datasette Lite is promoted in docs and README,9599,simonw,closed,0,,,8303187,Datasette 0.62,1,2022-08-14T05:12:35Z,2022-08-14T15:24:40Z,2022-08-14T15:24:40Z,OWNER,,As of 0.62 https://lite.datasette.io is a supported piece of the overall Datasette ecosystem.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1326349129,I_kwDOCGYnMM5PDntJ,461,Consider including animated SVG console demos,9599,simonw,open,0,,,,,1,2022-08-02T20:10:04Z,2022-08-02T20:12:14Z,,OWNER,,"I recorded this one using https://github.com/nbedos/termtosvg - with `pipx install termtosvg` and then `termtosvg` - execute demo - `exit` to save. ![sqlite-utils-insert-json](https://user-images.githubusercontent.com/9599/182464206-f4976af4-eda8-4020-8257-4ada1867fb44.svg) ```json [ { ""id"": 1, ""name"": ""Catimus"" }, { ""id"": 2, ""name"": ""Feliopia"" } ] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1324659241,I_kwDOCGYnMM5O9LIp,459,Single quoted transform recipes on Windows do not work as expected ,19921,shakeel,open,0,,,,,0,2022-08-01T16:14:54Z,2022-08-01T16:14:54Z,,CONTRIBUTOR,,"Trying to follow the tutorial for sqlite-utils and datasette https://datasette.io/tutorials/clean-data on Windows 11 OS `Microsoft Windows [Version 10.0.22622.440]`, with sqlite-utils and datasette installed using pipx. ``` pipx list package datasette 0.61.1, installed using Python 3.10.4 - datasette.exe package sqlite-utils 3.28, installed using Python 3.10.4 - sqlite-utils.exe ``` In the step to transform dates into ISO dates the quoted value `'r.parsedatetime(value)'` is copied verbatim into the columns instead of applying the output of the Python recipe. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ 'r.parsedatetime(value)' --dry-run 1975/01/31 00:00:00+00 --- becomes: r.parsedatetime(value) Would affect 13568 rows ``` However, if I change the code from single quotes to double quotes, it works as expected. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ ""r.parsedatetime(value)"" --dry-run 1975/01/31 00:00:00+00 --- becomes: 1975-01-31T00:00:00+00:00 Would affect 13568 rows ``` Specifying the transform code recipe should work with single quotes on Windows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1323332006,I_kwDOBm6k_c5O4HGm,1774,Request of feature for mongo,428820,johnfelipe,open,0,,,,,0,2022-07-31T01:00:05Z,2022-07-31T01:00:05Z,,NONE,,Will love if can we use datasette for mongo and all pipelines and workflows,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1774/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 838245338,MDU6SXNzdWU4MzgyNDUzMzg=,1272,Unit tests for the Dockerfile,9599,simonw,open,0,,,,,3,2021-03-23T01:36:29Z,2022-07-29T10:22:59Z,,OWNER,,"Working on the Dockerfile in #1249 made me wish for automated tests - to confirm that it boots up correctly, can run SpatiaLite and doesn't have weird bugs like the `/db` page hanging. These could run in CI too, but maybe only if the `Dockerfile` is updated.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1272/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 779156520,MDU6SXNzdWU3NzkxNTY1MjA=,1175,Use structlog for logging,9599,simonw,open,0,,,,,4,2021-01-05T15:11:36Z,2022-07-26T12:52:10Z,,OWNER,,To solve #241 JSON logging.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1310243385,I_kwDOCGYnMM5OGLo5,456,feature request: pivot command,536941,fgregg,open,0,,,,,5,2022-07-20T00:58:08Z,2022-07-20T17:50:50Z,,CONTRIBUTOR,,pivoting long-format table to wide-format tables is pretty common and kind of pain. would love to see this feature in sqlite-utils!,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 663145122,MDU6SXNzdWU2NjMxNDUxMjI=,903,Add temporary plugin testing pattern to the testing docs,9599,simonw,closed,0,,,,,1,2020-07-21T16:22:34Z,2022-07-18T21:34:33Z,2022-07-18T21:31:22Z,OWNER,,"https://til.simonwillison.net/pytest/registering-plugins-in-tests Would be useful to include this pattern on https://datasette.readthedocs.io/en/stable/testing_plugins.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/903/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1292368833,I_kwDOBm6k_c5NB_vB,1764,Keep track of config_dir in directory mode (for plugins),25778,eyeseast,closed,0,,,,,0,2022-07-03T16:57:49Z,2022-07-18T01:12:45Z,2022-07-18T01:12:45Z,CONTRIBUTOR,,"I started working on using `config_dir` with my [datasette-query-files plugin](https://github.com/eyeseast/datasette-query-files) and realized Datasette doesn't actually hold onto the `config_dir` argument. It gets used in `__init__` but then forgotten. It would be nice to be able to use it in plugins, though. Here's the reference issue: https://github.com/eyeseast/datasette-query-files/issues/4 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1292060682,I_kwDOCGYnMM5NA0gK,450,Add --ignore option to more commands,9599,simonw,closed,0,,,,,9,2022-07-02T13:52:02Z,2022-07-15T22:39:09Z,2022-07-15T22:37:45Z,OWNER,,"As seen in https://sqlite-utils.datasette.io/en/stable/cli-reference.html#add-foreign-key Could make this TIL trick unnecessary: https://til.simonwillison.net/bash/ignore-errors",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1298531653,I_kwDOCGYnMM5NZgVF,451,Make sqlite_utils.utils.chunks a documented function,9599,simonw,closed,0,,,,,2,2022-07-08T06:01:04Z,2022-07-15T22:09:34Z,2022-07-15T21:59:33Z,OWNER,,I want to use it in another project: https://github.com/simonw/sqlite-utils/blob/8a9fe6498faf783a1fdeb1793e661ad194a05267/sqlite_utils/utils.py#L471-L474,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1303169663,I_kwDOCGYnMM5NrMp_,453,'unclosed file' warning when using insert_upsert_implementation from Python,311257,makkus,closed,0,,,,,1,2022-07-13T09:34:35Z,2022-07-15T21:52:25Z,2022-07-15T21:52:21Z,NONE,,"I'm using the `[insert_upsert_implementation](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py)` function directly in my Python code to import a csv file with all the bells and whistles `sqlite-utils` provides, but I'm getting a resource warning that a io.TextWrapper object is not closed. The warning goes away when wrapping the code from [this line](https://github.com/simonw/sqlite-utils/blob/42440d6345c242ee39778045e29143fb550bd2c2/sqlite_utils/cli.py#L924) in a try/finally block like: ``` try: ... ... finally: decoded.close() ``` (might be that `sniff_buffer` must also be closed if non null, but I might be wrong) I suspect Python closes the reference automatically when the sqlite-utils cli run is done, but since my code doesn't exit, I'm getting the warning. Alternatively, it'd be cool if the 'import csv/tsv' functionality could be added directly to the Database class.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306548397,I_kwDOCGYnMM5N4Fit,454,CLI command for duplicating tables,9599,simonw,closed,0,,,,,1,2022-07-15T21:31:27Z,2022-07-15T21:48:23Z,2022-07-15T21:45:51Z,OWNER,,"CLI equivalent of: - #449",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279863844,I_kwDOCGYnMM5MSSwk,449,Utilities for duplicating tables and creating a table with the results of a query,1690072,davidleejy,closed,0,,,,,4,2022-06-22T09:41:43Z,2022-07-15T21:46:13Z,2022-07-15T21:21:36Z,CONTRIBUTOR,,"is there a duplicate table functionality? Otherwise, I'd be happy to submit a PR. In sqlite3 it would look like: ```python import sqlite3 as sl con = sl.connect('prompt-tune.db') def db_duplicate_table(table_name, table_name_new, con=con): # Duplicates table `table_name` to a new table `table_name_new`. try: cur = con.cursor() cur.execute(f""""""CREATE TABLE {table_name_new} AS SELECT * FROM {table_name}"""""") except Exception as e: print(e) finally: cur.close() db_duplicate_table('orig_table', 'new_table') ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 728905098,MDU6SXNzdWU3Mjg5MDUwOTg=,1048,Documentation and unit tests for urls.row() urls.row_blob() methods,9599,simonw,open,0,,,,,7,2020-10-25T00:13:53Z,2022-07-10T16:23:57Z,,OWNER,,https://github.com/simonw/datasette/blob/5db7ae3ce165ded57c7fb1cfbdb3258b1cf06c10/datasette/app.py#L1307-L1313,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1271426387,I_kwDOCGYnMM5LyG1T,444,CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool,9599,simonw,open,0,,,,,5,2022-06-14T22:22:47Z,2022-07-07T16:39:18Z,,OWNER,,"> I forgot to add equivalents of `extras_key=` and `ignore_extras=` to the CLI tool - will do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1155767915_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 860625833,MDU6SXNzdWU4NjA2MjU4MzM=,1300,Make row available to `render_cell` plugin hook,3243482,abdusco,closed,0,,,,,5,2021-04-18T10:14:37Z,2022-07-07T16:34:05Z,2022-07-07T16:31:22Z,CONTRIBUTOR,,"*Original title: **Generating URL for a row inside `render_cell` hook*** Hey, I am using Datasette to view a database that contains video metadata. It has BLOB columns that contain video thumbnails in JPG format (around 100-500KB per row). I've registered an output formatter that extends `datasette.blob_renderer.render_blob` function and serves the column with `image/jpeg` content type. ```python from datasette.blob_renderer import render_blob async def render_jpg(datasette, database, rows, columns, request, table, view_name): response = await render_blob(datasette, database, rows, columns, request, table, view_name) response.content_type = ""image/jpeg"" response.headers[""Content-Disposition""] = f'inline; filename=""image.jpg""' return response @hookimpl def register_output_renderer(): return { ""extension"": ""jpg"", ""render"": render_jpg, ""can_render"": lambda: True, } ``` This works well. I can visit `http://localhost:8001/mydb/videos/1.jpg?_blob_column=thumbnail` and view the image. I want to display the image directly with an `` tag (lazy-loaded of course). So, I need a URL, because embedding base64 would increase the page size too much (each image > 100KB). Datasette generates a link with `.blob` extension for blob columns. It does this by calling `datasette.urls.row_blob` https://github.com/simonw/datasette/blob/7a2ed9f8a119e220b66d67c7b9e07cbab47b1196/datasette/views/table.py#L169-L179 But I have no way of getting the row inside the `render_cell` hook. ```python @hookimpl def render_cell(value, column, table, database, datasette): if isinstance(value, bytes) and imghdr.what(None, value): # generate url return '$renderedLink' ``` Any pointers?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1294641696,I_kwDOBm6k_c5NKqog,1767,Ability to set a custom favicon,9599,simonw,open,0,,,,,9,2022-07-05T18:41:12Z,2022-07-05T18:56:43Z,,OWNER,,"If you're running a website on Datasette, like https://www.niche-museums.com/ or https://til.simonwillison.net/ - you should have the ability to easily specify a custom favicon. Currently the `/favicon.ico` view is hard-coded to do this: https://github.com/simonw/datasette/blob/9f1eb0d4eac483b953392157bd9fd6cc4df37de7/datasette/app.py#L179-L188",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1203943272,I_kwDOBm6k_c5Hwrdo,1713,Datasette feature for publishing snapshots of query results,9599,simonw,open,0,,,,,5,2022-04-14T01:42:00Z,2022-07-04T05:16:35Z,,OWNER,,"https://twitter.com/simonw/status/1514392335718645760 > Maybe [@datasetteproj](https://twitter.com/datasetteproj) should grow a feature that lets you cache the results of a query and give that snapshot a stable permalink > > A plugin that publishes the JSON output of a query to an S3 bucket would be pretty neat... especially if it could also be configured to re-publish the results on a schedule A lot of people said they would find this useful. Probably going to build this as a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1212701569,I_kwDOCGYnMM5ISFuB,427,"sqlite-utils convert date parsing recipe complains about trying to parse ""*""",1385831,wdccdw,closed,0,,,,,1,2022-04-22T19:27:10Z,2022-07-02T13:59:59Z,2022-07-02T13:59:32Z,NONE,,"Missing values in my dataset are denoted by a single asterisk. I am trying to parse string dates into dates. This works fine for columns without missing values, but, when the column contains ""*"", I get the following: ``` $ sqlite-utils convert ${dbfile} details dob 'r.parsedate(value)' [------------------------------------] 0%Traceback (most recent call last): File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2508, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 643, in parse raise ParserError(""Unknown string format: %s"", timestr) dateutil.parser._parser.ParserError: Unknown string format: * Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils==3.25.1', 'console_scripts', 'sqlite-utils')()) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2698, in convert db[table].convert( File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2524, in convert self.db.execute(sql, where_args or []) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 458, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: user-defined function raised exception ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/427/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455486286,MDU6SXNzdWU0NTU0ODYyODY=,26,Mechanism for turning nested JSON into foreign keys / many-to-many,9599,simonw,open,0,,,,,14,2019-06-13T00:52:06Z,2022-06-29T23:35:29Z,,OWNER,,"The GitHub JSON APIs have a really interesting convention with respect to related objects. Consider https://api.github.com/repos/simonw/sqlite-utils/issues - here's a truncated subset: ```json { ""id"": 449818897, ""node_id"": ""MDU6SXNzdWU0NDk4MTg4OTc="", ""number"": 24, ""title"": ""Additional Column Constraints?"", ""user"": { ""login"": ""IgnoredAmbience"", ""id"": 98555, ""node_id"": ""MDQ6VXNlcjk4NTU1"", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/98555?v=4"", ""gravatar_id"": """" }, ""labels"": [ { ""id"": 993377884, ""node_id"": ""MDU6TGFiZWw5OTMzNzc4ODQ="", ""url"": ""https://api.github.com/repos/simonw/sqlite-utils/labels/enhancement"", ""name"": ""enhancement"", ""color"": ""a2eeef"", ""default"": true } ], ""state"": ""open"" } ``` The `user` column lists a complete user. The `labels` column has a list of labels. Since both user and label have populated `id` field this is actually enough information for us to create records for them AND set up the corresponding foreign key (for user) and m2m relationships (for labels). It would be really neat if `sqlite-utils` had some kind of mechanism for correctly processing these kind of patterns. Thanks to `jq` there's not much need for extra customization of the shape here - if we support a narrowly defined structure users can use `jq` to reshape arbitrary JSON to match.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/26/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1227571375,I_kwDOCGYnMM5JK0Cv,431,Allow making m2m relation of a table to itself,738408,rafguns,open,0,,,,,3,2022-05-06T08:30:43Z,2022-06-23T14:12:51Z,,NONE,,"I am building a database, in which one of the tables has a many-to-many relationship to itself. As far as I can see, this is not (yet) possible using `.m2m()` in sqlite-utils. This may be a bit of a niche use case, so feel free to close this issue if you feel it would introduce too much complexity compared to the benefits. Example: suppose I have a table of people, and I want to store the information that John and Mary have two children, Michael and Suzy. It would be neat if I could do something like this: ```python from sqlite_utils import Database db = Database(memory=True) db[""people""].insert({""name"": ""John""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) db[""people""].insert({""name"": ""Mary""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) ``` But if I do that, the many-to-many table `parent_child` has only one column: ``` CREATE TABLE [parent_child] ( [people_id] TEXT REFERENCES [people]([name]), PRIMARY KEY ([people_id], [people_id]) ) ``` This could be solved by adding one or two keyword_arguments to `.m2m()`, e.g. `.m2m(..., left_name=None, right_name=None)` or `.m2m(..., names=(None, None))`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 727848625,MDU6SXNzdWU3Mjc4NDg2MjU=,12,"Some workout columns should be float, not text",9599,simonw,open,0,,,,,4,2020-10-23T02:47:02Z,2022-06-23T04:35:02Z,,MEMBER,,"Columns `duration`, `totalDistance` and `totalEnergyBurned` should be converted to float. https://github.com/dogsheep/healthkit-to-sqlite/blob/71e36e1cf034b96de2a8e6652265d782d3fdf63b/healthkit_to_sqlite/utils.py#L50-L57",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1280799259,I_kwDOBm6k_c5MV3Ib,1761,ensure_ascii=False,1473102,mustafa0x,open,0,,,,,0,2022-06-22T19:58:13Z,2022-06-22T19:58:30Z,,NONE,,"Hi, thanks for the project! For the JSON output, I would consider defaulting to `ensure_ascii=False` (UTF-8 seems pretty universal) or making it an option. When dealing with non-Latin text, `ensure_ascii=True` (the default) can triple the size of the output.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1277328147,I_kwDOCGYnMM5MInsT,446,Use Just to automate running tests and linters locally,9599,simonw,closed,0,,,,,2,2022-06-20T19:51:09Z,2022-06-21T19:28:35Z,2022-06-20T19:54:50Z,OWNER,,I keep committing code that fails additional tests like `mypy` and `flake8` and `black`. Automate those using Just.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,simonw,closed,0,,,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269998342,I_kwDOCGYnMM5LsqMG,443,Make `utils.rows_from_file()` a documented API,9599,simonw,closed,0,,,,,2,2022-06-13T21:53:24Z,2022-06-20T19:49:37Z,2022-06-14T20:12:46Z,OWNER,,"> `rows_from_file()` isn't part of the documented API but maybe it should be! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1154385916_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,simonw,closed,0,,,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250495688,I_kwDOCGYnMM5KiQzI,439,Misleading progress bar against utf-16-le CSV input,4068,frafra,open,0,,,,,12,2022-05-27T08:34:49Z,2022-06-15T03:53:43Z,,NONE,,"The program crashes without any error. ``` wget ""https://artsdatabanken.no/Fab2018/api/export/csv"" sqlite-utils create-database test.db sqlite-utils insert --csv --delimiter "";"" --encoding ""utf-16-le"" test test.db csv [------------------------------------] 0% [#################-------------------] 49% 00:00:01 ``` I would like to highlight various issues: 1. sqlite-utils catches exceptions without printing the stacktrace and/or reraising the exception, so there is no easy way to use `pdb` or similar to debug the program, solution: add a debug option 2. Silent crash: this is related to (1.), and it happens when there is a catch-all mechanism; solution: let the program fail.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1224112817,I_kwDOCGYnMM5I9nqx,430,Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases,9308268,rayvoelker,open,0,,,,,2,2022-05-03T13:33:58Z,2022-06-14T23:26:37Z,,NONE,,"I'm trying to figure out a way to get the `table.extract()` method to complete successfully -- I'm not sure if maybe the cause (and a possible solution) of this on Ubuntu Server 22.04 is to adjust some of the PRAGMA values within SQLite itself ... on another Linux system (PopOS), using this method on this same database appears to work just fine. Here's the bit that's causing the error, and the resulting error output: ```python # combine these columns into 1 table ""bib_properties"" : # best_title # bib_level_code # mat_type # material_code # best_author db[""circ_trans""].extract( [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], table=""bib_properties"", fk_column=""bib_properties_id"" ) db[""circ_trans""].extract( [""call_number""], table=""call_number"", fk_column=""call_number_id"", rename={""call_number"": ""value""} ) ``` ```python --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) Input In [17], in () 1 # combine these columns into 1 table ""bib_properties"" : 2 # best_title 3 # bib_level_code 4 # mat_type 5 # material_code 6 # best_author ----> 7 db[""circ_trans""].extract( 8 [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], 9 table=""bib_properties"", 10 fk_column=""bib_properties_id"" 11 ) 13 db[""circ_trans""].extract( 14 [""call_number""], 15 table=""call_number"", 16 fk_column=""call_number_id"", 17 rename={""call_number"": ""value""} 18 ) File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1764, in Table.extract(self, columns, table, fk_column, rename) 1761 column_order.append(c.name) 1763 # Drop the unnecessary columns and rename lookup column -> 1764 self.transform( 1765 drop=set(columns), 1766 rename={magic_lookup_column: fk_column}, 1767 column_order=column_order, 1768 ) 1770 # And add the foreign key constraint 1771 self.add_foreign_key(fk_column, table, ""id"") File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1526, in Table.transform(self, types, rename, drop, pk, not_null, defaults, drop_foreign_keys, column_order) 1524 with self.db.conn: 1525 for sql in sqls: -> 1526 self.db.execute(sql) 1527 # Run the foreign_key_check before we commit 1528 if pragma_foreign_keys_was_on: File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:465, in Database.execute(self, sql, parameters) 463 return self.conn.execute(sql, parameters) 464 else: --> 465 return self.conn.execute(sql) OperationalError: database or disk is full ``` This database is about 17G in total size, so I'm assuming the error is coming from the vacuum ... where i'm assuming it's maybe trying to do the temp storage in a location that doesn't have sufficient room. The disk space is more than ample on the host in question (1.8T is free in the directory where the sqlite db resides) The `/tmp` directory however is limited on a smaller disk associated with the OS I'm trying to think if there's a way to set the `PRAGMA temp_store` or maybe if it's `temp_store_directory` that I'm after ... to use the same local directory of where the file is located (maybe this is a property of the version of sqlite on the system?) ```python # SET the temp file store to be a file ... print(db.execute('PRAGMA temp_store').fetchall()) print(db.execute('PRAGMA temp_store=FILE').fetchall()) print(db.execute('PRAGMA temp_store').fetchall()) # the users home directory ... print(db.execute(""PRAGMA temp_store_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA temp_store_directory"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory"").fetchall()) ``` ```text [(1,)] [] [(1,)] [] [] [('/home/plchuser/',)] [] ``` Here's the docs on the Temporary File Storage Locations https://www.sqlite.org/tempfiles.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,ryascott,closed,0,,,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250629388,I_kwDOCGYnMM5KixcM,440,CSV files with too many values in a row cause errors,4068,frafra,closed,0,,,,,20,2022-05-27T10:54:44Z,2022-06-14T22:23:01Z,2022-06-14T20:12:46Z,NONE,,"*Original title: csv.DictReader can have None as key* In some cases, `csv.DictReader` can have `None` as key for unnamed columns, and a list of values as value. `sqlite_utils.utils.rows_from_file` cannot handle that: ```python url=""https://artsdatabanken.no/Fab2018/api/export/csv"" db = sqlite_utils.Database("":memory"") with urlopen(url) as fab: reader, _ = sqlite_utils.utils.rows_from_file(fab, encoding=""utf-16le"") db[""fab2018""].insert_all(reader, pk=""Id"") ``` Result: ``` Traceback (most recent call last): File """", line 3, in File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2924, in insert_all chunk = list(chunk) File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in fix_square_braces if any(""["" in key or ""]"" in key for key in record.keys()): File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in if any(""["" in key or ""]"" in key for key in record.keys()): TypeError: argument of type 'NoneType' is not iterable ``` Code: https://github.com/simonw/sqlite-utils/blob/59be60c471fd7a2c4be7f75e8911163e618ff5ca/sqlite_utils/db.py#L3454 `sqlite-utils insert` from command line is not affected by this issue.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1236693079,I_kwDOCGYnMM5JtnBX,432,"Support `rows_where()`, `delete_where()` etc for attached alias databases",11597658,luxint,open,0,,,,,5,2022-05-16T06:38:58Z,2022-06-14T22:16:48Z,,NONE,,"Hi, I noticed `rows_where()` doesn't return any rows from tables which are from attached databases. The `exists()` function returns false. As far as I can see this is because the `table_names()` function only looks for table names in the current database and not in attached (or temp) databases. Besides, `rows_where()`, also `insert_all()` and `delete_where()` didn't do what I was expecting because of this. For the moment I've patched `table_names()` for myself, see below but I'm not sure what the total impact is on the other functions like lookup truncate etc which all use `exists()`. Also `view_names()` doesn't look for views in attached or temp databases. ```python def table_names(self, fts4: bool = False, fts5: bool = False) -> List[str]: ""A list of string table names in this database."" where = [""type = 'table'""] if fts4: where.append(""sql like '%USING FTS4%'"") if fts5: where.append(""sql like '%USING FTS5%'"") dbs = [x[1] for x in self.execute('pragma database_list').fetchall()] lst=[] for db in dbs: sql = ""select name from {} where {}"".format(db+"".sqlite_master"","" AND "".join(where)) lst.extend(r[0] for r in self.execute(sql).fetchall()) return lst ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1257724585,I_kwDOCGYnMM5K91qp,441,Combining `rows_where()` and `search()` to limit which rows are searched,1448859,betatim,closed,0,,,,,4,2022-06-02T06:01:55Z,2022-06-14T21:57:57Z,2022-06-14T21:54:38Z,NONE,,"What is the right way to limit a full text search query to some rows of a table? For example, I have a table that contains the following columns: `title`, `content`, `owner` (each row represents a document). The `owner` column is a username. It feels right to store all documents in one table, instead of having one table per owner. In particular because I'd like to full text search all documents, only documents owned by one user and documents owned by a set of users. I tried to combine `.rows_where(""owner = ?"", ""1234"")` and `.search()` from the `Table` class but I don't think that is meant to work. I discovered `.search_sql()` as a way to generate the FTS SQL statement. By hand I can edit it to add a `AND [original].[owner] = :owner` to the `where` clause. This seems to do what I want. My two questions: 1. is adding a `AND ...` to the `where` clause actually the right thing to do or should I be doing something else (my SQL skills are low)? 2. is there a built-in to sqlite-utils way to achieve this? Right now I am thinking I will make my own version of `search_sql()` that generates a query that contains an additional `owner = :owner` for my particular use-case. Bonus question: is this generally useful/something to add to sqlite-utils or too niche?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269886084,I_kwDOCGYnMM5LsOyE,442,`maximize_csv_field_size_limit()` utility function,9599,simonw,closed,0,,,,,2,2022-06-13T19:54:54Z,2022-06-14T21:55:15Z,2022-06-14T21:31:49Z,OWNER,,"This code here runs only if `cli.py` is imported: https://github.com/simonw/sqlite-utils/blob/7ddf5300886a32d6daf60cf1d71efe492b65c87e/sqlite_utils/cli.py#L50-L59 I found myself needing the same fix in another library: - https://github.com/simonw/datasette-socrata/issues/13 It should be a documented utility function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160182768,I_kwDOCGYnMM5FJvvw,412,Optional Pandas integration,9599,simonw,open,0,,,,,13,2022-03-05T01:49:27Z,2022-06-14T15:36:29Z,,OWNER,,"It would be neat if there was a way to use this more seamlessly with Pandas, in particular Pandas dataframes - but without making Pandas a required dependency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so: ```yaml databases: fixtures: queries: neighborhood_search: sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood title: Search neighborhoods ``` This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it. So, I'd like to see a new `""sql_file""` key that works like so: `metadata.yaml`: ```yaml databases: fixtures: queries: neighborhood_search: sql_file: neighborhood_search.sql title: Search neighborhoods ``` `neighborhood_search.sql`: ```sql select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood ``` Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup. A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml: - Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code - Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit - Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier) - Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266329095,I_kwDOBm6k_c5LeqYH,1756,Mechanism for creating databases in WAL mode,9599,simonw,open,0,,,,,0,2022-06-09T15:39:28Z,2022-06-09T15:39:28Z,,OWNER,,"The `--create` option currently creates databases if they are missing, but does not enable WAL mode for them. It turns out WAL mode is useful for databases that are accepting writes! I think a `--create-wal` option that both creates them AND sets WAL mode on any that are created would be a good idea.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,ar-jan,open,0,,,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn): > Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management. > > This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime. > > For production deployments we recommend using gunicorn with the uvicorn worker class. We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1251739062,I_kwDOBm6k_c5KnAW2,1752,Research if I can drop Janus,9599,simonw,open,0,,,,,0,2022-05-28T22:46:52Z,2022-05-28T22:46:52Z,,OWNER,,"> It seems to me Janus dependency is not necessary, `async with app.database_write_mutex(): out = await app.transaction(func)` may be enough. Comment here: https://lobste.rs/s/fki4tj/architecture_notes_datasette#c_a2ihon",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,knutwannheden,closed,0,,,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.) When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach. I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1251700382,I_kwDOBm6k_c5Km26e,1750,Allow `label_column` to specify array of columns,408765,knutwannheden,open,0,,,,,0,2022-05-28T18:45:48Z,2022-05-28T18:45:48Z,,NONE,,"I think it would be great if the Datasette metadata would allow the `label_column` table key to list multiple columns. Something like: ```json ""tables"": { ""person"": { ""label_column"": [""first_name"", ""last_name""] }, ``` It would even be interesting with a ""label expression"" similar to a Python f-string. E.g. `{row.last_name}, {row.first_name}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1250161887,I_kwDOCGYnMM5Kg_Tf,438,illegal UTF-16 surrogate,4068,frafra,closed,0,,,,,2,2022-05-26T22:49:52Z,2022-05-27T08:21:53Z,2022-05-27T08:21:53Z,NONE,,"I am trying to insert `https://artsdatabanken.no/Fab2018/api/export/csv` into a SQLite database, but I have an error when using `sqlite-utils`: ``` sqlite-utils insert --csv --delimiter "";"" --encoding=""utf-16-le"" --pk ""Id"" csv fremmedart test.db [------------------------------------] 0% Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I tried to convert the file using `iconv -f ""utf-16le"" -t ""utf-8""`, but I still get a similar error (slightly different position): ``` sqlite-utils insert --csv --delimiter "";"" --encoding=utf-8 --pk ""Id"" csv_utf8 fremmedart test.db [------------------------------------] 0% Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I have no issues reading such file using this Python code: ```python content = open('csv', encoding='utf-16-le').read()) ``` `in2csv` works too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1247315144,I_kwDOBm6k_c5KWITI,1749,LDAP auth plugin,380241,benswift,open,0,,,,,0,2022-05-25T01:35:12Z,2022-05-25T01:35:12Z,,NONE,,"A [search of the plugins directory](https://datasette.io/plugins?q=ldap) doesn't turn up anything, but is is possible to set up a Datasette app which uses my organisation's LDAP for auth? If not, how much work would it be to write one (I _may_ have some spare cycles on my team to do this, but we haven't written a datasette plugin before).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1749/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1243715381,I_kwDOCGYnMM5KIZc1,436,"Add ""copy to clipboard"" button to code examples in documentation",9599,simonw,closed,0,,,,,0,2022-05-20T21:53:23Z,2022-05-20T21:57:53Z,2022-05-20T21:57:53Z,OWNER,,"Follows: - #435 Imitates: - https://github.com/simonw/datasette/issues/1748 I'll use https://github.com/executablebooks/sphinx-copybutton - here's the Datasette commit: https://github.com/simonw/datasette/commit/1465fea4798599eccfe7e8f012bd8d9adfac3039",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243704847,I_kwDOCGYnMM5KIW4P,435,Switch to Furo documentation theme,9599,simonw,closed,0,,,,,2,2022-05-20T21:46:39Z,2022-05-20T21:56:10Z,2022-05-20T21:54:43Z,OWNER,,"As seen in: - https://github.com/simonw/datasette/issues/1746 - https://github.com/simonw/shot-scraper/issues/77",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243498298,I_kwDOBm6k_c5KHkc6,1746,Switch documentation theme to Furo,9599,simonw,closed,0,,,,,21,2022-05-20T18:42:17Z,2022-05-20T21:28:29Z,2022-05-20T21:28:29Z,OWNER,,"https://github.com/pradyunsg/furo I just did this for `shot-scraper` and I really like it: https://shot-scraper.datasette.io/en/latest/ - https://github.com/simonw/shot-scraper/issues/77",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243517592,I_kwDOBm6k_c5KHpKY,1748,Add copy buttons next to code examples in the documentation,9599,simonw,closed,0,,,,,2,2022-05-20T19:09:00Z,2022-05-20T19:15:00Z,2022-05-20T19:11:32Z,OWNER,,Similar to the ones in `datasette-copyable` which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243512344,I_kwDOBm6k_c5KHn4Y,1747,Add tutorials to the getting started guide,9599,simonw,closed,0,,,,,1,2022-05-20T19:01:52Z,2022-05-20T19:12:30Z,2022-05-20T19:05:34Z,OWNER,,On https://docs.datasette.io/en/stable/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239008850,I_kwDOBm6k_c5J2cZS,1744,`--nolock` feature for opening locked databases,9599,simonw,closed,0,,,,,7,2022-05-17T18:25:16Z,2022-05-17T19:46:38Z,2022-05-17T19:40:30Z,OWNER,,"The getting started docs currently suggest you try this to browse your Chrome history: datasette ~/Library/Application\ Support/Google/Chrome/Default/History But if Chrome is running you will likely get this error: sqlite3.OperationalError: database is locked Turns out there's a workaround for this which I just spotted [on the SQLite forum](https://sqlite.org/forum/forumpost/86a67f6995): > You can do this using a [URI filename](https://sqlite.org/uri.html): > ``` > sqlite3 'file:places.sqlite?mode=ro&nolock=1' > ``` > That opens the file `places.sqlite` in read-only mode with locking disabled. This isn't safe, in that changes to the database made by other corrections are likely to cause this connection to return incorrect results or crash. Read-only mode should at least mean that you don't corrupt the database in the process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239080102,I_kwDOBm6k_c5J2tym,1745,Documentation on running cog,9599,simonw,closed,0,,,,,1,2022-05-17T19:41:06Z,2022-05-17T19:45:51Z,2022-05-17T19:43:45Z,OWNER,,Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1237871948,I_kwDOBm6k_c5JyG1M,1743,`datasette.utils.to_css_class()` should be a documented internal,9599,simonw,open,0,,,,,0,2022-05-16T23:57:26Z,2022-05-16T23:57:26Z,,OWNER,,"Because I'm using it in this plugin: - https://github.com/simonw/datasette-upload-dbs/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1237586379,I_kwDOBm6k_c5JxBHL,1742,?_trace=1 fails with datasette-geojson for some reason,9599,simonw,open,0,,,,,4,2022-05-16T19:06:05Z,2022-05-16T19:42:13Z,,OWNER,,view-source:https://calands.datasettes.com/calands/CPAD_2020a_SuperUnits.geojson?_sort=id&id__exact=4&_labels=on&_trace=1 is showing me a blank page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 607223136,MDU6SXNzdWU2MDcyMjMxMzY=,741,"Replace ""datasette publish --extra-options"" with ""--setting""",9599,simonw,open,0,,,3268330,Datasette 1.0,9,2020-04-27T04:29:04Z,2022-05-12T19:21:16Z,,OWNER,,"See https://github.com/simonw/datasette-publish-now/issues/9#issuecomment-618155764 - the `--extra-options` mechanism is in practice just used to set `--config` options in data that you publish, but that means you end up with pretty messy looking commands: datasette publish my.db --extra-options=""--config default_page_size:50 --config sql_time_limit_ms:3500"" A neater design would be to support `--config` as an option for `datasette publish` directly: datasette publish my.db --config default_page_size:50 --config sql_time_limit_ms:3500 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/741/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223699280,I_kwDOBm6k_c5I8CtQ,1739,.db downloads should be served with an ETag,9599,simonw,closed,0,,,,,6,2022-05-03T05:11:21Z,2022-05-04T18:21:18Z,2022-05-03T14:59:51Z,OWNER,,"I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it: ![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223234932,I_kwDOBm6k_c5I6RV0,1733,Get Datasette compatible with Pyodide,9599,simonw,closed,0,,,,,9,2022-05-02T19:01:58Z,2022-05-04T15:14:01Z,2022-05-02T20:15:27Z,OWNER,,"I've already got this working as a prototype. Here are the changes I had to make: - Replace the two dependencies that don't publish pure Python wheels to PyPI: `click-default-group` and `python-baseconv` - Get Datasette to work without threading - which it turns out is exclusively used for database connections - Make the `uvicorn` dependency optional (only needed when Datasette runs in the CLI) TODO: - [x] Switch to `click-default-group-wheel` - [x] https://github.com/simonw/datasette/issues/1734 - [x] Work around `uvicorn` import error - [x] https://github.com/simonw/datasette/issues/1735 - [x] #1737 Goal is to be able to do the following directly in https://pyodide.org/en/stable/console.html ```python import micropip await micropip.install(""datasette"") from datasette.app import Datasette ds = Datasette() await ds.client.get(""/.json"") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1733/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173023272,I_kwDOCGYnMM5F6uoo,416,Options for how `r.parsedate()` should handle invalid dates,638427,mattkiefer,closed,0,,,,,11,2022-03-17T23:29:55Z,2022-05-03T21:36:49Z,2022-03-21T04:01:39Z,NONE,,"Exceptions are normal expected behavior when typecasting an invalid format. However, r.parsedate() is really just re-formatting strings and keeping the type as text. So it may be better to print-and-pass on exception so the user can see a complete list of invalid values -- while also allowing for the parser to reformat the remaining valid values. ``` sqlite-utils convert idfpr.db license ""Expiration Date"" ""r.parsedate(value)"" [#######-----------------------------] 21% 00:01:57Traceback (most recent call last): File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/db.py"", line 2336, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 652, in parse raise ParserError(""String does not contain a date: %s"", timestr) dateutil.parser._parser.ParserError: String does not contain a date: / / ``` In this case, I had just one variation of an invalid date: ' / / '. But theoretically there could be many values that would have to be fixed one at a time with the current exception handling. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,tannewt,open,0,,,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db. Datasette should unescape the url component before passing them into the template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223459734,I_kwDOBm6k_c5I7IOW,1737,Automated test for Pyodide compatibility,9599,simonw,closed,0,,,,,4,2022-05-02T23:24:25Z,2022-05-02T23:40:50Z,2022-05-02T23:40:50Z,OWNER,,"Refs: - #1733 Need something in the test suite such that if Datasette breaks against Pyodide in the future we hear about it. I'm thinking this is an opportunity to use [shot-scraper javascript](https://github.com/simonw/shot-scraper#scraping-pages-using-javascript).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223263540,I_kwDOBm6k_c5I6YU0,1735,Datasette setting to disable threading (for Pyodide),9599,simonw,closed,0,,,,,1,2022-05-02T19:31:08Z,2022-05-02T23:25:49Z,2022-05-02T20:13:52Z,OWNER,,"> I'm going to add a Datasette setting to disable threading entirely, designed for usage in this particular case. > > I thought about adding a new setting, then I noticed this: > > datasette mydatabase.db --setting num_sql_threads 10 > > I'm going to let users set that to `0` to disable threaded execution of SQL queries. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115278325_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1219398983,I_kwDOBm6k_c5Iro1H,1730,SQL tracing should much more closely track the SQL query execution,9599,simonw,open,0,,,,,0,2022-04-28T22:41:04Z,2022-04-28T22:41:10Z,,OWNER,,"In #1727 I realized that the SQL tracing was measuring a whole bunch of stuff outside of the SQL query itself. I started experimenting with this fix for that but it didn't work - I got back an empty JSON array of traces for some reason: ```diff diff --git a/datasette/database.py b/datasette/database.py index ba594a8..d7f9172 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -7,7 +7,7 @@ import sys import threading import uuid -from .tracer import trace +from .tracer import trace, trace_child_tasks from .utils import ( detect_fts, detect_primary_keys, @@ -207,30 +207,31 @@ class Database: time_limit_ms = custom_time_limit with sqlite_timelimit(conn, time_limit_ms): - try: - cursor = conn.cursor() - cursor.execute(sql, params if params is not None else {}) - max_returned_rows = self.ds.max_returned_rows - if max_returned_rows == page_size: - max_returned_rows += 1 - if max_returned_rows and truncate: - rows = cursor.fetchmany(max_returned_rows + 1) - truncated = len(rows) > max_returned_rows - rows = rows[:max_returned_rows] - else: - rows = cursor.fetchall() - truncated = False - except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: - if e.args == (""interrupted"",): - raise QueryInterrupted(e, sql, params) - if log_sql_errors: - sys.stderr.write( - ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( - conn, repr(sql), params, e + with trace(""sql"", database=self.name, sql=sql.strip(), params=params): + try: + cursor = conn.cursor() + cursor.execute(sql, params if params is not None else {}) + max_returned_rows = self.ds.max_returned_rows + if max_returned_rows == page_size: + max_returned_rows += 1 + if max_returned_rows and truncate: + rows = cursor.fetchmany(max_returned_rows + 1) + truncated = len(rows) > max_returned_rows + rows = rows[:max_returned_rows] + else: + rows = cursor.fetchall() + truncated = False + except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: + if e.args == (""interrupted"",): + raise QueryInterrupted(e, sql, params) + if log_sql_errors: + sys.stderr.write( + ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( + conn, repr(sql), params, e + ) ) - ) - sys.stderr.flush() - raise + sys.stderr.flush() + raise if truncate: return Results(rows, truncated, cursor.description) @@ -238,9 +239,8 @@ class Database: else: return Results(rows, False, cursor.description) - with trace(""sql"", database=self.name, sql=sql.strip(), params=params): - results = await self.execute_fn(sql_operation_in_thread) - return results + with trace_child_tasks(): + return await self.execute_fn(sql_operation_in_thread) @property def size(self): ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1727#issuecomment-1111602802_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,henrikek,open,0,,,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url. ![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png) And the result is: ![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png) If I add the marked row to url_builder.py it seams to work: ![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1065432388,I_kwDOBm6k_c4_gTVE,1534,Maybe return JSON from HTML pages if `Accept: application/json` is sent,9599,simonw,closed,0,,,,,4,2021-11-28T20:48:09Z,2022-04-27T21:59:34Z,2022-02-02T23:39:33Z,OWNER,,"Relates to #1533 - and to the work I've been doing on the https://github.com/simonw/datasette-table Web Component. It would be useful to support users pasting in a URL to a Datasette table or query without first having to add the `.json` extension themselves - since then other systems could hit that URL with `Accept: application/json` to get back the JSON representation without first needing to read the `Link: ` header from #1533 to figure out what the URL to that JSON is. (There is weird logic deep in Datasette that says that you add `.json` to the path UNLESS the table name itself ends with `.json`, in which case you add `?_format=json` - this is super-confusing). [Update: I removed that confusing feature here: [https://simonwillison.net/2022/Mar/19/weeknotes/](https://simonwillison.net/2022/Mar/19/weeknotes/)]",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216508080,I_kwDOBm6k_c5IgnCw,1723,Research running SQL in table view in parallel using `asyncio.gather()`,9599,simonw,closed,0,,,,,5,2022-04-26T21:42:48Z,2022-04-27T18:53:44Z,2022-04-26T22:19:09Z,OWNER,,"Spun off from: - #1715",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1723/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1217014076,I_kwDOBm6k_c5Iiik8,1726,Security page in the documentation,9599,simonw,open,0,,,,,0,2022-04-27T08:43:30Z,2022-04-27T08:43:30Z,,OWNER,,"A page talking about how to run Datasette securely, and security concerns to take into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1726/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216622905,I_kwDOBm6k_c5IhDE5,1725,Performance question - what is happening in this gap?,9599,simonw,open,0,,,,,0,2022-04-27T00:21:11Z,2022-04-27T00:21:11Z,,OWNER,,"Trace from https://latest-with-plugins.datasette.io/github/commits?_facet=repo&_trace=1&_facet=committer ![CleanShot 2022-04-26 at 17 20 06@2x](https://user-images.githubusercontent.com/9599/165413811-db2cd599-2acc-46ce-b9c2-f9bc45b879e9.png) What's going on in that gap? Can I improve the tracing output to show some non-SQL queries to figure that out?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1216479167,I_kwDOBm6k_c5Igf-_,1722,`db.primary_keys()` and `db.table_columns()` don't show up in traces,9599,simonw,open,0,,,,,0,2022-04-26T21:08:36Z,2022-04-26T21:08:36Z,,OWNER,,"Noticed this while working on: - #1715 This code here isn't showing up in traces: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/table.py#L218-L220 Because those functions don't use the regular trace-instrumented `db.execute()` code path - they work directly against a connection instead: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/utils/__init__.py#L610-L626 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1215216249,I_kwDOCGYnMM5Ibrp5,428,Research adding support for savepoints,9599,simonw,open,0,,,,,1,2022-04-26T01:04:01Z,2022-04-26T01:05:29Z,,OWNER,,"https://www.sqlite.org/lang_savepoint.html Savepoints are like regular transactions except they have names and can be nested. Would there be any value in adding support to them to `sqlite-utils`, potentially as some kind of context manager? Something like this: ```python with db.savepoint(""name""): # do stuff with db.savepoint(""name2""): # do more stuff raise Release # Rolls back to before ""name2"" savepoint ``` I've never used this feature so I'm not comfortable adding anything like this without a bunch of extra research.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1213683988,I_kwDOBm6k_c5IV1kU,1718,Code examples in the documentation should be formatted with Black,9599,simonw,closed,0,,,,,12,2022-04-24T15:22:50Z,2022-04-24T16:24:14Z,2022-04-24T16:18:03Z,OWNER,,"For example on this page: https://docs.datasette.io/en/stable/writing_plugins.html#packaging-a-plugin I wonder if there's an easy way for me to enforce this for Sphinx documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212838949,I_kwDOBm6k_c5ISnQl,1716,Configure git blame to ignore Black commit,9599,simonw,closed,0,,,,,1,2022-04-22T21:56:37Z,2022-04-22T22:02:19Z,2022-04-22T22:02:19Z,OWNER,,"GitHub can support this in blame views now too: https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1211283427,I_kwDODFdgUs5IMrfj,72,feature: display progress bar when downloading multi-page responses,9020979,hydrosquall,open,0,,,,,1,2022-04-21T16:37:12Z,2022-04-21T17:29:31Z,,NONE,,"## Motivation For a long running command (longer than 1 minute) for a big table (like pull requests or commits), it can be tricky to know if the script is still running, or if a rate limit/error was encountered We know how many pages there are, so it may be possible to indicate how many remain. https://github.com/dogsheep/github-to-sqlite/blob/a6e237f75a4b86963d91dcb5c9582e3a1b3349d6/github_to_sqlite/utils.py#L367 ## Resources - Using the existing Click API: - https://click.palletsprojects.com/en/5.x/utils/#showing-progress-bars - Loading spinner: https://github.com/pavdmyt/yaspin - Progress bar: https://github.com/tqdm/tqdm",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1205867842,I_kwDODtX3eM5H4BVC,4,Retrieve the top-level story for a comment,1755789,telotortium,open,0,,,,,0,2022-04-15T20:25:39Z,2022-04-15T20:25:39Z,,NONE,,"I think that each comment inserted into the database should include a column `onstory` that contains the ID of the story on which the comment was made. This is exactly equivalent to the link after ""on:"" at the top of an HN comment page ([example](https://news.ycombinator.com/item?id=18358028)). We could do this either by directly retrieving the HTML page and using Beautiful Soup to find that link, or alternatively recurse up the tree in the Firebase API using the `parent` field (probably using `functools.lru_cache` in case a person has commented a bunch of times on the same story).",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1180427792,I_kwDOCGYnMM5GW-YQ,421,"""Error: near ""("": syntax error"" when using sqlite-utils indexes CLI",24938923,learning4life,closed,0,,,,,8,2022-03-25T07:12:51Z,2022-04-13T22:41:59Z,2022-04-13T22:41:59Z,NONE,,"This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 **New error when using CLI: ""sqlite-utils indexes global.db --table""** ``` (app-root) sqlite-utils indexes global.db --table Error: near ""("": syntax error (app-root) sqlite-utils --version sqlite-utils, version 3.25.1 (app-root) sqlite3 --version 3.36.0 2021-06-18 18:36:39 (app-root) python --version Python 3.8.11 ``` Dockerfile ``` FROM centos/python-38-centos7 USER root RUN yum update -y RUN yum upgrade -y # epel RUN yum -y install epel-release && yum clean all # SQLite RUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel WORKDIR /build/ COPY sqlite-autoconf-3360000.tar.gz ./ RUN tar -zxf sqlite-autoconf-3360000.tar.gz WORKDIR /build/sqlite-autoconf-3360000 RUN ./configure RUN make RUN make install # RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip RUN pip install sqlite-utils ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200866134,I_kwDOCGYnMM5Hk8NW,424,Better error message if you try to create a table with no columns,9599,simonw,closed,0,,,,,1,2022-04-12T02:43:20Z,2022-04-13T22:40:15Z,2022-04-13T22:40:10Z,OWNER,,"Seen here: - https://github.com/simonw/geojson-to-sqlite/issues/30 Attempting to create a table with no columns produced this confusing error: ``` File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/geojson_to_sqlite/utils.py"", line 69, in import_features db[table].create(column_types, pk=pk) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 863, in create self.db.create_table( File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 517, in create_table self.execute(sql) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 236, in execute return self.conn.execute(sql) sqlite3.OperationalError: near "")"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200649889,I_kwDOBm6k_c5HkHah,1710,Guide for plugin authors to upgrade their plugins for 1.0,9599,simonw,closed,0,,,,,1,2022-04-11T22:58:25Z,2022-04-11T23:04:01Z,2022-04-11T23:03:25Z,OWNER,,I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200224939,I_kwDOBm6k_c5Hifqr,1707,[feature] expanded detail page,536941,fgregg,open,0,,,,,1,2022-04-11T16:29:17Z,2022-04-11T16:33:00Z,,CONTRIBUTOR,,"Right now, if click on the detail page for a row you get the info for the row and links to related tables: ![Screenshot 2022-04-11 at 12-27-26 lm20 filing](https://user-images.githubusercontent.com/536941/162786802-90ac1a71-4624-47c4-ae55-b783f4f6c92d.png) It would be very cool if there was an option to expand the rows of the related tables from within this detail view. If you had that then datasette could fulfill a pretty common use case where you want to search for an entity and get a consolidate detail view about what you know about that entity. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1707/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1193090967,I_kwDOBm6k_c5HHR-X,1699,Proposal: datasette query,25778,eyeseast,open,0,,,,,6,2022-04-05T12:36:43Z,2022-04-11T01:32:12Z,,CONTRIBUTOR,,"I started sketching out a plugin to add a `datasette query` subcommand to export data from the command line. This is based on discussions in #1356 and #1605. Before I get too far down this rabbit hole, I figure it's worth getting some feedback here (unless this should happen in `Discussions`). Here's what I'm thinking: At its most basic, it will write the results of a query to STDOUT. ```sh datasette query -d data.db 'select * from data' > results.json ``` This isn't much improvement over using [sqlite-utils](https://github.com/simonw/sqlite-utils). To make better use of datasette and its ecosystem, run `datasette query` using a canned query defined in a `metadata.yml` file. For example, using the metadata file from [alltheplaces-datasette](https://github.com/eyeseast/alltheplaces-datasette/blob/main/metadata.yml): ```sh cd alltheplaces-datasette datasette query -d alltheplaces.db -m metadata.yml count_by_spider ``` That query would be good to get as CSV, and we can auto-discover metadata and databases in the current directory: ```sh cd alltheplaces-datasette datasette query count_by_spider -f csv ``` In this case, `count_by_spider` is a canned query defined on the `alltheplaces` database. If the same query is defined on multiple databases or its otherwise unclear which database `query` should use, pass the `-d` or `--database` option. If a query takes parameters, I can pass them in at runtime, using the `--param` or `-p` option: ```sh datasette query -d data.db -p value something 'select * from neighborhoods where some_column = :value' ``` I'm very interested in feedback on this, including whether it should be a plugin or in Datasette core. (I don't have a strong opinion about this, but I'm prototyping it as a plugin to start.)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1197925865,I_kwDOBm6k_c5HZuXp,1704,File PRs against incompatible plugins pinning to datasette<1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-04-08T23:15:30Z,2022-04-08T23:15:30Z,,OWNER,,"As part of the preparation for the 1.0 release, test all existing known plugins against the alpha. For any that break, submit a PR suggesting they pin to a version <1.0 - and include a link to the documentation on how to upgrade the plugin for 1.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1196327155,I_kwDOBm6k_c5HToDz,1702,Be more consistent with column quoting,9599,simonw,open,0,,,,,0,2022-04-07T16:59:20Z,2022-04-07T16:59:20Z,,OWNER,,"This tutorial made me notice that Datasette is pretty inconsistent with how column quoting works: https://datasette.io/tutorials/learn-sql It has examples of each of `""table_name""` and `[table_name]` and `table_name`, and it uses single quoted values too. Datasette should generate SQL as consistently as possible to support learners. That tutorial should also provide a tiny bit of extra information about what's going on here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1194790504,I_kwDOBm6k_c5HNw5o,1701,Use + for spaces instead of ~20,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-04-06T15:40:48Z,2022-04-06T15:55:10Z,2022-04-06T15:55:05Z,OWNER,,"Tilde encoding introduced in #1657 means that database files with spaces in the name - e.g. the Apple Mail `Envelope Index` database - end up with URLs like this: http://127.0.0.1:8001/Envelope~20Index I think this would be prettier: http://127.0.0.1:9933/Envelope+Index",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077620955,I_kwDOBm6k_c5AOzDb,1549,Redesign CSV export to improve usability,536941,fgregg,open,0,,,3268330,Datasette 1.0,5,2021-12-11T19:02:12Z,2022-04-04T11:17:13Z,,CONTRIBUTOR,,"*Original title: Set content type for CSV so that browsers will attempt to download instead opening in the browser* Right now, if the user clicks on the CSV related to a table or a query, the response header for the content type is ""content-type: text/plain; charset=utf-8"" Most browsers will try to open a file with this content-type in the browser. This is not what most people want to do, and lots of folks don't know that if they want to download the CSV and open it in the a spreadsheet program they next need to save the page through their browser. It would be great if the response header could be something like ``` 'Content-type: text/csv'); 'Content-disposition: attachment;filename=MyVerySpecial.csv'); ``` which would lead browsers to open a download dialog. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1189113609,I_kwDOBm6k_c5G4G8J,1697,"`Request.fake(..., url_vars={})`",9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-04-01T01:48:40Z,2022-04-01T02:02:18Z,2022-04-01T02:02:10Z,OWNER,,"I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well: ```python from datasette.utils.asgi import Request class Request(Request): @classmethod def fake(cls, path_with_query_string, method=""GET"", scheme=""http"", url_vars=None): """"""Useful for constructing Request objects for tests"""""" path, _, query_string = path_with_query_string.partition(""?"") scope = { ""http_version"": ""1.1"", ""method"": method, ""path"": path, ""raw_path"": path_with_query_string.encode(""latin-1""), ""query_string"": query_string.encode(""latin-1""), ""scheme"": scheme, ""type"": ""http"", } if url_vars: scope[""url_route""] = { ""kwargs"": url_vars } return cls(scope, None) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182227211,I_kwDOBm6k_c5Gd1sL,1692,[plugins][feature request]: Support additional script tag attributes when loading custom JS,9020979,hydrosquall,open,0,,,,,2,2022-03-27T01:16:03Z,2022-03-30T06:14:51Z,,CONTRIBUTOR,,"## Motivation - The build system for my new [plugin](https://github.com/hydrosquall/datasette-nteract-data-explorer) has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette. - I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers. To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, `nomodule` and `defer`. My injected scripts should look something like this: ```html ``` ## Proposal To achieve this, I propose additional optional properties to the API accepted by the `extra_js_urls` hook and custom JS field the `metadata.json` [described here](https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript). Under this API, I'd write something like this to get the above HTML rendered in Datasette. ```json { ""extra_js_urls"": [ { ""url"": ""/index.my-es-module-bundle.js"", ""module"": true, }, { ""url"": ""/index.my-legacy-fallback-bundle.js"", ""nomodule"": """", ""defer"": true } ] } ``` ## Resources - [MDN on the script tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script) - There may be other properties that could be added that are potentially valuable, like `async` or `referrerpolicy`, but I don't have an immediate need for those.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1185868354,I_kwDOBm6k_c5GrupC,1695,Option to un-filter facet not shown for `?col__exact=value`,9599,simonw,open,0,,,,,2,2022-03-30T04:44:02Z,2022-03-30T04:46:18Z,,OWNER,,"Spotted this on a page with `COUNTY__exact=Lee` in the URL: ![CleanShot 2022-03-29 at 21 41 46@2x](https://user-images.githubusercontent.com/9599/160752849-a9039343-3770-4655-920b-f19e25687a57.png) With `COUNTY=Lee` you get this instead: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182143895,I_kwDOBm6k_c5GdhWX,1691,Bug in pytest-httpx example,9599,simonw,closed,0,,,,,0,2022-03-26T22:45:30Z,2022-03-26T22:46:09Z,2022-03-26T22:46:09Z,OWNER,,"https://docs.datasette.io/en/0.61.1/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx says: ```python async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url='https://www.example.com/', data='Hello world', ) ``` That's wrong - `data=` should be `text=`. https://github.com/Colin-b/pytest_httpx/blob/v0.20.0/README.md#reply-with-custom-body",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182141761,I_kwDOBm6k_c5Gdg1B,1690,"Idea: `datasette.set_actor_cookie(response, actor)`",9599,simonw,open,0,,,,,2,2022-03-26T22:41:52Z,2022-03-26T22:43:00Z,,OWNER,,"I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/__init__.py#L79-L92 ```python redirect_response = Response.redirect(""/"") expires_at = int(time.time()) + (24 * 60 * 60) redirect_response.set_cookie( ""ds_actor"", datasette.sign( { ""a"": profile_response.json(), ""e"": baseconv.base62.encode(expires_at), }, ""actor"", ), ) return redirect_response ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182065616,I_kwDOBm6k_c5GdOPQ,1689,datasette.add_message() documentation is incorrect,9599,simonw,closed,0,,,,,1,2022-03-26T20:49:42Z,2022-03-26T21:35:57Z,2022-03-26T20:51:21Z,OWNER,,"https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says: `.add_message(request, message, message_type=datasette.INFO)` But in the code it's: https://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181364043,I_kwDOBm6k_c5Gai9L,1687,Make show_json.html or a similar mechanism stable for plugins,9599,simonw,open,0,,,,,0,2022-03-25T23:42:45Z,2022-03-25T23:42:45Z,,OWNER,,"I used `show_json.html` in the new `datasette-packages` plugin, which means it will break if that template changes: - https://github.com/simonw/datasette-packages/issues/3 It would be useful if it (or something like it) was documented and stable for plugins to use. Also relevant: - #878",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1687/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181236173,I_kwDOCGYnMM5GaDvN,422,Reconsider not running convert functions against null values,9599,simonw,open,0,,,,,1,2022-03-25T20:22:40Z,2022-03-25T20:23:21Z,,OWNER,,"I just got caught out by the fact that `None` values are not processed by the `.convert()` mechanism https://github.com/simonw/sqlite-utils/blob/0b7b80bd40fe86e4d66a04c9f607d94991c45c0b/sqlite_utils/db.py#L2504-L2510 I had run this code while working on #420 and I wasn't sure why it didn't work: ``` $ sqlite-utils add-column content.db articles score float $ sqlite-utils convert content.db articles score ' import random random.seed(10) def convert(value): global random return random.random() ' ``` The reason it didn't work is that the newly added `score` column was full of `null` values. I fixed it by doing this instead: $ sqlite-utils add-column content.db articles score float --not-null-default 1.0 But this indicates to me that the design of `convert()` here may be incorrect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181037277,I_kwDOBm6k_c5GZTLd,1686,heroku bails if app name specifed in datasette publish is the same as existing app,2115933,tlongers,open,0,,,,,0,2022-03-25T17:10:34Z,2022-03-25T17:10:34Z,,NONE,,"Seem that `heroku` does not accept an app overwrite triggered by specifying the app name using `datasette publish`, as below: ``` datasette publish heroku some.db --name ""jazzy-name"" ``` The resulting error has the below traceback: ``` Creating jazzy-name... ! ▸ Name jazzy-name is already taken Traceback (most recent call last): File ""/opt/homebrew/bin/datasette"", line 33, in sys.exit(load_entry_point('datasette==0.60.1', 'console_scripts', 'datasette')()) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/datasette/publish/heroku.py"", line 127, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 420, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 524, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'jazzy-name', '--json']' returned non-zero exit status 1. ``` It's a solid failsafe, but does `datasette publish` have a way to force an overwrite?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1178456794,I_kwDOCGYnMM5GPdLa,418,Add generated files to .gitignore,25778,eyeseast,closed,0,,,,,0,2022-03-23T17:48:12Z,2022-03-24T21:01:44Z,2022-03-24T21:01:44Z,CONTRIBUTOR,,"I end up with these in my local directory: .hypothesis/ Pipfile Pipfile.lock pyproject.toml Might as well gitignore them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1179998071,I_kwDOBm6k_c5GVVd3,1684,Mechanism for disabling faceting on large tables only,9599,simonw,open,0,,,,,1,2022-03-24T20:06:11Z,2022-03-24T20:13:19Z,,OWNER,,"Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions One option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1179928510,I_kwDOBm6k_c5GVEe-,1683,allow_facet: False should be respected by column cog menu,9599,simonw,closed,0,,,,,0,2022-03-24T19:05:06Z,2022-03-24T19:16:36Z,2022-03-24T19:16:36Z,OWNER,,"The column cog menu currently shows ""Facet by this"" even if faceting is disabled for the Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1089529555,I_kwDOBm6k_c5A8ObT,1581,"when hashed urls are turned on, the _memory db has improperly long-lived cache expiry",536941,fgregg,closed,0,,,,,1,2021-12-28T00:05:48Z,2022-03-24T04:08:18Z,2022-03-24T04:08:18Z,CONTRIBUTOR,,"if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database. in particular, this header is set: `cache-control: max-age=31536000` this is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561). Either the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178521513,I_kwDOBm6k_c5GPs-p,1682,SQL queries against databases with different routes are broken,9599,simonw,closed,0,,,,,1,2022-03-23T18:42:57Z,2022-03-23T18:48:16Z,2022-03-23T18:48:16Z,OWNER,,"500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable Here's the trace: ``` File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 54, in data return await QueryView(self.ds).data( File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 232, in data self.ds.get_database(database), sql File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py"", line 401, in get_database return self.databases[name] KeyError: 'fixtures-aa7318b' ``` It looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago! _Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1682/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change. After shipping, release a new `datasette-hashed-urls` that depends on it, also this: - https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1177101697,I_kwDOBm6k_c5GKSWB,1681,Potential bug in numeric handling where_clause for filters,9599,simonw,open,0,,,,,2,2022-03-22T17:43:50Z,2022-03-22T17:49:09Z,,OWNER,,"> Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`: > > https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212 > > Though... it looks like there's a bug in that? It doesn't account for `float` values - `""3.5"".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174655187,I_kwDOBm6k_c5GA9DT,1671,Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply,9308268,rayvoelker,open,0,,,,,8,2022-03-20T19:17:24Z,2022-03-22T17:43:12Z,,NONE,,"I found a strange behavior, and I'm not sure if it's related to views and boolean values perhaps, or if there's something else weird going on here, but I'll provide an example that may help show what I'm seeing happen. ```bash #!/bin/bash echo ""\""id\"",\""expiration_date\"" 0,2018-01-04 1,2019-01-05 2,2020-01-06 3,2021-01-07 4,2022-01-08 5,2023-01-09 6,2024-01-10 7,2025-01-11 8,2026-01-12 9,2027-01-13 "" > test.csv csvs-to-sqlite test.csv test.db sqlite-utils create-view --replace test.db test_view ""select id, expiration_date, case when julianday('NOW') >= julianday(expiration_date) then 1 else 0 end as has_expired FROM test"" ``` ```bash datasette test.db ``` ![image](https://user-images.githubusercontent.com/9308268/159178745-9c6152f7-eac6-4bf9-bef5-a2d63d3ee13f.png) ![image](https://user-images.githubusercontent.com/9308268/159178824-c8952137-270c-42a4-ad1c-f6ad2c51e499.png) ![image](https://user-images.githubusercontent.com/9308268/159178877-23e00b36-443a-43ef-83e5-e0bdddd3fdcd.png) ![image](https://user-images.githubusercontent.com/9308268/159178918-65922cc7-2514-4735-a72d-4904b99976d4.png) Thanks again and let me know if you want me to provide anything else!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,bsilverm,closed,0,,,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324835838,MDU6SXNzdWUzMjQ4MzU4Mzg=,276,Handle spatialite geometry columns better,45057,russss,closed,0,,,,,21,2018-05-21T08:46:55Z,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,CONTRIBUTOR,,"I'd like to see spatialite geometry columns rendered more sensibly - at the moment they come through as well-known-binary unless you use custom SQL, and WKB isn't of much use to anyone on the web. In HTML: they should be shown either as simple lat/long (if it's just a point, for example), or as a sensible placeholder if they're more complex geometries. In JSON: they should be GeoJSON geometries, (which means they can be automatically fed into a leaflet map with no further messing around). In CSV: they should be WKT. I briefly wondered if this should go into a plugin, but I suspect it needs hooking in at a deeper level than the plugin architecture will support any time soon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/276/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 953218043,MDU6SXNzdWU5NTMyMTgwNDM=,1403,Labels explaining what hidden tables are for,9599,simonw,open,0,,,,,0,2021-07-26T19:29:22Z,2022-03-21T22:20:37Z,,OWNER,,"A reasonable question: ""What are those hidden tables for?"" This could be answered by adding a small piece of explanatory text to each table - based on if it's related to FTS or to SpatiaLite or configured to be hidden for some other reason.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175854982,I_kwDOBm6k_c5GFh-G,1679,Research: how much overhead does the n=1 time limit have?,9599,simonw,closed,0,,,3268330,Datasette 1.0,11,2022-03-21T19:27:46Z,2022-03-21T21:55:57Z,2022-03-21T21:55:56Z,OWNER,,"https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200 ```python @contextmanager def sqlite_timelimit(conn, ms): deadline = time.perf_counter() + (ms / 1000) # n is the number of SQLite virtual machine instructions that will be # executed between each check. It's hard to know what to pick here. # After some experimentation, I've decided to go with 1000 by default and # 1 for time limits that are less than 50ms n = 1000 if ms < 50: n = 1 def handler(): if time.perf_counter() >= deadline: return 1 conn.set_progress_handler(handler, n) try: yield finally: conn.set_progress_handler(None, n) ``` How often do I set a time limit of 50 or less? How much slower does it go thanks to this code?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175894898,I_kwDOBm6k_c5GFrty,1680,Consider simplifying permissions for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-03-21T20:17:29Z,2022-03-21T20:17:29Z,,OWNER,,"Permission checks right now can express one of three opinions: - `False` means ""so not grant this permisson"" - `True` means ""grant this permission"" - `None` means ""I have no opinion"" But... there's also a concept of a ""default"" for a given permission check, which might be `False` or `True`. I worry this is too complicated. Could this be simplified before 1.0? In particular the default concept. See also: - #1676 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1170144879,I_kwDOBm6k_c5Fvv5v,1660,Refactor and simplify Datasette routing and views,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-15T19:56:56Z,2022-03-21T19:19:12Z,2022-03-21T19:19:01Z,OWNER,,"While working on: - https://github.com/simonw/datasette/issues/1657 - https://github.com/simonw/datasette/issues/1439 It became very clear that the least maintainable part of Datasette at the moment is the way routing to the database, table and row views work - in particular the subclassing mechanism with BaseView and DataView, but also the complex variety of ways in which the URL routes capture different named regular expression groups.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175715988,I_kwDOBm6k_c5GFACU,1678,Make `check_visibility()` a documented API,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:30:34Z,2022-03-21T19:04:03Z,2022-03-21T19:01:46Z,OWNER,,"Spotted this while working on: - #1677 https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175694248,I_kwDOBm6k_c5GE6uo,1677,Remove `check_permission()` from `BaseView`,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:18:18Z,2022-03-21T18:45:04Z,2022-03-21T18:45:03Z,OWNER,,"Follow-on from: - #1675 Refs: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175648453,I_kwDOBm6k_c5GEvjF,1675,Extract out `check_permissions()` from `BaseView,9599,simonw,closed,0,,,,,7,2022-03-21T16:39:46Z,2022-03-21T17:14:31Z,2022-03-21T17:13:21Z,OWNER,,"> I'm going to refactor this stuff out and document it so it can be easily used by plugins: https://github.com/simonw/datasette/blob/4a4164b81191dec35e423486a208b05a9edc65e4/datasette/views/base.py#L69-L103 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1074136176_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 780153562,MDU6SXNzdWU3ODAxNTM1NjI=,1177,Ability to stream all rows as newline-delimited JSON,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-01-06T07:10:48Z,2022-03-21T15:08:52Z,,OWNER,,"> Yet another use-case for this: I want to be able to stream newline-delimited JSON in order to better import into Pandas: > > pandas.read_json(""https://latest.datasette.io/fixtures/compound_three_primary_keys.json?_shape=array&_nl=on"", lines=True) _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1101#issuecomment-755128038_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091819089,I_kwDOCGYnMM5BE9ZR,360,MemoryError,559453,nzaar9,closed,0,,,,,1,2022-01-01T13:39:17Z,2022-03-21T04:22:46Z,2022-03-21T04:22:46Z,NONE,,"HI, when dealing with large json file (~170GB) i got the following error ``` Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1126, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1051, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1393, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 752, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/cli.py"", line 1300, in memory rows, format_used = rows_from_file(csv_fp, format=format, encoding=encoding) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 185, in rows_from_file return rows_from_file(buffered, format=Format.JSON) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 156, in rows_from_file decoded = json.load(fp) File ""/usr/lib/python3.9/json/__init__.py"", line 293, in load return loads(fp.read(), MemoryError ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,dotcs,closed,0,,,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command. Let's first create a simple database from JSON string ```console $ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo - $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""abc""}] ``` and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns)) ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run Traceback (most recent call last): File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert for row in db.conn.execute(sql, where_args).fetchall(): sqlite3.OperationalError: user-defined function raised exception ``` But without the `--dry-run` flag it does work as expected: ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""bar""}] ``` ```console $ sqlite-utils --version sqlite-utils, version 3.25.1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174717287,I_kwDOBm6k_c5GBMNn,1674,Tweak design of /.json,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T22:58:01Z,2022-03-20T22:58:40Z,,OWNER,,"https://latest.datasette.io/.json Currently: ```json { ""_memory"": { ""name"": ""_memory"", ""hash"": null, ""color"": ""a6c7b9"", ""path"": ""/_memory"", ""tables_and_views_truncated"": [], ""tables_and_views_more"": false, ""tables_count"": 0, ""table_rows_sum"": 0, ""show_table_row_counts"": false, ""hidden_table_rows_sum"": 0, ""hidden_tables_count"": 0, ""views_count"": 0, ""private"": false }, ""fixtures"": { ""name"": ""fixtures"", ""hash"": ""645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa"", ""color"": ""645005"", ""path"": ""/fixtures"", ""tables_and_views_truncated"": [ { ""name"": ""compound_three_primary_keys"", ""columns"": [ ""pk1"", ""pk2"", ""pk3"", ""content"" ], ""primary_keys"": [ ""pk1"", ""pk2"", ""pk3"" ], ""count"": 1001, ""hidden"": false, ""fts_table"": null, ""num_relationships_for_sorting"": 0, ""private"": false }, ``` As-of this issue the `""path""` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns: - #1668",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 910088936,MDU6SXNzdWU5MTAwODg5MzY=,1355,datasette --get should efficiently handle streaming CSV,9599,simonw,open,0,,,,,2,2021-06-03T04:40:40Z,2022-03-20T22:38:53Z,,OWNER,,"It would be great if you could use `datasette --get` to run queries that return streaming CSV data without running out of RAM. Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174708375,I_kwDOBm6k_c5GBKCX,1673,Streaming CSV spends a lot of time in `table_column_details`,9599,simonw,open,0,,,,,1,2022-03-20T22:25:28Z,2022-03-20T22:34:06Z,,OWNER,,"At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do: datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on' While investigating: - #1355 And spotted this: ``` datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2) Total Samples 5800 GIL: 71.00%, Active: 98.00%, Threads: 4 %Own %Total OwnTime TotalTime Function (filename:line) 8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212) 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614) 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81) 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120) 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571) 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140) ``` Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174697144,I_kwDOBm6k_c5GBHS4,1672,Refactor CSV handling code out of DataView,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T21:47:00Z,2022-03-20T21:52:39Z,,OWNER,,"> I think the way to get rid of most of the remaining complexity in `DataView` is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1672/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 688351054,MDU6SXNzdWU2ODgzNTEwNTQ=,140,Idea: insert-files mechanism for adding extra columns with fixed values,9599,simonw,open,0,,,,,1,2020-08-28T20:57:36Z,2022-03-20T19:45:45Z,,OWNER,,"Say for example you want to populate a `file_type` column with the value `gif`. That could work like this: ``` sqlite-utils insert-files gifs.db images *.gif \ -c path -c md5 -c last_modified:mtime \ -c file_type:text:gif --pk=path ``` So a column defined as a `text` column with a value that follows a second colon.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174306154,I_kwDOBm6k_c5F_n1q,1668,"Introduce concept of a database `route`, separate from its name",9599,simonw,closed,0,,,3268330,Datasette 1.0,20,2022-03-19T16:48:28Z,2022-03-20T16:43:16Z,2022-03-20T16:43:16Z,OWNER,,"Some issues came up in the new `datasette-hashed-urls` plugin relating to the way it renames databases on startup to achieve unique URLs that depend on the database SHA-256 content: - https://github.com/simonw/datasette-hashed-urls/issues/10 - https://github.com/simonw/datasette-hashed-urls/issues/9 - https://github.com/simonw/datasette-hashed-urls/issues/8 All three of these could be addressed by making the ""path"" concept for a database (the `/foo` bit where it is served) work independently of the database's name, which would be used for default display and also as the alias when configuring cross-database aliases.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123393829,I_kwDODFE5qs5C9aEl,10,sqlite3.OperationalError: no such table: main.my_activity,69208826,glxblt14,open,0,,,,,1,2022-02-03T17:59:29Z,2022-03-20T02:38:07Z,,NONE,,"Hello, When i run the command `google-takeout-to-sqlite my-activity db.db takeout-20220203T174446Z-001.zip`, i get this error : ``` Traceback (most recent call last): File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""C:\Users\julie\AppData\Local\Programs\Python\Python39-32\Scripts\google-takeout-to-sqlite.exe\__main__.py"", line 7, in File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1053, in main rv = self.invoke(ctx) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\cli.py"", line 31, in my_activity utils.save_my_activity(db, zf) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\utils.py"", line 19, in save_my_activity db[""my_activity""].create_index([""time""]) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\sqlite_utils\db.py"", line 629, in create_index self.db.conn.execute(sql) sqlite3.OperationalError: no such table: main.my_activity ``` Thank you for your help Sorry for my bad English EDIT: i used the json format",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648435885,MDU6SXNzdWU2NDg0MzU4ODU=,878,"New pattern for views that return either JSON or HTML, available for plugins",9599,simonw,open,0,,,3268330,Datasette 1.0,26,2020-06-30T19:26:13Z,2022-03-19T16:19:30Z,,OWNER,,"Can be part of #870 - refactoring existing views to use `register_routes()`. > I'm going to put the new `check_permissions()` method on `BaseView` as well. If I want that method to be available to plugins I can do so by turning that `BaseView` class into a documented API that plugins are encouraged to use themselves. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/832#issuecomment-651995453_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/878/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 531755959,MDU6SXNzdWU1MzE3NTU5NTk=,647,Move hashed URL mode out to a plugin,9599,simonw,closed,0,,,3268330,Datasette 1.0,9,2019-12-03T06:29:03Z,2022-03-19T11:56:05Z,2022-03-15T23:13:06Z,OWNER,,"They used to be the default until #418. Since making them optional I haven't felt the need to use them even once. That suggests to me that they should be removed. I think their effect could be entirely handled by an ASGI wrapping plugin. https://datasette.readthedocs.io/en/0.32/performance.html#hashed-url-mode",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810397025,MDU6SXNzdWU4MTAzOTcwMjU=,1228,500 error caused by faceting if a column called `n` exists,7107523,Kabouik,closed,0,,,,,5,2021-02-17T17:41:20Z,2022-03-19T06:44:40Z,2022-03-19T01:38:04Z,NONE,,"I recently discovered `datasette` thanks to your great talk at FOSDEM and would like to use it for some projects. However, when trying to use it on databases created from some csv ot tsv files, I am sometimes getting this issue when going to http://127.0.0.1:8001/databasetest/databasetest and I don't exactly understand what it refers to. So far, I couldn't find anything relevant when reviewing the raw text files that could explain this issue, nor could I find something obvious between the files that generate this issue and those that don't. Does the error ring a bell and, if so, could you please point me to the right direction? ``` $ datasette databasetest.db INFO: Started server process [1408482] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:56394 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56394 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56396 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56398 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK Traceback (most recent call last): File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/app.py"", line 1099, in route_path response = await view(request, send) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 147, in view request, **request.scope[""url_route""][""kwargs""] File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 121, in dispatch_request return await handler(request, *args, **kwargs) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 260, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 434, in view_get request, database, hash, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/table.py"", line 782, in data suggested_facets.extend(await facet.suggest()) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in suggest and any(r[""n""] > 1 for r in distinct_values) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in and any(r[""n""] > 1 for r in distinct_values) TypeError: '>' not supported between instances of 'str' and 'int' INFO: 127.0.0.1:56402 - ""GET /databasetest/databasetest HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:56402 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest?sql=select+*+from+databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /databasetest.json?sql=select+*+from+databasetest&_shape=array&_shape=array HTTP/1.1"" 200 OK ^CINFO: Shutting down INFO: Waiting for application shutdown. INFO: Application shutdown complete. INFO: Finished server process [1408482] ``` Note that there is no error if I go to http://127.0.0.1:8001/databasetest and then click on `Run SQL`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161584460,I_kwDOBm6k_c5FPF9M,1651,Get rid of the no-longer necessary ?_format=json hack for tables called x.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-07T15:40:42Z,2022-03-19T04:04:50Z,2022-03-15T18:25:42Z,OWNER,,"Tidy up from: - #1439",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1169840669,I_kwDOBm6k_c5Fulod,1658,Revert main to version that passes tests,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-15T15:37:02Z,2022-03-19T04:04:50Z,2022-03-15T15:42:58Z,OWNER,,"> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1658/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features: - `db.hash` - `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108235694,I_kwDOBm6k_c5CDlWu,1603,A proper favicon,9599,simonw,closed,0,,,3268330,Datasette 1.0,19,2022-01-19T15:24:55Z,2022-03-19T04:04:49Z,2022-01-20T06:07:31Z,OWNER,,"Tips here: https://adamj.eu/tech/2022/01/18/how-to-add-a-favicon-to-your-django-site/ - I think a PNG served at `/favicon.ico` is the best option, since safari doesn't support SVG yet. Relevant code: https://github.com/simonw/datasette/blob/cb29119db9115b1f40de2fb45263ed77e3bfbb3e/datasette/app.py#L182-L183 I can reuse the icon for https://datasette.io/desktop",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114147905,I_kwDOBm6k_c5CaIxB,1612,Move canned queries closer to the SQL input area,639012,jsfenfen,closed,0,,,3268330,Datasette 1.0,5,2022-01-25T17:06:39Z,2022-03-19T04:04:49Z,2022-01-25T18:34:21Z,CONTRIBUTOR,,"*Original title: Consider placing example queries above the sql input?* Hi! Have been enjoying deploying ad hoc datasettes for collaborators to pick over! I keep finding myself manually ""fixing"" the database.html template so that the ""example queries"" (canned queries) appear directly *over* the sql box? So they are sorta more a suggestion for collaborators who aren't inclined to write their own queries? My sense is any time I go to the trouble of writing canned queries my users should see 'em? (( I have also considered a client-side reactive-ish option where selecting a query just places the raw SQL in the box and doesn't execute it, but this seems to end up being an inconvenience, rather than a teaching tool. )) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122413719,I_kwDOBm6k_c5C5qyX,1621,Test against Python 3.11 dev version,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-02-02T21:38:57Z,2022-03-19T04:04:49Z,2022-02-02T21:58:54Z,OWNER,,"To avoid another surprise like we got with 3.10: https://simonwillison.net/2021/Oct/9/finding-and-reporting-a-bug/ From a quick GitHub code search it looks like `3.11-dev` should work: https://cs.github.com/urllib3/urllib3/blob/7bec77e81aa0a194c98381053225813f5347c9d2/.github/workflows/ci.yml#L60",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from: - #1620 ``` % curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ``` > % mkdir /tmp/data > % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit) > ^CINFO: Shutting down > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK > ``` > The first time I ran Datasette I got two databases - `fixtures` and `created` > > BUT... when I ran Datasette the second time it looked like this: > > > > This is the same result you get if you run: > > datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db > > This is caused by this Datasette issue: > - https://github.com/simonw/datasette/issues/509 > > So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831 _Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead: - #647 - https://github.com/simonw/datasette-hashed-urls/issues/3 https://github.com/simonw/datasette-hashed-urls Sub-tasks: - [x] Remove hashed URL mode implementation - [x] Update documentation - [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065429936,I_kwDOBm6k_c4_gSuw,1532,Use datasette-table Web Component to guide the design of the JSON API for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2021-11-28T20:37:18Z,2022-03-16T20:13:34Z,,OWNER,,"I realized that one of the reasons I'm having trouble committing to nailing down the JSON API for 1.0 is that I don't use it much myself - I use the `?_shape=array` one quite often, but I don't have any projects that are using the default, more fully-featured API. As an experiment I built a Web Component for embedding Datasette tables on pages - https://github.com/simonw/datasette-table - and I think it's actually going to be a really useful tool for helping me dog food the v1.0 API design.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 973139047,MDU6SXNzdWU5NzMxMzkwNDc=,1439,Rethink how .ext formats (v.s. ?_format=) works before 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,48,2021-08-17T23:32:51Z,2022-03-15T20:51:26Z,2022-03-15T20:51:26Z,OWNER,,"Datasette currently has surprising special behaviour for if a table name ends in `.csv` - which can happen when a tool like `csvs-to-sqlite` creates tables that match the filename that they were imported from. https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv illustrates this behaviour: it links to `.csv` and `.json` that look like this: - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv&_size=max Where normally Datasette would add the `.csv` or `.json` extension to the path component of the URL (as seen on other pages such as https://latest.datasette.io/fixtures/facet_cities) here the [path_with_format() function](https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/utils/__init__.py#L710) notices that there is already a `.` in the path and instead adds `?_format=csv` to the query string instead. The problem with this mechanism is that it's pretty surprising. Anyone writing external code to Datasette who wants to get back the `.csv` or `.json` version giving the URL to a table page will need to know about and implement this behaviour themselves. That's likely to cause all kinds of bugs in the future.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531502365,MDU6SXNzdWU1MzE1MDIzNjU=,646,Make database level information from metadata.json available in the index.html template,18017473,lagolucas,open,0,,,3268330,Datasette 1.0,3,2019-12-02T19:55:10Z,2022-03-15T20:50:34Z,,NONE,,"Did a search on the issues here and didn't find anything related to what I want. I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page. I tried some small tweaks on the python and html files, but failed to get that result. Is there a way? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 626593402,MDU6SXNzdWU2MjY1OTM0MDI=,780,Internals documentation for datasette.metadata() method,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2020-05-28T15:14:22Z,2022-03-15T20:50:34Z,,OWNER,,https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/app.py#L297-L328,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/780/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 646737558,MDU6SXNzdWU2NDY3Mzc1NTg=,870,Refactor default views to use register_routes,9599,simonw,open,0,,,,,10,2020-06-27T18:53:12Z,2022-03-15T20:07:18Z,,OWNER,,"It would be much cleaner if Datasette's default views were all registered using the new `register_routes()` plugin hook. Could dramatically reduce the code in `datasette/app.py`. > The ideal fix here would be to rework my `BaseView` subclass mechanism to work with `register_routes()` so that those views don't have any special privileges above plugin-provided views. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/864#issuecomment-648580556_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/870/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058072543,I_kwDOBm6k_c4_EOff,1518,Complete refactor of TableView and table.html template,9599,simonw,open,0,,,3268330,Datasette 1.0,45,2021-11-19T02:55:16Z,2022-03-15T18:35:49Z,,OWNER,,"Split from #878. The current `TableView` class is by far the most complex part of Datasette, and the most difficult to work on: https://github.com/simonw/datasette/blob/0.59.2/datasette/views/table.py In #878 I started exploring a new pattern for building views. In doing so it became clear that `TableView` is the first beast that I need to slay - if I can refactor that into something neat the pattern for building other views will emerge as a natural consequence. I've been trying to build this as a `register_routes()` plugin, as originally suggested in #870 - though unfortunately it looks like those plugins can't replace existing Datasette default views at the moment, see #1517. [UPDATE: I was wrong about this, plugins can over-ride default views just fine] I also know that I want to have a fully documented template context for `table.html` as a major step on the way to Datasette 1.0, see #1510. All of this adds up to the `TableView` factor being a major project that will unblock a whole flurry of other things - so I'm going to work on that in this separate issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1168995756,I_kwDOBm6k_c5FrXWs,1657,Tilde encoding: use ~ instead of - for dash-encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2022-03-14T22:55:17Z,2022-03-15T18:25:11Z,2022-03-15T18:01:58Z,OWNER,,Refs #1439,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 675753042,MDU6SXNzdWU2NzU3NTMwNDI=,131,sqlite-utils insert: options for column types,9599,simonw,open,0,,,,,5,2020-08-09T18:59:11Z,2022-03-15T13:21:42Z,,OWNER,,"The `insert` command currently results in string types for every column - at least when used against CSV or TSV inputs. It would be useful if you could do the following: - automatically detects the column types based on eg the first 1000 records - explicitly state the rule for specific columns `--detect-types` could work for the former - or it could do that by default and allow opt-out using `--no-detect-types` For specific columns maybe this: sqlite-utils insert db.db images images.tsv \ --tsv \ -c id int \ -c score float",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/131/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 930807135,MDU6SXNzdWU5MzA4MDcxMzU=,1384,Plugin hook for dynamic metadata,9599,simonw,open,0,,,,,22,2021-06-26T22:36:03Z,2022-03-14T00:36:42Z,,OWNER,,"@brandonrobertz contributed an implementation of this in PR #1368, which I just merged. Opening this ticket to track further work on this before it goes out in a Datasette release (likely preceded by an alpha).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1145882578,I_kwDOCGYnMM5ETMfS,408,`deterministic=True` fails on versions of SQLite prior to 3.8.3,24938923,learning4life,closed,0,,,,,6,2022-02-21T14:36:43Z,2022-03-13T16:54:09Z,2022-03-02T00:38:11Z,NONE,,"Hi, love your work. I am unable to lookup indexes in a database using sqlite-utils: ` sqlite-utils indexes city_spec.db --table` or `sqlite-utils indexes city_spec.db MyTable ` **Software** sqlite-utils, version 3.24 sqlite3 --version: 3.36.0 **Output:** Traceback (most recent call last): File ""/opt/app-root/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/decorators.py"", line 26, in new_func return f(get_current_context(), *args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 2123, in indexes ctx.invoke( File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 1624, in query db.register_fts4_bm25() File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 403, in register_fts4_bm25 self.register_function(rank_bm25, deterministic=True) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 399, in register_function register(fn) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 392, in register self.conn.create_function(name, arity, fn, **kwargs) sqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160034488,I_kwDOCGYnMM5FJLi4,411,Support for generated columns,25778,eyeseast,open,0,,,,,8,2022-03-04T20:41:33Z,2022-03-11T22:32:43Z,,CONTRIBUTOR,,"This is a fairly new feature -- SQLite version 3.31.0 (2020-01-22) -- that I, admittedly, haven't gotten to work yet. But it looks _incredibly_ useful: https://dgl.cx/2020/06/sqlite-json-support I'm not sure if this is an option on `add-column` or a separate command like `add-generated-column`. Either way, it needs an argument to populate it. It could be something like this: ```sh sqlite-utils add-column data.db table-name generated --as 'json_extract(data, ""$.field"")' --virtual ``` More here: https://www.sqlite.org/gencol.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/411/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1166731361,I_kwDOCGYnMM5Fiuhh,414,I forgot to include the changelog in the 3.25.1 release,9599,simonw,closed,0,,,,,7,2022-03-11T18:32:36Z,2022-03-11T18:40:39Z,2022-03-11T18:40:39Z,OWNER,,"I pushed a release for https://github.com/simonw/sqlite-utils/releases/tag/3.25.1 but forgot to include the release notes in `docs/changelog.rst` This means https://sqlite-utils.datasette.io/en/stable/changelog.html isn't showing them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166587040,I_kwDOCGYnMM5FiLSg,413,Display autodoc type information more legibly,9599,simonw,closed,0,,,,,5,2022-03-11T15:58:20Z,2022-03-11T18:07:10Z,2022-03-11T18:07:10Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.25/reference.html#sqlite_utils.db.Table.insert looks like this at the moment: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this: ```Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2' RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json ENV PORT 8080 EXPOSE 8080 CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db ``` This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 678760988,MDU6SXNzdWU2Nzg3NjA5ODg=,932,End-user documentation,9599,simonw,open,0,,,3268330,Datasette 1.0,6,2020-08-13T22:04:39Z,2022-03-08T15:20:48Z,,OWNER,,"Datasette's documentation is aimed at people who install and configure it. What about end users of preconfigured and deployed Datasette instances? Something that can be linked to from the Datasette UI would be really useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/932/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1154399841,I_kwDOBm6k_c5Ezr5h,1645,"Sensible `cache-control` headers for static assets, including those served by plugins",697092,curiousleo,open,0,,,3268330,Datasette 1.0,4,2022-02-28T18:12:03Z,2022-03-08T02:59:29Z,,NONE,,"## What I'm seeing With `default_cache_ttl = 86400`, I see the following: A table view returns `Cache-control: max-age=86400`: ![Screenshot_20220228_190000](https://user-images.githubusercontent.com/697092/156034352-4d64683e-39c8-49af-81df-0217a5957bbd.png) A static asset returns no `Cache-control` header: ![Screenshot_20220228_185933](https://user-images.githubusercontent.com/697092/156034363-d0b03cc2-5889-4ed2-b601-8c1846b8469a.png) ## What I expected to see I expected the static asset to return a `Cache-control` header indicating that this response can be cached. ## Why this matters I'm productionising a Datasette deployment right now and was looking into putting it behind a Varnish instance. I was surprised to see requests for static assets being served from Datasette rather than Varnish, this is what led me to look more closely at the response headers. While Datasette serves those static assets pretty quickly, I don't see why Datasette should serve them. By their nature, static assets like images and JS files are very cacheable, so it should be easy to serve them from a cache like Varnish. (Note that Varnish can easily be configured to override this header, enabling caching for static assets. But it would be better if this override was not necessary.) ## Discussion It seems clear to me that serving static assets without a `Cache-control` header is not ideal. I see two options here: A. Static assets use the same logic as table / SQL views to set the `Cache-control` header based on `default_cache_ttl`. B. An additional setting for static assets is introduced (`default_static_cache_ttl`, say).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1645/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1161969891,I_kwDOBm6k_c5FQkDj,1654,Adopt a code of conduct,9599,simonw,closed,0,,,,,5,2022-03-07T22:00:24Z,2022-03-07T22:19:35Z,2022-03-07T22:19:35Z,OWNER,,"This is long overdue, especially given the size of the project now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161937073,I_kwDOBm6k_c5FQcCx,1653,Mechanism to default a table to sorting by multiple columns,9599,simonw,open,0,,,,,2,2022-03-07T21:20:11Z,2022-03-07T21:23:39Z,,OWNER,,"### Discussed in https://github.com/simonw/datasette/discussions/1652
Originally posted by **zaneselvans** March 7, 2022 It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order): ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: created_time ``` But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette? ```sql SELECT respondent_id, report_year, spplmnt_num, row_number, row_seq, row_prvlg, acct_num, depr_plnt_base, est_avg_srvce_lf, net_salvage, apply_depr_rate, mrtlty_crv_typ, avg_remaining_lf, report_prd FROM f1_edcfu_epda WHERE respondent_id = 210 AND report_year = 2020 ORDER BY report_year, report_prd, respondent_id, spplmnt_num, row_number LIMIT 1000 ``` The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key. I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names... ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: ""(report_year, report_prd, respondent_id, spplmnt_num, row_number)"" ``` and they all give me server errors like: ``` Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number) ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1160750713,I_kwDOBm6k_c5FL6Z5,1650,Implement redirects from old % encoding to new dash encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,5,2022-03-06T23:40:02Z,2022-03-07T19:26:15Z,2022-03-07T19:26:14Z,OWNER,,"> One big advantage to this scheme is that redirecting old links to `%2F` pages (e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators) is easy - if you see a `%` in the `raw_path`, redirect to that page with the `%` replaced by `-`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1439#issuecomment-1060044007_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160407071,I_kwDOBm6k_c5FKmgf,1647,Test failures with SQLite 3.37.0+ due to column affinity case,9599,simonw,closed,0,,,,,5,2022-03-05T17:37:46Z,2022-03-05T19:56:28Z,2022-03-05T19:47:04Z,OWNER,,"These three tests are failing on my local machine: ``` FAILED tests/test_internals_database.py::test_table_column_details[facetable-expected0] - AssertionError: assert [Column(cid=0, name='pk', type='INTEGER', no... FAILED tests/test_internals_database.py::test_table_column_details[sortable-expected1] - AssertionError: assert [Column(cid=0, name='pk1', type='varchar(30)'... FAILED tests/test_table_html.py::test_sort_links - AssertionError: assert [{'a_href': None,\n 'attrs': {'class': ['col-Link'],\n 'data-column': '... ``` I ran `pytest --lf -vv` and the output had things like this in it: ``` E - Column(cid=1, name='created', type='text', notnull=0, default_value=None, is_pk=0, hidden=0), E ? ^^^^ E + Column(cid=1, name='created', type='TEXT', notnull=0, default_value=None, is_pk=0, hidden=0), ... E {'a_href': '/fixtures/sortable?_sort=sortable_with_nulls_2', E 'attrs': {'class': ['col-sortable_with_nulls_2'], E 'data-column': 'sortable_with_nulls_2', E 'data-column-not-null': '0', E - 'data-column-type': 'real', E ? ^^^^ E + 'data-column-type': 'REAL', E ? ^^^^ ``` Something is causing column types to come back in uppercase where previously they were lowercase.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,broccolihighkicks,open,0,,,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron. I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc). I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes). ```python Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ERROR: Exception in ASGI application Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ``` Thanks, I am finding Datasette very useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1152072027,I_kwDOBm6k_c5Eqzlb,1642,Dependency issue with asgiref and uvicorn,9599,simonw,closed,0,,,,,1,2022-02-26T18:00:35Z,2022-03-05T01:11:27Z,2022-03-05T01:11:17Z,OWNER,,"``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. datasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible. ``` That's after I forced an upgrade of `uvicorn` due to this warning: ``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. uvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1063388037,I_kwDOCGYnMM4_YgOF,343,Provide function to generate hash_id from specified columns,82988,psychemedia,closed,0,,,,,4,2021-11-25T10:12:12Z,2022-03-02T04:25:25Z,2022-03-02T04:25:25Z,NONE,,"Hi I note that you define `_hash()` to create a `hash_id` from non-id column values in a table [here](https://github.com/simonw/sqlite-utils/blob/8f386a0d300d1b1c76132bb75972b755049fb742/sqlite_utils/db.py#L2996). It would be useful to be able to call a complementary function to generate a corresponding `_id` from a subset of specified columns when adding items to another table, eg to support the creation of foreign keys. Or is there a better pattern for doing that?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149310456,I_kwDOBm6k_c5EgRX4,1641,Tweak mobile keyboard settings,9599,simonw,open,0,,,,,1,2022-02-24T13:47:10Z,2022-02-24T13:49:26Z,,OWNER,,"https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12 `autocorrect=""off""` is worth experimenting with. Twitter: https://twitter.com/forestgregg/status/1496842959563726852",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1148638868,I_kwDOBm6k_c5EdtaU,1639,Make datasette-redirect-forbidden unneccessary,9599,simonw,open,0,,,,,0,2022-02-23T22:18:46Z,2022-02-23T22:18:46Z,,OWNER,,"I wrote `datasette-redirect-forbidden` today because I needed 403 errors to redirect to `/-/login` and it was the quickest way to solve that problem. This should be a feature of Datasette core. - https://github.com/simonw/datasette-redirect-forbidden/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 677272618,MDU6SXNzdWU2NzcyNzI2MTg=,928,Test failures caused by failed attempts to mock pip,9599,simonw,closed,0,,,,,4,2020-08-11T23:53:18Z,2022-02-23T16:19:47Z,2020-08-12T00:07:49Z,OWNER,,"Errors like this one: https://github.com/simonw/datasette/pull/927/checks?check_run_id=973559696 ``` 2020-08-11T23:36:39.8801334Z =================================== FAILURES =================================== 2020-08-11T23:36:39.8802411Z _________________________________ test_install _________________________________ 2020-08-11T23:36:39.8803242Z 2020-08-11T23:36:39.8804935Z thing = 2020-08-11T23:36:39.8806663Z comp = 'main', import_path = 'pip._internal.cli.main' 2020-08-11T23:36:39.8807696Z 2020-08-11T23:36:39.8808728Z def _dot_lookup(thing, comp, import_path): 2020-08-11T23:36:39.8810573Z try: 2020-08-11T23:36:39.8812262Z > return getattr(thing, comp) 2020-08-11T23:36:39.8817136Z E AttributeError: module 'pip._internal.cli' has no attribute 'main' 2020-08-11T23:36:39.8843043Z 2020-08-11T23:36:39.8855951Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1215: AttributeError 2020-08-11T23:36:39.8873372Z 2020-08-11T23:36:39.8877803Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.8906532Z 2020-08-11T23:36:39.8925767Z def get_src_prefix(): 2020-08-11T23:36:39.8928277Z # type: () -> str 2020-08-11T23:36:39.8930068Z if running_under_virtualenv(): 2020-08-11T23:36:39.8949721Z src_prefix = os.path.join(sys.prefix, 'src') 2020-08-11T23:36:39.8951813Z else: 2020-08-11T23:36:39.8969014Z # FIXME: keep src in cwd for now (it is not a temporary folder) 2020-08-11T23:36:39.9012110Z try: 2020-08-11T23:36:39.9013489Z > src_prefix = os.path.join(os.getcwd(), 'src') 2020-08-11T23:36:39.9014538Z E FileNotFoundError: [Errno 2] No such file or directory 2020-08-11T23:36:39.9016122Z 2020-08-11T23:36:39.9017617Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/site-packages/pip/_internal/locations.py:50: FileNotFoundError 2020-08-11T23:36:39.9018802Z 2020-08-11T23:36:39.9020070Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.9020930Z 2020-08-11T23:36:39.9022275Z args = (), keywargs = {} 2020-08-11T23:36:39.9023183Z 2020-08-11T23:36:39.9024077Z @wraps(func) 2020-08-11T23:36:39.9024984Z def patched(*args, **keywargs): 2020-08-11T23:36:39.9028770Z > with self.decoration_helper(patched, 2020-08-11T23:36:39.9031861Z args, 2020-08-11T23:36:39.9038358Z keywargs) as (newargs, newkeywargs): 2020-08-11T23:36:39.9039654Z 2020-08-11T23:36:39.9040566Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1322: 2020-08-11T23:36:39.9041492Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1142107925,I_kwDOBm6k_c5EEy8V,1638,`filters_from_request` plugin hook docs should mention that returning an async function is allowed,9599,simonw,open,0,,,,,0,2022-02-18T00:08:26Z,2022-02-18T00:08:26Z,,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#filters-from-request-request-database-table-datasette doesn't mention that you can return an `async` function - but you can, and in fact Datasette itself uses that here: https://github.com/simonw/datasette/blob/aa7f0037a46eb76ae6fe9bf2a1f616c58738ecdf/datasette/filters.py#L43-L47",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 335200136,MDU6SXNzdWUzMzUyMDAxMzY=,327,Explore if SquashFS can be used to shrink size of packaged Docker containers,9599,simonw,open,0,,,,,4,2018-06-24T18:15:16Z,2022-02-17T23:37:24Z,,OWNER,,"Inspired by this article: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html#sqlite-database-indexed--squashed https://en.wikipedia.org/wiki/SquashFS is ""a compressed read-only file system for Linux"" - which means it could be a really nice fit for Datasette and its read-only SQLite databases. It would be interesting to explore a Dockerfile recipe that used SquashFS to compress the SQLite database file that was bundled up by `datasette package` and friends.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1125297737,I_kwDOCGYnMM5DEq5J,402,Advanced class-based `conversions=` mechanism,9599,simonw,open,0,,,,,14,2022-02-06T19:47:41Z,2022-02-16T10:18:55Z,,OWNER,,"The `conversions=` parameter works like this at the moment: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions ```python db[""places""].insert( {""name"": ""Wales"", ""geometry"": wkt}, conversions={""geometry"": ""GeomFromText(?, 4326)""}, ) ``` This proposal is to support values in that dictionary that are objects, not strings, which can represent more complex conversions - spun out from #399. New proposed mechanism: ```python from sqlite_utils.utils import LongitudeLatitude db[""places""].insert( { ""name"": ""London"", ""point"": (-0.118092, 51.509865) }, conversions={""point"": LongitudeLatitude}, ) ``` Here `LongitudeLatitude` is a magical value which does TWO things: it sets up the `GeomFromText(?, 4326)` SQL function, and it handles converting the `(51.509865, -0.118092)` tuple into a `POINT({} {})` string. This would involve a change to the `conversions=` contract - where it usually expects a SQL string fragment, but it can also take an object which combines that SQL string fragment with a Python conversion function. Best of all... this resolves the `lat, lon` v.s. `lon, lat` dilemma because you can use `from sqlite_utils.utils import LongitudeLatitude` OR `from sqlite_utils.utils import LatitudeLongitude` depending on which you prefer! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030739566_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1128139375,I_kwDOCGYnMM5DPgpv,405,"`Database(memory_name=""name"")` constructor argument",9599,simonw,closed,0,,,,,2,2022-02-09T07:15:03Z,2022-02-16T01:23:16Z,2022-02-16T01:23:16Z,OWNER,,"SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process. Datasette supports this - SQLite could support it too. https://docs.datasette.io/en/0.60.2/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124237013,I_kwDOCGYnMM5DAn7V,398,Add SpatiaLite helpers to CLI,25778,eyeseast,closed,0,,,,,9,2022-02-04T14:01:28Z,2022-02-16T01:02:29Z,2022-02-16T00:58:07Z,CONTRIBUTOR,,"Now that #385 is merged, add CLI versions of those methods. ```sh # init spatialite sqlite-utils init-spatialite database.db # or maybe/also sqlite-utils create database.db --enable-wal --spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this needs to create a table if it doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column sqlite-utils create-spatial-index database.db table-name geometry ``` Should be mostly straightforward. The one thing worth highlighting in docs is that geometry columns can only be added to existing tables. Trying to add a geometry column to a table that doesn't exist yet might mean you have a schema like `{""rowid"": int, ""geometry"": bytes}`. Might be worth nudging people to explicitly create a table first, then add geometry columns. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 764059235,MDU6SXNzdWU3NjQwNTkyMzU=,1143,"More flexible CORS support in core, to encourage good security practices",114388,yurivish,open,0,,,3268330,Datasette 1.0,6,2020-12-12T17:06:35Z,2022-02-13T17:41:17Z,,NONE,,"It would be nice if the `--cors` option accepted an origin regex to more securely allow secure local development. As an example, Observable notebooks namespace every user's notebooks by their username and user content is served from username.observableusercontent.com, so you would set `--cors-origin username.observableusercontent.com` to restrict access to a local development Datasette instance to only your own notebooks, rather than exposing the data to any website that makes a request. Thank you for all of your work on Datasette!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1128120451,I_kwDOCGYnMM5DPcCD,404,Add example of `--convert` to the help for `sqlite-utils insert`,9599,simonw,closed,0,,,,,2,2022-02-09T06:49:09Z,2022-02-09T06:56:35Z,2022-02-09T06:55:16Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of `--convert` in action. I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike. These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1126692066,I_kwDOCGYnMM5DJ_Ti,403,Document how to add a primary key to a rowid table using `sqlite-utils transform --pk`,536941,fgregg,closed,0,,,,,4,2022-02-08T01:39:40Z,2022-02-09T04:22:43Z,2022-02-08T19:33:59Z,CONTRIBUTOR,,"*Original title: Add option for adding a new, serial, primary key* sometimes we have tables that don't have primary keys, but ought to have them. we *can* use rowid for that, but it would often be nicer to have an explicit primary key. using the current value of rowid would be fine.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779691739,MDU6SXNzdWU3Nzk2OTE3Mzk=,1176,"Policy on documenting ""public"" datasette.utils functions",9599,simonw,closed,0,,,3268330,Datasette 1.0,13,2021-01-05T22:55:25Z,2022-02-07T06:43:32Z,2022-02-07T06:42:58Z,OWNER,,"https://github.com/simonw/datasette-css-properties starts [like this](https://github.com/simonw/datasette-css-properties/blob/0.1/datasette_css_properties/__init__.py#L1-L3): ```python from datasette import hookimpl from datasette.utils.asgi import Response from datasette.utils import escape_css_string, to_css_class ``` `escape_css_string` and `to_css_class` are not documented, which means relying on them is risky since there's no promise that they won't change. Would be good to figure out a policy on this, and maybe promote some of them to ""documented"" status.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1176/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125576543,I_kwDOBm6k_c5DFu9f,1630,Review datasette.utils and decide which functions should be documented for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-02-07T06:39:52Z,2022-02-07T06:39:52Z,,OWNER,,"Follows: - #1176",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 688622148,MDU6SXNzdWU2ODg2MjIxNDg=,957,Simplify imports of common classes,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2020-08-29T23:44:04Z,2022-02-06T06:36:41Z,2022-02-06T06:34:37Z,OWNER,,"There are only a few classes that plugins need to import. It would be nice if these imports were as short and memorable as possible. For example: ```python from datasette.app import Datasette from datasette.utils.asgi import Response ``` Could both become: ```python from datasette import Datasette from datasette import Response ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125081640,I_kwDOCGYnMM5DD2Io,401,Update SpatiaLite example in the documentation,9599,simonw,closed,0,,,,,2,2022-02-06T02:02:07Z,2022-02-06T02:05:03Z,2022-02-06T02:03:24Z,OWNER,,"This one here: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions It should take advantage of the new methods from: - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125077063,I_kwDOCGYnMM5DD1BH,400,`sqlite-utils create-table` ... `--if-not-exists`,9599,simonw,closed,0,,,,,1,2022-02-06T01:32:53Z,2022-02-06T01:34:53Z,2022-02-06T01:34:46Z,OWNER,,"Inspired by: - #397 To match the option on `create-index`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-index ``` --if-not-exists Ignore if index already exists ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087181951,I_kwDOBm6k_c5AzRR_,1576,Traces should include SQL executed by subtasks created with `asyncio.gather`,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2021-12-22T20:52:02Z,2022-02-05T05:21:35Z,2022-02-05T05:19:53Z,OWNER,,"I tried running some parallel SQL queries using `asyncio.gather()` but the SQL that was executed didn't show up in the trace rendered by https://datasette.io/plugins/datasette-pretty-traces I realized that was because traces are keyed against the current task ID, which changes when a sub-task is run using `asyncio.gather` or similar. The faceting and suggest faceting queries are missing from this trace: ![image](https://user-images.githubusercontent.com/9599/147153855-2d611f07-922a-4d18-9e6e-4be89e010dc4.png) > The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25 > > This is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively ""child tasks"" of the current request, and hence should be tracked the same. > > https://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-999870993_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683805434,MDU6SXNzdWU2ODM4MDU0MzQ=,135,Code for finding SpatiaLite in the usual locations,9599,simonw,closed,0,,,,,3,2020-08-21T20:15:34Z,2022-02-05T00:04:26Z,2020-08-21T20:30:13Z,OWNER,,"I built this for `shapefile-to-sqlite` but it would be useful in `sqlite-utils` too: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L16-L19 ```python SPATIALITE_PATHS = ( ""/usr/lib/x86_64-linux-gnu/mod_spatialite.so"", ""/usr/local/lib/mod_spatialite.dylib"", ) ``` https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L105-L109 ```python def find_spatialite(): for path in SPATIALITE_PATHS: if os.path.exists(path): return path return None ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683812642,MDU6SXNzdWU2ODM4MTI2NDI=,136,--load-extension=spatialite shortcut option,9599,simonw,closed,0,,,,,3,2020-08-21T20:31:25Z,2022-02-05T00:04:26Z,2020-10-16T19:14:32Z,OWNER,,In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134),140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723460107,MDU6SXNzdWU3MjM0NjAxMDc=,187,Maybe: Utility method / CLI tool for initializing SpatiaLite,9599,simonw,closed,0,,,,,2,2020-10-16T19:04:03Z,2022-02-05T00:04:26Z,2020-10-16T19:15:13Z,OWNER,,"> I think this should initialize SpatiaLite against the current database if it has not been initialized already. > > Relevant code: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L112-L126",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723708310,MDU6SXNzdWU3MjM3MDgzMTA=,188,About loading spatialite,30607,aborruso,closed,0,,,,,1,2020-10-17T08:47:02Z,2022-02-05T00:04:26Z,2020-10-17T08:52:58Z,NONE,,"Hi @simonw , If I run ``` sqlite3 .load /usr/local/lib/mod_spatialite.so select spatialite_version(); ``` I have `5.0.0`. ![image](https://user-images.githubusercontent.com/30607/96332706-d8cd3300-1065-11eb-906b-daf99963198e.png) If I run ``` sqlite-utils :memory: ""select spatialite_version()"" --load-extension=spatialite ``` I have ``` Traceback (most recent call last): File ""/home/aborruso/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 936, in query _load_extensions(db, load_extension) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 1326, in _load_extensions db.conn.load_extension(ext) TypeError: argument 1 must be str, not None ``` How to load properly spatialite extension in sqlite-utils? Thank you very muc",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert: ```python # Convert to ""Well Known Text"" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", ( ""Wales"", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557842245,MDU6SXNzdWU1NTc4NDIyNDU=,79,Helper methods for working with SpatiaLite,9599,simonw,closed,0,,,,,8,2020-01-31T00:39:19Z,2022-02-05T00:04:25Z,2022-02-04T05:55:11Z,OWNER,,"As demonstrated by this piece of documentation, using SpatiaLite with sqlite-utils requires a fair bit of boilerplate: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/docs/python-api.rst#L880-L909",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 734777631,MDU6SXNzdWU3MzQ3Nzc2MzE=,1080,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them",9599,simonw,open,0,,,3268330,Datasette 1.0,7,2020-11-02T19:55:06Z,2022-02-04T06:25:18Z,,OWNER,,Can use `/database/-/...` namespace from #296,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1080/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1123849278,I_kwDOCGYnMM5C_JQ-,395,"""apt-get: command not found"" error on macOS",9599,simonw,closed,0,,,,,1,2022-02-04T06:03:42Z,2022-02-04T06:10:58Z,2022-02-04T06:10:58Z,OWNER,,"Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123851690,I_kwDOCGYnMM5C_J2q,396,"mypy failure, sqlite_utils/utils.py:56",9599,simonw,closed,0,,,,,0,2022-02-04T06:08:09Z,2022-02-04T06:10:33Z,2022-02-04T06:10:33Z,OWNER,,"https://github.com/simonw/sqlite-utils/runs/5062725880?check_suite_focus=true > `sqlite_utils/utils.py:56: error: Incompatible return value type (got ""None"", expected ""str"")`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072792507,I_kwDOCGYnMM4_8YO7,352,`sqlite-utils insert --extract colname`,9599,simonw,open,0,,,,,4,2021-12-07T00:55:44Z,2022-02-03T22:59:36Z,,OWNER,,"Is there a reason I've not added `--extract` as an option for `sqlite-utils insert` next? There's a `extracts=` option for the various `table.insert()` etc methods - last line in this code block: https://github.com/simonw/sqlite-utils/blob/213a0ff177f23a35f3b235386366ff132eb879f1/sqlite_utils/db.py#L2483-L2495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1094981339,I_kwDOCGYnMM5BRBbb,363,Better error message if `--convert` code fails to return a dict,9599,simonw,closed,0,,,,,4,2022-01-06T05:26:28Z,2022-02-03T22:52:30Z,2022-02-03T22:51:30Z,OWNER,,"Here's the traceback if your `--convert` function doesn't return a dict right now: ``` % sqlite-utils insert /tmp/all.db blah /tmp/log.log --convert 'all.upper()' --all Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 949, in insert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 834, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2602, in insert_all first_record = next(records) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 3044, in fix_square_braces for record in records: File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 831, in docs = (decode_base64_values(doc) for doc in docs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 86, in decode_base64_values to_fix = [ File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 89, in if isinstance(doc[k], dict) TypeError: string indices must be integers ``` It would be nicer if that returned a more useful error message. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/361#issuecomment-1006295276_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1118585417,I_kwDOCGYnMM5CrEJJ,393,Better documentation for insert-replace,9599,simonw,closed,0,,,,,1,2022-01-30T15:40:23Z,2022-02-03T22:13:24Z,2022-02-03T22:13:24Z,OWNER,,"Currently: https://sqlite-utils.datasette.io/en/stable/python-api.html#insert-replacing-data > If you want to insert a record or replace an existing record with the same primary key, using the replace=True argument to .insert() or .insert_all(): Should describe the exception you get first, then how to use replace to avoid it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097091527,I_kwDOCGYnMM5BZEnH,369,Research how much of a difference analyze / sqlite_stat1 makes,9599,simonw,closed,0,,,,,11,2022-01-09T03:03:36Z,2022-02-03T21:07:41Z,2022-02-03T21:07:35Z,OWNER,,"> Is there a downside to having a `sqlite_stat1` table if it has wildly incorrect statistics in it? _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008163050_ More generally: how much of a difference does the `sqlite_stat1` table created by `ANALYZE` make to queries? I'm particularly interested in `group by` / `count *` queries since Datasette uses those for faceting.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122446693,I_kwDOCGYnMM5C5y1l,394,Test against Python 3.11-dev,9599,simonw,open,0,,,,,1,2022-02-02T22:21:03Z,2022-02-03T21:06:35Z,,OWNER,,"Same as: - https://github.com/simonw/datasette/issues/1621",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122557010,I_kwDOBm6k_c5C6NxS,1627,Get the tests passing against Windows,9599,simonw,open,0,,,,,0,2022-02-03T01:23:06Z,2022-02-03T01:23:32Z,,OWNER,,"> OK, the tests do NOT pass against Windows! https://github.com/simonw/datasette/runs/5044105941 > > _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1626#issuecomment-1028515161_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122450452,I_kwDOBm6k_c5C5zwU,1625,Try running tests against macOS and Windows in addition to Ubuntu,9599,simonw,open,0,,,,,0,2022-02-02T22:25:57Z,2022-02-02T22:25:57Z,,OWNER,,"I already do this for `sqlite-utils`: https://github.com/simonw/sqlite-utils/blob/3.22.1/.github/workflows/test.yml Related: - #1617 - #1545",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1121121305,I_kwDOBm6k_c5C0vQZ,1618,"Reconsider policy on blocking queries containing the string ""pragma""",770231,strada,open,0,,,,,6,2022-02-01T19:39:46Z,2022-02-02T19:42:03Z,,NONE,,"First of all, thanks for creating this cool project, and also supporting publishing to various hosting services out of the box. While testing out, I noticed legitimate queries such as ``` select * from books where title like 'Pragmatic%' ``` or ``` select * from books where title = 'The Pragmatic Programmer' ``` are blocked, due to the regular expression check here: https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L185 Example as seen from a Datasette instance: https://fivethirtyeight.datasettes.com/polls?sql=select+*+from+books+where+title+like+%27Pragmatic%25%27%0D%0A I'd propose a regular expression like ``` re.compile(f""pragma_(?!({'|'.join(allowed_pragmas)}))""), ``` instead of ``` re.compile(f""pragma(?!_({'|'.join(allowed_pragmas)}))""), ``` I can create a pull request with this change, unless the maintainers think it would allow unwanted queries to be executed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` Looks like running `count(*)` against KNN took 83s! It ignored the time limit. And still only returned a count of 0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114640101,I_kwDOCGYnMM5CcA7l,392,`sqlite-utils bulk --batch-size` option,9599,simonw,closed,0,,,,,4,2022-01-26T05:17:11Z,2022-01-26T18:17:59Z,2022-01-26T18:17:59Z,OWNER,,"> Could add support for `--batch-size` as seen in `insert`/`upsert` too - causing it to break the list up into batches and commit for each one. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/391#issuecomment-1021876055_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114628238,I_kwDOBm6k_c5Cb-CO,1613,Improvements to help make Datasette a better tool for learning SQL,9599,simonw,open,0,,,,,5,2022-01-26T04:56:07Z,2022-01-26T16:41:46Z,,OWNER,,Tracking issue for the general goal of making Datasette a better tool for learning SQL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1613/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114638930,I_kwDOCGYnMM5CcApS,391,`sqlite-utils bulk` progress bar,9599,simonw,closed,0,,,,,2,2022-01-26T05:14:49Z,2022-01-26T05:17:20Z,2022-01-26T05:16:51Z,OWNER,,"It can easily have a progress bar because it works by looping through an iterator: https://github.com/simonw/sqlite-utils/blob/a9fca7efa4184fbb2a65ca1275c326950ed9d3c1/sqlite_utils/cli.py#L1014-L1018 Should also support the `--silent` option if I add this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114557284,I_kwDOCGYnMM5Cbstk,390,`sqlite-utils upsert` should require `--pk` more elegantly,9599,simonw,closed,0,,,,,1,2022-01-26T02:20:31Z,2022-01-26T03:20:25Z,2022-01-26T03:19:43Z,OWNER,,"Currently throws an ugly traceback: ``` % echo '[ {""id"": 1, ""name"": ""Lila""}, {""id"": 1, ""name"": ""Lila""} ]' | sqlite-utils upsert data.db chickens - Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 1104, in upsert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 906, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2615, in insert_all raise PrimaryKeyRequired(""upsert() requires a pk"") sqlite_utils.db.PrimaryKeyRequired: upsert() requires a pk ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099897648,I_kwDOCGYnMM5Bjxsw,384,Add examples to every `--help`,9599,simonw,closed,0,,,,,0,2022-01-12T05:31:25Z,2022-01-26T03:15:02Z,2022-01-26T03:15:02Z,OWNER,,Everything on https://sqlite-utils.datasette.io/en/stable/cli-reference.html would benefit from an example.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114544727,I_kwDOCGYnMM5CbppX,389,Plausible analytics for documentation,9599,simonw,closed,0,,,,,2,2022-01-26T01:58:35Z,2022-01-26T02:07:41Z,2022-01-26T02:07:41Z,OWNER,,"```html ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/388#issuecomment-1021785268_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1111293050,I_kwDOCGYnMM5CPPx6,387,Python library docs should start with a self contained example,9599,simonw,closed,0,,,,,1,2022-01-22T06:23:56Z,2022-01-26T01:37:17Z,2022-01-26T01:35:30Z,OWNER,,You have to read a lot of stuff in a lot of different places to get started with the Python library. Add a getting started introduction to https://sqlite-utils.datasette.io/en/stable/python-api.html,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6* > `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/ > > Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days? > > Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109884720,I_kwDOBm6k_c5CJ38w,1609,"Ensure ""pip install datasette"" still works with Python 3.6",9599,simonw,closed,0,,,,,12,2022-01-21T00:08:10Z,2022-01-24T19:20:09Z,2022-01-21T02:24:13Z,OWNER,,"## Original title: Can I keep ""pip install datasette"" working on Python 3.6? I dropped support for 3.6 in: - #1577 I'm getting reports that `pip3 install datasette` throws an error on that Python, even though I haven't made that new release yet - presumably due to lack of pinning of Uvicorn: https://twitter.com/ldodds/status/1484289475195080706 Is it possible to get `pip` on that version of Python to install the highest possible version of the packages that are still known to support Python 3.6? If so, how?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1609/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1105916061,I_kwDOBm6k_c5B6vCd,1601,Add KNN and data_licenses to hidden tables list,25778,eyeseast,closed,0,,,,,5,2022-01-17T14:19:57Z,2022-01-20T21:29:44Z,2022-01-20T04:38:54Z,CONTRIBUTOR,,"They're generated by Spatialite and not very interesting in most cases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 824064069,MDU6SXNzdWU4MjQwNjQwNjk=,1249,Updated Dockerfile with SpatiaLite version 5.0,9599,simonw,closed,0,,,,,45,2021-03-08T00:17:36Z,2022-01-20T21:29:43Z,2021-03-29T00:57:13Z,OWNER,,"The version bundled in Datasette's Docker image right now is 4.4.0-RC0 https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/Dockerfile#L16-L17 5 has been out for a couple of months and has a bunch of big improvements, most notable stable KNN support.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 837308703,MDU6SXNzdWU4MzczMDg3MDM=,1268,Figure out why SpatiaLite 5.0 hangs the database page on Linux,9599,simonw,closed,0,,,,,18,2021-03-22T04:44:16Z,2022-01-20T21:29:43Z,2021-03-22T17:41:12Z,OWNER,,"See detailed notes in https://github.com/simonw/datasette/issues/1249 - for some reason SpatiaLite 5.0 hangs the `/dbname` page on Linux (inside Docker containers, both with a custom compiled SpatiaLite and one installed from the Ubuntu 20.10 package repository). This doesn't happen on macOS with SpatiaLite 5 installed using Homebrew.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842416110,MDU6SXNzdWU4NDI0MTYxMTA=,1278,SpatiaLite timezones demo is broken,9599,simonw,closed,0,,,,,2,2021-03-27T04:45:27Z,2022-01-20T21:29:43Z,2021-03-27T16:17:13Z,OWNER,,https://github.com/simonw/datasette/blob/5fd02890650db790b2ffdb90eb9f78f8e0639c37/docs/spatialite.rst#L96,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection: ```pycon >>> from datasette.app import Datasette >>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""]) >>> db = ds.add_memory_database('geo') >>> await db.execute_write(""select InitSpatialMetadata(1)"") UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f') no such function: InitSpatialMetadata ``` It looks like the code that loads additional modules only works on read-only connections, not on write connections: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153 Compared to: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752966476,MDU6SXNzdWU3NTI5NjY0NzY=,1114,--load-extension=spatialite not working with datasetteproject/datasette docker image,2182,danp,closed,0,,,,,4,2020-11-29T17:35:20Z,2022-01-20T21:29:42Z,2020-11-29T17:37:45Z,CONTRIBUTOR,,"https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places: https://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60 However, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`. This results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite Error: Could not find SpatiaLite extension ``` But it does work when given an explicit path: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit) ... ``` Perhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752995227,MDU6SXNzdWU3NTI5OTUyMjc=,1115,SpatiaLite error could suggest --load-extension=spatialite,9599,simonw,closed,0,,,,,1,2020-11-29T20:05:07Z,2022-01-20T21:29:42Z,2020-11-29T20:13:22Z,OWNER,,"https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/cli.py#L533-L548 This could use the `find_spatialite()` function and, if it finds something, suggest the user use `--load-extension=spatialite` https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/utils/__init__.py#L1015-L1019",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443040665,MDU6SXNzdWU0NDMwNDA2NjU=,466,"Move ""no such module: VirtualSpatialIndex"" code elsewhere",9599,simonw,closed,0,,,4305096,0.28,2,2019-05-11T22:09:00Z,2022-01-20T21:29:41Z,2019-05-11T22:57:22Z,OWNER,,"We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module: https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554 This code is part of `.inspect()` which is going away - see #462 - so I need to find somewhere else for it to live.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563347679,MDU6SXNzdWU1NjMzNDc2Nzk=,668,Make it easier to load SpatiaLite,9599,simonw,closed,0,,,,,2,2020-02-11T17:03:43Z,2022-01-20T21:29:41Z,2021-01-04T20:18:39Z,OWNER,,"``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn=, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: ``` datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib ``` Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/668/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723803777,MDU6SXNzdWU3MjM4MDM3Nzc=,1028,--load-extension=spatialite shortcut,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-17T17:02:08Z,2022-01-20T21:29:41Z,2020-10-19T22:37:55Z,OWNER,,I added this to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/136 and I really like it: pass a special value of `spatialite` and Datasette should attempt to load it from known likely installation locations.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1028/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336936010,MDU6SXNzdWUzMzY5MzYwMTA=,331,Datasette throws error when loading spatialite db without extension loaded,82988,psychemedia,closed,0,,,,,2,2018-06-29T09:51:14Z,2022-01-20T21:29:40Z,2018-07-10T15:13:36Z,CONTRIBUTOR,,"When starting datasette on a SpatialLite database *without* loading the SpatiaLite extension (using eg `--load-extension=/usr/local/lib/mod_spatialite.dylib`) an error is thrown and the server fails to start: ``` datasette -p 8003 adminboundaries.db Serve! files=('adminboundaries.db',) on port 8003 Traceback (most recent call last): File ""/Users/ajh59/anaconda3/bin/datasette"", line 11, in sys.exit(cli()) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py"", line 552, in serve ds.inspect() File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py"", line 273, in inspect ""tables"": inspect_tables(conn, self.metadata.get(""databases"", {}).get(name, {})) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py"", line 79, in inspect_tables ""PRAGMA table_info({});"".format(escape_sqlite(table)) sqlite3.OperationalError: no such module: VirtualSpatialIndex ``` It would be nice to trap this and return a message saying something like: ``` It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328155946,MDU6SXNzdWUzMjgxNTU5NDY=,301,"--spatialite option for ""datasette publish heroku""",9599,simonw,open,0,,,,,1,2018-05-31T14:13:09Z,2022-01-20T21:28:50Z,,OWNER,,Split off from #243. Need to figure out how to install and configure SpatiaLite on Heroku.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 838382890,MDU6SXNzdWU4MzgzODI4OTA=,1273,Refresh SpatiaLite documentation,9599,simonw,open,0,,,,,4,2021-03-23T06:05:55Z,2022-01-20T21:28:50Z,,OWNER,,https://docs.datasette.io/en/0.55/spatialite.html was written before I had tools like [geojson-to-sqlite](https://datasette.io/tools/geojson-to-sqlite) and [shapefile-to-sqlite](https://datasette.io/tools/shapefile-to-sqlite).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849975810,MDU6SXNzdWU4NDk5NzU4MTA=,1292,Research ctypes.util.find_library('spatialite'),9599,simonw,open,0,,,,,1,2021-04-04T22:36:59Z,2022-01-20T21:28:50Z,,OWNER,,"Spotted this in the Django SpatiaLite backend: https://github.com/django/django/blob/8f6a7a0e9e7c5404af6520ae606927e32415eb00/django/contrib/gis/db/backends/spatialite/base.py#L24-L36 ```python ctypes.util.find_library('spatialite') ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1292/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1104691662,I_kwDOBm6k_c5B2EHO,1600,plugins --all example should use cog,9599,simonw,closed,0,,,,,1,2022-01-15T11:47:49Z,2022-01-20T05:06:21Z,2022-01-20T05:04:16Z,OWNER,,The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table > > This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 910092577,MDU6SXNzdWU5MTAwOTI1Nzc=,1356,"Research: syntactic sugar for using --get with SQL queries, maybe ""datasette query""",9599,simonw,open,0,,,,,10,2021-06-03T04:49:42Z,2022-01-20T01:06:37Z,,OWNER,,"Inspired by https://github.com/simonw/sqlite-utils/issues/264 - in particular this example: ``` datasette covid.db --get='/covid.yaml?sql=select * from ny_times_us_counties limit 1' - date: '2020-01-21' county: Snohomish state: Washington fips: 53061 cases: 1 deaths: 0 ``` Having to construct that URL - including potentially URL escaping the SQL query - isn't a great developer experience. Imagine if you could do this instead: datasette covid.db --query ""select * from ny_times_us_counties limit 1"" --format yaml ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108300685,I_kwDOBm6k_c5CD1ON,1604,Option to assign a domain/subdomain using `datasette publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-19T16:21:17Z,2022-01-19T16:23:54Z,,OWNER,,Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1107557831,I_kwDOCGYnMM5CA_3H,386,"Better ""contributing"" documentation",9599,simonw,closed,0,,,,,0,2022-01-19T02:11:48Z,2022-01-19T02:15:21Z,2022-01-19T02:15:21Z,OWNER,,"This page jumps straight into running the tests: https://sqlite-utils.datasette.io/en/latest/contributing.html It should add a little more about expected collaboration styles - opening an issue before filing a pull request - and probably link to https://simonwillison.net/2022/Jan/12/how-i-build-a-feature/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/386/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1100015398,I_kwDOBm6k_c5BkOcm,1591,Maybe let plugins define custom serve options?,9599,simonw,open,0,,,,,7,2022-01-12T08:18:47Z,2022-01-15T11:56:59Z,,OWNER,,"https://twitter.com/psychemedia/status/1481171650934714370 > can extensions be passed their own cli args? eg `--ext-tiddlywiki-dbname tiddlywiki2.sqlite` ? I've thought something like this might be useful for other plugins in the past, too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102966378,I_kwDOBm6k_c5Bve5q,1599,Add architecture documentation,9599,simonw,open,0,,,,,0,2022-01-14T04:55:38Z,2022-01-14T04:56:03Z,,OWNER,,"Inspired by https://matklad.github.io/2021/02/06/ARCHITECTURE.md.html Good example: https://github.com/rust-analyzer/rust-analyzer/blob/d7c99931d05e3723d878bea5dc26766791fa4e69/docs/dev/architecture.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099723916,I_kwDOBm6k_c5BjHSM,1590,Table+query JSON and CSV links broken when using `base_url` setting,1001306,eelkevdbos,closed,0,,,7571612,Datasette 0.60,11,2022-01-11T23:46:39Z,2022-01-14T01:16:34Z,2022-01-14T01:16:08Z,NONE,,"Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set. In the follow asgi example, I'm hosting a custom Datasette instance: ```python # asgi.py import pathlib from asgi_cors import asgi_cors from channels.routing import URLRouter from django.urls import re_path from datasette.app import Datasette datasette_ = Datasette( files=[], settings={ ""base_url"": ""/datasettes/"", ""plugins"": {} }, config_dir=pathlib.Path('.'), ) application = URLRouter([ re_path(r""^datasettes/.*"", asgi_cors(datasette_.app(), allow_all=True)), ]) ``` Running it with: ```shell $ daphne -p 8002 asgi:application ``` Using a simple query on the `_memory` table: ```sql select sqlite_version() ``` http://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29 It renders the following upon inspection: ![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png) I am using datasette version `0.59.4`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059555791,I_kwDOBm6k_c4_J4nP,1527,Columns starting with an underscore behave poorly in filters,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-11-22T01:01:36Z,2022-01-14T00:57:08Z,2022-01-14T00:57:08Z,OWNER,,"Similar bug to #1525 (and #1506 before it). Start on https://latest.datasette.io/fixtures/facetable?_facet=_neighborhood - then select a neighborhood - then try to remove that filter using the little ""x"" and submitting the form again. ![filter-bug](https://user-images.githubusercontent.com/9599/142786754-31d265a2-944d-4ea2-af6f-305d445a2ccb.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102637351,I_kwDOBm6k_c5BuOkn,1598,Replace update-docs-help.py script with cog,9599,simonw,closed,0,,,,,1,2022-01-14T00:33:27Z,2022-01-14T00:47:57Z,2022-01-14T00:47:57Z,OWNER,,"I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism: https://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help ``` Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. ``` ``` Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension TEXT Path to a SQLite extension to load --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102568047,I_kwDOBm6k_c5Bt9pv,1596,Documentation page warning of changes coming in 1.0,9599,simonw,open,0,,,,,0,2022-01-13T23:26:04Z,2022-01-13T23:26:04Z,,OWNER,,I should start this relatively soon.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518. ``` % curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 '' <head> <title>fixtures: facetable: 14 rows where where on_earth = 1 sorted by pk ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991467558,MDU6SXNzdWU5OTE0Njc1NTg=,1466,Add Datasette Desktop to installation documentation,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-09-08T19:41:27Z,2022-01-13T22:28:28Z,2022-01-13T21:55:18Z,OWNER,,See https://datasette.io/desktop and https://simonwillison.net/2021/Sep/8/datasette-desktop/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087931918,I_kwDOBm6k_c5A2IYO,1579,`.execute_write(... block=True)` should be the default behaviour,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-12-23T18:54:28Z,2022-01-13T22:28:08Z,2021-12-23T19:18:26Z,OWNER,,"Every single piece of code I've written against the write APIs has used the `block=True` option to wait for the result. Without that, it instead fires the write into the queue but then continues even before it has finished executing. `block=True` should clearly be the default behaviour here!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name): `` My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`): `` So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication. Thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083927147,I_kwDOBm6k_c5Am2pr,1571,Track number of executions for execute_write_many() in traces,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T19:16:17Z,2022-01-13T22:27:49Z,2021-12-19T20:30:40Z,OWNER,,"Spotted while working on #1555 There's no indication there of how many times `execute_write_many()` executed the SQL. Solving this is a tiny bit tricky because `params_seq` is an iterator that we don't want to exhaust before passing it to `conn.executemany()` - so we need to instead wrap it in something that counts how many times it was called. But then we need a way to attach that to the trace here: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/database.py#L115-L122 So probably need to redesign the `trace()` decorator to allow extra pairs to be attached to it within the `with` statement. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084007781,I_kwDOBm6k_c5AnKVl,1572,"""Query took"" should be ""Queries took""",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-19T04:03:00Z,2022-01-13T22:27:43Z,2021-12-19T04:03:24Z,OWNER,,"This is misleading, since usually there have been more than one query executed: ![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods: > > - `db.execute_write(sql, params=None, block=False)` > - `db.execute_write_script(sql, block=False)` > - `db.execute_write_many(sql, params_seq, block=False)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1 I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc ```sql SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns' FROM sqlite_schema AS m, pragma_index_list(m.name) AS il, pragma_index_info(il.name) AS ii WHERE m.type='table' ORDER BY 1; ``` https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this: ```diff diff --git a/datasette/database.py b/datasette/database.py index 468e936..1a424f5 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -94,10 +94,14 @@ class Database: f""file:{self.path}{qs}"", uri=True, check_same_thread=False ) - async def execute_write(self, sql, params=None, block=False): + async def execute_write(self, sql, params=None, executescript=False, block=False): + assert not executescript and params, ""Cannot use params with executescript=True"" def _inner(conn): with conn: - return conn.execute(sql, params or []) + if executescript: + return conn.executescript(sql) + else: + return conn.execute(sql, params or []) with trace(""sql"", database=self.name, sql=sql.strip(), params=params): results = await self.execute_write_fn(_inner, block=block) ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon >>> ds = Datasette(memory=True) Traceback (most recent call last): File """", line 1, in TypeError: __init__() missing 1 required positional argument: 'files' >>> ds = Datasette(memory=True, files=[]) ``` I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083718998,I_kwDOBm6k_c5AmD1W,1567,Remove undocumented sqlite_functions mechanism,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T01:51:10Z,2022-01-13T22:27:04Z,2021-12-18T01:54:46Z,OWNER,,"I added this in 0b8c1b0a6da9cb8ac0d28cc90dd783de87554036 but it's never been documented and the same thing can now be achieved using the `prepare_connection` plugin hook. https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L262 https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L551-L552 It's used here in the tests: https://github.com/simonw/datasette/blob/69244a617b1118dcbd04a8f102173f04680cf08c/tests/fixtures.py#L156",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520740741,MDU6SXNzdWU1MjA3NDA3NDE=,625,If you apply ?_facet_array=tags then &_facet=tags does nothing,9599,simonw,closed,0,,,7571612,Datasette 0.60,13,2019-11-11T04:59:29Z,2022-01-13T22:26:58Z,2021-12-16T20:12:22Z,OWNER,,"Start here: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags Note that `tags` is offered as a suggested facet. But if you click that you get this: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags&_facet=tags The `_facet=tags` is added to the URL and it's removed from the list of suggested tags... but the facet itself is not displayed: The `_facet=tags` facet should look like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082564912,I_kwDOBm6k_c5AhqEw,1557,`?_nosuggest=1` parameter for disabling facet suggestions on table view,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-16T19:21:42Z,2022-01-13T22:26:48Z,2021-12-16T19:24:59Z,OWNER,,"Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1078702875,I_kwDOBm6k_c5AS7Mb,1552,Allow to set `facets_array` in metadata (like current `facets`),3556,davidbgk,closed,0,,,7571612,Datasette 0.60,9,2021-12-13T16:00:44Z,2022-01-13T22:26:15Z,2021-12-16T18:47:48Z,CONTRIBUTOR,,"For now, you can set a `facets` value (array) in your metadata file but I couldn't find a way to set a `facets_array` in order to provide default facets for arrays (like tags). My use-case is to access to [that kind of view](https://latest.datasette.io/fixtures/facetable?_facet_array=tags) by default without URL's parameters as with other default facets. _I'm new to datasette, and I'm willing to help with a PR if that is not already implemented and I missed it!_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1081318247,I_kwDOBm6k_c5Ac5tn,1556,"Show count of facet values always, not just for `?_facet_size=max`",9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-15T17:49:01Z,2022-01-13T22:26:07Z,2021-12-15T17:58:06Z,OWNER,,"> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1556/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777140799,MDU6SXNzdWU3NzcxNDA3OTk=,1166,Adopt Prettier for JavaScript code formatting,9599,simonw,open,0,,,,,10,2020-12-31T21:25:27Z,2022-01-13T22:22:18Z,,OWNER,,https://prettier.io/ - I'm going to go with 2 spaces.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1166/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 520655983,MDU6SXNzdWU1MjA2NTU5ODM=,619,"""Invalid SQL"" page should let you edit the SQL",9599,simonw,closed,0,,,,,14,2019-11-10T20:54:12Z,2022-01-13T22:21:42Z,2021-06-02T04:15:54Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/619/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642651572,MDU6SXNzdWU2NDI2NTE1NzI=,860,Plugin hook for instance/database/table metadata,9599,simonw,closed,0,,,,,10,2020-06-21T22:20:25Z,2022-01-13T22:21:42Z,2021-06-26T22:56:28Z,OWNER,,"I'm not happy with how `metadata.(json|yaml)` keeps growing new features. Rather than having a single plugin hook for all of `metadata.json` I'm going to split out the feature that shows actual real metadata for tables and databases - `source`, `license` etc - into its own plugin-powered mechanism. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/357#issuecomment-647189045_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/860/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681334912,MDU6SXNzdWU2ODEzMzQ5MTI=,942,Support column descriptions in metadata.json,9599,simonw,closed,0,,,,,18,2020-08-18T20:52:00Z,2022-01-13T22:21:42Z,2021-08-12T23:53:24Z,OWNER,,"Could look something like this: ```json { ""title"": ""Five Thirty Eight"", ""license"": ""CC Attribution 4.0 License"", ""license_url"": ""https://creativecommons.org/licenses/by/4.0/"", ""source"": ""fivethirtyeight/data on GitHub"", ""source_url"": ""https://github.com/fivethirtyeight/data"", ""databases"": { ""fivethirtyeight"": { ""tables"": { ""mueller-polls/mueller-approval-polls"": { ""description_html"": ""

....

"", ""columns"": { ""name_of_column"": ""column_description goes here"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/942/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 712202333,MDU6SXNzdWU3MTIyMDIzMzM=,982,"SQL editor should allow execution of write queries, if you have permission",9599,simonw,open,0,,,,,2,2020-09-30T19:04:35Z,2022-01-13T22:21:29Z,,OWNER,,"The `datasette-write` plugin provides this at the moment https://github.com/simonw/datasette-write - but it feels like it should be a built-in capability, protected by a default permission. UI concept: if you have write permission then the existing SQL editor gets an ""execute write"" checkbox underneath it. JavaScript can spot if you appear to be trying to execute an UPDATE or INSERT or DELETE query and check that checkbox for you. If you link to a query page with a non-SELECT then that query will be displayed in the box ready for you to POST submit it. The page will also then get ""cannot be embedded"" headers to protect against clickjacking.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/982/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 733999615,MDU6SXNzdWU3MzM5OTk2MTU=,1079,Handle long breadcrumbs better with new menu,9599,simonw,open,0,,,,,1,2020-11-01T15:57:41Z,2022-01-13T22:21:29Z,,OWNER,,"On this page when signed in as root: https://latest.datasette.io/fixtures/roadside_attraction_characteristics/1 ![EF921CB1-625F-4D04-A850-490B812A72B3](https://user-images.githubusercontent.com/9599/97807807-db0fbf80-1c17-11eb-9c77-ae5169b12c3d.jpeg) ![A49D8B76-5ACF-4F71-A8B4-21A44F5C8D51](https://user-images.githubusercontent.com/9599/97807809-dea34680-1c17-11eb-9511-a49af56a4bd2.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 776634318,MDU6SXNzdWU3NzY2MzQzMTg=,1164,Mechanism for minifying JavaScript that ships with Datasette,9599,simonw,open,0,,,,,9,2020-12-30T20:59:06Z,2022-01-13T22:21:29Z,,OWNER,,"> If I'm going to minify it I'll need to figure out a build step in Datasette itself so that I can easily work on that minified version. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/983#issuecomment-752748496_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 776635426,MDU6SXNzdWU3NzY2MzU0MjY=,1165,Mechanism for executing JavaScript unit tests,9599,simonw,open,0,,,,,9,2020-12-30T21:02:34Z,2022-01-13T22:21:29Z,,OWNER,,"> I'm going to need to add JavaScript unit tests for this new plugin system. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/983#issuecomment-752757289_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323718842,MDU6SXNzdWUzMjM3MTg4NDI=,268,Mechanism for ranking results from SQLite full-text search,9599,simonw,open,0,,,,,12,2018-05-16T17:36:40Z,2022-01-13T22:21:28Z,,OWNER,,This isn't particularly straight-forward - all the more reason for Datasette to implement it for you. This article is helpful: http://charlesleifer.com/blog/using-sqlite-full-text-search-with-python/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 707849175,MDU6SXNzdWU3MDc4NDkxNzU=,974,static assets and favicon aren't cached by the browser,45416,obra,open,0,,,,,1,2020-09-24T04:44:55Z,2022-01-13T22:21:28Z,,NONE,,"Using datasette to solve some frustrating problems with our fulfillment provider today, I was surprised to see repeated requests for assets under /-/static and the favicon. While it won't likely be a huge performance bottleneck, I bet datasette would feel a bit zippier if you had Uvicorn serving up some caching-related headers telling the browser it was safe to cache static assets.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/974/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 895686039,MDU6SXNzdWU4OTU2ODYwMzk=,1336,Document turning on WAL for live served SQLite databases,9599,simonw,open,0,,,,,1,2021-05-19T17:08:58Z,2022-01-13T21:55:59Z,,OWNER,,"Datasette docs don't talk about WAL yet, which allows you to safely serve reads from a database file while it is accepting writes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1067771698,I_kwDOCGYnMM4_pOcy,348,Command for creating an empty database,9599,simonw,closed,0,,,7558727,3.21,6,2021-11-30T23:24:27Z,2022-01-13T07:06:59Z,2022-01-09T20:33:20Z,OWNER,,"I sometimes find the need to create an empty SQLite database file - for example if I want to enable WAL on it before using it with another script. I currently do that like this: sqlite3 my.db vacuum sqlite-utils enable-wal my.db It would be nice if `sqlite-utils` had a convenience command for doing this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1100499619,I_kwDOBm6k_c5BmEqj,1592,Row pages should show links to foreign keys,9599,simonw,open,0,,,,,1,2022-01-12T15:50:20Z,2022-01-12T15:52:17Z,,OWNER,,Refs #1518 refactor.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1592/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1099584685,I_kwDOCGYnMM5BilSt,381,`sqlite-utils rows` options `--limit` and `--offset`,9599,simonw,closed,0,,,,,2,2022-01-11T20:23:12Z,2022-01-11T23:33:37Z,2022-01-11T23:19:36Z,OWNER,,Because I often want to use it just to preview a few rows from the database. Piping through `| head -n 20` works for JSON and CSV (they stream) but not for `--table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099585611,I_kwDOCGYnMM5BilhL,382,`--where` option for `sqlite-rows`,9599,simonw,closed,0,,,,,1,2022-01-11T20:24:23Z,2022-01-11T23:33:14Z,2022-01-11T23:32:47Z,OWNER,,CLI equivalent of `table.rows_where()` - should accept parameters too. Work on this at the same time as #381.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099586786,I_kwDOCGYnMM5Bilzi,383,Add documentation page with the output of `--help`,9599,simonw,closed,0,,,,,4,2022-01-11T20:25:58Z,2022-01-11T22:55:05Z,2022-01-11T21:44:05Z,OWNER,,"Can be maintained using `cog` from #373. Similar in purpose to the API reference page, but this is for the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096558279,I_kwDOCGYnMM5BXCbH,365,create-index should run analyze after creating index,536941,fgregg,closed,0,,,7558727,3.21,16,2022-01-07T18:21:25Z,2022-01-11T02:43:34Z,2022-01-11T01:36:48Z,CONTRIBUTOR,,"sqlite's query planner depends upon analyze to make good use of indices. It would be nice if analyze was run as part of the create-index command. If data is inserted later, things can get out date, but it would still probably be a net win. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098574572,I_kwDOCGYnMM5Beurs,380,Release notes for 3.21,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-11T02:12:30Z,2022-01-11T02:34:26Z,2022-01-11T02:34:26Z,OWNER,,For these commits: https://github.com/simonw/sqlite-utils/compare/3.20...129141572f249ea290e2a075437e2ebaad215859,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/380/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098544628,I_kwDOCGYnMM5BenX0,379,CLI options for running ANALYZE,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-11T01:09:16Z,2022-01-11T01:38:01Z,2022-01-11T01:36:48Z,OWNER,,"> The Python methods are all done now, next step is the CLI options. I'll do those in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009508865_ - [x] `sqlite-utils analyze` command - [x] `sqlite-utils create-index --analyze` option (see #365) - [x] `sqlite-utils insert --analyze` option - [x] `sqlite-utils upsert --analyze` option In #378 I also added `.delete_where(..., analyze=True)` but there isn't currently a `sqlite-utils delete-where` CLI command - deletions via CLI are expected to be handled using SQL queries.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1096563265,I_kwDOCGYnMM5BXDpB,366,Python library methods for calling ANALYZE,9599,simonw,closed,0,,,7558727,3.21,10,2022-01-07T18:28:01Z,2022-01-11T01:09:33Z,2022-01-11T01:09:33Z,OWNER,,"> Relevant documentation: https://www.sqlite.org/lang_analyze.html _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1007633376_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098309897,I_kwDOCGYnMM5BduEJ,378,analyze=True parameter for some methods,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T19:54:52Z,2022-01-11T01:08:11Z,2022-01-11T01:08:09Z,OWNER,,"This would cause `ANALYZE` to be run against the relevant table at the end of executing the method. > Having browsed the API reference I think the methods that would benefit from an `analyze=True` parameter are: - [x] `table.create_index` - [x] `table.insert_all` - [x] `table.upsert_all` - [x] `table.delete_where` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009288898_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/378/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097436959,I_kwDOCGYnMM5BaY8f,376,`--nl` mode should ignore blank lines,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T04:10:54Z,2022-01-10T19:27:41Z,2022-01-10T04:12:46Z,OWNER,,Spotted this while manually testing #364 - there's no reason `--nl` should crash if you feed it an empty line in between JSON objects.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097129710,I_kwDOCGYnMM5BZN7u,372,Idea: `suffix` and `stem` file columns,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-09T07:48:53Z,2022-01-10T19:27:34Z,2022-01-09T20:17:00Z,OWNER,,"For https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-data-from-files Given a file called `dogs.jpg` stem would be `dogs` and ext would be `jpg`. Need to decide what happens for `dogs.and.cats.jpg.gz`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/372/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097128334,I_kwDOCGYnMM5BZNmO,371,Support mutating row in `--convert` without returning it,9599,simonw,closed,0,,,7558727,3.21,6,2022-01-09T07:38:44Z,2022-01-10T19:27:30Z,2022-01-09T20:06:15Z,OWNER,,"Currently you have to do this: ``` $ sqlite-utils insert dogs.db dogs dogs.json --convert ' row[""is_good""] = 1 return row' ``` Would be neat if this worked too: ``` $ sqlite-utils insert dogs.db dogs dogs.json \ --convert 'row[""is_good""] = 1' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135860,I_kwDOCGYnMM5BZPb0,374,`--fmt` should imply `-t`,9599,simonw,closed,0,,,7558727,3.21,4,2022-01-09T08:23:07Z,2022-01-10T19:27:26Z,2022-01-09T18:07:59Z,OWNER,,Not sure why I didn't implement this.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1095570074,I_kwDOCGYnMM5BTRKa,364,`--batch-size 1` doesn't seem to commit for every item,9599,simonw,closed,0,,,7558727,3.21,16,2022-01-06T18:18:50Z,2022-01-10T19:27:17Z,2022-01-10T05:36:19Z,OWNER,,"I'm trying this, but it doesn't seem to write anything to the database file until I hit `CTRL+C`: ``` heroku logs --app=simonwillisonblog --tail | grep 'measure#nginx.service' | \ sqlite-utils insert /tmp/herokutail.db log - --import re --convert ""$(cat < ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077431957,I_kwDOCGYnMM5AOE6V,356,`sqlite-utils insert --convert` option,9599,simonw,closed,0,,,,,11,2021-12-11T07:24:48Z,2022-01-06T06:30:13Z,2022-01-06T06:28:53Z,OWNER,,"Idea come to me while re-reading this: https://simonwillison.net/2021/Aug/6/sqlite-utils-convert/ This is a bit of a hack: ``` cat /tmp/log.txt | \ jq --raw-input '{line: .}' --compact-output | \ sqlite-utils insert /tmp/logs.db log - --nl ``` Would be great if you could pipe lines to `insert` and transform them on the way in. A `--convert python-code` option, modeled after `sqlite-utils convert`, could do this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1091850530,I_kwDODEm0Qs5BFFEi,63,Import archive error 'withheld_in_countries',521097,pauloxnet,open,0,,,,,0,2022-01-01T16:58:59Z,2022-01-01T16:58:59Z,,NONE,,"Importing the twitter archive I received this error: ```bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File ""/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 759, in import_ archive.import_from_file(db, filename, content) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2625, in insert_all self.insert_chunk( File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2406, in insert_chunk result = self.db.execute(query, params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries ``` I found only a single tweet with the key `withheld_in_countries` in `tweet.js` that seems the problems: ```JSON [ { ""tweet"" : { ""retweeted"" : false, ""source"" : ""Twitter for Android"", ""entities"" : { ""hashtags"" : [ { ""text"" : ""NowOnAndroid"", ""indices"" : [ ""64"", ""77"" ] } ], ""symbols"" : [ ], ""user_mentions"" : [ { ""name"" : ""Periscope"", ""screen_name"" : ""PeriscopeCo"", ""indices"" : [ ""3"", ""15"" ], ""id_str"" : ""1111111111"", ""id"" : ""222222222"" } ], ""urls"" : [ { ""url"" : ""https://t.co/xxxxxxxxx"", ""expanded_url"" : ""https://vine.co/v/xxxxxxxxx"", ""display_url"" : ""vine.co/v/xxxxxxxxxx"", ""indices"" : [ ""78"", ""101"" ] } ] }, ""display_text_range"" : [ ""0"", ""101"" ], ""favorite_count"" : ""0"", ""id_str"" : ""1111111111111111111111"", ""truncated"" : false, ""retweet_count"" : ""0"", ""withheld_in_countries"" : [ ""TR"" ], ""id"" : ""000000000000000000"", ""possibly_sensitive"" : false, ""created_at"" : ""Fri Aug 14 06:04:03 +0000 2015"", ""favorited"" : false, ""full_text"" : ""RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk"", ""lang"" : ""en"" } } ] ``` I solved the error removing the key from the `tweet.js` but I'm reporting this error to improve the project.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091838742,I_kwDOBm6k_c5BFCMW,1585,Fire base caching for `publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-01T15:38:15Z,2022-01-01T15:40:38Z,,OWNER,,"https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848 Could this enable easier vanity URLs of the format `https://$project_id.web.app/`? How about CDN caching?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091257796,I_kwDOBm6k_c5BC0XE,1584,give error with recursive sql,58088336,tunguyenatwork,open,0,,,,,0,2021-12-30T18:53:16Z,2021-12-30T18:53:16Z,,NONE,,"I got an error ""near ""WITH"": syntax error"" after I upgraded to version 0.59 from 0.52.4. This error is related to recursive sql. It works great on the previous version but it failed after upgraded. Below is an example of sql: WITH RECURSIVE manager_of(position, super_position) AS (SELECT position, case ifnull(INDIRECT_SUPER_POSITION,'') when '' then super_position else INDIRECT_SUPER_POSITION end as SUPER_POSITION FROM position where super_position<>'SGV000000001' and super_position!='' and position <> super_position),chain_manager_of_position(position, level) AS (SELECT super_position, 1 as level FROM manager_of WHERE super_position!='' and (position=:pos or position in (Select position from employee where employee=:ein)) UNION ALL SELECT super_position, level+1 as level FROM manager_of JOIN chain_manager_of_position USING(position)) SELECT * FROM chain_manager_of_position left join employee using(position) where employee is not NULL order by level limit 1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090810196,I_kwDOBm6k_c5BBHFU,1583,consider adding deletion step of cloudbuild artifacts to gcloud publish,536941,fgregg,open,0,,,,,1,2021-12-30T00:33:23Z,2021-12-30T00:34:16Z,,CONTRIBUTOR,,"right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun. after successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090798237,I_kwDOCGYnMM5BBEKd,359,Use RETURNING if available to populate last_pk,9599,simonw,open,0,,,,,0,2021-12-29T23:43:23Z,2021-12-29T23:43:23Z,,OWNER,,"Inspired by this: https://news.ycombinator.com/item?id=29729283 > Because SQLite is effectively serializing all the writes for us, we have zero locking in our code. We used to have to lock when inserting new items (to get the LastInsertRowId), but the newer version of SQLite supports the RETURNING keyword, so we don't even have to lock on inserts now.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 770598024,MDU6SXNzdWU3NzA1OTgwMjQ=,1152,Efficiently calculate list of databases/tables a user can view,9599,simonw,open,0,,,,,12,2020-12-18T06:13:01Z,2021-12-27T23:04:31Z,,OWNER,,"> The homepage currently performs a massive flurry of permission checks - one for each, database, table and view: https://github.com/simonw/datasette/blob/0.53/datasette/views/index.py#L21-L75 > > A paginated version of this is a little daunting as the permission checks would have to be carried out in every single table just to calculate the count that will be paginated. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1150#issuecomment-747864831_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 807437089,MDU6SXNzdWU4MDc0MzcwODk=,228,--no-headers option for CSV and TSV,9599,simonw,closed,0,,,,,10,2021-02-12T17:56:51Z,2021-12-26T07:01:31Z,2021-02-14T22:25:17Z,OWNER,,"https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c has a fascinating CSV file that doesn't have a header row - it starts like this: ```csv Computation and measurement of turbulent flow through idealized turbine blade passages,,""Loizou, Panos A."",https://isni.org/isni/0000000136122593,,University of Manchester,https://isni.org/isni/0000000121662407,1989,Thesis (Ph.D.),,Physical Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781, ""Prolactin and growth hormone secretion in normal, hyperprolactinaemic and acromegalic man"",,""Prescott, R. W. G."",https://isni.org/isni/0000000134992122,,University of Newcastle upon Tyne,https://isni.org/isni/0000000104627212,1983,Thesis (Ph.D.),,Biological Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232784, ``` It would be useful if `sqlite-utils insert ... --csv` had a mechanism for importing files like this one.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087919372,I_kwDOBm6k_c5A2FUM,1578,Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key,9599,simonw,open,0,,,,,4,2021-12-23T18:27:59Z,2021-12-24T21:33:19Z,,OWNER,,"Found this while working on https://github.com/simonw/datasette-tiddlywiki Then clicking on `/tiddlywiki/tiddlers/%24%3A%2FDefaultTiddlers` returns a 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 781262510,MDU6SXNzdWU3ODEyNjI1MTA=,1181,"Certain database names results in 404: ""Database not found: None""",1470389,jieter,closed,0,,,6346396,Datasette 0.54,4,2021-01-07T12:01:16Z,2021-12-21T18:25:15Z,2021-01-25T05:13:19Z,NONE,,"I have a file named `test-database (1).sqlite`. When requesting the home route `/`, I see datasette is able to read it correctly: However, if I click any of the links, datasette replies with: `Error 404 Database not found: None` It seems the hash is crucial, as renaming the file to `database (1).sqlite` makes the error go away. This lines checks for a single dash: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184 ``` $ datasette test-database\ \(1\).sqlite INFO: Started server process [68314] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:54043 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54043 - ""GET / HTTP/1.1"" 200 OK ... INFO: 127.0.0.1:54044 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54044 - ""GET /test-database (1) HTTP/1.1"" 404 Not Found ``` Version: ``` $ datasette --version datasette, version 0.53 ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084257842,I_kwDOBm6k_c5AoHYy,1575,__call__() got an unexpected keyword argument 'specname',9599,simonw,closed,0,,,,,1,2021-12-20T01:24:04Z,2021-12-20T01:48:03Z,2021-12-20T01:47:57Z,OWNER,,"> I've installed the alpha version but get an error when starting up Datasette: ``` Traceback (most recent call last): File ""/Users/tim/.pyenv/versions/stock-exchange/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py"", line 15, in from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py"", line 31, in from .views.database import DatabaseDownload, DatabaseView File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py"", line 25, in from datasette.plugins import pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py"", line 29, in mod = importlib.import_module(plugin) File ""/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py"", line 9, in @hookimpl(specname=""filters_from_request"") TypeError: __call__() got an unexpected keyword argument 'specname' ``` _Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706001517,MDU6SXNzdWU3MDYwMDE1MTc=,163,Idea: conversions= could take Python functions,9599,simonw,open,0,,,,,4,2020-09-22T00:37:12Z,2021-12-20T00:56:52Z,,OWNER,,"Right now you use `conversions=` like this: ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": ""upper(?)""}) ``` How about if you could optionally provide a Python function (or a lambda) like this? ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": lambda s: s.upper()}) ``` This would work by creating a random name for that function, registering it (similar to #162), executing the SQL and then un-registering the custom function at the end.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1083657868,I_kwDOBm6k_c5Al06M,1565,Documented JavaScript variables on different templates made available for plugins,9599,simonw,open,0,,,,,8,2021-12-17T22:30:51Z,2021-12-19T22:37:29Z,,OWNER,,"While working on https://github.com/simonw/datasette-leaflet-freedraw/issues/10 I found myself writing this atrocity to figure out the SQL query used for a specific table page: ```javascript let innerSql = Array.from(document.getElementsByTagName(""span"")).filter( el => el.innerText == ""View and edit SQL"" )[0].parentElement.getAttribute(""title"") ``` This is obviously bad - it's very brittle, and will break if I ever change the text on that link (like localizing it for example). Instead, I think pages like that one should have a block of script at the bottom something like this: ```javascript window.datasette = window.datasette || {}; datasette.view_name = 'table'; datasette.table_sql = 'select * from ...'; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1084185188,I_kwDOBm6k_c5An1pk,1573,Make trace() a documented internal API,9599,simonw,open,0,,,,,1,2021-12-19T20:32:56Z,2021-12-19T21:13:13Z,,OWNER,,"This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52 Including the new `kwargs` pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520681725,MDU6SXNzdWU1MjA2ODE3MjU=,621,Syntax for ?_through= that works as a form field,9599,simonw,open,0,,,,,7,2019-11-11T00:19:03Z,2021-12-18T01:42:33Z,,OWNER,,"The current syntax for `?_through=` uses JSON to avoid any risk of confusion with table or column names that contain special characters. This means you can't target a form field at it. We should be able to support both - `?x.y.z=value` for tables and columns with ""regular"" names, falling back to the current JSON syntax for columns or tables that won't work with the key/value syntax.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 519613116,MDU6SXNzdWU1MTk2MTMxMTY=,617,Refactor TableView.data() method,9599,simonw,closed,0,,,,,9,2019-11-08T01:55:41Z,2021-12-18T01:41:47Z,2021-12-11T19:17:11Z,OWNER,,"This is by far the most complex piece of Datasette - the `TableView.data()` method is over 500 lines long and is increasingly getting in the way of cleanly implementing new features (e.g. #615 and #613). Need to break it up into smaller, cleaner pieces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445850934,MDU6SXNzdWU0NDU4NTA5MzQ=,473,Plugin hook: filters_from_request,9599,simonw,closed,0,,,,,13,2019-05-19T18:44:33Z,2021-12-17T23:11:30Z,2021-12-17T19:02:17Z,OWNER,,"I meant to add this as part of the facets plugin mechanism but didn't quite get to it. Original idea was to allow plugins to register extra filters, as seen in `datasette/filters.py`: https://github.com/simonw/datasette/blob/260085838887ee343f4d3b177c422e7aef5ade9d/datasette/filters.py#L83-L98",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079422215,I_kwDOCGYnMM5AVq0H,357,pytest-runner is not required,4067843,pgajdos,closed,0,,,,,1,2021-12-14T07:51:24Z,2021-12-16T20:43:19Z,2021-12-16T20:43:13Z,NONE,,Deprecated pytest-runner is not necessary for running the testsuite.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 636511683,MDU6SXNzdWU2MzY1MTE2ODM=,830,Redesign register_facet_classes plugin hook,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2020-06-10T20:03:27Z,2021-12-16T19:58:22Z,,OWNER,,"Nothing uses this plugin hook yet, so the design is not yet proven. I'm going to build a real plugin against it and use that process to inform any design changes that may need to be made. I'll add a warning about this to the documentation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit. it would be great if a response is truncated that an header signalling that was set in the header. i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 741231849,MDU6SXNzdWU3NDEyMzE4NDk=,1087,Idea: ?_extra=urls for getting back URLs to useful things,9599,simonw,open,0,,,,,0,2020-11-12T02:55:41Z,2021-12-15T18:06:16Z,,OWNER,,"Working on https://github.com/simonw/datasette-search-all/issues/10 made me realize that sometimes it can be difficult to calculate the URL for a database, table or row within Datasette. It would be useful to have an optional extra JSON extension (using `?_extra=` from #262) that can help with this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1087/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 705840673,MDU6SXNzdWU3MDU4NDA2NzM=,972,Support faceting against arbitrary SQL queries,9599,simonw,open,0,,,,,1,2020-09-21T19:00:43Z,2021-12-15T18:02:20Z,,OWNER,,"> ... support for running facets against arbitrary custom SQL queries is half-done in that facets now execute against wrapped subqueries as-of ea66c45df96479ef66a89caa71fff1a97a862646 > > https://github.com/simonw/datasette/blob/ea66c45df96479ef66a89caa71fff1a97a862646/datasette/facets.py#L192-L200 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/971#issuecomment-696307922_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/972/reactions"", ""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 3, ""rocket"": 0, ""eyes"": 0}",, 962391325,MDU6SXNzdWU5NjIzOTEzMjU=,1423,Show count of facet values if ?_facet_size=max,9599,simonw,closed,0,,,,,9,2021-08-06T04:42:20Z,2021-12-15T17:48:40Z,2021-08-16T18:56:43Z,OWNER,,"I sometimes want to get a count of the values in a facet - if it's a facet of US states for example I want to know if all 50 are represented. Idea: if `?_facet_size=max` is present, add a count to the facet heading. So on: https://latest.datasette.io/fixtures/compound_three_primary_keys?_facet=content&_facet_size=max&_facet=pk1&_facet=pk2#facet-pk2 It could have something like this: Note that the first column shows >1000 - because in that case we've truncated the facet calculation since the maximum allowed returned rows is 1000.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1423/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072106103,I_kwDOBm6k_c4_5wp3,1542,feature request: order and dependency of plugins (that use js),33631,fs111,open,0,,,,,1,2021-12-06T12:40:45Z,2021-12-15T17:47:08Z,,NONE,,"I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js. Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it. Basically what I want to do is: my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized. Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor. I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077102934,I_kwDOCGYnMM5AM0lW,353,"Allow passing a file of code to ""sqlite-utils convert""",536941,fgregg,closed,0,,,,,8,2021-12-10T18:06:14Z,2021-12-11T01:38:29Z,2021-12-11T01:09:39Z,CONTRIBUTOR,,"sqlite-utils is so nice, but the ergonomics of the multiline code in kind of tough. It's really hard (maybe impossible) to make the newlines play well with Makefiles. it would be great to write your code fragment in a separate file and direct it into the sqlite-utils either like ```sqlite-utils convert my.db my_table my_column < custom_code.py``` or ```sqlite-utils convert my.db my_table my_column --custom-code=custom_code.py``` Thanks, as ever, for these great tools!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077243232,I_kwDOCGYnMM5ANW1g,354,Test failure in test_rebuild_fts,9599,simonw,closed,0,,,,,7,2021-12-10T21:27:55Z,2021-12-11T01:08:46Z,2021-12-11T01:08:46Z,OWNER,,"Not sure why this has only just started failing, but I'm getting this: https://github.com/simonw/sqlite-utils/runs/4488687639 ``` E sqlite3.DatabaseError: database disk image is malformed sqlite_utils/db.py:425: DatabaseError _______________________ test_rebuild_fts[searchable_fts] _______________________ fresh_db = > table_to_fix = 'searchable_fts' @pytest.mark.parametrize(""table_to_fix"", [""searchable"", ""searchable_fts""]) def test_rebuild_fts(fresh_db, table_to_fix): table = fresh_db[""searchable""] table.insert(search_records[0]) table.enable_fts([""text"", ""country""]) # Run a search rows = list(table.search(""tanuki"")) assert len(rows) == 1 assert { ""rowid"": 1, ""text"": ""tanuki are running tricksters"", ""country"": ""Japan"", ""not_searchable"": ""foo"", }.items() <= rows[0].items() # Delete from searchable_fts_data fresh_db[""searchable_fts_data""].delete_where() # This should have broken the index with pytest.raises(sqlite3.DatabaseError): list(table.search(""tanuki"")) # Running rebuild_fts() should fix it > fresh_db[table_to_fix].rebuild_fts() ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077322009,I_kwDOCGYnMM5ANqEZ,355,Allow users to pass a full convert() function definition,9599,simonw,closed,0,,,,,4,2021-12-10T23:59:58Z,2021-12-11T00:51:15Z,2021-12-11T00:49:31Z,OWNER,,"> I think the fix for this is to change the rules about what code is accepted in both the `-` mode and the literal code string mode: you can pass in a Python expression, OR a fragment that gets turned into a function, OR code that implements its own `def convert(value)` function. So this would work too: > ```sh > sqlite-utils convert my.db mytable col1 ' > def convert(value): > return value.upper() > ' > ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991381679_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072780607,I_kwDOCGYnMM4_8VU_,351,Support `--import xml.etree.ElementTree` in `sqlite-utils convert`,9599,simonw,closed,0,,,,,1,2021-12-07T00:40:29Z,2021-12-11T00:11:25Z,2021-12-11T00:11:25Z,OWNER,,"It's not possible to use a module that requires a nested import, such as `xml.etree.ElementTree`, at the moment. I found and fixed this bug in `git-history`, I should replicate that fix (and accompanying documentation) here: https://github.com/simonw/git-history/issues/39",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1068791148,I_kwDOBm6k_c4_tHVs,1540,Idea: hover to reveal details of linked row,9599,simonw,open,0,,,,,6,2021-12-01T19:28:07Z,2021-12-09T23:38:39Z,,OWNER,," Hovering over that could work a little bit like GitHub issue links: ![hover](https://user-images.githubusercontent.com/9599/144300537-9cd9e9af-ac16-42db-842f-37661bc94063.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 863884805,MDU6SXNzdWU4NjM4ODQ4MDU=,1304,"Document how to send multiple values for ""Named parameters"" ",9308268,rayvoelker,open,0,,,,,4,2021-04-21T13:19:06Z,2021-12-08T03:23:14Z,,NONE,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters I thought that I had seen an example of how to do this example below, but I can't seem to find it ```sql select * from bib where bib.bib_record_num in (1008088,1008092) ``` ```sql select * from bib where bib.bib_record_num in (:bib_record_numbers) ``` ![image](https://user-images.githubusercontent.com/9308268/115558839-2333a480-a281-11eb-85e6-ce3bada79140.png) https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-204d100?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Awhere%0D%0A++bib.bib_record_num+in+%28%3Abib_record_numbers%29&bib_record_numbers=1008088%2C1008092 Or, maybe this isn't a fully supported feature. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,simonw,open,0,,,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,simonw,open,0,,,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 964322136,MDU6SXNzdWU5NjQzMjIxMzY=,1426,"Manage /robots.txt in Datasette core, block robots by default",9599,simonw,open,0,,,,,9,2021-08-09T19:56:56Z,2021-12-04T07:11:29Z,,OWNER,,"See accompanying Twitter thread: https://twitter.com/simonw/status/1424820203603431439 > Datasette currently has a plugin for configuring robots.txt, but I'm beginning to think it should be part of core and crawlers should be blocked by default - having people explicitly opt-in to having their sites crawled and indexed feels a lot safer https://datasette.io/plugins/datasette-block-robots I have a lot of Datasettes deployed now, and tailing logs shows that they are being *hammered* by search engine crawlers even though many of them are not interesting enough to warrant indexing. I'm starting to think blocking crawlers would actually be a better default for most people, provided it was well documented and easy to understand how to allow them. Default-deny is usually a better policy than default-allow!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1426/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1071071397,I_kwDODFdgUs4_10Cl,69,View that combines issues and issue comments,9599,simonw,open,0,,,,,1,2021-12-04T00:34:33Z,2021-12-04T00:34:52Z,,MEMBER,,I want to see a reverse chronologically ordered interface onto both issues and comments - essentially a unified log of comments and issues opened across one or multiple projects.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1069881276,I_kwDOBm6k_c4_xRe8,1541,Different default layout for row page,9599,simonw,open,0,,,,,1,2021-12-02T18:56:36Z,2021-12-02T18:56:54Z,,OWNER,,"The row page displays as a table even though it only has one table row. maybe default to the same display as the narrow page version, even for wide pages?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058790545,I_kwDOBm6k_c4_G9yR,1519,base_url is omitted in JSON and CSV views,157158,phubbard,closed,0,,,,,22,2021-11-19T18:10:45Z,2021-12-01T17:50:09Z,2021-11-20T19:11:21Z,NONE,,"I have a datasette deployment, using Apache2 to reverse proxy: ProxyPass /ged http://thor.phfactor.net:8001 ProxyPreserveHost On In settings.json I have ```json { ""base_url"": ""/ged/"", ""trace_debug"": 1, ""template_debug"": 1 } ``` and datasette works correctly. However, if you view a query and then click on the 'This data as json, CSV' both links omit the base_url prefix and are therefore 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1067775061,I_kwDOBm6k_c4_pPRV,1539,Research PRAGMA query_only,9599,simonw,open,0,,,,,0,2021-11-30T23:30:24Z,2021-11-30T23:30:24Z,,OWNER,,"https://www.sqlite.org/pragma.html#pragma_query_only > The query_only pragma prevents data changes on database files when enabled. When this pragma is enabled, any attempt to CREATE, DELETE, DROP, INSERT, or UPDATE will result in an [SQLITE_READONLY](https://www.sqlite.org/rescode.html#readonly) error. However, the database is not truly read-only. You can still run a [checkpoint](https://www.sqlite.org/wal.html#ckpt) or a [COMMIT](https://www.sqlite.org/lang_transaction.html) and the return value of the [sqlite3_db_readonly()](https://www.sqlite.org/c3ref/db_readonly.html) routine is not affected. Would it be worth adding this as an extra protection against accidental writes to a DB file over a read-only connection?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066563554,I_kwDOCGYnMM4_knfi,346,Way to test SQLite 3.37 (and potentially other versions) in CI,9599,simonw,open,0,,,,,5,2021-11-29T22:21:06Z,2021-11-29T23:12:49Z,,OWNER,,"> Need to figure out a good pattern for testing this in CI too - it will currently skip the new tests if it doesn't have SQLite 3.37 or higher. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982076924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066501534,I_kwDOCGYnMM4_kYWe,345,`table.strict` introspection boolean for identifying STRICT mode tables,9599,simonw,closed,0,,,,,2,2021-11-29T21:05:10Z,2021-11-29T22:45:26Z,2021-11-29T22:44:36Z,OWNER,,"> From the STRICT docs: >> The SQLite parser accepts a comma-separated list of table options after the final close parenthesis in a CREATE TABLE statement. As of this writing (2021-08-23) only two options are recognized: >> >> - STRICT >> - [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html) > > So I think I need to read the `CREATE TABLE` statement from the `sqlite_master` table, split on the last `)`, split those tokens on `,` and see if `create` is in there (case insensitive). _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982020757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058896236,I_kwDOBm6k_c4_HXls,1522,Deploy a live instance of demos/apache-proxy,9599,simonw,closed,0,,,,,34,2021-11-19T20:32:55Z,2021-11-23T03:00:34Z,2021-11-20T18:51:56Z,OWNER,,"> I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1521#issuecomment-974322178_ I started by following https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here: https://github.com/simonw/datasette-publish-vercel/issues/51 In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site). Can this be addressed? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273944952,MDU6SXNzdWUyNzM5NDQ5NTI=,93,Package as standalone binary,67420,atomotic,closed,0,,,,,18,2017-11-14T21:14:07Z,2021-11-21T07:00:23Z,2021-11-21T07:00:23Z,NONE,,"hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_ I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637395097,MDU6SXNzdWU2MzczOTUwOTc=,838,Incorrect URLs when served behind a proxy with base_url set,79913,tsibley,closed,0,,,6026070,0.51,14,2020-06-11T23:58:55Z,2021-11-20T19:35:48Z,2021-11-20T19:35:48Z,NONE,,"I'm running `datasette serve --config base_url:/foo/ …`, proxying to it with this Apache config: ProxyPass /foo/ http://localhost:8001/ ProxyPassReverse /foo/ http://localhost:8001/ and then accessing it via `https://example.com/foo/`. Although many of the URLs in the pages are correct (presumably because they either use absolute paths which include `base_url` or relative paths), the faceting and pagination links still use fully-qualified URLs pointing at `http://localhost:8001`. I looked into this a little in the source code, and it seems to be an issue anywhere `request.url` or `request.path` is used, as these contain the values for the request between the frontend (Apache) and backend (Datasette) server. Those properties are primarily used via the `path_with_…` family of utility functions and the `Datasette.absolute_url` method.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059209412,I_kwDOBm6k_c4_IkDE,1523,Come up with a more elegant solution for base_url than ds.urls.path(),9599,simonw,open,0,,,,,0,2021-11-20T19:05:22Z,2021-11-20T19:05:22Z,,OWNER,,"While fixing #1519 I added a lot of ugly code that looks like this: https://github.com/simonw/datasette/blob/08947fa76433d18988aa1ee1d929bd8320c75fe2/datasette/facets.py#L228-L230 See these two commits in particular: fe687fd0207c4c56c4778d3e92e3505fc4b18172 and 08947fa76433d18988aa1ee1d929bd8320c75fe2 It would be great to come up with a less verbose and error-prone way of handling this problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058815557,I_kwDOBm6k_c4_HD5F,1521,Docker configuration for exercising Datasette behind Apache mod_proxy,9599,simonw,closed,0,,,,,10,2021-11-19T18:46:18Z,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,,"> Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974310208_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058803238,I_kwDOBm6k_c4_HA4m,1520,Pattern for avoiding accidental URL over-rides,9599,simonw,open,0,,,,,1,2021-11-19T18:28:05Z,2021-11-19T18:29:26Z,,OWNER,,"Following #1517 I'm experimenting with a plugin that does this: ```python @hookimpl def register_routes(): return [ (r""/(?P[^/]+)/(?P[^/]+?)$"", Table().view), ] ``` This is supposed to replace the default table page with new code... but there's a problem: `/-/versions` on that instance now returns 404 `Database '-' does not exist`! Need to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058196641,I_kwDOCGYnMM4_Esyh,342,Extra options to `lookup()` which get passed to `insert()`,9599,simonw,closed,0,,,,,7,2021-11-19T06:53:03Z,2021-11-19T07:26:54Z,2021-11-19T07:26:54Z,OWNER,,"For https://github.com/simonw/git-history/issues/12 I found myself wanting to pass extra options to `lookup()` to set the column order, primary key etc.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes. It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1056746091,I_kwDOBm6k_c4-_Kpr,1515,Handle foreign keys that point to a non-existent table,9599,simonw,open,0,,,,,0,2021-11-17T23:40:13Z,2021-11-18T01:31:56Z,,OWNER,,"Spotted in https://github.com/simonw/datasette-graphql/issues/79 Demo: https://datasette-graphql-demo.datasette.io/fixtures/bad_foreign_key The foreign key links to a 404 page. ![B87009C7-CFCA-4DF9-8FBA-FA3E6CA28EC2](https://user-images.githubusercontent.com/9599/142334788-4d1a4acd-bc87-4426-b333-d46b221afcec.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets. This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets. Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 812704869,MDU6SXNzdWU4MTI3MDQ4Njk=,1237,?_pretty=1 option for pretty-printing JSON output,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-02-20T20:54:40Z,2021-11-16T18:28:33Z,,OWNER,,Suggested by @frankieroberto in https://github.com/simonw/datasette/issues/782#issuecomment-782746755,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 718540751,MDU6SXNzdWU3MTg1NDA3NTE=,1012,For 1.0 update trove classifier in setup.py,9599,simonw,open,0,,,3268330,Datasette 1.0,5,2020-10-10T05:52:08Z,2021-11-16T13:18:36Z,,OWNER,, Development Status :: 5 - Production/Stable,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1012/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1052247023,I_kwDOBm6k_c4-uAPv,1505,Datasette should have an option to output CSV with semicolons,9599,simonw,open,0,,,,,1,2021-11-12T18:02:21Z,2021-11-16T11:40:52Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 440222719,MDU6SXNzdWU0NDAyMjI3MTk=,448,_facet_array should work against views,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2019-05-03T21:08:04Z,2021-11-16T01:32:05Z,2021-11-16T01:19:40Z,OWNER,,"I created this view: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads-8dbda00/ads_with_targets ``` CREATE VIEW ads_with_targets as select ads.*, json_group_array(targets.name) as target_names from ads join ad_targets on ad_targets.ad_id = ads.id join targets on ad_targets.target_id = targets.id group by ad_targets.ad_id ``` When I try to apply faceting by array it appears to work at first: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names But actually it's doing the wrong thing - the SQL for the facets uses rowid, but rowid is not present on views at all! These results are incorrect, and clicking to select a facet will fail to produce any rows: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names&target_names__arraycontains=people_who_match%3Ainterests%3AAfrican-American+Civil+Rights+Movement+%281954%E2%80%9468%29 Here's the SQL it should be using when you select a facet (note that it does not use a rowid): https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads?sql=select+*+from+ads_with_targets+where+id+in+%28%0D%0A++++++++++++select+ads_with_targets.id+from+ads_with_targets%2C+json_each%28ads_with_targets.target_names%29+j%0D%0A++++++++++++where+j.value+%3D+%3Ap0%0D%0A++++++++%29+limit+101&p0=people_who_match%3Ainterests%3ABlack+%28Color%29 So we need to do something a lot smarter here. I'm not sure what the fix will look like, or even if it's feasible given that views don't have a rowid to hook into so the JSON faceting SQL may have to be completely rewritten. ``` datasette publish cloudrun \ russian-ads.db \ --name json-view-facet-bug-demo \ --branch master \ --extra-options ""--config sql_time_limit_ms:5000 --config facet_time_limit_ms:5000"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/448/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1054246919,I_kwDOBm6k_c4-1ogH,1511,Review plugin hooks for Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-11-15T23:26:05Z,2021-11-16T01:20:14Z,,OWNER,,I need to perform a detailed review of the plugin interface - especially the plugin hooks like [register_facet_classes()](https://docs.datasette.io/en/stable/plugin_hooks.html#register-facet-classes) which I don't yet have complete confidence in.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459590021,MDU6SXNzdWU0NTk1OTAwMjE=,519,Decide what goes into Datasette 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-06-23T15:47:41Z,2021-11-15T23:26:11Z,2021-11-15T23:26:11Z,OWNER,,Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1005891028,I_kwDOCGYnMM479K3U,329,Rethink approach to [ and ] in column names (currently throws error),9599,simonw,closed,0,,,,,12,2021-09-23T22:14:24Z,2021-11-15T02:57:51Z,2021-11-15T02:57:51Z,OWNER,,"> I think it's best to still keep `[` and `]` out of column names though. Transforming them into `(` and `)` seems reasonable - but should that happen here or in `sqlite-utils`? I think in `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/datasette-app/issues/121#issuecomment-926200398_ This is a rethinking of the solution to: - https://github.com/simonw/sqlite-utils/issues/86",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053136495,I_kwDOCGYnMM4-xZZv,341,`hash_id: Optional[Any]` should be `hash_id: Optional[str]`,9599,simonw,closed,0,,,,,0,2021-11-15T02:12:39Z,2021-11-15T02:19:31Z,2021-11-15T02:19:31Z,OWNER,,"In a few places: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L642 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L751 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1049 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1230 But it's correct here: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L2470",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053122092,I_kwDOCGYnMM4-xV4s,339,`table.lookup()` option to populate additional columns when creating a record,9599,simonw,closed,0,,,,,4,2021-11-15T01:41:17Z,2021-11-15T02:02:34Z,2021-11-15T02:02:00Z,OWNER,,"> For the commits table I feel like I want a version of `table.lookup()` that can be passed additional columns to populate only if the record does not exist yet. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/12#issuecomment-967455017_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053087862,I_kwDOCGYnMM4-xNh2,338,"dict, list, tuple should all map to TEXT",9599,simonw,closed,0,,,,,0,2021-11-15T00:28:01Z,2021-11-15T00:36:03Z,2021-11-15T00:36:03Z,OWNER,,"> This relates to the fact that dictionaries, lists and tuples get special treatment and are converted to JSON strings, using this code: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L2937-L2947 > > So the `COLUMN_TYPE_MAPPING` should include those too - right now it looks like this: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L165-L188 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/322#issuecomment-968401459_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,andreaslongo,closed,0,,,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"``` Python 3.9.5 mypy 0.910 sqlite-utils 3.17.1 ``` While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error: ``` mypy . src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ Found 2 errors in 2 files (checked 7 source files) ``` When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away. ``` al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la total 200 drwx------ 3 al al 4096 Oct 14 22:00 . drwx------ 117 al al 4096 Oct 12 21:12 .. -rw------- 1 al al 64409 Oct 12 21:11 cli.py -rw------- 1 al al 109092 Oct 12 21:11 db.py -rw------- 1 al al 0 Oct 14 22:00 py.typed -rw------- 1 al al 684 Oct 12 21:11 recipes.py -rw------- 1 al al 7988 Oct 12 21:11 utils.py -rw------- 1 al al 113 Oct 12 21:11 __init__.py ``` I would like to suggest adding a `py.typed` file to the repository. See also the mypy docs on creating PEP 561 compatible packages: https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028056713,I_kwDOCGYnMM49RuaJ,332,`sqlite-utils memory --flatten` option to flatten nested JSON,22523840,rdtq,closed,0,,,,,1,2021-10-16T14:04:42Z,2021-11-14T23:05:05Z,2021-11-14T23:05:05Z,NONE,,"currently --flatten option works only for `insert` command, it would be cool if it worked for `memory` as well to query nested json",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1042569687,I_kwDOCGYnMM4-JFnX,335,sqlite-utils index-foreign-keys fails due to pre-existing index,596279,zaneselvans,closed,0,,,,,11,2021-11-02T16:22:11Z,2021-11-14T22:55:56Z,2021-11-14T22:55:56Z,NONE,,"While running the command: ```sh sqlite-utils index-foreign-keys $SQLITE_DIR/pudl.sqlite ``` I got the following error: ``` Traceback (most recent call last): File ""/home/zane/miniconda3/envs/pudl-dev/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 454, in index_foreign_keys db.index_foreign_keys() File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 902, in index_foreign_keys table.create_index([fk.column]) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1563, in create_index self.db.execute(sql) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: index idx_generators_eia860_report_date already exists ``` This DB was created with the foreign key constraint `PRAGMA` enabled and a bunch of column-level `CHECK` constraints. Is this an expected behavior? Should one not try to index foreign keys if FK constraints are already being enforced within the DB? I'm also noticing that the size of the DB after FK indexes have been added went from 483MB to 835MB, which seems like a much bigger jump than when I've done this previously. Software versions... * sqlite-utils 3.17.1 * sqlite 3.36.0 * SQLAlchemy 1.4.26 (used to create the DB)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,simonw,closed,0,,,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation. https://readthedocs.org/projects/datasette/builds/15268454/ ``` /home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html Running Sphinx v1.8.5 loading translations [en]... done making output directory... building [mo]: targets for 0 po files that are out of date building [html]: targets for 27 source files that are out of date updating environment: 27 added, 0 changed, 0 removed reading sources... [ 3%] authentication Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main app.build(args.force_all, filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update len(to_build)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc pub.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish self.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms self.document.transformer.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms Transformer.apply_transforms(self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms transform.apply(**kwargs) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply apply_source_workaround(n) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1050163432,I_kwDOBm6k_c4-mDjo,1503,`?_nocol=` removes that column from the filter interface,9599,simonw,closed,0,,,,,1,2021-11-10T18:22:50Z,2021-11-14T05:08:27Z,2021-11-14T04:53:07Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable This causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter. ![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052826038,I_kwDOBm6k_c4-wNm2,1506,Columns beginning with an underscore do not facet correctly,9599,simonw,closed,0,,,,,1,2021-11-14T02:20:32Z,2021-11-14T04:45:21Z,2021-11-14T04:45:21Z,OWNER,,"Datasette treats columns that start with an underscore as querystring parameters it should ignore! Discovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1041778507,I_kwDOCGYnMM4-GEdL,334,Filter by datetime objects using rows_where(),11642379,viseshrp,closed,0,,,,,0,2021-11-02T00:44:08Z,2021-11-13T19:23:21Z,2021-11-13T19:23:21Z,NONE,,"Firstly, thanks for this nice utility. It would be nice to have an example in the docs on how to filter by date range using `rows_where()`. This doesn't seem to work: ``` table.rows_where('datetime(created) between datetime(""2021-10-31T17:29:59.277428-04:00"") AND datetime(""2021-11-01T03:44:04.544651+00:00"")') ``` I could probably just use `db.query()`, which works for the above, but it would be nice if I could pass in `datetime` objects in `rows_where()`. Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 924748955,MDU6SXNzdWU5MjQ3NDg5NTU=,1380,Serve all db files in a folder,193463,stratosgear,open,0,,,,,5,2021-06-18T10:03:32Z,2021-11-13T08:09:11Z,,NONE,,"I tried to get the `serve` command to serve all the .db files in the `/mnt` folder but is seems that the server does not refresh the list of files. In more detail: * Starting datasette as a docker container with: ``` docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt ``` * Datasette correctly serves all the *.db files found in the /mnt folder * When the server is running, if I copy a new file in the $PWD folder, Datasette does not seem to see the new files, forcing me to restart Docker. Is there an option/setting that I overlooked, or is this something missing? BTW, the `--reload` setting, although at first glance is what you think you need, does not seem to do anything in regards of seeing all *.db files. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1380/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1051277222,I_kwDOBm6k_c4-qTem,1504,Link to ?_size=max at bottom of table page,9599,simonw,open,0,,,,,0,2021-11-11T19:06:33Z,2021-11-11T19:06:33Z,,OWNER,,"This can have text such as ""Show 1,000 rows per page"", based on the max size limit setting. Would make it easier for people to see more data at once without having to know how to hack the URL, similar to the `...` for facet sizes I added in #1337.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,gustavorps,open,0,,,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 845794436,MDU6SXNzdWU4NDU3OTQ0MzY=,1284,Feature or Documentation Request: Individual table as home page template,192568,mroswell,open,0,,,,,4,2021-03-31T03:56:17Z,2021-11-04T03:15:01Z,,CONTRIBUTOR,,"It would be great to have a sample showing how to move a single database that has a single table, to the index page. I'm trying it now, and find there is a real depth of Datasette and Python understanding that's required to be successful. I've got all the basic jinja concepts down... variables, template control structures, template inheritance, template overrides, css, html, the --template-dir and --static arguments, etc. But copying the table.html file to index.html doesn't work. There are undocumented functions and filters... I can figure some of them out (yay, url_builder.py and utils/__init__.py!) but it's a slog better handled by a much stronger Python developer. One sample would make a world of difference. The ideal form of this documentation would be a diff between the default table.html and how that would look if essentially moved to index.html. The use case is for everyone who wants to create a public-facing website to explore a single table at the root directory. (Maybe a second bit of documentation for people who have a single database with multiple tables.) (Hmm... might be cool to have a setting for that, where it happens automagically! If only one table, then home page is at the table level. if only one database, then home page is at the database level.... as an option.) I suppose I could ignore this, and somehow do this in the DNS settings once I hook up Vercel to a domain name, maybe.. and remove the breadcrumbs in table.html... but for now, a documentation request in the form of a diff... for viewing a single table (or a single database) at the root. (Actually, there's probably room for a whole expanded section on templates. Noticed some nice table metadata in one of the datasette examples, for instance... Hmm... maybe a whole library of solutions in one place... maybe a documentation hackathon! If that's of interest, of course it's a separate issue. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 707478649,MDU6SXNzdWU3MDc0Nzg2NDk=,173,Progress bar for sqlite-utils insert,9599,simonw,closed,0,,,,,6,2020-09-23T15:43:56Z,2021-11-01T08:42:24Z,2020-10-27T18:16:04Z,OWNER,,"It would be nice if `sqlite-utils insert` had a progress bar, for when it's churning through huge CSV files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 826064552,MDU6SXNzdWU4MjYwNjQ1NTI=,1253,"Capture ""Ctrl + Enter"" or ""⌘ + Enter"" to send SQL query?",9308268,rayvoelker,open,0,,,,,1,2021-03-09T15:00:50Z,2021-10-30T16:00:42Z,,NONE,,"It appears as though ""Shift + Enter"" triggers the form submit action to submit SQL, but could that action be bound to the ""Ctrl + Enter"" or ""⌘ + Enter"" action? I feel like that pattern already exists in a number of similar tools and could improve usability of the editor.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 761915790,MDU6SXNzdWU3NjE5MTU3OTA=,206,sqlite-utils should suggest --csv if JSON parsing fails,9599,simonw,closed,0,,,,,4,2020-12-11T05:17:56Z,2021-10-30T15:52:17Z,2021-01-03T18:42:22Z,OWNER,,"``` ~ % gsutil cat gs://ossf-criticality-score/python_top_200.csv | sqlite-utils insert /tmp/crit.db crit - ... File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 355, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ``` A nicer error message here would be one that says the JSON is invalid but suggests that maybe you could try `--csv`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836829560,MDU6SXNzdWU4MzY4Mjk1NjA=,248,support for Apache Arrow / parquet files I/O,649467,mhalle,open,0,,,,,1,2021-03-20T14:59:30Z,2021-10-28T23:46:48Z,,NONE,,"I just started looking at Apache Arrow using pyarrow for import and export of tabular datasets, and it looks quite compelling. It might be worth looking at for sqlite-utils and/or datasette. As a test, I took a random jsonl data dump of a dataset I have with floats, strings, and ints and converted it to arrow's parquet format using the naive `pyarrow.parquet.write_file()` command, which has automatic type inferrence. It compressed down to 7% of the original size. Conversion of a 26MB JSON file and serializing it to parquet was eyeblink instantaneous. Parquet files are portable and can be directly imported into pandas and other analytics software. The only hangup is the automatic type inference of the naive reader. It's great for general laziness and for parsing JSON columns (it correctly interpreted a table of mine with a JSON array). However, I did get an exception for a string column where most entries looked integer-like but had a couple values that weren't -- the reader tried to coerce all of them for some reason, even though the JSON type is string. Since the writer optionally takes a schema, it shouldn't be too hard to grab the sqlite header types. With some additional hinting, you might get datetime columns and JSON, which are native Arrow types. Somewhat tangentially, someone even wrote an sqlite vfs extension for Parquet: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/248/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 817989436,MDU6SXNzdWU4MTc5ODk0MzY=,242,Async support,25778,eyeseast,open,0,,,,,13,2021-02-27T18:29:38Z,2021-10-28T14:37:56Z,,CONTRIBUTOR,,"Following our conversation last week, want to note this here before I forget. I've had a couple situations where I'd like to do a bunch of updates in an async event loop, but I run into SQLite's issues with concurrent writes. This feels like something sqlite-utils could help with. PeeWee ORM has a [SQLite write queue](http://docs.peewee-orm.com/en/latest/peewee/playhouse.html#sqliteq) that might be a good model. It's using threads or gevent, but I _think_ that approach would translate well enough to asyncio. Happy to help with this, too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see: - https://github.com/aio-libs/janus/issues/358 This is a tracking issue for anything else that shows up. This is also needed for the Homebrew package to upgrade to 3.10: - https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 950664971,MDU6SXNzdWU5NTA2NjQ5NzE=,1401,unordered list is not rendering bullet points in description_html on database page,536941,fgregg,open,0,,,,,2,2021-07-22T13:24:18Z,2021-10-23T13:09:10Z,,CONTRIBUTOR,,"Thanks for this tremendous package, @simonw! In the `description_html` for a database, I [have an unordered list](https://github.com/labordata/warehouse/blob/fcea4502e5b615b0eb3e0bdcb45ec634abe20bb6/warehouse_metadata.yml#L19-L22). However, on the database page on the deployed site, it is not rendering this as a bulleted list. ![Screenshot 2021-07-22 at 09-21-51 nlrb](https://user-images.githubusercontent.com/536941/126645923-2777b7f1-fd4c-4d2d-af70-a35e49a07675.png) Page here: https://labordata-warehouse.herokuapp.com/nlrb-9da4ae5 The documentation gives an [example of using an unordered list](https://docs.datasette.io/en/stable/metadata.html#using-yaml-for-metadata) in a `description_html`, so I expected this will work.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1033864602,I_kwDOBm6k_c49n4Wa,1496,Named parameters docs should include an example of a cast,9599,simonw,closed,0,,,,,1,2021-10-22T18:56:04Z,2021-10-22T19:38:23Z,2021-10-22T19:34:27Z,OWNER,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters It's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944903881,MDU6SXNzdWU5NDQ5MDM4ODE=,1396,"""invalid reference format"" publishing Docker image",9599,simonw,closed,0,,,,,9,2021-07-15T01:02:07Z,2021-10-19T08:10:26Z,2021-07-15T19:47:25Z,OWNER,,"Error ocurred at the end of the publish flow for Datasette 0.58: https://github.com/simonw/datasette/runs/3072216421 ``` Removing intermediate container cf32b9440907 ---> dfd6985b2afc Successfully built dfd6985b2afc Successfully tagged ***/datasette:0.58 invalid reference format Error: Process completed with exit code 1. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,chenrui333,closed,0,,,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command. My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks! relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 969855774,MDU6SXNzdWU5Njk4NTU3NzQ=,1432,Rename Datasette.__init__(config=) parameter to settings=,9599,simonw,open,0,,,,,8,2021-08-13T01:00:27Z,2021-10-19T01:16:41Z,,OWNER,,"> While I'm doing this I should rename this internal variable to avoid confusion in the future: > > https://github.com/simonw/datasette/blob/e837095ef35ae155b4c78cc9a8b7133a48c94f03/datasette/app.py#L203 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1431#issuecomment-898072940_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 995098231,MDU6SXNzdWU5OTUwOTgyMzE=,1470,?_sort=rowid with _next= returns error,19851673,eigenfoo,closed,0,,,,,4,2021-09-13T16:36:15Z,2021-10-18T19:30:15Z,2021-10-10T01:15:03Z,NONE,,"For example: - Go to https://cryptics.eigenfoo.xyz/clues/clues?_next=100 (this is the second page of results in a Datasette site) - Search anything using the FTS search bar. For example, searching for `hello` will take you to https://cryptics.eigenfoo.xyz/clues/clues?_search=hello&_sort=rowid&_next=100 - A `500 Error: list index out of range` is raised. This is because the search URL includes the `&_next=100` UTM parameter, carried over from where the FTS search was run. However, there isn't a second page in the search results, so a `list index out of range` error is raised. You can confirm that removing this UTM parameter from the URL returns the appropriate search results. The FTS search request should strip any `_next` UTM parameter. --- ```bash datasette, version 0.58.1 sqlite-utils, version 3.17 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 964400482,MDU6SXNzdWU5NjQ0MDA0ODI=,310,`sqlite-utils insert --flatten` option to flatten nested JSON,9599,simonw,closed,0,,,,,3,2021-08-09T21:23:08Z,2021-10-16T13:54:56Z,2021-08-09T21:44:06Z,OWNER,,"I had to do this with a `jq` recipe today: https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs ``` cat log.json | jq -c '[leaf_paths as $path | { ""key"": $path | join(""_""), ""value"": getpath($path) }] | from_entries' \ | sqlite-utils insert /tmp/logs.db logs - --nl --alter --batch-size 1 ``` That was to turn something like this: ```json { ""httpRequest"": { ""latency"": ""0.112114537s"", ""requestMethod"": ""GET"", ""requestSize"": ""534"", ""status"": 200, }, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels"": { ""service"": ""datasette-io"" } } ``` Into this instead: ```json { ""httpRequest_latency"": ""0.112114537s"", ""httpRequest_requestMethod"": ""GET"", ""httpRequest_requestSize"": ""534"", ""httpRequest_status"": 200, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels_service"": ""datasette-io"" } ``` I have to do this often enough that I think it should be an option, `--flatten` - so I can do this instead: ``` cat log.json | sqlite-utils insert /tmp/logs.db logs - --flatten ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988553806,MDU6SXNzdWU5ODg1NTM4MDY=,1457,suggestion: distinguish names in `--static` documentation,51016,ctb,closed,0,,,,,0,2021-09-05T17:04:27Z,2021-10-14T18:39:55Z,2021-10-14T18:39:55Z,CONTRIBUTOR,,"Over in https://docs.datasette.io/en/stable/custom_templates.html#serving-static-files, there is the slightly comical example command - ``` datasette -m metadata.json --static static:static/ --memory ``` (now, with MORE STATIC!) It took me a while to sort out all the URLs and paths involved because I wasn't being very clever. But in the interests of simplification and distinction, I might suggest something like ``` datasette -m metadata.json --static loc:static-files/ --memory ``` I will submit a PR for your consideration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1025754125,I_kwDOBm6k_c49I8QN,1488,Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects'),9599,simonw,closed,0,,,,,5,2021-10-13T22:37:22Z,2021-10-14T18:03:45Z,2021-10-14T18:03:45Z,OWNER,,This is caused by a change made to `httpx` in https://github.com/encode/httpx/releases/tag/0.20.0,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 994450961,MDU6SXNzdWU5OTQ0NTA5NjE=,1469,"Column cog shows ""facet by this"" when already default faceted",9599,simonw,closed,0,,,,,2,2021-09-13T04:51:26Z,2021-10-13T21:20:07Z,2021-10-13T21:20:07Z,OWNER,,"e.g. on https://covid-19.datasettes.com/covid/economist_excess_deaths But if you add `?_facet=country` to the URL that goes away: https://covid-19.datasettes.com/covid/economist_excess_deaths?_facet_size=5&_facet=country The logic that decides if the ""Facet by this"" item is shown does not take default `metadata.json` facets into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1023243105,I_kwDOBm6k_c48_XNh,1486,pipx installation instructions for plugins don't reference pipx inject,41546558,RhetTbull,closed,0,,,,,0,2021-10-12T00:43:42Z,2021-10-13T21:09:11Z,2021-10-13T21:09:11Z,CONTRIBUTOR,,"The datasette [installation instructions](https://github.com/simonw/datasette/blob/main/docs/installation.rst) discuss how to install with pipx, how to upgrade with pipx, and how to upgrade plugins with pipx but do not mention how to install a plugin with pipx. You discussed this on your [blog](https://til.simonwillison.net/python/installing-upgrading-plugins-with-pipx) but looks like this didn't make it in when you updated the docs for pipx (#756). I'll submit a PR shortly to fix this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1021849766,I_kwDOBm6k_c486DCm,1483,Running a search on page 2 of results should not preserve ?_next=,9599,simonw,closed,0,,,,,0,2021-10-10T01:18:12Z,2021-10-13T21:08:10Z,2021-10-13T21:08:10Z,OWNER,,Reported by @eigenfoo in https://github.com/simonw/datasette/issues/1470,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1483/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951817328,MDU6SXNzdWU5NTE4MTczMjg=,12,403 when getting token,285352,treyhunner,open,0,,,,,1,2021-07-23T18:43:26Z,2021-10-12T18:31:57Z,,NONE,,"I tried to use https://your-foursquare-oauth-token.glitch.me/ to get my Swarm auth token and got a 403 after I clicked the Allow button: ![image](https://user-images.githubusercontent.com/285352/126826478-60e53614-263d-40bb-9f1d-c1a676644eb0.png) I'm not sure if this is the right repo to report this in",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533481,MDU6SXNzdWU2MDI1MzM0ODE=,3,"Import EXIF data into SQLite - lens used, ISO, aperture etc",9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,2,2020-04-18T19:24:31Z,2021-10-05T12:38:24Z,,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 597671518,MDU6SXNzdWU1OTc2NzE1MTg=,98,"Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records",9599,simonw,closed,0,,,,,7,2020-04-10T03:19:40Z,2021-09-28T04:38:44Z,2020-04-13T03:29:15Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,mroswell,open,0,,,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency. Would love to have documentation on how to modify the template to: - Remove the generated ID from the table - Link the federal ID to the detail page - and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key. I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports. Perhaps this isn't a template issue, maybe more of a db manipulation... Margie",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1006016302,I_kwDOBm6k_c479pcu,1477,Consider adding request to the documented default template context,9599,simonw,open,0,,,,,0,2021-09-24T02:34:09Z,2021-09-24T02:34:09Z,,OWNER,,I made a plugin for this today but I think perhaps it should be a default thing instead: https://datasette.io/plugins/datasette-template-request,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 990844088,MDU6SXNzdWU5OTA4NDQwODg=,325,sqlite-utils memory can't deal with multiple files with the same name,144773,karlb,closed,0,,,,,4,2021-09-08T08:14:42Z,2021-09-22T20:52:56Z,2021-09-22T20:45:45Z,NONE,,"When I use multiple files with the same name, e.g. in `sqlite-utils memory a/bug.csv b/bug.csv`, sqlite-utils creates invalid views. ``` Traceback (most recent call last): File ""/home/karl/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 1299, in memory db[csv_table].transform(types=tracker.types) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1287, in transform self.db.execute(sql) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: error in view t1: no such table: main.bug ``` This can be reproduced with ```sh #!/bin/bash mkdir foo mkdir bar echo -e 'col1,col2\nval1,val2' > foo/bug.csv echo -e 'col3,col4\nval3,val4' > bar/bug.csv sqlite-utils memory */bug.csv 'SELECT 1' ``` Ideally, the tables would get unique names by including the next path segment until the names are unique. But just making the numbered t* aliases work would be good enough. This problem can of course be worked around by renaming the files, but it would be nice if this case was handled more gracefully. Thanks a lot for this great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274615452,MDU6SXNzdWUyNzQ2MTU0NTI=,111,Add “updated” to metadata,9599,simonw,open,0,,,,,12,2017-11-16T18:22:20Z,2021-09-21T22:48:27Z,,OWNER,,"To give an indication as to when the data was last updated. This should be a field in the metadata that is then shown on the index page and in the footer, if it is set. Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date: datasette publish file.db --updated=today",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 984939366,MDU6SXNzdWU5ODQ5MzkzNjY=,58,"Error: Use either --since or --since_id, not both - still broken",42904,rubenv,closed,0,,,,,1,2021-09-01T09:45:28Z,2021-09-21T17:37:41Z,2021-09-21T17:37:41Z,CONTRIBUTOR,,"Hi Simon, It appears the fix for #57 doesn't fix things for me: ``` $ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6 ``` ``` $ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both ``` Is there any way I can help debug this?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 999902754,I_kwDOBm6k_c47mU4i,1473,base logo link visits `undefined` rather than href url,192568,mroswell,open,0,,,,,2,2021-09-18T04:17:04Z,2021-09-19T00:45:32Z,,CONTRIBUTOR,,"I have two connected sites: http://www.SaferOrToxic.org (a Hugo website) and: http://disinfectants.SaferOrToxic.org/disinfectants/listN (a datasette table page) The latter is linked as ""The List"" in the former's menu. (I'd love a prettier URL, but that's what I've got.) On: http://disinfectants.SaferOrToxic.org/disinfectants/listN ... all the other menu links should point back to: https://www.SaferOrToxic.org And they do! But the logo, for some reason--though it has an href pointing to: https://www.SaferOrToxic.org Keeps going to this instead: https://disinfectants.saferortoxic.org/disinfectants/undefined What is causing that? How can I fix it? In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.) But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page (""The List,"") and then have the List point back to the home website. The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly. But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined This is the HTML: ``` ``` Is this somehow related to cloudflare? Or something in the datasette code? I'm starting to think it's a cloudflare issue. Can I at least rule out it being a datasette issue? My repository is here: https://github.com/mroswell/list-N (BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 991191951,MDU6SXNzdWU5OTExOTE5NTE=,1464,clean checkout & clean environment has test failures,51016,ctb,open,0,,,,,6,2021-09-08T14:16:23Z,2021-09-13T22:17:17Z,,CONTRIBUTOR,,"I followed the instructions [here](https://docs.datasette.io/en/stable/contributing.html#setting-up-a-development-environment), and even after running `python update-docs-help.py` I get the following failed tests -- any thoughts? ``` FAILED tests/test_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] ``` This is with python 3.9.7 and lots of other packages, as in attached environment listing from `conda list`. [conda-installed.txt](https://github.com/simonw/datasette/files/7129487/conda-installed.txt) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 994390593,MDU6SXNzdWU5OTQzOTA1OTM=,1468,Faceting for custom SQL queries,72577720,MichaelTiemannOSC,closed,0,,,,,2,2021-09-13T02:52:16Z,2021-09-13T04:54:22Z,2021-09-13T04:54:17Z,CONTRIBUTOR,,"Facets are awesome. But not when I need to join to tidy tables together. Or even just running explicitly the default SQL query that simply lists all the rows and columns of a table (up to SIZE). That is to say, when I browse a table, I see facets: https://latest.datasette.io/fixtures/compound_three_primary_keys But when I run a custom query, I don't: https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101 Is there an idiom to cause custom SQL to come back with facet suggestions?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 990367646,MDU6SXNzdWU5OTAzNjc2NDY=,1462,"Separate out ""debug"" options from ""root"" options",9599,simonw,open,0,,,,,1,2021-09-07T21:27:34Z,2021-09-07T21:34:33Z,,OWNER,,"> I ditched ""root"" for ""admin"" because root by default gives you a whole bunch of stuff which I think could be confusing: > > > > Maybe the real problem here is that I'm conflating ""root"" permissions with ""debug"" options. Perhaps there should be an extra Datasette mode that unlocks debug tools for the root user? _Originally posted by @simonw in https://github.com/simonw/datasette-app-support/issues/8#issuecomment-914638998_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 989986586,MDU6SXNzdWU5ODk5ODY1ODY=,1461,Try blacken-docs,9599,simonw,closed,0,,,,,3,2021-09-07T13:28:50Z,2021-09-07T16:13:59Z,2021-09-07T16:13:59Z,OWNER,,https://github.com/asottile/blacken-docs,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 989109888,MDU6SXNzdWU5ODkxMDk4ODg=,1460,Override column metadata with metadata from another column,72577720,MichaelTiemannOSC,open,0,,,,,0,2021-09-06T12:13:33Z,2021-09-06T12:13:33Z,,CONTRIBUTOR,,"I have a table from the PUDL project (https://github.com/catalyst-cooperative/pudl) that looks like this: ``` CREATE TABLE fuel_ferc1 ( id INTEGER NOT NULL, record_id TEXT, utility_id_ferc1 INTEGER, report_year INTEGER, plant_name_ferc1 TEXT, fuel_type_code_pudl VARCHAR(7), fuel_unit VARCHAR(7), fuel_qty_burned FLOAT, fuel_mmbtu_per_unit FLOAT, fuel_cost_per_unit_burned FLOAT, fuel_cost_per_unit_delivered FLOAT, fuel_cost_per_mmbtu FLOAT, PRIMARY KEY (id), FOREIGN KEY(plant_name_ferc1, utility_id_ferc1) REFERENCES plants_ferc1 (plant_name_ferc1, utility_id_ferc1), CONSTRAINT fuel_ferc1_fuel_type_code_pudl_enum CHECK (fuel_type_code_pudl IN ('coal', 'oil', 'gas', 'solar', 'wind', 'hydro', 'nuclear', 'waste', 'unknown')), CONSTRAINT fuel_ferc1_fuel_unit_enum CHECK (fuel_unit IN ('ton', 'mcf', 'bbl', 'gal', 'kgal', 'gramsU', 'kgU', 'klbs', 'btu', 'mmbtu', 'mwdth', 'mwhth', 'unknown')) ); ``` Note that `fuel_unit` is a unit that **pint** can understand, and that `fuel_qty_burned` is a column of data that could be expressed in terms of actual units, not merely as a dimensionless number. Ditto the `fuel_cost_per_unit_...` columns. Is there a way to give a column a default metadata unit (such as *tons* or *USD/ton*) and then let that be overridden when the metadata in another column says *barrels* or *USD/gramsU*? @catalyst-cooperative",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 988556488,MDU6SXNzdWU5ODg1NTY0ODg=,1459,suggestion: allow `datasette --open` to take a relative URL,51016,ctb,open,0,,,,,1,2021-09-05T17:17:07Z,2021-09-05T19:59:15Z,,CONTRIBUTOR,,"(soft suggestion because I'm not sure I'm using datasette right yet) Over at https://github.com/ctb/2021-sourmash-datasette, I'm playing around with datasette, and I'm creating some static pages to send people to the right facets. There may well be better ways of achieving this end goal, and I will find out if so, I'm sure! But regardless I think it might be neat to support an option to allow `-o/--open` to take a relative URL, that then gets appended to the hostname and port. This would let me improve my documentation. I don't see any downsides, either, but 🤷 there may well be some :) Happy to dig in and provide a PR if it's of interest. I'm not sure off the top of my head how to support an optional value to a parameter in argparse - the current `-o` behavior is kinda nice so it'd be suboptimal to require a url for `-o`. Maybe `--open-url=` or something would work? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 988552851,MDU6SXNzdWU5ODg1NTI4NTE=,1456,conda install results in non-functioning `datasette serve` due to out-of-date asgiref,51016,ctb,open,0,,,,,0,2021-09-05T16:59:55Z,2021-09-05T16:59:55Z,,CONTRIBUTOR,,"Over in https://github.com/ctb/2021-sourmash-datasette, I discovered that the following commands fail: ``` conda create -n datasette4 -y datasette=0.58.1 conda activate datasette4 datasette gathertax.db ``` with `ImportError: cannot import name 'WebSocketScope' from 'asgiref.typing'`. This appears to be because asgiref 3.3.4 doesn't have WebSocketScope, but later versions do - a simple ``` pip install asgiref==3.4.1 ``` fixes the problem for me, at least to the point where I can run datasette and poke around as usual. I note that over in the conda-forge recipe, https://github.com/conda-forge/datasette-feedstock/blob/master/recipe/meta.yaml pins asgiref to < 3.4.0, but I'm not sure why - so I'm not sure how to best resolve this issue :).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 986829194,MDU6SXNzdWU5ODY4MjkxOTQ=,14,xml.etree.ElementTree.Parse Error - mismatched tag,46968,step21,open,0,,,,,1,2021-09-02T14:46:36Z,2021-09-02T14:53:11Z,,NONE,,"This is an error message I get upon parsing the enex file of my Inbox. Please find the full error message below. Any hints welcome. ``` Importing from ENEX [##################------------------] 50% 00:00:50 Traceback (most recent call last): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 17, in find_all_tags for event, el in parser.read_events(): File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1329, in read_events raise event File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1301, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: mismatched tag: line 6837961, column 2 ``` ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 978357984,MDU6SXNzdWU5NzgzNTc5ODQ=,1446,Modify base.html template to support optional sticky footer,9599,simonw,closed,0,,,,,3,2021-08-24T18:11:12Z,2021-08-31T01:54:59Z,2021-08-24T20:32:47Z,OWNER,,"The neatest way to have the footer stick to the bottom of the browser window that I've found is to use the flexbox pattern from https://css-tricks.com/couple-takes-sticky-footer/ ```html
content
``` ```css html, body { height: 100%; } body { display: flex; flex-direction: column; } .content { flex: 1 0 auto; } .footer { flex-shrink: 0; } ``` I tried this in a custom plugin but it ended up having to duplicate the entire `base.html` template just to get a wrapper around the not-footer content. I think Datasette's own `base.html` template should have this wrapper element instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 983221851,MDU6SXNzdWU5ODMyMjE4NTE=,34,Data folder as index command parameter,1223625,humrochagf,open,0,,,,,0,2021-08-30T21:29:33Z,2021-08-30T21:29:33Z,,NONE,,"Hi, First of all, thank you for this wonderful project :smile: I started to use dogsheep to make my personal data searchable, and by using the project I noticed an issue with the index command. It always expects you are running it from the root folder from where the data is located, so I got some errors while trying to make it work on my setup. I separate all databases inside a `data` folder (I published my setup to be easier to follow: https://github.com/humrochagf/my-dogsheep) Before, I configured `dogsheep.yml` to add the data folder to its path like this: ```yml data/twitter.db: tweets: sql: |- ... ``` And running the index command like this: ``` dogsheep-beta index data/dogsheep.db dogsheep.yml ``` It worked to the normal search feature with no problem this way, but when I started adding `display_sql` rules the app started to crash, because at datasette `get_database` it was looking for `data/twitter` and it only had a db called `twitter` there. So my workaround to that was to cd into the data folder and run the indexer. You can check the way I'm doing it at this line of the makefile: https://github.com/humrochagf/my-dogsheep/blob/main/makefile#L3 It works but it would be nice to have an option to pass the path where the data is located to the index function.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 982803408,MDU6SXNzdWU5ODI4MDM0MDg=,1454,Feature Request: Publish to IPFS,1560788,blitmap,open,0,,,,,0,2021-08-30T13:36:18Z,2021-08-30T13:36:18Z,,NONE,,"Hello, I am a huge fan of this being used for exploring data. I think it has a lot of flexibility not found in other tools. I'm not sure if what I'm asking for is possible: Can this be extended to publish to IPFS? IPFS is an attractive hosting option for decentralized journalism. Food for thought ~",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 981676832,MDU6SXNzdWU5ODE2NzY4MzI=,1449,`register_commands()` plugin hook to register extra CLI commands,9599,simonw,closed,0,,,,,13,2021-08-28T00:26:21Z,2021-08-28T01:58:35Z,2021-08-28T01:43:11Z,OWNER,,"The `datasette` CLI tool currently has 7 subcommands: `serve`, `inspect`, `install`, `package`, `plugins`, `publish` and `uninstall`. A plugin hook could allow plugins to register extra subcommands. I've avoided this for quite a while because I didn't have good use-cases for it - but the existence of the `datasette install xxx` command for installing packages into the correct virtual environment means that actually there's a good reason to do this: it would allow plugins to provide additional command-line mechanisms without the user having to understand how virtual environments work in order to install those commands into the same environment as the rest of Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 981681138,MDU6SXNzdWU5ODE2ODExMzg=,1450,"Datasette --help should show something more useful than ""Datasette!""",9599,simonw,closed,0,,,,,1,2021-08-28T00:44:51Z,2021-08-28T00:49:07Z,2021-08-28T00:49:07Z,OWNER,,"https://github.com/simonw/datasette/blob/a1a33bb5822214be1cebd98cd858b2058d91a4aa/datasette/cli.py#L122-L127 _Originally spotted by @simonw in https://github.com/simonw/datasette/issues/1449#issuecomment-907539668_ ``` ~ % datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724759588,MDU6SXNzdWU3MjQ3NTk1ODg=,29,Add search highlighting snippets,9599,simonw,open,0,,,,,5,2020-10-19T16:00:48Z,2021-08-26T20:23:11Z,,MEMBER,,Like on https://til.simonwillison.net/til/search?q=Snippet,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602585497,MDU6SXNzdWU2MDI1ODU0OTc=,7,Integrate image content hashing,9599,simonw,open,0,,,,,2,2020-04-19T00:36:58Z,2021-08-26T02:01:01Z,,MEMBER,,To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 642572841,MDU6SXNzdWU2NDI1NzI4NDE=,859,Database page loads too slowly with many large tables (due to table counts),3243482,abdusco,open,0,,,,,21,2020-06-21T14:23:17Z,2021-08-25T21:59:55Z,,CONTRIBUTOR,,"Hey, I have a database that I save in HTML from couple of web scrapers. There are around 200k+, 50+ rows in a couple of tables, with sqlite file weighing around 600MB. The app runs on a VPS with 2 core CPU, 4GB RAM and refreshing database page regularly takes more than 10 seconds. I was suspecting that counting tables was the culprit, but manually running `select count(*) from table_name` for the largest table finishes under a second. I've looked at the source code. There's a check for index page for mutable databases larger than 100MB https://github.com/simonw/datasette/blob/799c5d53570d773203527f19530cf772dc2eeb24/datasette/views/index.py#L15 but this check is not performed for database page. I've manually crippled `Database::table_counts` method ```py async def table_counts(self, limit=10): if not self.is_mutable and self.cached_table_counts is not None: return self.cached_table_counts # Try to get counts for each table, $limit timeout for each count counts = {} for table in await self.table_names(): try: # table_count = ( # await self.execute( # ""select count(*) from [{}]"".format(table), # custom_time_limit=limit, # ) # ).rows[0][0] counts[table] = 10 # table_count # In some cases I saw ""SQL Logic Error"" here in addition to # QueryInterrupted - so we catch that too: except (QueryInterrupted, sqlite3.OperationalError, sqlite3.DatabaseError): counts[table] = None if not self.is_mutable: self.cached_table_counts = counts return counts ``` now the page loads in <100ms. Is it possible to apply size check on database page too?
/-/versions output
{
    ""python"": {
        ""version"": ""3.8.0"",
        ""full"": ""3.8.0 (default, Oct 28 2019, 16:14:01) \n[GCC 8.3.0]""
    },
    ""datasette"": {
        ""version"": ""0.44""
    },
    ""asgi"": ""3.0"",
    ""uvicorn"": ""0.11.5"",
    ""sqlite"": {
        ""version"": ""3.22.0"",
        ""fts_versions"": [
            ""FTS5"",
            ""FTS4"",
            ""FTS3""
        ],
        ""extensions"": {
            ""json1"": null
        },
        ""compile_options"": [
            ""COMPILER=gcc-7.4.0"",
            ""ENABLE_COLUMN_METADATA"",
            ""ENABLE_DBSTAT_VTAB"",
            ""ENABLE_FTS3"",
            ""ENABLE_FTS3_PARENTHESIS"",
            ""ENABLE_FTS3_TOKENIZER"",
            ""ENABLE_FTS4"",
            ""ENABLE_FTS5"",
            ""ENABLE_JSON1"",
            ""ENABLE_LOAD_EXTENSION"",
            ""ENABLE_PREUPDATE_HOOK"",
            ""ENABLE_RTREE"",
            ""ENABLE_SESSION"",
            ""ENABLE_STMTVTAB"",
            ""ENABLE_UNLOCK_NOTIFY"",
            ""ENABLE_UPDATE_DELETE_LIMIT"",
            ""HAVE_ISNAN"",
            ""LIKE_DOESNT_MATCH_BLOBS"",
            ""MAX_SCHEMA_RETRY=25"",
            ""MAX_VARIABLE_NUMBER=250000"",
            ""OMIT_LOOKASIDE"",
            ""SECURE_DELETE"",
            ""SOUNDEX"",
            ""TEMP_STORE=1"",
            ""THREADSAFE=1""
        ]
    }
}
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 979627285,MDU6SXNzdWU5Nzk2MjcyODU=,323,`table.convert()` method should clean up after itself,9599,simonw,closed,0,,,,,1,2021-08-25T21:15:39Z,2021-08-25T21:25:26Z,2021-08-25T21:25:18Z,OWNER,,"It currently works like this: https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/sqlite_utils/db.py#L2177-L2195 It's registering a function called `convert_value()` and then failing to de-register that function once it has finished. It might even be possible for two queries running against the same connection to clobber each other's `convert_value()` functions, leading to incorrect behaviour. So two fixes: firstly it should register the function with a unique name (maybe add a random suffix). Secondly, it should de-register that function once it has finished.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 976399638,MDU6SXNzdWU5NzYzOTk2Mzg=,319,[Enhancement] Please allow 'insert-files' to insert content as text.,66709385,pjamargh,closed,0,,,,,10,2021-08-22T15:10:46Z,2021-08-24T23:33:45Z,2021-08-24T23:33:44Z,NONE,,"'insert-files' creates BLOB columns for file contents. Transforming the column to TEXT still keep the content as binary. Even though I'm sure there is a transform that can be applied decoding the text it would be great to have a argument to make 'insert-files' to do it as text (with optional text encoding). The use case is a bunch of htmls (single file) on a directory structure that inserted with this command could be served in Datasette allowing full text search.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 977323133,MDU6SXNzdWU5NzczMjMxMzM=,1445,Ability to search for text across all columns in a table,9599,simonw,open,0,,,,,5,2021-08-23T18:50:48Z,2021-08-23T19:10:17Z,,OWNER,,"When I'm working with new data I often find myself wanting to run a search for text embedded in ANY of the columns of a table, without having to even fully understand the schema first. I figured out a trick for doing that using a SQL-generated SQL query here: https://til.simonwillison.net/datasette/search-all-columns-trick But maybe this should be a core Datasette feature? Or a plugin?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1445/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 977128935,MDU6SXNzdWU5NzcxMjg5MzU=,21,Duplicate Column,32016596,FabianHertwig,open,0,,,,,1,2021-08-23T15:00:44Z,2021-08-23T17:00:59Z,,NONE,,"Hey, thank you for this repo! When I try to convert my export, I get a multiple column error. Here is the stack trace: ```sh (.venv) (base) computer:bodyweight_app user$ healthkit-to-sqlite ./data/Health_export.zip ./data/healthkit.db Importing from HealthKit [###############################-----] 87% 00:00:22 Traceback (most recent call last): File ""/MyProject/.venv/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 146, in write_records batch_size=50, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 2579, in insert_all extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1246, in create extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 767, in create_table self.execute(sql) File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: duplicate column name: metadata_Meal ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/21/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 976405225,MDU6SXNzdWU5NzY0MDUyMjU=,320,sqlite-utils memory --analyze option,9599,simonw,closed,0,,,,,2,2021-08-22T15:37:10Z,2021-08-22T15:46:56Z,2021-08-22T15:44:29Z,OWNER,,To provide a way of running [analyze-tables](https://sqlite-utils.datasette.io/en/stable/cli.html#analyzing-tables) directly against JSON or CSV data.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 657572753,MDU6SXNzdWU2NTc1NzI3NTM=,894,?sort=colname~numeric to sort by by column cast to real,9599,simonw,open,0,,,,,21,2020-07-15T18:47:48Z,2021-08-20T02:07:53Z,,OWNER,,"If a text column actually contains numbers, being able to ""sort by column, treated as numeric"" would be really useful. Probably depends on column actions enabled by #690",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/894/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 975166271,MDU6SXNzdWU5NzUxNjYyNzE=,20,Add index on workout_points.date,9599,simonw,open,0,,,,,2,2021-08-20T01:08:04Z,2021-08-20T01:12:48Z,,MEMBER,,"Sorting that by date makes sense for seeing most recent points, and my DB has 2.5m points in so it's an expensive sort!",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,simonw,closed,0,,,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,rubenv,closed,0,,,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command: ``` twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since ``` Which gives the following error: ``` Error: Use either --since or --since_id, not both ``` Running without `--since`. ``` Traceback (most recent call last): File ""/usr/local/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline for tweet in bar: File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator for rv in self.iter: File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline yield from fetch_timeline( File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline raise Exception(str(tweets[""errors""])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}] ``` ``` Python 3.9.5 twitter-to-sqlite, version 0.21.3 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974995592,MDU6SXNzdWU5NzQ5OTU1OTI=,1443,datasette.databases should be a documented property,9599,simonw,closed,0,,,,,1,2021-08-19T19:53:04Z,2021-08-19T21:25:07Z,2021-08-19T21:23:47Z,OWNER,,"https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/app.py#L231 I want to use it in https://github.com/simonw/datasette-block-robots/issues/5",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974987856,MDU6SXNzdWU5NzQ5ODc4NTY=,1442,Mechanism to cause specific branches to deploy their own demos,9599,simonw,closed,0,,,,,6,2021-08-19T19:41:39Z,2021-08-19T21:11:45Z,2021-08-19T21:09:40Z,OWNER,,"A useful capability would be if it was super-easy to say ""any pushes to branch X should be deployed to `latest-X.datasette.io`"". I'd like to use this for the column query information work in #1434",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913135723,MDU6SXNzdWU5MTMxMzU3MjM=,266,"Add some types, enforce with mypy",9599,simonw,closed,0,,,,,3,2021-06-07T06:05:56Z,2021-08-18T22:25:38Z,2021-08-18T22:25:38Z,OWNER,,"A good starting point would be adding type information to the members of these named tuples and the introspection methods that return them: https://github.com/simonw/sqlite-utils/blob/9dff7a38831d471b1dff16d40d89eb5c3b4e84d6/sqlite_utils/db.py#L51-L75",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965210966,MDU6SXNzdWU5NjUyMTA5NjY=,314,Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`,9599,simonw,closed,0,,,,,2,2021-08-10T18:03:59Z,2021-08-18T22:25:21Z,2021-08-18T22:25:21Z,OWNER,,"> Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/312#issuecomment-896200682_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,simonw,closed,0,,,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints. https://veekaybee.github.io/2019/07/08/python-type-hints/ is good. It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 931752773,MDU6SXNzdWU5MzE3NTI3NzM=,294,Add a `sqlite-utils memory` example to the README,9599,simonw,closed,0,,,,,0,2021-06-28T16:35:59Z,2021-08-18T21:40:03Z,2021-08-18T21:40:03Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974067156,MDU6SXNzdWU5NzQwNjcxNTY=,318,Research: handle gzipped CSV directly,9599,simonw,open,0,,,,,2,2021-08-18T21:23:04Z,2021-08-18T21:25:30Z,,OWNER,,"Would it be worthwhile for the `sqlite-utils` command-line tool to grow features to efficiently directly interact with gzipped CSV data? Maybe add `--gz` options to both `insert` and to the various commands that output query results.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 934123448,MDU6SXNzdWU5MzQxMjM0NDg=,295,Insert with --tsv and --no-headers give error about --nl arguments,7288187,davidscotson,closed,0,,,,,1,2021-06-30T21:01:01Z,2021-08-18T20:19:04Z,2021-08-18T20:18:57Z,NONE,,"Not quite sure if this is a bug, or just an assumption I made but I thought `--tsv` and `--no-headers` would work together when inserting from a file, and currently they seem not to (sqlite-utils, version 3.12, installed on Mac OS X via brew) Instead it says: `Error: Use just one of --nl, --csv or --tsv` As if it has interpreted the --no-headers as --nl. The --help does specifically say CSV: `--no-headers CSV file has no header row` And this heading in the documentation also only refers to CSV, but the text does mention TSV in passing, and I'd generally expect them to behave the same in most cases. https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944326512,MDU6SXNzdWU5NDQzMjY1MTI=,296,"`table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option",32427188,deafmute1,closed,0,,,,,6,2021-07-14T11:26:47Z,2021-08-18T20:13:12Z,2021-08-18T20:10:48Z,NONE,,"Hi, Recently got this error: ``` Traceback (most recent call last): File """", line 1, in File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 38, in start(""/home/ethan/git/music-metadata-indexer/sample"", ""/home/ethan/git/music-metadata-indexer/test.db"") File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 23, in start scanner.build_database() File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 79, in build_database _import_song(self.db, Path(dirpath).joinpath(f), self.logger) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 23, in _import_song db.add_song(filepath) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/index.py"", line 166, in add_song for match in self.search(""albums"", album): File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1625, in search cursor = self.db.execute( File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 243, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: fts5: syntax error near ""."" ``` So, the error seems to suggest there was a ""."" character somewhere in the SQL command that was causing the error. I did a little digging and found this in the docs: https://www.sqlite.org/fts5.html#fts5_strings. ""."" is one of the many prohibited characters. My solution was to just strip these out of the query using this line `query = query.translate({e: None for e in itertools.chain(range(0,26), range(27, 48), range(58,65), range(91,95), [96], range(123,128))})` Perhaps this could be included into the `table.search()` function? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 831751367,MDU6SXNzdWU4MzE3NTEzNjc=,246,Escaping FTS search strings,16001974,DeNeutoy,closed,0,,,,,4,2021-03-15T12:15:09Z,2021-08-18T18:57:13Z,2021-08-18T18:43:12Z,CONTRIBUTOR,," Thanks for the excellent library, it's very nice to use! I've been building some in memory search functionality for a data annotation tool i'm making, and I got tripped up a little bit with escaping the full text search queries. First I tried using `db.quote(q)`, which doesn't work, because sqlite FTS has it's own (separate)[ query syntax](https://www2.sqlite.org/fts5.html#full_text_query_syntax). You can see this happening here also: http://search-24ways.herokuapp.com/24ways-f8f455f/articles?_search=acces%2A I got around this by aggressively escaping quotes inside the query string like this: ```python quoted = q.replace('""', '""""') quoted = f'""{quoted}""' print(quoted) results = db[""data""].search(quoted, columns=[""id""]) return [x[""id""] for x in results] ``` This works in the sense it doesn't crash, but it also removes access to the search query syntax. Given the well specified definition, it might be possible for sqlite-utils to provide a `db.quote_query(q)` which would intelligently escape a query whilst leaving the syntax intact. This would be very nice! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 972827346,MDU6SXNzdWU5NzI4MjczNDY=,317,Link to a better example on docs index,9599,simonw,closed,0,,,,,1,2021-08-17T15:43:40Z,2021-08-18T18:31:43Z,2021-08-18T18:31:43Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/7a19822ac9ee24be2fbb4c2326a0bf2f3d2d9c4d/docs/index.rst#L39 Is a very old example,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 972918533,MDU6SXNzdWU5NzI5MTg1MzM=,1438,Query page .csv and .json links are not correctly URL-encoded on Vercel under unknown specific conditions,9599,simonw,open,0,,,,,7,2021-08-17T17:35:36Z,2021-08-18T00:22:23Z,,OWNER,,"> Confirmed: https://thesession.vercel.app/thesession?sql=select+*+from+tunes+where+name+like+%22%25wise+maid%25%22%0D%0A is a page where the URL correctly encoded `%` as `%25` - but then in the HTML on that page that links to the CSV and JSON versions we get this: > > ```html >

This data as > json, > CSV >

> ``` Those CSV and JSON links are incorrect. _Originally posted by @simonw in https://github.com/simonw/datasette-publish-vercel/issues/48#issuecomment-900497579_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 970320615,MDU6SXNzdWU5NzAzMjA2MTU=,316,Fix visible backticks on reference page,9599,simonw,closed,0,,,,,1,2021-08-13T11:37:46Z,2021-08-14T05:12:23Z,2021-08-14T05:10:48Z,OWNER,,"https://sqlite-utils.datasette.io/en/latest/reference.html Search for backtick to reveal various minor markup bugs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 970626625,MDU6SXNzdWU5NzA2MjY2MjU=,1435,Turn off suggest facets on tables with large numbers of columns,9599,simonw,open,0,,,,,0,2021-08-13T18:30:48Z,2021-08-13T18:30:48Z,,OWNER,,If a table has 200 columns it will take multiple seconds to try and suggest facets. I should either quit after the first 20 or not suggest facets at all.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 969548935,MDU6SXNzdWU5Njk1NDg5MzU=,1429,UI for setting `?_size=max` on table page,9599,simonw,open,0,,,,,2,2021-08-12T20:52:09Z,2021-08-13T04:37:41Z,,OWNER,,"It defaults to 100 per page, but you can increase that to 1000 per page using `?_size=max` (or higher if `max_returned_rows` is set higher than that). But... that's only available to people who know how to hack URLs. Solution: add a link that sets that option to the pagination block at the bottom of the table: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 969840302,MDU6SXNzdWU5Njk4NDAzMDI=,1431,`--help-config` should be called `--help-settings`,9599,simonw,closed,0,,,,,1,2021-08-13T00:46:48Z,2021-08-13T01:01:58Z,2021-08-13T01:01:58Z,OWNER,,Follow-on from #1105 rebranding exercise.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722816436,MDU6SXNzdWU3MjI4MTY0MzY=,186,.extract() shouldn't extract null values,9599,simonw,open,0,,,,,7,2020-10-16T02:41:08Z,2021-08-12T12:32:14Z,,OWNER,,"This almost works, but it creates a rogue `type` record with a value of None. ``` In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [5]: db[""creatures""].insert_all([ {""id"": 1, ""name"": ""Simon"", ""type"": None}, {""id"": 2, ""name"": ""Natalie"", ""type"": None}, {""id"": 3, ""name"": ""Cleo"", ""type"": ""dog""}], pk=""id"") Out[5]:
In [7]: db[""creatures""].extract(""type"") Out[7]:
In [8]: list(db[""creatures""].rows) Out[8]: [{'id': 1, 'name': 'Simon', 'type_id': None}, {'id': 2, 'name': 'Natalie', 'type_id': None}, {'id': 3, 'name': 'Cleo', 'type_id': 2}] In [9]: db[""type""] Out[9]:
In [10]: list(db[""type""].rows) Out[10]: [{'id': 1, 'type': None}, {'id': 2, 'type': 'dog'}] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,