html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/sqlite-utils/issues/267#issuecomment-866184260,https://api.github.com/repos/simonw/sqlite-utils/issues/267,866184260,MDEyOklzc3VlQ29tbWVudDg2NjE4NDI2MA==,9599,simonw,2021-06-22T17:26:18Z,2021-06-22T17:27:27Z,OWNER,"If an`.update()` method doesn't work because it collides with an existing dictionary method a `.pk` property could still be nice: ```python for row in db[""sometable""].rows: db[""sometable""].update(row.pk, {""modified"": 1}) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",915421499,row.update() or row.pk, https://github.com/simonw/sqlite-utils/issues/267#issuecomment-866182655,https://api.github.com/repos/simonw/sqlite-utils/issues/267,866182655,MDEyOklzc3VlQ29tbWVudDg2NjE4MjY1NQ==,9599,simonw,2021-06-22T17:24:03Z,2021-06-22T17:24:03Z,OWNER,"I'm re-opening this as a research task because it may be possible to cleanly implement this using a `dict` subclass - some notes on that here: https://treyhunner.com/2019/04/why-you-shouldnt-inherit-from-list-and-dict-in-python/ Since this would just be for adding methods (and maybe a property for returning the primary keys for a row) the usual disadvantages of subclassing `dict` described in that article shouldn't apply. One catch: dictionaries already have a `.update()` method! So would have to pick another name.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",915421499,row.update() or row.pk, https://github.com/simonw/sqlite-utils/issues/290#issuecomment-865510796,https://api.github.com/repos/simonw/sqlite-utils/issues/290,865510796,MDEyOklzc3VlQ29tbWVudDg2NTUxMDc5Ng==,9599,simonw,2021-06-22T04:04:40Z,2021-06-22T04:04:48Z,OWNER,"Still needs documentation, which will involve rewriting the whole [Executing queries](https://sqlite-utils.datasette.io/en/3.11/python-api.html#executing-queries) section.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",926777310,`db.query()` method (renamed `db.execute_returning_dicts()`), https://github.com/simonw/sqlite-utils/issues/290#issuecomment-865497846,https://api.github.com/repos/simonw/sqlite-utils/issues/290,865497846,MDEyOklzc3VlQ29tbWVudDg2NTQ5Nzg0Ng==,9599,simonw,2021-06-22T03:21:38Z,2021-06-22T03:21:38Z,OWNER,"The Python docs say: https://docs.python.org/3/library/sqlite3.html > To retrieve data after executing a SELECT statement, you can either treat the cursor as an iterator, call the cursor’s `fetchone()` method to retrieve a single matching row, or call `fetchall()` to get a list of the matching rows. Looking at the C source code, both `fetchmany()` and `fetchall()` work under the hood by assembling a Python list: https://github.com/python/cpython/blob/be1cb3214d09d4bf0288bc45f3c1f167f67e4514/Modules/_sqlite/cursor.c#L907-L972 - see calls to `PyList_Append()` So it looks like the most efficient way to iterate over a cursor may well be `for row in cursor:` - which I think calls this C function: https://github.com/python/cpython/blob/be1cb3214d09d4bf0288bc45f3c1f167f67e4514/Modules/_sqlite/cursor.c#L813-L876","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",926777310,`db.query()` method (renamed `db.execute_returning_dicts()`), https://github.com/simonw/sqlite-utils/issues/290#issuecomment-865495370,https://api.github.com/repos/simonw/sqlite-utils/issues/290,865495370,MDEyOklzc3VlQ29tbWVudDg2NTQ5NTM3MA==,9599,simonw,2021-06-22T03:14:30Z,2021-06-22T03:14:30Z,OWNER,"One small problem with the existing method: https://github.com/simonw/sqlite-utils/blob/8cedc6a8b29180e68326f6b76f249d5e39e4b591/sqlite_utils/db.py#L362-L365 It returns a full list, but what if the user would rather have a generator they can iterate over without loading the results into memory in one go?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",926777310,`db.query()` method (renamed `db.execute_returning_dicts()`), https://github.com/simonw/sqlite-utils/issues/290#issuecomment-865491922,https://api.github.com/repos/simonw/sqlite-utils/issues/290,865491922,MDEyOklzc3VlQ29tbWVudDg2NTQ5MTkyMg==,9599,simonw,2021-06-22T03:05:35Z,2021-06-22T03:05:35Z,OWNER,"Potential names: - `db.query(sql)` - it's weird to have both this and `db.execute()` but it is at least short and memorable - `db.sql(sql)` - `db.execute_d(sql)` - ugly - `db.execute_dicts(sql)` - confusing - `db.execute_sql(sql)` - easily confused with `db.execute(sql)` I think `db.query(sql)` may be the best option here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",926777310,`db.query()` method (renamed `db.execute_returning_dicts()`), https://github.com/simonw/datasette/pull/1368#issuecomment-865160132,https://api.github.com/repos/simonw/datasette/issues/1368,865160132,MDEyOklzc3VlQ29tbWVudDg2NTE2MDEzMg==,9599,simonw,2021-06-21T16:07:06Z,2021-06-21T16:08:48Z,OWNER,"A few tests failed - Black, the test that checks the docs mention the new hook - the most interesting failing test looks like this one: ``` updated_metadata[""databases""][""fixtures""][""queries""][""magic_parameters""][ ""allow"" ] = (allow if ""query"" in permissions else deny) > cascade_app_client.ds._metadata = updated_metadata E AttributeError: can't set attribute ``` From https://github.com/simonw/datasette/blob/0a7621f96f8ad14da17e7172e8a7bce24ef78966/tests/test_permissions.py#L439-L467 This test is directly manipulating `_metadata` purely for the purposes of simulating different permissions - I think updating it to manipulate `_local_metadata` instead would fix that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913865304,DRAFT: A new plugin hook for dynamic metadata, https://github.com/simonw/sqlite-utils/issues/289#issuecomment-864609271,https://api.github.com/repos/simonw/sqlite-utils/issues/289,864609271,MDEyOklzc3VlQ29tbWVudDg2NDYwOTI3MQ==,9599,simonw,2021-06-20T20:42:07Z,2021-06-20T20:42:07Z,OWNER,"Wow, thank you! I didn't know about `typing.cast()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925677191,Mypy fixes for rows_from_file(), https://github.com/simonw/sqlite-utils/issues/286#issuecomment-864594956,https://api.github.com/repos/simonw/sqlite-utils/issues/286,864594956,MDEyOklzc3VlQ29tbWVudDg2NDU5NDk1Ng==,9599,simonw,2021-06-20T18:38:05Z,2021-06-20T18:38:05Z,OWNER,3.10 is out in Homebrew now (they turn that around so fast): https://formulae.brew.sh/formula/sqlite-utils,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925487946,Add installation instructions, https://github.com/simonw/datasette/issues/1382#issuecomment-864480051,https://api.github.com/repos/simonw/datasette/issues/1382,864480051,MDEyOklzc3VlQ29tbWVudDg2NDQ4MDA1MQ==,9599,simonw,2021-06-20T00:20:06Z,2021-06-20T00:21:02Z,OWNER,"Yes you can - thanks for pointing this out, I've added a comment to the `install.sh` script in the `datasette-csvs` Glitch project: ```bash pip3 install -U --no-cache-dir -r requirements.txt --user && \ mkdir -p .data && \ rm .data/data.db || true && \ for f in *.csv do # Add --encoding=latin-1 to the following if your CSVs use a different encoding: sqlite-utils insert .data/data.db ${f%.*} $f --csv done ``` So if you edit that file in your own project and change the line to this: sqlite-utils insert .data/data.db ${f%.*} $f --csv --encoding=iso-8859-1 It should fix this for you.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925406964,Datasette with Glitch - is it possible to use CSV with ISO-8859-1 encoding?, https://github.com/simonw/sqlite-utils/issues/272#issuecomment-864476167,https://api.github.com/repos/simonw/sqlite-utils/issues/272,864476167,MDEyOklzc3VlQ29tbWVudDg2NDQ3NjE2Nw==,9599,simonw,2021-06-19T23:36:48Z,2021-06-19T23:36:48Z,OWNER,Wrote this up on my blog here: https://simonwillison.net/2021/Jun/19/sqlite-utils-memory/ - with a video demo here: https://www.youtube.com/watch?v=OUjd0rkc678,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864419283,https://api.github.com/repos/simonw/sqlite-utils/issues/284,864419283,MDEyOklzc3VlQ29tbWVudDg2NDQxOTI4Mw==,9599,simonw,2021-06-19T15:15:34Z,2021-06-19T15:15:34Z,OWNER,"I think this code is at fault: https://github.com/simonw/sqlite-utils/blob/5b257949d996fe43dc5d218d4308b88796a90740/sqlite_utils/db.py#L1017-L1023 It's using `.pks` which adds `rowid` if it's missing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925320167,.transform(types=) turns rowid into a concrete column, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864418795,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864418795,MDEyOklzc3VlQ29tbWVudDg2NDQxODc5NQ==,9599,simonw,2021-06-19T15:11:05Z,2021-06-19T15:11:14Z,OWNER,"Actually I'm going to go with `use_rowid` instead - because the table doesn't inherently use a rowid itself, but you should use one if you want to query it in a way that gives you back a primary key.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864418188,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864418188,MDEyOklzc3VlQ29tbWVudDg2NDQxODE4OA==,9599,simonw,2021-06-19T15:05:53Z,2021-06-19T15:05:53Z,OWNER,"```python @property def uses_rowid(self): return not any(column for column in self.columns if column.is_pk) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864417808,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864417808,MDEyOklzc3VlQ29tbWVudDg2NDQxNzgwOA==,9599,simonw,2021-06-19T15:03:00Z,2021-06-19T15:03:00Z,OWNER,I think I like `table.uses_rowid` best - it reads well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864417765,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864417765,MDEyOklzc3VlQ29tbWVudDg2NDQxNzc2NQ==,9599,simonw,2021-06-19T15:02:42Z,2021-06-19T15:02:42Z,OWNER,"Some options: - `table.rowid_only` - `table.rowid_as_pk` - `table.no_pks` - `table.no_pk` - `table.uses_rowid` - `table.use_rowid`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864417493,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864417493,MDEyOklzc3VlQ29tbWVudDg2NDQxNzQ5Mw==,9599,simonw,2021-06-19T15:00:43Z,2021-06-19T15:00:43Z,OWNER,"I have to be careful about the language I use here. Here's the official definition: https://www.sqlite.org/rowidtable.html > A ""rowid table"" is any table in an SQLite schema that > > - is *not* a [virtual table](https://www.sqlite.org/vtab.html), and > - is *not* a [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html) table. > > Most tables in a typical SQLite database schema are rowid tables. > > Rowid tables are distinguished by the fact that they all have a unique, non-NULL, signed 64-bit integer [rowid](https://www.sqlite.org/lang_createtable.html#rowid) that is used as the access key for the data in the underlying [B-tree](https://www.sqlite.org/fileformat2.html#btree) storage engine. So it's not correct to call a table a ""rowid table"" only if it is missing its own primary keys. Maybe `table.has_rowid` is the right language to use here? No, that's no good - because tables with their own primary keys usually also have a rowid.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864417133,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864417133,MDEyOklzc3VlQ29tbWVudDg2NDQxNzEzMw==,9599,simonw,2021-06-19T14:57:36Z,2021-06-19T14:57:36Z,OWNER,"So the logic is: ```python [column.name for column in self.columns if column.is_pk] ``` I need to decide on a property name. Existing names are documented here: https://sqlite-utils.datasette.io/en/stable/python-api.html#introspection","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/285#issuecomment-864417031,https://api.github.com/repos/simonw/sqlite-utils/issues/285,864417031,MDEyOklzc3VlQ29tbWVudDg2NDQxNzAzMQ==,9599,simonw,2021-06-19T14:56:45Z,2021-06-19T14:56:45Z,OWNER,"```pycon >>> db = sqlite_utils.Database(memory=True) >>> db[""rowid_table""].insert({""name"": ""Cleo""}) >>> db[""regular_table""].insert({""id"": 1, ""name"": ""Cleo""}, pk=""id"")
>>> db[""rowid_table""].pks ['rowid'] >>> db[""regular_table""].pks ['id'] ``` But that's because the `.pks` property hides the difference: https://github.com/simonw/sqlite-utils/blob/dc94f4bb8cfe922bb2f9c89f8f0f29092ea63133/sqlite_utils/db.py#L805-L810 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925410305,Introspection property for telling if a table is a rowid table, https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416911,https://api.github.com/repos/simonw/sqlite-utils/issues/284,864416911,MDEyOklzc3VlQ29tbWVudDg2NDQxNjkxMQ==,9599,simonw,2021-06-19T14:55:45Z,2021-06-19T14:55:45Z,OWNER,"https://github.com/simonw/sqlite-utils/blob/dc94f4bb8cfe922bb2f9c89f8f0f29092ea63133/sqlite_utils/db.py#L805-L810 So I can indeed detect a `rowid` table by looking for no `is_pk` columns.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925320167,.transform(types=) turns rowid into a concrete column, https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416785,https://api.github.com/repos/simonw/sqlite-utils/issues/284,864416785,MDEyOklzc3VlQ29tbWVudDg2NDQxNjc4NQ==,9599,simonw,2021-06-19T14:54:41Z,2021-06-19T14:54:41Z,OWNER,"```pycon >>> db = sqlite_utils.Database(memory=True) >>> db[""rowid_table""].insert({""name"": ""Cleo""})
>>> db[""regular_table""].insert({""id"": 1, ""name"": ""Cleo""}, pk=""id"")
>>> db[""rowid_table""].pks ['rowid'] >>> db[""regular_table""].pks ['id'] ``` I think I need an introspection property for working out if a table is a `rowid` table or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925320167,.transform(types=) turns rowid into a concrete column, https://github.com/simonw/sqlite-utils/issues/283#issuecomment-864416086,https://api.github.com/repos/simonw/sqlite-utils/issues/283,864416086,MDEyOklzc3VlQ29tbWVudDg2NDQxNjA4Ng==,9599,simonw,2021-06-19T14:49:06Z,2021-06-19T14:49:13Z,OWNER,"Once again, this is difficult because of the use of a generator here - `rows_from_file()` only yields rows, so there is no obvious mechanism for it to communicate back to the wrapping code that the detected format was CSV or TSV as opposed to JSON. I'm going to change `rows_from_file()` to return a `(generator, detected_format)` tuple.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925319214,memory: Shouldn't detect types for JSON, https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864358951,https://api.github.com/repos/simonw/sqlite-utils/issues/284,864358951,MDEyOklzc3VlQ29tbWVudDg2NDM1ODk1MQ==,9599,simonw,2021-06-19T05:30:00Z,2021-06-19T05:30:00Z,OWNER,If this can be fixed it will be in the `transform_sql()` method.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925320167,.transform(types=) turns rowid into a concrete column, https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864358680,https://api.github.com/repos/simonw/sqlite-utils/issues/284,864358680,MDEyOklzc3VlQ29tbWVudDg2NDM1ODY4MA==,9599,simonw,2021-06-19T05:27:13Z,2021-06-19T05:27:13Z,OWNER,How easy is it to detect a `rowid` table? Is it as simple as `.pks` returning `None`? If so the documentation should mention that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925320167,.transform(types=) turns rowid into a concrete column, https://github.com/simonw/sqlite-utils/issues/282#issuecomment-864354627,https://api.github.com/repos/simonw/sqlite-utils/issues/282,864354627,MDEyOklzc3VlQ29tbWVudDg2NDM1NDYyNw==,9599,simonw,2021-06-19T04:42:03Z,2021-06-19T04:42:03Z,OWNER,"Demo: curl -s 'https://api.github.com/users/simonw/repos?per_page=100' | \ sqlite-utils memory - 'select sum(size), sum(stargazers_count) from stdin limit 1' [{""sum(size)"": 2042547, ""sum(stargazers_count)"": 6769}] ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925305186,Automatic type detection for CSV data, https://github.com/simonw/sqlite-utils/issues/282#issuecomment-864350407,https://api.github.com/repos/simonw/sqlite-utils/issues/282,864350407,MDEyOklzc3VlQ29tbWVudDg2NDM1MDQwNw==,9599,simonw,2021-06-19T03:52:20Z,2021-06-19T03:52:20Z,OWNER,I'll have an environment variable for `--detect-types` so users who really want that as the default option can turn it on.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925305186,Automatic type detection for CSV data, https://github.com/simonw/sqlite-utils/issues/282#issuecomment-864349123,https://api.github.com/repos/simonw/sqlite-utils/issues/282,864349123,MDEyOklzc3VlQ29tbWVudDg2NDM0OTEyMw==,9599,simonw,2021-06-19T03:36:54Z,2021-06-19T03:36:54Z,OWNER,"I may change the default for `sqlite-utils insert` to detect types if I release `sqlite-utils` 4.0, as a backwards-incompatible change.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925305186,Automatic type detection for CSV data, https://github.com/simonw/sqlite-utils/issues/179#issuecomment-864349066,https://api.github.com/repos/simonw/sqlite-utils/issues/179,864349066,MDEyOklzc3VlQ29tbWVudDg2NDM0OTA2Ng==,9599,simonw,2021-06-19T03:36:04Z,2021-06-19T03:36:04Z,OWNER,This work is going to happen in #282.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",709577625,sqlite-utils transform/insert --detect-types, https://github.com/simonw/sqlite-utils/issues/282#issuecomment-864348954,https://api.github.com/repos/simonw/sqlite-utils/issues/282,864348954,MDEyOklzc3VlQ29tbWVudDg2NDM0ODk1NA==,9599,simonw,2021-06-19T03:34:42Z,2021-06-19T03:35:46Z,OWNER,"I built some prototype code here for something which looks at every row in a CSV import and records the likely types: https://gist.github.com/simonw/465f9356f175d1cf86957947dff501d4 This could be used by the command-line tools to figure out what `table.transform(types=...)` method to use at the end. This is a different approach to the pure SQL version I tried building in https://github.com/simonw/sqlite-utils/issues/179 - I think this is a better approach though, it's less prone to weird idiosyncrasies of SQLite types, and it's also easy for us to add on to the existing CSV import code in a way that won't require scanning the data twice.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",925305186,Automatic type detection for CSV data, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864330508,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864330508,MDEyOklzc3VlQ29tbWVudDg2NDMzMDUwOA==,9599,simonw,2021-06-19T00:34:24Z,2021-06-19T00:34:24Z,OWNER,"Got this working: % curl 'https://api.github.com/repos/simonw/datasette/issues' | sqlite-utils memory - 'select id from stdin' ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864328927,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864328927,MDEyOklzc3VlQ29tbWVudDg2NDMyODkyNw==,9599,simonw,2021-06-19T00:25:08Z,2021-06-19T00:25:17Z,OWNER,"I tried writing this function with type hints, but eventually gave up: ```python def rows_from_file( fp: BinaryIO, format: Optional[Format] = None, dialect: Optional[Type[csv.Dialect]] = None, encoding: Optional[str] = None, ) -> Generator[dict, None, None]: if format == Format.JSON: decoded = json.load(fp) if isinstance(decoded, dict): decoded = [decoded] if not isinstance(decoded, list): raise RowsFromFileBadJSON(""JSON must be a list or a dictionary"") yield from decoded elif format == Format.CSV: decoded_fp = io.TextIOWrapper(fp, encoding=encoding or ""utf-8-sig"") yield from csv.DictReader(decoded_fp) elif format == Format.TSV: yield from rows_from_file( fp, format=Format.CSV, dialect=csv.excel_tab, encoding=encoding ) elif format is None: # Detect the format, then call this recursively buffered = io.BufferedReader(fp, buffer_size=4096) first_bytes = buffered.peek(2048).strip() if first_bytes[0] in (b""["", b""{""): # TODO: Detect newline-JSON yield from rows_from_file(fp, format=Format.JSON) else: dialect = csv.Sniffer().sniff(first_bytes.decode(encoding, ""ignore"")) yield from rows_from_file( fp, format=Format.CSV, dialect=dialect, encoding=encoding ) else: raise RowsFromFileError(""Bad format"") ``` mypy said: ``` sqlite_utils/utils.py:157: error: Argument 1 to ""BufferedReader"" has incompatible type ""BinaryIO""; expected ""RawIOBase"" sqlite_utils/utils.py:163: error: Argument 1 to ""decode"" of ""bytes"" has incompatible type ""Optional[str]""; expected ""str"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/281#issuecomment-864323438,https://api.github.com/repos/simonw/sqlite-utils/issues/281,864323438,MDEyOklzc3VlQ29tbWVudDg2NDMyMzQzOA==,9599,simonw,2021-06-18T23:55:06Z,2021-06-18T23:55:06Z,OWNER,"The `-:json` idea is flawed: Click thinks that's the syntax for an option called `:json`. I'm going to do `stdin:json` - which means you can't open a file called `stdin` - but you could use `cat stdin | sqlite-utils memory stdin:json ...` instead which is an OK workaround.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924992318,Mechanism for explicitly stating CSV or JSON or TSV for sqlite-utils memory, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864208476,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864208476,MDEyOklzc3VlQ29tbWVudDg2NDIwODQ3Ng==,9599,simonw,2021-06-18T18:30:08Z,2021-06-18T23:30:19Z,OWNER,"So maybe this is a function which can either be told the format or, if none is provided, it detects one for itself. ```python def rows_from_file(fp, format=None): # ... yield from rows ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864207841,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864207841,MDEyOklzc3VlQ29tbWVudDg2NDIwNzg0MQ==,9599,simonw,2021-06-18T18:28:40Z,2021-06-18T18:28:46Z,OWNER,"```python def detect_format(fp): # ... return ""csv"", fp, dialect # or return ""json"", fp, parsed_data # or return ""json-nl"", fp, docs ``` The mixed return types here are ugly. In all of these cases what we really want is to return a generator of `{...}` objects. So maybe it returns that instead. ```python def filepointer_to_documents(fp): # ... yield from documents ``` I can refactor `sqlite-utils insert` to use this new code too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864206308,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864206308,MDEyOklzc3VlQ29tbWVudDg2NDIwNjMwOA==,9599,simonw,2021-06-18T18:25:04Z,2021-06-18T18:25:04Z,OWNER,"Or... since I'm not using a streaming JSON parser at the moment, if I think something is JSON I can load the entire thing into memory to validate it. I still need to detect newline-delimited JSON. For that I can consume the first line of the input to see if it's a valid JSON object, then maybe sniff the second line too? This does mean that if the input is a single line of GIANT JSON it will all be consumed into memory at once, but that's going to happen anyway. So I need a function which, given a file pointer, consumes from it, detects the type, then returns that type AND a file pointer to the beginning of the file again. I can use `io.BufferedReader` for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864129273,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864129273,MDEyOklzc3VlQ29tbWVudDg2NDEyOTI3Mw==,9599,simonw,2021-06-18T15:47:47Z,2021-06-18T15:47:47Z,OWNER,"Detecting valid JSON is tricky - just because a stream starts with `[` or `{` doesn't mean the entire stream is valid JSON. You need to parse the entire stream to determine that for sure. One way to solve this would be with a custom state machine. Another would be to use the `ijson` streaming parser - annoyingly it throws the same exception class for invalid JSON for different reasons, but the `e.args[0]` for that exception includes human-readable text about the error - if it's anything other than `parse error: premature EOF` then it probably means the JSON was invalid.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/278#issuecomment-864128489,https://api.github.com/repos/simonw/sqlite-utils/issues/278,864128489,MDEyOklzc3VlQ29tbWVudDg2NDEyODQ4OQ==,9599,simonw,2021-06-18T15:46:24Z,2021-06-18T15:46:24Z,OWNER,A workaround could be to define a bash or zsh alias of some sort.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",923697888,"Support db as first parameter before subcommand, or as environment variable", https://github.com/simonw/sqlite-utils/issues/278#issuecomment-864126781,https://api.github.com/repos/simonw/sqlite-utils/issues/278,864126781,MDEyOklzc3VlQ29tbWVudDg2NDEyNjc4MQ==,9599,simonw,2021-06-18T15:43:19Z,2021-06-18T15:43:19Z,OWNER,"I don't think it's possible to do this without breaking backwards compatibility, unfortunately.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",923697888,"Support db as first parameter before subcommand, or as environment variable", https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864103005,https://api.github.com/repos/simonw/sqlite-utils/issues/279,864103005,MDEyOklzc3VlQ29tbWVudDg2NDEwMzAwNQ==,9599,simonw,2021-06-18T15:04:15Z,2021-06-18T15:04:15Z,OWNER,"To detect JSON, check to see if the stream starts with `[` or `{` - maybe do something more sophisticated than that. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924990677,sqlite-utils memory should handle TSV and JSON in addition to CSV, https://github.com/simonw/sqlite-utils/issues/272#issuecomment-864101267,https://api.github.com/repos/simonw/sqlite-utils/issues/272,864101267,MDEyOklzc3VlQ29tbWVudDg2NDEwMTI2Nw==,9599,simonw,2021-06-18T15:01:41Z,2021-06-18T15:01:41Z,OWNER,I'll split the remaining work out into separate issues.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/pull/273#issuecomment-864099764,https://api.github.com/repos/simonw/sqlite-utils/issues/273,864099764,MDEyOklzc3VlQ29tbWVudDg2NDA5OTc2NA==,9599,simonw,2021-06-18T14:59:27Z,2021-06-18T14:59:27Z,OWNER,I'm going to merge this as-is and work on the JSON/TSV support in a separate issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/pull/277#issuecomment-864092515,https://api.github.com/repos/simonw/sqlite-utils/issues/277,864092515,MDEyOklzc3VlQ29tbWVudDg2NDA5MjUxNQ==,9599,simonw,2021-06-18T14:47:57Z,2021-06-18T14:47:57Z,OWNER,This is a neat improvement.,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",923612361,add -h support closes #276, https://github.com/simonw/sqlite-utils/issues/275#issuecomment-862617165,https://api.github.com/repos/simonw/sqlite-utils/issues/275,862617165,MDEyOklzc3VlQ29tbWVudDg2MjYxNzE2NQ==,9599,simonw,2021-06-16T18:34:51Z,2021-06-16T18:34:51Z,OWNER,"Also use this: https://github.com/simonw/datasette/blob/83e9c8bc7585dcc62f200e37c2daefcd669ee05e/codecov.yml And add a badge, as seen on https://github.com/simonw/asgi-csrf","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922955697,Enable code coverage, https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862605436,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862605436,MDEyOklzc3VlQ29tbWVudDg2MjYwNTQzNg==,9599,simonw,2021-06-16T18:19:05Z,2021-06-16T18:19:05Z,OWNER,`--attach` documentation: https://github.com/simonw/sqlite-utils/blob/192dc2c5b73bd836ab8e2e5fed4b36c6ea02f250/docs/cli.rst#joining-in-memory-data-against-existing-databases-using-attach,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/issues/267#issuecomment-862494864,https://api.github.com/repos/simonw/sqlite-utils/issues/267,862494864,MDEyOklzc3VlQ29tbWVudDg2MjQ5NDg2NA==,9599,simonw,2021-06-16T15:51:28Z,2021-06-16T16:26:15Z,OWNER,"I did add a slightly clumsy mechanism recently to help a bit here though: the `pks_and_rows_where()` method: https://sqlite-utils.datasette.io/en/stable/python-api.html#listing-rows-with-their-primary-keys More details in the issue for that feature: #240 The idea here is that if you want to call update you need the primary key for the row - so you can do this: ```python for pk, row in db[""sometable""].pks_and_rows_where(): db[""sometable""].update(pk, {""modified"": 1}"") ``` The `pk` may end up as a single value or a tuple depending on if the table has a compound primary key - but you don't need to worry about that if you use this method as it will return the correct primary key value for you.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",915421499,row.update() or row.pk, https://github.com/simonw/sqlite-utils/issues/131#issuecomment-862495803,https://api.github.com/repos/simonw/sqlite-utils/issues/131,862495803,MDEyOklzc3VlQ29tbWVudDg2MjQ5NTgwMw==,9599,simonw,2021-06-16T15:52:33Z,2021-06-16T15:52:33Z,OWNER,I like `-t` or `--type` for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",675753042,sqlite-utils insert: options for column types, https://github.com/simonw/sqlite-utils/issues/267#issuecomment-862493179,https://api.github.com/repos/simonw/sqlite-utils/issues/267,862493179,MDEyOklzc3VlQ29tbWVudDg2MjQ5MzE3OQ==,9599,simonw,2021-06-16T15:49:13Z,2021-06-16T15:49:13Z,OWNER,"The big challenge here is that the rows returned by this library aren't objects, they are Python dictionaries - so adding methods to them isn't possible without changing the type that is returned by these methods. Part of the philosophy of the library is that it should make it as easy as possible to round-trip between Python dictionaries and SQLite table data, so I don't think adding methods like this is going to fit.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",915421499,row.update() or row.pk, https://github.com/simonw/sqlite-utils/issues/270#issuecomment-862491721,https://api.github.com/repos/simonw/sqlite-utils/issues/270,862491721,MDEyOklzc3VlQ29tbWVudDg2MjQ5MTcyMQ==,9599,simonw,2021-06-16T15:47:06Z,2021-06-16T15:47:06Z,OWNER,"SQLite doesn't have a JSON column type - it has JSON processing functions, but they operate against TEXT columns - so there's nothing I can do here unfortunately.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919314806,Cannot set type JSON, https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862491016,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862491016,MDEyOklzc3VlQ29tbWVudDg2MjQ5MTAxNg==,9599,simonw,2021-06-16T15:46:13Z,2021-06-16T15:46:13Z,OWNER,"Columns from data imported from CSV in this way is currently treated as `TEXT`, which means numeric sorts and suchlike won't work as people might expect. It would be good to do automatic type detection here, see #179.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862485408,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862485408,MDEyOklzc3VlQ29tbWVudDg2MjQ4NTQwOA==,9599,simonw,2021-06-16T15:38:58Z,2021-06-16T15:39:28Z,OWNER,"Also `sqlite-utils memory` reflects the existing `sqlite-utils :memory:` mechanism, which is a point in its favour. And it helps emphasize that the file you are querying will be loaded into memory, so probably don't try this against a 1GB CSV file.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862484557,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862484557,MDEyOklzc3VlQ29tbWVudDg2MjQ4NDU1Nw==,9599,simonw,2021-06-16T15:37:51Z,2021-06-16T15:38:34Z,OWNER,"I wonder if there's a better name for this than `sqlite-utils memory`? - `sqlite-utils memory hello.csv ""select * from hello""` - `sqlite-utils mem hello.csv ""select * from hello""` - `sqlite-utils temp hello.csv ""select * from hello""` - `sqlite-utils adhoc hello.csv ""select * from hello""` - `sqlite-utils scratch hello.csv ""select * from hello""` I think `memory` is best. I don't like the others, except for `scratch` which is OK.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862479704,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862479704,MDEyOklzc3VlQ29tbWVudDg2MjQ3OTcwNA==,9599,simonw,2021-06-16T15:31:31Z,2021-06-16T15:31:31Z,OWNER,"Plus, could I make this change to `sqlite-utils query` without breaking backwards compatibility? Adding a new `sqlite-utils memory` command is completely safe from that perspective.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862478881,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862478881,MDEyOklzc3VlQ29tbWVudDg2MjQ3ODg4MQ==,9599,simonw,2021-06-16T15:30:24Z,2021-06-16T15:30:24Z,OWNER,"But... `sqlite-utils my.csv ""select * from my""` is a much more compelling initial experience than `sqlite-utils memory my.csv ""select * from my""`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862475685,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862475685,MDEyOklzc3VlQ29tbWVudDg2MjQ3NTY4NQ==,9599,simonw,2021-06-16T15:26:19Z,2021-06-16T15:29:38Z,OWNER,"Here's a radical idea: what if I combined `sqlite-utils memory` into `sqlite-utils query`? The trick here would be to detect if the arguments passed on the command-line refer to SQLite databases or if they refer to CSV/JSON data that should be imported into temporary tables. Detecting a SQLite database file is actually really easy - they all start with the same binary string: ```pycon >>> open(""my.db"", ""rb"").read(100) b'SQLite format 3\x00... ``` (Need to carefully check that a CSV file with`SQLite format 3` as the first column name doesn't accidentally get interpreted as a SQLite DB though). So then what would the semantics of `sqlite-utils query` (which is also the default command) be? - `sqlite-utils mydb.db ""select * from x""` - `sqlite-utils my.csv ""select * from my""` - `sqlite-utils mydb.db my.csv ""select * from mydb.x join my on ...""` - this is where it gets weird. We can't import the CSV data directly into `mpdb.db` - it's suppose to go into the in-memory database - so now we need to start using database aliases like `mydb.x` because we passed at least one other file? The complexity here is definitely in the handling of a combination of SQLite database files and CSV filenames. Also, `sqlite-utils query` doesn't accept multiple filenames at the moment, so that will change. I'm not 100% sold on this as being better than having a separate `sqlite-utils memory` command, as seen in #273.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862046009,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862046009,MDEyOklzc3VlQ29tbWVudDg2MjA0NjAwOQ==,9599,simonw,2021-06-16T05:15:38Z,2021-06-16T05:15:38Z,OWNER,"I'm going to add a `--encoding` option - it will affect ALL CSV input files, so if you have CSV files with different encodings you'll need to sort that mess out yourself (likely by importing each CSV file separately into a database using `sqlite-utils insert` with different `--encoding` values).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862045639,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862045639,MDEyOklzc3VlQ29tbWVudDg2MjA0NTYzOQ==,9599,simonw,2021-06-16T05:14:38Z,2021-06-16T05:14:38Z,OWNER,"Can't share much code though since a bunch of that `insert` stuff is specific to that command - showing progress bars, returning errors on illegal option combinations etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862045438,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862045438,MDEyOklzc3VlQ29tbWVudDg2MjA0NTQzOA==,9599,simonw,2021-06-16T05:14:00Z,2021-06-16T05:14:00Z,OWNER,I should probably refactor the CSV/JSON/loading stuff into a function in `utils.py` in order to share some of the implementation with the existing `sqlite-utils insert` code: https://github.com/simonw/sqlite-utils/blob/287cdcae8908916687f2ecccc87c38549d004ac6/sqlite_utils/cli.py#L691-L734,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862043974,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862043974,MDEyOklzc3VlQ29tbWVudDg2MjA0Mzk3NA==,9599,simonw,2021-06-16T05:10:12Z,2021-06-16T05:10:12Z,OWNER,"I can stop promoting `:memory:` here and promote `memory` instead: https://github.com/simonw/sqlite-utils/blob/c7234cae8336b8525034e8f917d82dd0699abd42/docs/cli.rst#L83-L86","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/pull/273#issuecomment-862042110,https://api.github.com/repos/simonw/sqlite-utils/issues/273,862042110,MDEyOklzc3VlQ29tbWVudDg2MjA0MjExMA==,9599,simonw,2021-06-16T05:05:51Z,2021-06-16T05:06:11Z,OWNER,"Initial documentation is here: https://github.com/simonw/sqlite-utils/blob/c7234cae8336b8525034e8f917d82dd0699abd42/docs/cli.rst#running-queries-directly-against-csv-data It only talks about CSV at the moment - needs to be updated to mention JSON too once that is implemented.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",922099793,sqlite-utils memory command for directly querying CSV/JSON data, https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862040971,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862040971,MDEyOklzc3VlQ29tbWVudDg2MjA0MDk3MQ==,9599,simonw,2021-06-16T05:02:56Z,2021-06-16T05:02:56Z,OWNER,Moving this to a PR.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862040906,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862040906,MDEyOklzc3VlQ29tbWVudDg2MjA0MDkwNg==,9599,simonw,2021-06-16T05:02:47Z,2021-06-16T05:02:47Z,OWNER,"Got a prototype working! ``` % curl -s 'https://fivethirtyeight.datasettes.com/polls/president_approval_polls.csv?_size=max&_stream=1' | sqlite-utils memory - 'select * from t limit 5' --nl {""rowid"": ""1"", ""question_id"": ""139304"", ""poll_id"": ""74225"", ""state"": """", ""politician_id"": ""11"", ""politician"": ""Donald Trump"", ""pollster_id"": ""568"", ""pollster"": ""YouGov"", ""sponsor_ids"": ""352"", ""sponsors"": ""Economist"", ""display_name"": ""YouGov"", ""pollster_rating_id"": ""391"", ""pollster_rating_name"": ""YouGov"", ""fte_grade"": ""B"", ""sample_size"": ""1500"", ""population"": ""a"", ""population_full"": ""a"", ""methodology"": ""Online"", ""start_date"": ""1/16/21"", ""end_date"": ""1/19/21"", ""sponsor_candidate"": """", ""tracking"": """", ""created_at"": ""1/20/21 10:18"", ""notes"": """", ""url"": ""https://docs.cdn.yougov.com/y9zsit5bzd/weeklytrackingreport.pdf"", ""source"": ""538"", ""yes"": ""42.0"", ""no"": ""53.0""} {""rowid"": ""2"", ""question_id"": ""139305"", ""poll_id"": ""74225"", ""state"": """", ""politician_id"": ""11"", ""politician"": ""Donald Trump"", ""pollster_id"": ""568"", ""pollster"": ""YouGov"", ""sponsor_ids"": ""352"", ""sponsors"": ""Economist"", ""display_name"": ""YouGov"", ""pollster_rating_id"": ""391"", ""pollster_rating_name"": ""YouGov"", ""fte_grade"": ""B"", ""sample_size"": ""1155"", ""population"": ""rv"", ""population_full"": ""rv"", ""methodology"": ""Online"", ""start_date"": ""1/16/21"", ""end_date"": ""1/19/21"", ""sponsor_candidate"": """", ""tracking"": """", ""created_at"": ""1/20/21 10:18"", ""notes"": """", ""url"": ""https://docs.cdn.yougov.com/y9zsit5bzd/weeklytrackingreport.pdf"", ""source"": ""538"", ""yes"": ""44.0"", ""no"": ""55.0""} {""rowid"": ""3"", ""question_id"": ""139306"", ""poll_id"": ""74226"", ""state"": """", ""politician_id"": ""11"", ""politician"": ""Donald Trump"", ""pollster_id"": ""23"", ""pollster"": ""American Research Group"", ""sponsor_ids"": """", ""sponsors"": """", ""display_name"": ""American Research Group"", ""pollster_rating_id"": ""9"", ""pollster_rating_name"": ""American Research Group"", ""fte_grade"": ""B"", ""sample_size"": ""1100"", ""population"": ""a"", ""population_full"": ""a"", ""methodology"": ""Live Phone"", ""start_date"": ""1/16/21"", ""end_date"": ""1/19/21"", ""sponsor_candidate"": """", ""tracking"": """", ""created_at"": ""1/20/21 10:18"", ""notes"": """", ""url"": ""https://americanresearchgroup.com/economy/"", ""source"": ""538"", ""yes"": ""30.0"", ""no"": ""66.0""} {""rowid"": ""4"", ""question_id"": ""139307"", ""poll_id"": ""74226"", ""state"": """", ""politician_id"": ""11"", ""politician"": ""Donald Trump"", ""pollster_id"": ""23"", ""pollster"": ""American Research Group"", ""sponsor_ids"": """", ""sponsors"": """", ""display_name"": ""American Research Group"", ""pollster_rating_id"": ""9"", ""pollster_rating_name"": ""American Research Group"", ""fte_grade"": ""B"", ""sample_size"": ""990"", ""population"": ""rv"", ""population_full"": ""rv"", ""methodology"": ""Live Phone"", ""start_date"": ""1/16/21"", ""end_date"": ""1/19/21"", ""sponsor_candidate"": """", ""tracking"": """", ""created_at"": ""1/20/21 10:18"", ""notes"": """", ""url"": ""https://americanresearchgroup.com/economy/"", ""source"": ""538"", ""yes"": ""29.0"", ""no"": ""67.0""} {""rowid"": ""5"", ""question_id"": ""139298"", ""poll_id"": ""74224"", ""state"": """", ""politician_id"": ""11"", ""politician"": ""Donald Trump"", ""pollster_id"": ""1528"", ""pollster"": ""AtlasIntel"", ""sponsor_ids"": """", ""sponsors"": """", ""display_name"": ""AtlasIntel"", ""pollster_rating_id"": ""546"", ""pollster_rating_name"": ""AtlasIntel"", ""fte_grade"": ""B/C"", ""sample_size"": ""5188"", ""population"": ""a"", ""population_full"": ""a"", ""methodology"": ""Online"", ""start_date"": ""1/15/21"", ""end_date"": ""1/19/21"", ""sponsor_candidate"": """", ""tracking"": """", ""created_at"": ""1/19/21 21:52"", ""notes"": """", ""url"": ""https://projects.fivethirtyeight.com/polls/20210119_US_Atlas2.pdf"", ""source"": ""538"", ""yes"": ""44.6"", ""no"": ""53.9""} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862018937,https://api.github.com/repos/simonw/sqlite-utils/issues/272,862018937,MDEyOklzc3VlQ29tbWVudDg2MjAxODkzNw==,9599,simonw,2021-06-16T03:59:28Z,2021-06-16T04:00:05Z,OWNER,"Mainly for debugging purposes it would be useful to be able to save the created in-memory database back to a file again later. This could be done with: sqlite-utils memory blah.csv --save saved.db Can use `.iterdump()` to implement this: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.iterdump Maybe instead (or as-well-as) offer `--dump` which dumps out the SQL from that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861989987,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861989987,MDEyOklzc3VlQ29tbWVudDg2MTk4OTk4Nw==,9599,simonw,2021-06-16T02:34:21Z,2021-06-16T02:34:21Z,OWNER,"The documentation already covers this ``` $ sqlite-utils :memory: ""select sqlite_version()"" [{""sqlite_version()"": ""3.29.0""}] ``` https://sqlite-utils.datasette.io/en/latest/cli.html#running-queries-and-returning-json `sqlite-utils memory ""select sqlite_version()""` is a little bit more intuitive than that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861987651,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861987651,MDEyOklzc3VlQ29tbWVudDg2MTk4NzY1MQ==,9599,simonw,2021-06-16T02:27:20Z,2021-06-16T02:27:20Z,OWNER,Solution: `sqlite-utils memory -` attempts to detect the input based on if it starts with a `{` or `[` (likely JSON) or if it doesn't use the `csv.Sniffer()` mechanism. Or you can use `sqlite-utils memory -:csv` to specifically indicate the type of input.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861985944,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861985944,MDEyOklzc3VlQ29tbWVudDg2MTk4NTk0NA==,9599,simonw,2021-06-16T02:22:52Z,2021-06-16T02:22:52Z,OWNER,"Another option: allow an optional `:suffix` specifying the type of the file. If this is missing we detect based on the filename. sqlite-utils memory somefile:csv ""select * from somefile"" One catch: how to treat `-` for standard input? cat blah.csv | sqlite-utils memory - ""select * from stdin"" That's fine for CSV, but what about TSV or JSON or nl-JSON? Maybe this: cat blah.csv | sqlite-utils memory -:json ""select * from stdin"" Bit weird though. The alternative would be to support this: cat blah.csv | sqlite-utils memory --load-csv - But that's verbose compared to the version without the long `--load-x` option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861984707,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861984707,MDEyOklzc3VlQ29tbWVudDg2MTk4NDcwNw==,9599,simonw,2021-06-16T02:19:48Z,2021-06-16T02:19:48Z,OWNER,"This is going to need to be a separate command, for relatively non-obvious reasons. sqlite-utils blah.db ""select * from x"" Is equivalent to this, because `query` is the default sub-command: sqlite-utils query blah.db ""select * from x"" But... this means that making the filename optional doesn't actually work - because then this is ambiguous: sqlite-utils --load-csv blah.csv ""select * from blah"" So instead, I'm going to add a new sub-command. I'm currently thinking `memory` to reflect that this command operates on an in-memory database: sqlite-utils memory --load-csv blah.csv ""select * from blah"" I still think I need to use `--load-csv` rather than `--csv` because one interesting use-case for this is loading in CSV and converting it to JSON, or vice-versa. Another option: allow multiple arguments which are filenames, and use the extension (or sniff the content) to decide what to do with them: sqlite-utils memory blah.csv foo.csv ""select * from foo join blah on ..."" This would require the last positional argument to always be a SQL query, and would treat all other positional arguments as files that should be imported into memory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861891835,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861891835,MDEyOklzc3VlQ29tbWVudDg2MTg5MTgzNQ==,9599,simonw,2021-06-15T23:09:31Z,2021-06-15T23:09:31Z,OWNER,`--load-csv` and `--load-json` and `--load-nl` and `--load-tsv` are unambiguous.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861891693,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861891693,MDEyOklzc3VlQ29tbWVudDg2MTg5MTY5Mw==,9599,simonw,2021-06-15T23:09:08Z,2021-06-15T23:09:08Z,OWNER,Problem: `--csv` and `--json` and `--nl` are already options for `sqlite-utils query` - need new non-conflicting names.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861891272,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861891272,MDEyOklzc3VlQ29tbWVudDg2MTg5MTI3Mg==,9599,simonw,2021-06-15T23:08:02Z,2021-06-15T23:08:02Z,OWNER,"`--csv -` should work though, for reading from stdin. The table can be called `stdin`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861891110,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861891110,MDEyOklzc3VlQ29tbWVudDg2MTg5MTExMA==,9599,simonw,2021-06-15T23:07:38Z,2021-06-15T23:07:38Z,OWNER,`--csvt` seems unnecessary to me: if people want to load different CSV files with the same filename (but in different directories) they will get an error unless they rename the files first.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861890689,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861890689,MDEyOklzc3VlQ29tbWVudDg2MTg5MDY4OQ==,9599,simonw,2021-06-15T23:06:37Z,2021-06-15T23:06:37Z,OWNER,"How about `--json` and `--nl` and `--tsv` too? Imitating the format options for `sqlite-utils insert`. And what happens if you provide a filename too? I'm tempted to say that the `--csv` stuff still gets loaded into an in-memory database but it's given a name and can then be joined against using SQLite `memory.blah` syntax.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861889437,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861889437,MDEyOklzc3VlQ29tbWVudDg2MTg4OTQzNw==,9599,simonw,2021-06-15T23:03:26Z,2021-06-15T23:03:26Z,OWNER,Maybe also support `--csvt` as an alternative option which takes two arguments: the CSV path and the name of the table that should be created from it (rather than auto-detecting from the filename).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command", https://github.com/simonw/sqlite-utils/issues/269#issuecomment-861103967,https://api.github.com/repos/simonw/sqlite-utils/issues/269,861103967,MDEyOklzc3VlQ29tbWVudDg2MTEwMzk2Nw==,9599,simonw,2021-06-15T01:34:10Z,2021-06-15T01:34:10Z,OWNER,"SQLite doesn't have the concept of a boolean column, so there's not much I can do here: https://www.sqlite.org/datatype3.html#boolean_datatype","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919250621,bool type not supported, https://github.com/simonw/sqlite-utils/issues/266#issuecomment-861103684,https://api.github.com/repos/simonw/sqlite-utils/issues/266,861103684,MDEyOklzc3VlQ29tbWVudDg2MTEwMzY4NA==,9599,simonw,2021-06-15T01:33:13Z,2021-06-15T01:33:13Z,OWNER,Dupe of #37,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913135723,"Add some types, enforce with mypy", https://github.com/simonw/datasette/issues/1377#issuecomment-861089794,https://api.github.com/repos/simonw/datasette/issues/1377,861089794,MDEyOklzc3VlQ29tbWVudDg2MTA4OTc5NA==,9599,simonw,2021-06-15T00:53:29Z,2021-06-15T00:53:29Z,OWNER,"Potential hook names: - `skip_csrf(scope, datasette)` - ... I can't think of any other ones I would tolerate to be honest","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",920884085,Mechanism for plugins to exclude certain paths from CSRF checks, https://github.com/simonw/datasette/issues/1377#issuecomment-861087949,https://api.github.com/repos/simonw/datasette/issues/1377,861087949,MDEyOklzc3VlQ29tbWVudDg2MTA4Nzk0OQ==,9599,simonw,2021-06-15T00:49:19Z,2021-06-15T00:49:19Z,OWNER,"The new `skip_if_scope` mechanism in `asgi-csrf` https://github.com/simonw/asgi-csrf/issues/20 is designed to help here. Now I need to design a plugin hook that allows plugins to have an opinion on whether a specific `scope` should have CSRF skipped.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",920884085,Mechanism for plugins to exclude certain paths from CSRF checks, https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861042050,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64,861042050,MDEyOklzc3VlQ29tbWVudDg2MTA0MjA1MA==,9599,simonw,2021-06-14T22:45:42Z,2021-06-14T22:45:42Z,MEMBER,I'm definitely interested in supporting events in this tool - see #14.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",920636216,"feature: support ""events""", https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861041597,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64,861041597,MDEyOklzc3VlQ29tbWVudDg2MTA0MTU5Nw==,9599,simonw,2021-06-14T22:44:54Z,2021-06-14T22:44:54Z,MEMBER,Have you found a way to access events in GraphQL? I can only see way to access a timeline of events for a single issue or a single pull request. See also https://github.community/t/get-event-equivalent-for-v4/13600/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",920636216,"feature: support ""events""", https://github.com/simonw/datasette/issues/1376#issuecomment-860230663,https://api.github.com/repos/simonw/datasette/issues/1376,860230663,MDEyOklzc3VlQ29tbWVudDg2MDIzMDY2Mw==,9599,simonw,2021-06-13T15:39:37Z,2021-06-13T15:39:37Z,OWNER,Actually it looks like there is a PR open already that addresses this: #1296 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919822817,Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns), https://github.com/simonw/datasette/issues/1375#issuecomment-860230385,https://api.github.com/repos/simonw/datasette/issues/1375,860230385,MDEyOklzc3VlQ29tbWVudDg2MDIzMDM4NQ==,9599,simonw,2021-06-13T15:37:49Z,2021-06-13T15:37:49Z,OWNER,"There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell Datasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments I considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning string instead of the expected dictionary and breaking your code.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919508498,JSON export dumps JSON fields as TEXT, https://github.com/simonw/datasette/issues/1376#issuecomment-860229397,https://api.github.com/repos/simonw/datasette/issues/1376,860229397,MDEyOklzc3VlQ29tbWVudDg2MDIyOTM5Nw==,9599,simonw,2021-06-13T15:31:02Z,2021-06-13T15:31:02Z,OWNER,Alternative fix would be to update that section of the documentation - if the container upgrade proves tricky I can fall back on that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919822817,Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns), https://github.com/simonw/datasette/issues/1376#issuecomment-860229226,https://api.github.com/repos/simonw/datasette/issues/1376,860229226,MDEyOklzc3VlQ29tbWVudDg2MDIyOTIyNg==,9599,simonw,2021-06-13T15:29:45Z,2021-06-13T15:29:45Z,OWNER,"Oh good catch - this is a SQLite version issue. The `fixtures.db` file used on https://latest.datasette.io/ includes a generated column (for testing purposes) which is a feature added in SQLite 3.31.0 on 2020-01-22. https://latest.datasette.io/-/versions But... it looks like the packaged Datasette Docker container doesn't have that SQLite version! I should fix that. I'm renaming this issue. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919822817,Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns), https://github.com/simonw/sqlite-utils/issues/271#issuecomment-860142489,https://api.github.com/repos/simonw/sqlite-utils/issues/271,860142489,MDEyOklzc3VlQ29tbWVudDg2MDE0MjQ4OQ==,9599,simonw,2021-06-13T02:53:06Z,2021-06-13T02:53:06Z,OWNER,"Looks like this is the problem: https://github.com/simonw/sqlite-utils/blob/b0f9d1e494c9891ce407e27b0f5c6deeea361d30/sqlite_utils/db.py#L1724-L1742 Note how `set_cols = [col for col in all_columns if col not in pks] ` can potentially return an empty list if ALL of the columns are primary keys - but the next line of code that assigns `sql2` continues regardless, when it should instead be skipped if there are no columns in `set_cols`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919702451,table.upsert_all() fails if input has a single column that should be a primary key, https://github.com/simonw/sqlite-utils/issues/270#issuecomment-859986489,https://api.github.com/repos/simonw/sqlite-utils/issues/270,859986489,MDEyOklzc3VlQ29tbWVudDg1OTk4NjQ4OQ==,9599,simonw,2021-06-12T02:47:12Z,2021-06-12T02:47:12Z,OWNER,"Can you expand on what you'd like to change here? The library and CLI tool already allow JSON data to be stored in columns: - https://sqlite-utils.datasette.io/en/stable/cli.html#nested-json-values - https://sqlite-utils.datasette.io/en/stable/python-api.html#storing-json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919314806,Cannot set type JSON, https://github.com/simonw/sqlite-utils/issues/268#issuecomment-859898736,https://api.github.com/repos/simonw/sqlite-utils/issues/268,859898736,MDEyOklzc3VlQ29tbWVudDg1OTg5ODczNg==,9599,simonw,2021-06-11T20:37:44Z,2021-06-11T20:37:44Z,OWNER,"From the prototype: ``` % sqlite-utils schema 24ways.db CREATE TABLE [articles] ( [title] TEXT , [contents] TEXT , [year] TEXT , [author] TEXT , [author_slug] TEXT , [published] TEXT , [url] TEXT , [topic] TEXT ); CREATE VIRTUAL TABLE ""articles_fts"" USING FTS5 ( title, author, contents, content=""articles"" ); CREATE TABLE 'articles_fts_data'(id INTEGER PRIMARY KEY, block BLOB); CREATE TABLE 'articles_fts_idx'(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID; CREATE TABLE 'articles_fts_docsize'(id INTEGER PRIMARY KEY, sz BLOB); CREATE TABLE 'articles_fts_config'(k PRIMARY KEY, v) WITHOUT ROWID; % sqlite-utils schema 24ways.db | sqlite3 /tmp/boo.db Error: near line 15: table 'articles_fts_data' already exists Error: near line 16: table 'articles_fts_idx' already exists Error: near line 17: table 'articles_fts_docsize' already exists Error: near line 18: table 'articles_fts_config' already exists ``` The problem here is that the `CREATE VIRTUAL TABLE ""articles_fts""...` line causes those next four tables to be created - but that means that piping the output of this command into `sqlite3` in order to re-create those tables throws errors. I don't think this matters. I see this tool as more for introspection than for recreating table structures.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919181559,db.schema property and sqlite-utils schema command, https://github.com/simonw/sqlite-utils/issues/268#issuecomment-859895540,https://api.github.com/repos/simonw/sqlite-utils/issues/268,859895540,MDEyOklzc3VlQ29tbWVudDg1OTg5NTU0MA==,9599,simonw,2021-06-11T20:30:34Z,2021-06-11T20:30:34Z,OWNER,"You can currently see the `sql` on the CLI using: % sqlite-utils rows fixtures.db sqlite_master -c name -c sql name sql -------------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------- simple_primary_key CREATE TABLE simple_primary_key ( id varchar(30) primary key, content text ) sqlite_autoindex_simple_primary_key_1 primary_key_multiple_columns CREATE TABLE primary_key_multiple_columns ( id varchar(30) primary key, content text, content2 text ) sqlite_autoindex_primary_key_multiple_columns_1 primary_key_multiple_columns_explicit_label CREATE TABLE primary_key_multiple_columns_explicit_label ( id varchar(30) primary key, content text, content2 text ) sqlite_autoindex_primary_key_multiple_columns_explicit_label_1 compound_primary_key CREATE TABLE compound_primary_key ( pk1 varchar(30), pk2 varchar(30), content text, PRIMARY KEY (pk1, pk2) ) sqlite_autoindex_compound_primary_key_1 compound_three_primary_keys CREATE TABLE compound_three_primary_keys ( pk1 varchar(30), pk2 varchar(30), pk3 varchar(30), content text, PRIMARY KEY (pk1, pk2, pk3) ) sqlite_autoindex_compound_three_primary_keys_1 foreign_key_references CREATE TABLE foreign_key_references ( pk varchar(30) primary key, foreign_key_with_label varchar(30), foreign_key_with_no_label varchar(30), FOREIGN KEY (foreign_key_with_label) REFERENCES simple_primary_key(id), FOREIGN KEY (foreign_key_with_no_label) REFERENCES primary_key_multiple_columns(id) ) sqlite_autoindex_foreign_key_references_1 sortable CREATE TABLE sortable ( pk1 varchar(30), pk2 varchar(30), content text, sortable integer, sortable_with_nulls real, sortable_with_nulls_2 real, text text, PRIMARY KEY (pk1, pk2) ) sqlite_autoindex_sortable_1 no_primary_key CREATE TABLE no_primary_key ( content text, a text, b text, c text ) 123_starts_with_digits CREATE TABLE [123_starts_with_digits] ( content text ) paginated_view CREATE VIEW paginated_view AS SELECT content, '- ' || content || ' -' AS content_extra FROM no_primary_key Table With Space In Name CREATE TABLE ""Table With Space In Name"" ( pk varchar(30) primary key, content text ) sqlite_autoindex_Table With Space In Name_1 table/with/slashes.csv CREATE TABLE ""table/with/slashes.csv"" ( pk varchar(30) primary key, content text ) sqlite_autoindex_table/with/slashes.csv_1 complex_foreign_keys CREATE TABLE ""complex_foreign_keys"" ( pk varchar(30) primary key, f1 text, f2 text, f3 text, FOREIGN KEY (""f1"") REFERENCES [simple_primary_key](id), FOREIGN KEY (""f2"") REFERENCES [simple_primary_key](id), FOREIGN KEY (""f3"") REFERENCES [simple_primary_key](id) ) sqlite_autoindex_complex_foreign_keys_1 custom_foreign_key_label CREATE TABLE ""custom_foreign_key_label"" ( pk varchar(30) primary key, foreign_key_with_custom_label text, FOREIGN KEY (""foreign_key_with_custom_label"") REFERENCES [primary_key_multiple_columns_explicit_label](id) ) sqlite_autoindex_custom_foreign_key_label_1 units CREATE TABLE units ( pk integer primary key, distance int, frequency int ) searchable CREATE TABLE searchable ( pk integer primary key, text1 text, text2 text, [name with . and spaces] text ) searchable_fts CREATE VIRTUAL TABLE ""searchable_fts"" USING FTS3 (text1, text2, [name with . and spaces], content=""searchable"") searchable_fts_content CREATE TABLE 'searchable_fts_content'(docid INTEGER PRIMARY KEY, 'c0text1', 'c1text2', 'c2name with . and spaces', 'c3content') searchable_fts_segments CREATE TABLE 'searchable_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB) searchable_fts_segdir CREATE TABLE 'searchable_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx)) sqlite_autoindex_searchable_fts_segdir_1 select CREATE TABLE [select] ( [group] text, [having] text, [and] text ) facet_cities CREATE TABLE facet_cities ( id integer primary key, name text ) simple_view CREATE VIEW simple_view AS SELECT content, upper(content) AS upper_content FROM simple_primary_key ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919181559,db.schema property and sqlite-utils schema command, https://github.com/simonw/sqlite-utils/issues/268#issuecomment-859894105,https://api.github.com/repos/simonw/sqlite-utils/issues/268,859894105,MDEyOklzc3VlQ29tbWVudDg1OTg5NDEwNQ==,9599,simonw,2021-06-11T20:28:52Z,2021-06-11T20:28:52Z,OWNER,"Out of interest, here are the rows from that table where `sql` is `null`: https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++sqlite_master%0D%0Awhere%0D%0A++sql+is+null ```csv type,name,tbl_name,rootpage,sql index,sqlite_autoindex_simple_primary_key_1,simple_primary_key,3, index,sqlite_autoindex_primary_key_multiple_columns_1,primary_key_multiple_columns,5, index,sqlite_autoindex_primary_key_multiple_columns_explicit_label_1,primary_key_multiple_columns_explicit_label,7, index,sqlite_autoindex_compound_primary_key_1,compound_primary_key,9, index,sqlite_autoindex_compound_three_primary_keys_1,compound_three_primary_keys,11, index,sqlite_autoindex_foreign_key_references_1,foreign_key_references,14, index,sqlite_autoindex_sortable_1,sortable,16, index,sqlite_autoindex_Table With Space In Name_1,Table With Space In Name,20, index,sqlite_autoindex_table/with/slashes.csv_1,table/with/slashes.csv,22, index,sqlite_autoindex_complex_foreign_keys_1,complex_foreign_keys,24, index,sqlite_autoindex_custom_foreign_key_label_1,custom_foreign_key_label,26, index,sqlite_autoindex_tags_1,tags,31, index,sqlite_autoindex_searchable_tags_1,searchable_tags,34, index,sqlite_autoindex_searchable_fts_segdir_1,searchable_fts_segdir,37, ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919181559,db.schema property and sqlite-utils schema command, https://github.com/simonw/sqlite-utils/issues/268#issuecomment-859888469,https://api.github.com/repos/simonw/sqlite-utils/issues/268,859888469,MDEyOklzc3VlQ29tbWVudDg1OTg4ODQ2OQ==,9599,simonw,2021-06-11T20:26:20Z,2021-06-11T20:26:20Z,OWNER,`sqlite-utils schema data.db` could output the same thing to the console.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919181559,db.schema property and sqlite-utils schema command, https://github.com/simonw/datasette/issues/1371#issuecomment-858099514,https://api.github.com/repos/simonw/datasette/issues/1371,858099514,MDEyOklzc3VlQ29tbWVudDg1ODA5OTUxNA==,9599,simonw,2021-06-09T21:03:49Z,2021-06-09T21:03:49Z,OWNER,I'll release these as an alpha straight away - it makes sense to have plugin hook changes available for people to test as alpha dependencies ASAP.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",915455228,Menu plugin hooks should include the request, https://github.com/simonw/datasette/pull/1370#issuecomment-857139881,https://api.github.com/repos/simonw/datasette/issues/1370,857139881,MDEyOklzc3VlQ29tbWVudDg1NzEzOTg4MQ==,9599,simonw,2021-06-08T20:58:41Z,2021-06-08T20:58:41Z,OWNER,We can remove a bunch of unnecessary `str(path)` calls too - this search finds a bunch of possible candidates: https://ripgrep.datasette.io/-/ripgrep?pattern=str%5C%28.*%28db%7Cpath%29&glob=datasette%2F**%2F*.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",914130834,Ensure db.path is a string before trying to insert into internal database, https://github.com/simonw/sqlite-utils/issues/266#issuecomment-856231119,https://api.github.com/repos/simonw/sqlite-utils/issues/266,856231119,MDEyOklzc3VlQ29tbWVudDg1NjIzMTExOQ==,9599,simonw,2021-06-07T20:26:05Z,2021-06-07T20:26:05Z,OWNER,"https://github.com/python/cpython/blob/2ab27c4af4ddf7528e1375e77c787c7fbb09b5e6/Lib/typing.py#L2173-L2195 In Python 3.6 or higher can do this: ```python class Employee(NamedTuple): name: str id: int ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913135723,"Add some types, enforce with mypy", https://github.com/simonw/datasette/issues/1365#issuecomment-856212136,https://api.github.com/repos/simonw/datasette/issues/1365,856212136,MDEyOklzc3VlQ29tbWVudDg1NjIxMjEzNg==,9599,simonw,2021-06-07T19:54:04Z,2021-06-07T19:54:04Z,OWNER,"I've hit this one too. I agree, fixing this in Datasette itself is better than fixing it in the tests across multiple other projects.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913017577,pathlib.Path breaks internal schema, https://github.com/simonw/datasette/issues/1369#issuecomment-856208637,https://api.github.com/repos/simonw/datasette/issues/1369,856208637,MDEyOklzc3VlQ29tbWVudDg1NjIwODYzNw==,9599,simonw,2021-06-07T19:47:23Z,2021-06-07T19:47:23Z,OWNER,No point in showing the IDs twice if the blue label doesn't differ from the gray ID,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913900374,Don't show foreign key IDs twice if no label, https://github.com/simonw/datasette/issues/1367#issuecomment-856160770,https://api.github.com/repos/simonw/datasette/issues/1367,856160770,MDEyOklzc3VlQ29tbWVudDg1NjE2MDc3MA==,9599,simonw,2021-06-07T18:22:33Z,2021-06-07T18:22:33Z,OWNER,Here's why: https://github.com/simonw/datasette/blob/03ec71193b9545536898a4bc7493274fec48bdd7/datasette/static/app.css#L455-L458,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913823889,Navigation menu display bug, https://github.com/simonw/datasette/issues/1366#issuecomment-856147969,https://api.github.com/repos/simonw/datasette/issues/1366,856147969,MDEyOklzc3VlQ29tbWVudDg1NjE0Nzk2OQ==,9599,simonw,2021-06-07T18:03:03Z,2021-06-07T18:03:03Z,OWNER,"Here's an example of a test that uses it. It's necessary because sometimes fixtures that create temporary directories break in unexpected ways: https://github.com/simonw/datasette/blob/0a7621f96f8ad14da17e7172e8a7bce24ef78966/tests/test_plugins.py#L658-L666","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913809802,Get rid of this `restore_working_directory` hack entirely, https://github.com/simonw/datasette/issues/1366#issuecomment-856147450,https://api.github.com/repos/simonw/datasette/issues/1366,856147450,MDEyOklzc3VlQ29tbWVudDg1NjE0NzQ1MA==,9599,simonw,2021-06-07T18:02:13Z,2021-06-07T18:02:13Z,OWNER,"The hack in question is this fixture, which I've been using in an ad-hoc manner to work around errors while running the tests: https://github.com/simonw/datasette/blob/030deb4b25cda842ff7129ab7c18550c44dd8379/tests/conftest.py#L62-L75 I don't understand the underlying issue well enough to know how to get rid of it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913809802,Get rid of this `restore_working_directory` hack entirely, https://github.com/simonw/sqlite-utils/issues/266#issuecomment-855611939,https://api.github.com/repos/simonw/sqlite-utils/issues/266,855611939,MDEyOklzc3VlQ29tbWVudDg1NTYxMTkzOQ==,9599,simonw,2021-06-07T06:07:41Z,2021-06-07T06:07:41Z,OWNER,"Looks like this is the way to do this: ```python Point = typing.NamedTuple( ""Point"", [('x', int), ('y', int)] ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",913135723,"Add some types, enforce with mypy", https://github.com/simonw/datasette/issues/1362#issuecomment-855430317,https://api.github.com/repos/simonw/datasette/issues/1362,855430317,MDEyOklzc3VlQ29tbWVudDg1NTQzMDMxNw==,9599,simonw,2021-06-06T17:07:48Z,2021-06-06T17:07:48Z,OWNER,"I guess I can offer a `disable_csp` setting so that people with complex custom templates aren't completely blocked from using them with Datasette, but maybe it would be better not to offer that? Or to offer it as a `datasette-insecure-csp` plugin instead? I like the idea of very actively encouraging CSP across all Datasette projects, but I'm nervous about making the software unusable for certain edge cases. Maybe require CSP and wait for someone to complain?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855429111,https://api.github.com/repos/simonw/datasette/issues/1362,855429111,MDEyOklzc3VlQ29tbWVudDg1NTQyOTExMQ==,9599,simonw,2021-06-06T16:59:05Z,2021-06-06T17:00:15Z,OWNER,"Twitter conversation: https://twitter.com/simonw/status/1401565566045806594 @dracos provided some really useful code examples there: > We generate it here: https://github.com/mysociety/fixmystreet/blob/e9fec4e567e7148ed128816e5770c2963be51af6/perllib/FixMyStreet/Cobrand/Default.pm#L89-L90 And use it e.g. https://github.com/mysociety/fixmystreet/blob/ba6788cd25d8f471a4e3308403607627b4d2f4f6/templates/web/base/common_header_tags.html or https://github.com/mysociety/fixmystreet/blob/cb4f2b96364d151988b5c664888468b25cc62240/templates/web/fixmystreet.com/header/css.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855428601,https://api.github.com/repos/simonw/datasette/issues/1362,855428601,MDEyOklzc3VlQ29tbWVudDg1NTQyODYwMQ==,9599,simonw,2021-06-06T16:55:33Z,2021-06-06T16:55:33Z,OWNER,"> No, because Vary header is about _request_ headers that cause the response to vary, not response headers. Hah, of course! Thanks for the correction. So the nonce mechanism would actually be pretty great here, especially for the `extra_body_script()` hook.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855427396,https://api.github.com/repos/simonw/datasette/issues/1362,855427396,MDEyOklzc3VlQ29tbWVudDg1NTQyNzM5Ng==,9599,simonw,2021-06-06T16:46:17Z,2021-06-06T16:46:17Z,OWNER,"Mind you, since that plugin hook looks like this: ```python @hookimpl def extra_body_script(): return { ""module"": True, ""script"": ""console.log('Your JavaScript goes here...')"" } ``` Having it calculate a sha256 hash wouldn't be difficult.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855426750,https://api.github.com/repos/simonw/datasette/issues/1362,855426750,MDEyOklzc3VlQ29tbWVudDg1NTQyNjc1MA==,9599,simonw,2021-06-06T16:41:30Z,2021-06-06T16:44:49Z,OWNER,"This is from the current `base.html` template: https://github.com/simonw/datasette/blob/030deb4b25cda842ff7129ab7c18550c44dd8379/datasette/templates/base.html#L62-L66 Which includes this: https://github.com/simonw/datasette/blob/030deb4b25cda842ff7129ab7c18550c44dd8379/datasette/templates/_close_open_menus.html#L1-L16 The `body_scripts` bit is for this `extra_body_script` plugin hook, which is the thing that will be the most affected by implementing CSP: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-body-script-template-database-table-columns-view-name-request-datasette","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855426516,https://api.github.com/repos/simonw/datasette/issues/1362,855426516,MDEyOklzc3VlQ29tbWVudDg1NTQyNjUxNg==,9599,simonw,2021-06-06T16:39:34Z,2021-06-06T16:39:34Z,OWNER,The reason Datasette uses small inline scripts right now is to avoid the overhead of an extra HTTP request for a JavaScript file - but these are both inherently cachable and perform much better under HTTP/2 so that's likely a false optimization.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855426314,https://api.github.com/repos/simonw/datasette/issues/1362,855426314,MDEyOklzc3VlQ29tbWVudDg1NTQyNjMxNA==,9599,simonw,2021-06-06T16:38:04Z,2021-06-06T16:38:04Z,OWNER,"The other option for inline scripts is the CSP nonce: Content-Security-Policy: script-src 'nonce-2726c7f26c' Then: Since an attacker can't guess what the nonce will be it prevents them from injecting their own script block - this seems easier to make available to plugins than a full hashing mechanism, just make `{{ csp_nonce() }}` available to the template. That template function can then be smart enough to set a flag which Datasette uses to decide if the `script-src 'nonce-2726c7f26c'` policy should be sent or not. Presumably this would also require adding `Content-Security-Policy` to the `Vary` header though, which will have a nasty effect on Cloudflare and Fastly and such like.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855418899,https://api.github.com/repos/simonw/datasette/issues/1362,855418899,MDEyOklzc3VlQ29tbWVudDg1NTQxODg5OQ==,9599,simonw,2021-06-06T15:42:55Z,2021-06-06T15:42:55Z,OWNER,Another consideration: testing that this works correctly could require adoption of a real browser test environment (probably Cypress or maybe Playwright) to execute tests that will fail if CSP is violated.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855418698,https://api.github.com/repos/simonw/datasette/issues/1362,855418698,MDEyOklzc3VlQ29tbWVudDg1NTQxODY5OA==,9599,simonw,2021-06-06T15:41:24Z,2021-06-06T15:41:24Z,OWNER,"I think the best way to answer these questions is with some prototyping - of both Datasette and some of the existing JavaScript plugins. I can start with a `datasette-experimental-csp` plugin that sets the header (and could even run an optional report URI mechanism).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855418401,https://api.github.com/repos/simonw/datasette/issues/1362,855418401,MDEyOklzc3VlQ29tbWVudDg1NTQxODQwMQ==,9599,simonw,2021-06-06T15:39:38Z,2021-06-06T15:39:38Z,OWNER,"The security benefit of forcing all JavaScript plugins to be written as CSP-friendly external scripts is very compelling though. Other plugin-heavy ecosystems such as WordPress have suffered greatly from insecurely written plugins - could this be a huge security win for the Datasette ecosystem generally?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1362#issuecomment-855418065,https://api.github.com/repos/simonw/datasette/issues/1362,855418065,MDEyOklzc3VlQ29tbWVudDg1NTQxODA2NQ==,9599,simonw,2021-06-06T15:37:11Z,2021-06-06T15:37:11Z,OWNER,"The easiest way to apply CSP is to remove all inline `