html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/sqlite-utils/issues/230#issuecomment-778812684,https://api.github.com/repos/simonw/sqlite-utils/issues/230,778812684,MDEyOklzc3VlQ29tbWVudDc3ODgxMjY4NA==,9599,simonw,2021-02-14T17:45:16Z,2021-02-14T17:45:16Z,OWNER,"Running this could take any CSV (or TSV) file and automatically detect the delimiter. If no header row is detected it could add `unknown1,unknown2` headers: sqlite-utils insert db.db data file.csv --sniff (Using `--sniff` would imply `--csv`) This could be called `--sniffer` instead but I like `--sniff` better.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",808008305,--sniff option for sniffing delimiters, https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050,https://api.github.com/repos/simonw/sqlite-utils/issues/228,778812050,MDEyOklzc3VlQ29tbWVudDc3ODgxMjA1MA==,9599,simonw,2021-02-14T17:41:30Z,2021-02-14T17:41:30Z,OWNER,"I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",807437089,--no-headers option for CSV and TSV, https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778811934,https://api.github.com/repos/simonw/sqlite-utils/issues/228,778811934,MDEyOklzc3VlQ29tbWVudDc3ODgxMTkzNA==,9599,simonw,2021-02-14T17:40:48Z,2021-02-14T17:40:48Z,OWNER,"Another pattern that might be useful is to generate a header that is just ""unknown1,unknown2,unknown3"" for each of the columns in the rest of the file. This makes it easy to e.g. facet-explore within Datasette to figure out the correct names, then use `sqlite-utils transform --rename` to rename the columns. I needed to do that for the https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c example.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",807437089,--no-headers option for CSV and TSV, https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778511347,https://api.github.com/repos/simonw/sqlite-utils/issues/228,778511347,MDEyOklzc3VlQ29tbWVudDc3ODUxMTM0Nw==,9599,simonw,2021-02-12T23:27:50Z,2021-02-12T23:27:50Z,OWNER,"For the moment, a workaround can be to `cat` an additional row onto the start of the file. echo ""name,url,description"" | cat - missing_headings.csv | sqlite-utils insert blah.db table - --csv","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",807437089,--no-headers option for CSV and TSV, https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778510528,https://api.github.com/repos/simonw/sqlite-utils/issues/131,778510528,MDEyOklzc3VlQ29tbWVudDc3ODUxMDUyOA==,9599,simonw,2021-02-12T23:25:06Z,2021-02-12T23:25:06Z,OWNER,"If `-c` isn't available, maybe `-t` or `--type` would work for specifying column types: ``` sqlite-utils insert db.db images images.tsv \ --tsv \ --type id int \ --type score float ``` or ``` sqlite-utils insert db.db images images.tsv \ --tsv \ -t id int \ -t score float ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",675753042,sqlite-utils insert: options for column types, https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778508887,https://api.github.com/repos/simonw/sqlite-utils/issues/131,778508887,MDEyOklzc3VlQ29tbWVudDc3ODUwODg4Nw==,9599,simonw,2021-02-12T23:20:11Z,2021-02-12T23:20:11Z,OWNER,"Annoyingly `-c` is currently a shortcut for `--csv` - so I'd have to do a major version bump to use that. https://github.com/simonw/sqlite-utils/blob/726219c3503e77440975cd15b74d006639feb0f8/sqlite_utils/cli.py#L601-L603 Particularly annoying because I attempted to remove the `-c` shortcut in https://github.com/simonw/sqlite-utils/commit/2c00567aac6d9c79087cfff0d054f64922b1473d#diff-76294b3d4afeb27e74e738daa01c26dd4dc9ccb6f4477451483a2ece1095902eL48 but forgot to remove it from the input options (I removed it from the output options).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",675753042,sqlite-utils insert: options for column types, https://github.com/simonw/datasette/issues/1220#issuecomment-778467759,https://api.github.com/repos/simonw/datasette/issues/1220,778467759,MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ==,30607,aborruso,2021-02-12T21:35:17Z,2021-02-12T21:35:17Z,NONE,Thank you,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/simonw/datasette/issues/1220#issuecomment-778439617,https://api.github.com/repos/simonw/datasette/issues/1220,778439617,MDEyOklzc3VlQ29tbWVudDc3ODQzOTYxNw==,7476523,bobwhitelock,2021-02-12T20:33:27Z,2021-02-12T20:33:27Z,CONTRIBUTOR,"That Docker command will mount your current directory inside the Docker container at `/mnt` - so you shouldn't need to change anything locally, just run ``` docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db ``` and it will use the `fixtures.db` file within your current directory","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770069864,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60,770069864,MDEyOklzc3VlQ29tbWVudDc3MDA2OTg2NA==,22578954,daniel-butler,2021-01-29T21:52:05Z,2021-02-12T18:29:43Z,CONTRIBUTOR,"For the purposes below I am assuming the organization I would get all the repositories and their related commits from is called `gh-organization`. The github's owner id of gh-orgnization is `123456789`. ```bash github-to-sqlite repos github.db gh-organization ``` I'm on a windows computer running git bash to be able to use the `|` command. This works for me ```bash sqlite3 github.db ""SELECT full_name FROM repos WHERE owner = '123456789';"" | tr '\n\r' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; } ``` On a pure linux system I think this would work because the new line character is normally `\n` ```bash sqlite3 github.db ""SELECT full_name FROM repos WHERE owner = '123456789';"" | tr '\n' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }` ``` As expected I ran into rate limit issues #51 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797097140,Use Data from SQLite in other commands, https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778349672,https://api.github.com/repos/simonw/sqlite-utils/issues/228,778349672,MDEyOklzc3VlQ29tbWVudDc3ODM0OTY3Mg==,9599,simonw,2021-02-12T18:00:43Z,2021-02-12T18:00:43Z,OWNER,"I could combine this with #131 to allow types to be specified in addition to column names. Probably need an option that means ""ignore the existing heading row and use this one instead"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",807437089,--no-headers option for CSV and TSV, https://github.com/simonw/sqlite-utils/issues/227#issuecomment-778349142,https://api.github.com/repos/simonw/sqlite-utils/issues/227,778349142,MDEyOklzc3VlQ29tbWVudDc3ODM0OTE0Mg==,9599,simonw,2021-02-12T17:59:35Z,2021-02-12T17:59:35Z,OWNER,It looks like I can at least bump this size limit up to the maximum allowed by Python - I'll take a look at that. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",807174161,Error reading csv files with large column data, https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778246347,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33,778246347,MDEyOklzc3VlQ29tbWVudDc3ODI0NjM0Nw==,41546558,RhetTbull,2021-02-12T15:00:43Z,2021-02-12T15:00:43Z,CONTRIBUTOR,"Yes, Big Sur Photos database doesn't have `ZGENERICASSET` table. PR #31 will fix this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803338729,photo-to-sqlite: command not found, https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33,778014990,MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA==,675335,leafgarland,2021-02-12T06:54:14Z,2021-02-12T06:54:14Z,NONE,"Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803338729,photo-to-sqlite: command not found, https://github.com/simonw/datasette/issues/1220#issuecomment-778008752,https://api.github.com/repos/simonw/datasette/issues/1220,778008752,MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg==,30607,aborruso,2021-02-12T06:37:34Z,2021-02-12T06:37:34Z,NONE,"I have used my path, I'm running it from the folder in wich I have the db. Do I must an absolute path? Do I must create exactly that folder?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33,778002092,MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg==,11855322,robmarkcole,2021-02-12T06:19:32Z,2021-02-12T06:19:32Z,NONE,"hi @leafgarland that results in a new error: ``` (venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos apple-photos photos.db Traceback (most recent call last): File ""/Users/robin/datasette/venv/bin/dogsheep-photos"", line 8, in <module> sys.exit(cli()) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py"", line 206, in apple_photos db.conn.execute( sqlite3.OperationalError: no such table: attached.ZGENERICASSET ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803338729,photo-to-sqlite: command not found, https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33,777951854,MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA==,675335,leafgarland,2021-02-12T03:54:39Z,2021-02-12T03:54:39Z,NONE,"I think that is a typo in the docs, you can use > dogsheep-photos apple-photos photos.db","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803338729,photo-to-sqlite: command not found, https://github.com/simonw/datasette/pull/1223#issuecomment-777949755,https://api.github.com/repos/simonw/datasette/issues/1223,777949755,MDEyOklzc3VlQ29tbWVudDc3Nzk0OTc1NQ==,22429695,codecov[bot],2021-02-12T03:45:31Z,2021-02-12T03:45:31Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=h1) Report > Merging [#1223](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=desc) (d1cd1f2) into [main](https://codecov.io/gh/simonw/datasette/commit/9603d893b9b72653895318c9104d754229fdb146?el=desc) (9603d89) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1223 +/- ## ======================================= Coverage 91.42% 91.42% ======================================= Files 32 32 Lines 3955 3955 ======================================= Hits 3616 3616 Misses 339 339 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=footer). Last update [9603d89...d1cd1f2](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806918878,Add compile option to Dockerfile to fix failing test (fixes #696), https://github.com/simonw/datasette/issues/1220#issuecomment-777927946,https://api.github.com/repos/simonw/datasette/issues/1220,777927946,MDEyOklzc3VlQ29tbWVudDc3NzkyNzk0Ng==,7476523,bobwhitelock,2021-02-12T02:29:54Z,2021-02-12T02:29:54Z,CONTRIBUTOR,"According to https://github.com/simonw/datasette/blob/master/docs/installation.rst#using-docker it should be ``` docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db ``` This uses `/mnt/fixtures.db` whereas you're using `fixtures.db` - did you try using this path instead?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/simonw/datasette/issues/1221#issuecomment-777901052,https://api.github.com/repos/simonw/datasette/issues/1221,777901052,MDEyOklzc3VlQ29tbWVudDc3NzkwMTA1Mg==,9599,simonw,2021-02-12T01:09:54Z,2021-02-12T01:09:54Z,OWNER,"I also tested this manually. I generated certificate files like so: cd /tmp python -m trustme This created `/tmp/server.pem`, `/tmp/client.pem` and `/tmp/server.key` Then I started Datasette like this: datasette --memory --ssl-keyfile=/tmp/server.key --ssl-certfile=/tmp/server.pem And exercise it using `curl` like so: /tmp % curl --cacert /tmp/client.pem 'https://localhost:8001/_memory.json' {""database"": ""_memory"", ""path"": ""/_memory"", ""size"": 0, ""tables"": [], ""hidden_count"": 0, ""views"": [], ""queries"": [], ""private"": false, ""allow_execute_sql"": true, ""query_ms"": 0.8843200000114848} Note that without the `--cacert` option I get an error: ``` /tmp % curl 'https://localhost:8001/_memory.json' curl: (60) SSL certificate problem: Invalid certificate chain More details here: https://curl.haxx.se/docs/sslcerts.html curl failed to verify the legitimacy of the server and therefore could not establish a secure connection to it. To learn more about this situation and how to fix it, please visit the web page mentioned above. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806849424,Support SSL/TLS directly, https://github.com/simonw/datasette/issues/1221#issuecomment-777887190,https://api.github.com/repos/simonw/datasette/issues/1221,777887190,MDEyOklzc3VlQ29tbWVudDc3Nzg4NzE5MA==,9599,simonw,2021-02-12T00:29:18Z,2021-02-12T00:29:18Z,OWNER,I can use this recipe to start a `datasette` server in a sub-process during the pytest run and exercise it with real HTTP requests: https://til.simonwillison.net/pytest/subprocess-server,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806849424,Support SSL/TLS directly, https://github.com/simonw/datasette/issues/1221#issuecomment-777883452,https://api.github.com/repos/simonw/datasette/issues/1221,777883452,MDEyOklzc3VlQ29tbWVudDc3Nzg4MzQ1Mg==,9599,simonw,2021-02-12T00:19:30Z,2021-02-12T00:19:40Z,OWNER,"Uvicorn supports these options: https://www.uvicorn.org/#command-line-options ``` --ssl-keyfile TEXT SSL key file --ssl-certfile TEXT SSL certificate file --ssl-keyfile-password TEXT SSL keyfile password --ssl-version INTEGER SSL version to use (see stdlib ssl module's) [default: 2] --ssl-cert-reqs INTEGER Whether client certificate is required (see stdlib ssl module's) [default: 0] --ssl-ca-certs TEXT CA certificates file --ssl-ciphers TEXT Ciphers to use (see stdlib ssl module's) [default: TLSv1] ``` For the moment I'm going to support just `--ssl-keyfile` and `--ssl-certfile` as arguments to `datasette serve`. I'll add other options if people ask for them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806849424,Support SSL/TLS directly, https://github.com/dogsheep/evernote-to-sqlite/pull/10#issuecomment-777839351,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/10,777839351,MDEyOklzc3VlQ29tbWVudDc3NzgzOTM1MQ==,9599,simonw,2021-02-11T22:37:55Z,2021-02-11T22:37:55Z,MEMBER,"I've merged these changes by hand now, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770712149,BugFix for encoding and not update info., https://github.com/dogsheep/evernote-to-sqlite/issues/7#issuecomment-777827396,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7,777827396,MDEyOklzc3VlQ29tbWVudDc3NzgyNzM5Ng==,9599,simonw,2021-02-11T22:13:14Z,2021-02-11T22:13:14Z,MEMBER,My best guess is that you have an older version of `sqlite-utils` installed here - the `replace=True` argument was added in version 2.0. I've bumped the dependency in `setup.py`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743297582,evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace', https://github.com/dogsheep/evernote-to-sqlite/issues/9#issuecomment-777821383,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/9,777821383,MDEyOklzc3VlQ29tbWVudDc3NzgyMTM4Mw==,9599,simonw,2021-02-11T22:01:28Z,2021-02-11T22:01:28Z,MEMBER,"Aha! I think I've figured out what's going on here. The CData blocks containing the notes look like this: `<![CDATA[<!DOCTYPE en-note SYSTEM ""http://xml.evernote.com/pub/enml2.dtd""><en-note><div>This note includes two images.</div><div><br /></div>...` The DTD at http://xml.evernote.com/pub/enml2.dtd includes some entities: ``` <!--=========== External character mnemonic entities ===================--> <!ENTITY % HTMLlat1 PUBLIC ""-//W3C//ENTITIES Latin 1 for XHTML//EN"" ""http://www.w3.org/TR/xhtml1/DTD/xhtml-lat1.ent""> %HTMLlat1; <!ENTITY % HTMLsymbol PUBLIC ""-//W3C//ENTITIES Symbols for XHTML//EN"" ""http://www.w3.org/TR/xhtml1/DTD/xhtml-symbol.ent""> %HTMLsymbol; <!ENTITY % HTMLspecial PUBLIC ""-//W3C//ENTITIES Special for XHTML//EN"" ""http://www.w3.org/TR/xhtml1/DTD/xhtml-special.ent""> %HTMLspecial; ``` So I need to be able to handle all of those different entities. I think I can do that using `html.entities.entitydefs` from the Python standard library, which looks a bit like this: ```python {'Aacute': 'Á', 'aacute': 'á', 'Aacute;': 'Á', 'aacute;': 'á', 'Abreve;': 'Ă', 'abreve;': 'ă', 'ac;': '∾', 'acd;': '∿', # ... } ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",748372469,ParseError: undefined entity š, https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777798330,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11,777798330,MDEyOklzc3VlQ29tbWVudDc3Nzc5ODMzMA==,9599,simonw,2021-02-11T21:18:58Z,2021-02-11T21:18:58Z,MEMBER,Thanks for the fix!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792851444,XML parse error, https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777690332,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11,777690332,MDEyOklzc3VlQ29tbWVudDc3NzY5MDMzMg==,3613583,dskrad,2021-02-11T18:16:01Z,2021-02-11T18:16:01Z,NONE,"I solved this issue by modifying line 31 of utils.py content = ET.tostring(ET.fromstring(content_xml.strip())).decode(""utf-8"")","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792851444,XML parse error, https://github.com/simonw/datasette/issues/1200#issuecomment-777178728,https://api.github.com/repos/simonw/datasette/issues/1200,777178728,MDEyOklzc3VlQ29tbWVudDc3NzE3ODcyOA==,9599,simonw,2021-02-11T03:13:59Z,2021-02-11T03:13:59Z,OWNER,"I came up with the need for this while playing with this tool: https://calands.datasettes.com/calands?sql=select%0D%0A++AsGeoJSON(geometry)%2C+*%0D%0Afrom%0D%0A++CPAD_2020a_SuperUnits%0D%0Awhere%0D%0A++PARK_NAME+like+'%25mini%25'+and%0D%0A++Intersects(GeomFromGeoJSON(%3Afreedraw)%2C+geometry)+%3D+1%0D%0A++and+CPAD_2020a_SuperUnits.rowid+in+(%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++SpatialIndex%0D%0A++++where%0D%0A++++++f_table_name+%3D+'CPAD_2020a_SuperUnits'%0D%0A++++++and+search_frame+%3D+GeomFromGeoJSON(%3Afreedraw)%0D%0A++)&freedraw={""type""%3A""MultiPolygon""%2C""coordinates""%3A[[[[-122.42202758789064%2C37.82280243352759]%2C[-122.39868164062501%2C37.823887203271454]%2C[-122.38220214843751%2C37.81846319511331]%2C[-122.35061645507814%2C37.77071473849611]%2C[-122.34924316406251%2C37.74465712069939]%2C[-122.37258911132814%2C37.703380457832374]%2C[-122.39044189453125%2C37.690340943717715]%2C[-122.41241455078126%2C37.680559803205135]%2C[-122.44262695312501%2C37.67295135774715]%2C[-122.47283935546876%2C37.67295135774715]%2C[-122.52502441406251%2C37.68382032669382]%2C[-122.53463745117189%2C37.6892542140253]%2C[-122.54699707031251%2C37.690340943717715]%2C[-122.55798339843751%2C37.72945260537781]%2C[-122.54287719726564%2C37.77831314799672]%2C[-122.49893188476564%2C37.81303878836991]%2C[-122.46185302734376%2C37.82822612280363]%2C[-122.42889404296876%2C37.82822612280363]%2C[-122.42202758789064%2C37.82280243352759]]]]} - before I fixed https://github.com/simonw/datasette-leaflet-geojson/issues/16 it was loading a LOT of maps, which felt bad. I wanted to be able to link people to that page with a hard limit on the number of rows displayed on that page. It's mainly to guard against unexpected behaviour from limit-less queries though. It's not a very high priority feature!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792890765,?_size=10 option for the arbitrary query page would be useful, https://github.com/simonw/datasette/issues/1200#issuecomment-777132761,https://api.github.com/repos/simonw/datasette/issues/1200,777132761,MDEyOklzc3VlQ29tbWVudDc3NzEzMjc2MQ==,7476523,bobwhitelock,2021-02-11T00:29:52Z,2021-02-11T00:29:52Z,CONTRIBUTOR,I'm probably missing something but what's the use case here - what would this offer over adding `limit 10` to the query?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792890765,?_size=10 option for the arbitrary query page would be useful, https://github.com/simonw/datasette/issues/1219#issuecomment-775442039,https://api.github.com/repos/simonw/datasette/issues/1219,775442039,MDEyOklzc3VlQ29tbWVudDc3NTQ0MjAzOQ==,9599,simonw,2021-02-08T20:39:52Z,2021-02-08T22:13:00Z,OWNER,"This comment helped me find a pattern for running Scalene against the Datasette test suite: https://github.com/emeryberger/scalene/issues/70#issuecomment-755245858 ``` pip install scalene ``` Then I created a file called `run_tests.py` with the following contents: ```python if __name__ == ""__main__"": import sys, pytest pytest.main(sys.argv) ``` Then I ran this: ``` scalene --profile-all run_tests.py -sv -x . ``` But... it quit with a segmentation fault! ``` (datasette) datasette % scalene --profile-all run_tests.py -sv -x . ======================================================================== test session starts ======================================================================== platform darwin -- Python 3.8.6, pytest-6.0.1, py-1.9.0, pluggy-0.13.1 -- python cachedir: .pytest_cache rootdir: /Users/simon/Dropbox/Development/datasette, configfile: pytest.ini plugins: asyncio-0.14.0, timeout-1.4.2 collecting ... Fatal Python error: Segmentation fault Current thread 0x0000000110c1edc0 (most recent call first): File ""/Users/simon/Dropbox/Development/datasette/datasette/utils/__init__.py"", line 553 in detect_json1 File ""/Users/simon/Dropbox/Development/datasette/datasette/filters.py"", line 168 in Filters File ""/Users/simon/Dropbox/Development/datasette/datasette/filters.py"", line 94 in <module> File ""<frozen importlib._bootstrap>"", line 219 in _call_with_frames_removed File ""<frozen importlib._bootstrap_external>"", line 783 in exec_module File ""<frozen importlib._bootstrap>"", line 671 in _load_unlocked File ""<frozen importlib._bootstrap>"", line 975 in _find_and_load_unlocked File ""<frozen importlib._bootstrap>"", line 991 in _find_and_load File ""/Users/simon/Dropbox/Development/datasette/datasette/views/table.py"", line 27 in <module> File ""<frozen importlib._bootstrap>"", line 219 in _call_with_frames_removed File ""<frozen importlib._bootstrap_external>"", line 783 in exec_module File ""<frozen importlib._bootstrap>"", line 671 in _load_unlocked File ""<frozen importlib._bootstrap>"", line 975 in _find_and_load_unlocked File ""<frozen importlib._bootstrap>"", line 991 in _find_and_load File ""/Users/simon/Dropbox/Development/datasette/datasette/app.py"", line 42 in <module> File ""<frozen importlib._bootstrap>"", line 219 in _call_with_frames_removed File ""<frozen importlib._bootstrap_external>"", line 783 in exec_module File ""<frozen importlib._bootstrap>"", line 671 in _load_unlocked File ""<frozen importlib._bootstrap>"", line 975 in _find_and_load_unlocked File ""<frozen importlib._bootstrap>"", line 991 in _find_and_load File ""/Users/simon/Dropbox/Development/datasette/tests/test_api.py"", line 1 in <module> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/assertion/rewrite.py"", line 170 in exec_module File ""<frozen importlib._bootstrap>"", line 671 in _load_unlocked File ""<frozen importlib._bootstrap>"", line 975 in _find_and_load_unlocked File ""<frozen importlib._bootstrap>"", line 991 in _find_and_load File ""<frozen importlib._bootstrap>"", line 1014 in _gcd_import File ""/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/importlib/__init__.py"", line 127 in import_module File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/pathlib.py"", line 520 in import_path File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py"", line 552 in _importtestmodule File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py"", line 484 in _getobj File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py"", line 288 in obj File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py"", line 500 in _inject_setup_module_fixture File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py"", line 487 in collect File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py"", line 324 in <lambda> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py"", line 294 in from_call File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py"", line 324 in pytest_make_collect_report File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py"", line 187 in _multicall File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 84 in <lambda> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 93 in _hookexec File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py"", line 286 in __call__ File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py"", line 441 in collect_one_node File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 768 in genitems File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 771 in genitems File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 568 in _perform_collect File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 516 in perform_collect File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 306 in pytest_collection File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py"", line 187 in _multicall File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 84 in <lambda> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 93 in _hookexec File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py"", line 286 in __call__ File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 295 in _main File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 240 in wrap_session File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py"", line 289 in pytest_cmdline_main File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py"", line 187 in _multicall File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 84 in <lambda> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py"", line 93 in _hookexec File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py"", line 286 in __call__ File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/config/__init__.py"", line 157 in main File ""run_tests.py"", line 3 in <module> File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/scalene_profiler.py"", line 1525 in main File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/__main__.py"", line 7 in main File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/__main__.py"", line 14 in <module> File ""/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py"", line 87 in _run_code File ""/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py"", line 194 in _run_module_as_main Scalene error: received signal SIGSEGV ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803929694,Try profiling Datasette using scalene, https://github.com/simonw/datasette/issues/1219#issuecomment-775497449,https://api.github.com/repos/simonw/datasette/issues/1219,775497449,MDEyOklzc3VlQ29tbWVudDc3NTQ5NzQ0OQ==,9599,simonw,2021-02-08T22:11:34Z,2021-02-08T22:11:34Z,OWNER,"https://github.com/emeryberger/scalene/issues/110 reports a ""received signal SIGSEGV"" error that was fixed by upgrading to the latest Scalene version, but I'm running that already.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803929694,Try profiling Datasette using scalene, https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774730656,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9,774730656,MDEyOklzc3VlQ29tbWVudDc3NDczMDY1Ng==,635179,merwok,2021-02-07T18:45:04Z,2021-02-07T18:45:04Z,NONE,"That URL uses TLS 1.3, but maybe only if the client supports it. It could be your Python version or your SSL library that’s not recent enough.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",801780625,SSL Error, https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774726123,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9,774726123,MDEyOklzc3VlQ29tbWVudDc3NDcyNjEyMw==,12669260,jfeiwell,2021-02-07T18:21:08Z,2021-02-07T18:21:08Z,NONE,@simonw any ideas here?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",801780625,SSL Error, https://github.com/simonw/datasette/issues/1217#issuecomment-774528913,https://api.github.com/repos/simonw/datasette/issues/1217,774528913,MDEyOklzc3VlQ29tbWVudDc3NDUyODkxMw==,639730,virtadpt,2021-02-06T19:23:41Z,2021-02-06T19:23:41Z,NONE,I've had a lot of success running it as an OpenFaaS lambda.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1217#issuecomment-774385092,https://api.github.com/repos/simonw/datasette/issues/1217,774385092,MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg==,6165713,plpxsk,2021-02-06T02:49:11Z,2021-02-06T02:49:11Z,NONE,"A good reference seems to be the note to run `datasette` as a module in https://github.com/simonw/datasette/pull/556 ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/sqlite-utils/issues/223#issuecomment-774373829,https://api.github.com/repos/simonw/sqlite-utils/issues/223,774373829,MDEyOklzc3VlQ29tbWVudDc3NDM3MzgyOQ==,9599,simonw,2021-02-06T01:39:47Z,2021-02-06T01:39:47Z,OWNER,Documentation: https://sqlite-utils.datasette.io/en/stable/cli.html#cli-insert-csv-tsv-delimiter,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788527932,--delimiter option for CSV import, https://github.com/simonw/datasette/issues/1208#issuecomment-774286962,https://api.github.com/repos/simonw/datasette/issues/1208,774286962,MDEyOklzc3VlQ29tbWVudDc3NDI4Njk2Mg==,4488943,kbaikov,2021-02-05T21:02:39Z,2021-02-05T21:02:39Z,CONTRIBUTOR,@simonw could you please take a look at the PR 1211 that fixes this issue?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",794554881,A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-774217792,https://api.github.com/repos/simonw/sqlite-utils/issues/203,774217792,MDEyOklzc3VlQ29tbWVudDc3NDIxNzc5Mg==,1049910,drkane,2021-02-05T18:44:13Z,2021-02-05T18:44:13Z,NONE,"Thanks for looking at this - home schooling kids has prevented me from replying. I'd struggled with how to adapt the API for the foreign keys too - I definitely tried the String/Tuple approach. I hadn't considered the breaking changes that would introduce though. I can take a look at this and try and make the change - see which of your options works best. I've got a workaround for the use-case I was looking at this for, so it wouldn't be a problem for me if it was put on the back burner until a hypothetical v4.0 anyway. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/datasette/issues/1210#issuecomment-773977128,https://api.github.com/repos/simonw/datasette/issues/1210,773977128,MDEyOklzc3VlQ29tbWVudDc3Mzk3NzEyOA==,525780,heyarne,2021-02-05T11:30:34Z,2021-02-05T11:30:34Z,NONE,"Thanks for your quick reply! Having changed my `metadata.yml`, queries AND database I can't really reproduce it anymore, sorry. But at least I'm happy to say that it works now! :) Thanks again for the super nifty tool, very appreciated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796234313,Immutable Database w/ Canned Queries, https://github.com/simonw/datasette/issues/1216#issuecomment-772796111,https://api.github.com/repos/simonw/datasette/issues/1216,772796111,MDEyOklzc3VlQ29tbWVudDc3Mjc5NjExMQ==,9599,simonw,2021-02-03T20:20:48Z,2021-02-03T20:20:48Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/1600d2a3ec3ada1f6fb5b1eb73bdaeccb5f80530/datasette/app.py#L620-L632,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",800669347,"/-/databases should reflect connection order, not alphabetical order", https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,772408273,MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw==,42315895,gsajko,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,"I figured it out. Those tweets are in database, because somebody quote tweeted them, or retweeted them. And if you grab quoted tweet or reweeted tweet from other tweet json, It doesn't grab all of the details. So if someone quote tweeted a quote tweet, the second quote tweet won't have `quoted_status`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607,Not all quoted statuses get fetched?, https://github.com/simonw/datasette/issues/1212#issuecomment-772007663,https://api.github.com/repos/simonw/datasette/issues/1212,772007663,MDEyOklzc3VlQ29tbWVudDc3MjAwNzY2Mw==,4488943,kbaikov,2021-02-02T21:36:56Z,2021-02-02T21:36:56Z,CONTRIBUTOR,"How do you get 4-5 minutes? I run my tests in WSL 2, so may be i need to try a real linux VM.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797651831,Tests are very slow. , https://github.com/simonw/datasette/issues/1214#issuecomment-772001787,https://api.github.com/repos/simonw/datasette/issues/1214,772001787,MDEyOklzc3VlQ29tbWVudDc3MjAwMTc4Nw==,9599,simonw,2021-02-02T21:28:53Z,2021-02-02T21:28:53Z,OWNER,"Fix is now live on https://latest.datasette.io/fixtures/searchable?_search=terry - clearing ""terry"" and re-submitting the form now works as expected.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799693777,Re-submitting filter form duplicates _x querystring arguments, https://github.com/simonw/datasette/issues/1214#issuecomment-771992628,https://api.github.com/repos/simonw/datasette/issues/1214,771992628,MDEyOklzc3VlQ29tbWVudDc3MTk5MjYyOA==,9599,simonw,2021-02-02T21:15:18Z,2021-02-02T21:15:18Z,OWNER,"The cause of this bug is form fields which begin with `_` but ARE displayed as form inputs on the page - hence should not be duplicated in an `<input type=""hidden"">` element.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799693777,Re-submitting filter form duplicates _x querystring arguments, https://github.com/simonw/datasette/issues/1214#issuecomment-771992025,https://api.github.com/repos/simonw/datasette/issues/1214,771992025,MDEyOklzc3VlQ29tbWVudDc3MTk5MjAyNQ==,9599,simonw,2021-02-02T21:14:16Z,2021-02-02T21:14:16Z,OWNER,"As a result, navigating to https://github-to-sqlite.dogsheep.net/github/labels?_search=help and clearing out the `_search` field then submitting the form does NOT clear the search term.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799693777,Re-submitting filter form duplicates _x querystring arguments, https://github.com/simonw/datasette/issues/1212#issuecomment-771976561,https://api.github.com/repos/simonw/datasette/issues/1212,771976561,MDEyOklzc3VlQ29tbWVudDc3MTk3NjU2MQ==,9599,simonw,2021-02-02T20:53:27Z,2021-02-02T20:53:27Z,OWNER,"It would be great if we could get `python-xdist` to run too - I tried it in the past and gave up when I ran into those race conditions, but I've not done any further digging to see if there's a way to fix that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797651831,Tests are very slow. , https://github.com/simonw/datasette/issues/1212#issuecomment-771975941,https://api.github.com/repos/simonw/datasette/issues/1212,771975941,MDEyOklzc3VlQ29tbWVudDc3MTk3NTk0MQ==,9599,simonw,2021-02-02T20:52:36Z,2021-02-02T20:52:36Z,OWNER,"37 minutes, wow! They're a little slow for me (4-5 minutes perhaps) but not nearly that bad. Thanks for running that profile. I think you're right: figuring out how to use more session scopes would definitely help. The `:memory:` idea is interesting too. The new `memory_name=` feature added in #1151 (released in Datasette 0.54) could help a lot here, since it allows Datasette instances to share the same in-memory database across multiple HTTP requests and connections. Note that `memory_name=` also persists within test runs themselves, independently of any `scope=` options on the fixtures. That might actually help us here! I'd be delighted if you explored this issue further, especially the option of using `memory_name=` for the fixtures databases used by the tests. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797651831,Tests are very slow. , https://github.com/simonw/datasette/issues/1213#issuecomment-771968675,https://api.github.com/repos/simonw/datasette/issues/1213,771968675,MDEyOklzc3VlQ29tbWVudDc3MTk2ODY3NQ==,9599,simonw,2021-02-02T20:41:55Z,2021-02-02T20:41:55Z,OWNER,"So maybe I could a special response header which ASGI middleware can pick up that means ""Don't attempt to gzip this, just stream it through"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799663959,gzip support for HTML (and JSON) responses, https://github.com/simonw/datasette/issues/1213#issuecomment-771968177,https://api.github.com/repos/simonw/datasette/issues/1213,771968177,MDEyOklzc3VlQ29tbWVudDc3MTk2ODE3Nw==,9599,simonw,2021-02-02T20:41:13Z,2021-02-02T20:41:13Z,OWNER,"Starlette accumulates the full response body in a `body` variable and then does this: ```python elif message_type == ""http.response.body"": # Remaining body in streaming GZip response. body = message.get(""body"", b"""") more_body = message.get(""more_body"", False) self.gzip_file.write(body) if not more_body: self.gzip_file.close() message[""body""] = self.gzip_buffer.getvalue() self.gzip_buffer.seek(0) self.gzip_buffer.truncate() await self.send(message) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799663959,gzip support for HTML (and JSON) responses, https://github.com/simonw/datasette/issues/1213#issuecomment-771965281,https://api.github.com/repos/simonw/datasette/issues/1213,771965281,MDEyOklzc3VlQ29tbWVudDc3MTk2NTI4MQ==,9599,simonw,2021-02-02T20:37:08Z,2021-02-02T20:39:24Z,OWNER,Starlette's gzip middleware implementation is here: https://github.com/encode/starlette/blob/0.14.2/starlette/middleware/gzip.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",799663959,gzip support for HTML (and JSON) responses, https://github.com/simonw/datasette/pull/1211#issuecomment-771127458,https://api.github.com/repos/simonw/datasette/issues/1211,771127458,MDEyOklzc3VlQ29tbWVudDc3MTEyNzQ1OA==,4488943,kbaikov,2021-02-01T20:13:39Z,2021-02-01T20:13:39Z,CONTRIBUTOR,Ping @simonw ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797649915,Use context manager instead of plain open, https://github.com/simonw/datasette/pull/1159#issuecomment-770865698,https://api.github.com/repos/simonw/datasette/issues/1159,770865698,MDEyOklzc3VlQ29tbWVudDc3MDg2NTY5OA==,552629,lovasoa,2021-02-01T13:42:29Z,2021-02-01T13:42:29Z,NONE,@simonw : Could you have a look at this ? I think this really improves readability.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",774332247,Improve the display of facets information, https://github.com/simonw/datasette/pull/1211#issuecomment-770343684,https://api.github.com/repos/simonw/datasette/issues/1211,770343684,MDEyOklzc3VlQ29tbWVudDc3MDM0MzY4NA==,22429695,codecov[bot],2021-01-31T08:03:40Z,2021-01-31T08:03:40Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=h1) Report > Merging [#1211](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=desc) (e33ccaa) into [main](https://codecov.io/gh/simonw/datasette/commit/dde3c500c73ace33529672f7d862b76753d309cc?el=desc) (dde3c50) will **decrease** coverage by `0.00%`. > The diff coverage is `92.85%`. [](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1211 +/- ## ========================================== - Coverage 91.54% 91.53% -0.01% ========================================== Files 32 32 Lines 3948 3959 +11 ========================================== + Hits 3614 3624 +10 - Misses 334 335 +1 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `77.29% <66.66%> (-0.31%)` | :arrow_down: | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.62% <100.00%> (+<0.01%)` | :arrow_up: | | [datasette/publish/cloudrun.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvY2xvdWRydW4ucHk=) | `96.96% <100.00%> (+0.09%)` | :arrow_up: | | [datasette/publish/heroku.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvaGVyb2t1LnB5) | `87.73% <100.00%> (+0.60%)` | :arrow_up: | | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.13% <100.00%> (+0.02%)` | :arrow_up: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=footer). Last update [dde3c50...e33ccaa](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797649915,Use context manager instead of plain open, https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-770150526,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51,770150526,MDEyOklzc3VlQ29tbWVudDc3MDE1MDUyNg==,22578954,daniel-butler,2021-01-30T03:44:19Z,2021-01-30T03:47:24Z,CONTRIBUTOR,I don't have much experience with github's rate limiting. In my day job we use the [tenacity library](https://github.com/jd/tenacity) to handle http errors we get.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703246031,github-to-sqlite should handle rate limits better, https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770112248,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60,770112248,MDEyOklzc3VlQ29tbWVudDc3MDExMjI0OA==,22578954,daniel-butler,2021-01-30T00:01:03Z,2021-01-30T01:14:42Z,CONTRIBUTOR,"Yes that would be cool! I wouldn't mind helping. Is this the meat of it? https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/utils.py#L512 It looks like the cli option is added with this decorator : https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/cli.py#L14 I looked a bit at utils.py in the GitHub repository. I was surprised at the amount of manual mapping of the API response you had to do to get this to work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797097140,Use Data from SQLite in other commands, https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770071568,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60,770071568,MDEyOklzc3VlQ29tbWVudDc3MDA3MTU2OA==,9599,simonw,2021-01-29T21:56:15Z,2021-01-29T21:56:15Z,MEMBER,"I really like the way you're using pipes here - really smart. It's similar to how I build the demo database in this GitHub Actions workflow: https://github.com/dogsheep/github-to-sqlite/blob/62dfd3bc4014b108200001ef4bc746feb6f33b45/.github/workflows/deploy-demo.yml#L52-L82 `twitter-to-sqlite` actually has a mechanism for doing this kind of thing, documented at https://github.com/dogsheep/twitter-to-sqlite#providing-input-from-a-sql-query-with---sql-and---attach It lets you do things like: ``` $ twitter-to-sqlite users-lookup my.db --sql=""select follower_id from following"" --ids ``` Maybe I should add something similar to `github-to-sqlite`? Feels like it could be really useful.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797097140,Use Data from SQLite in other commands, https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,769973212,MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg==,42315895,gsajko,2021-01-29T18:29:02Z,2021-01-29T18:31:55Z,NONE,"I think it was with `twitter-to-sqlite home-timeline home.db -a auth.json --since` and Im using only this command to grab tweets from cron tab `2,7,12,17,22,27,32,37,42,47,52,57 * * * * run-one /home/gsajko/miniconda3/bin/twitter-to-sqlite home-timeline /home/gsajko/work/custom_twitter_feed/home.db -a /home/gsajko/work/custom_twitter_feed/auth/auth.json --since` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607,Not all quoted statuses get fetched?, https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769957751,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,769957751,MDEyOklzc3VlQ29tbWVudDc2OTk1Nzc1MQ==,9599,simonw,2021-01-29T17:59:40Z,2021-01-29T17:59:40Z,MEMBER,"This is interesting - how did you create that initial table? Was this using the `twitter-to-sqlite import archive.db ~/Downloads/twitter-2019-06-25-b31f2.zip` command, or something else?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607,Not all quoted statuses get fetched?, https://github.com/simonw/datasette/issues/1207#issuecomment-769534187,https://api.github.com/repos/simonw/datasette/issues/1207,769534187,MDEyOklzc3VlQ29tbWVudDc2OTUzNDE4Nw==,9599,simonw,2021-01-29T02:37:19Z,2021-01-29T02:37:19Z,OWNER,https://docs.datasette.io/en/latest/testing_plugins.html#using-pdb-for-errors-thrown-inside-datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793881756,"Document the Datasette(..., pdb=True) testing pattern", https://github.com/simonw/datasette/issues/1209#issuecomment-769455370,https://api.github.com/repos/simonw/datasette/issues/1209,769455370,MDEyOklzc3VlQ29tbWVudDc2OTQ1NTM3MA==,9599,simonw,2021-01-28T23:00:21Z,2021-01-28T23:00:21Z,OWNER,"Good catch on the workaround here. The root problem is that `datasette-template-sql` looks for the first available databsae if you don't provide it with a `database=` argument, and in Datasette 0.54 the first available database changed to being the new `_internal` database. Is this a bug? I think it is - because the documented behaviour on https://docs.datasette.io/en/stable/internals.html#get-database-name is this: > `name` - string, optional > > The name to be used for this database - this will be used in the URL path, e.g. `/dbname`. If not specified Datasette will pick one based on the filename or memory name. Since the new behaviour differs from what was in the documentation I'm going to treat this as a bug and fix it.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",795367402,v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround, https://github.com/simonw/datasette/issues/1205#issuecomment-769453074,https://api.github.com/repos/simonw/datasette/issues/1205,769453074,MDEyOklzc3VlQ29tbWVudDc2OTQ1MzA3NA==,9599,simonw,2021-01-28T22:54:49Z,2021-01-28T22:55:02Z,OWNER," I also checked that the following works: echo '{""foo"": ""bar""}' | sqlite-utils insert _memory.db demo - datasette _memory.db --memory Sure enough, it results in the following Datasette homepage - thanks to #509 <img width=""274"" alt=""Datasette___memory___memory_2"" src=""https://user-images.githubusercontent.com/9599/106208790-c8564980-6178-11eb-8b8b-053a9f1d0193.png"">","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793027837,Rename /:memory: to /_memory, https://github.com/simonw/datasette/issues/1205#issuecomment-769452084,https://api.github.com/repos/simonw/datasette/issues/1205,769452084,MDEyOklzc3VlQ29tbWVudDc2OTQ1MjA4NA==,9599,simonw,2021-01-28T22:52:23Z,2021-01-28T22:52:23Z,OWNER,Here are the redirect tests: https://github.com/simonw/datasette/blob/1600d2a3ec3ada1f6fb5b1eb73bdaeccb5f80530/tests/test_api.py#L635-L648,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793027837,Rename /:memory: to /_memory, https://github.com/simonw/datasette/issues/1205#issuecomment-769442165,https://api.github.com/repos/simonw/datasette/issues/1205,769442165,MDEyOklzc3VlQ29tbWVudDc2OTQ0MjE2NQ==,9599,simonw,2021-01-28T22:30:16Z,2021-01-28T22:30:27Z,OWNER,"I'm going to do this, with redirects from `/:memory:*`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793027837,Rename /:memory: to /_memory, https://github.com/simonw/datasette/issues/1210#issuecomment-769274591,https://api.github.com/repos/simonw/datasette/issues/1210,769274591,MDEyOklzc3VlQ29tbWVudDc2OTI3NDU5MQ==,9599,simonw,2021-01-28T18:10:02Z,2021-01-28T18:10:02Z,OWNER,That definitely sounds like a bug! Can you provide a copy of your `metadata.JSON` and the command-line you are using to launch Datasette?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796234313,Immutable Database w/ Canned Queries, https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-767888743,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54,767888743,MDEyOklzc3VlQ29tbWVudDc2Nzg4ODc0Mw==,19328961,henry501,2021-01-26T23:07:41Z,2021-01-26T23:07:41Z,NONE,"My import got much further with the applied fixes than 0.21.3, but not 100%. I do appear to have all of the tweets imported at least. Not sure when I'll have a chance to look further to try to fix or see what didn't make it into the import. Here's my output: ``` direct-messages-group: not yet implemented branch-links: not yet implemented periscope-expired-broadcasts: not yet implemented direct-messages: not yet implemented mute: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-ban-information: not yet implemented periscope-profile-description: not yet implemented screen-name-change: not yet implemented manifest: not yet implemented fleet: not yet implemented user-link-clicks: not yet implemented periscope-broadcast-metadata: not yet implemented contact: not yet implemented fleet-mute: not yet implemented device-token: not yet implemented protected-history: not yet implemented direct-message-mute: not yet implemented Traceback (most recent call last): File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/bin/twitter-to-sqlite"", line 33, in <module> sys.exit(load_entry_point('twitter-to-sqlite==0.21.3', 'console_scripts', 'twitter-to-sqlite')()) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 772, in import_ archive.import_from_file(db, filepath.name, open(filepath, ""rb"").read()) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 233, in import_from_file to_insert = transformer(data) File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 21, in callback return {filename: [fn(item) for item in data]} File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 21, in <listcomp> return {filename: [fn(item) for item in data]} File ""/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 88, in ageinfo return item[""ageMeta""][""ageInfo""] KeyError: 'ageInfo' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779088071,Archive import appears to be broken on recent exports, https://github.com/simonw/datasette/issues/1208#issuecomment-767823684,https://api.github.com/repos/simonw/datasette/issues/1208,767823684,MDEyOklzc3VlQ29tbWVudDc2NzgyMzY4NA==,9599,simonw,2021-01-26T20:58:51Z,2021-01-26T20:58:51Z,OWNER,"This is a good catch - I've been lazy about this, but you're right that it's an issue that needs cleaning up. Would be very happy to apply a PR, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",794554881,A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper, https://github.com/simonw/datasette/issues/1151#issuecomment-767762551,https://api.github.com/repos/simonw/datasette/issues/1151,767762551,MDEyOklzc3VlQ29tbWVudDc2Nzc2MjU1MQ==,9599,simonw,2021-01-26T19:07:44Z,2021-01-26T19:07:44Z,OWNER,Mentioned in https://simonwillison.net/2021/Jan/25/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/991#issuecomment-767761155,https://api.github.com/repos/simonw/datasette/issues/991,767761155,MDEyOklzc3VlQ29tbWVudDc2Nzc2MTE1NQ==,9599,simonw,2021-01-26T19:05:21Z,2021-01-26T19:06:36Z,OWNER,"Idea: implement this using the existing table view, with a custom template called `table-internal-bb0ec0-tables.html` - that's the custom template listed in the HTML comments at the bottom of https://latest.datasette.io/_internal/tables","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714377268,Redesign application homepage, https://github.com/simonw/datasette/issues/1201#issuecomment-766991680,https://api.github.com/repos/simonw/datasette/issues/1201,766991680,MDEyOklzc3VlQ29tbWVudDc2Njk5MTY4MA==,9599,simonw,2021-01-25T17:42:21Z,2021-01-25T17:42:21Z,OWNER,https://docs.datasette.io/en/stable/changelog.html#v0-54,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54, https://github.com/simonw/datasette/pull/1206#issuecomment-766589070,https://api.github.com/repos/simonw/datasette/issues/1206,766589070,MDEyOklzc3VlQ29tbWVudDc2NjU4OTA3MA==,22429695,codecov[bot],2021-01-25T06:50:30Z,2021-01-25T17:31:11Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=h1) Report > Merging [#1206](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=desc) (06480e1) into [main](https://codecov.io/gh/simonw/datasette/commit/a5ede3cdd455e2bb1a1fb2f4e1b5a9855caf5179?el=desc) (a5ede3c) will **not change** coverage. > The diff coverage is `100.00%`. [](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1206 +/- ## ======================================= Coverage 91.53% 91.53% ======================================= Files 32 32 Lines 3947 3947 ======================================= Hits 3613 3613 Misses 334 334 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/version.py](https://codecov.io/gh/simonw/datasette/pull/1206/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZlcnNpb24ucHk=) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=footer). Last update [a5ede3c...571476d](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793086333,Release 0.54, https://github.com/simonw/datasette/pull/1206#issuecomment-766588371,https://api.github.com/repos/simonw/datasette/issues/1206,766588371,MDEyOklzc3VlQ29tbWVudDc2NjU4ODM3MQ==,9599,simonw,2021-01-25T06:49:06Z,2021-01-25T06:49:06Z,OWNER,"Last thing to do: write up the annotated version of these release notes, assign it a URL on my blog and link to it from the release notes here so I can publish them simultaneously.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793086333,Release 0.54, https://github.com/simonw/datasette/pull/1206#issuecomment-766588020,https://api.github.com/repos/simonw/datasette/issues/1206,766588020,MDEyOklzc3VlQ29tbWVudDc2NjU4ODAyMA==,9599,simonw,2021-01-25T06:48:20Z,2021-01-25T06:48:20Z,OWNER,"Issues to reference in the commit message: #509, #1091, #1150, #1151, #1166, #1167, #1178, #1181, #1182, #1184, #1185, #1186, #1187, #1194, #1198","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793086333,Release 0.54, https://github.com/simonw/datasette/issues/1201#issuecomment-766586151,https://api.github.com/repos/simonw/datasette/issues/1201,766586151,MDEyOklzc3VlQ29tbWVudDc2NjU4NjE1MQ==,9599,simonw,2021-01-25T06:44:43Z,2021-01-25T06:44:43Z,OWNER,"OK, release notes are ready to merge from that branch. I'll ship the release in the morning, to give me time to write the accompanying annotated release notes.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54, https://github.com/simonw/datasette/issues/1201#issuecomment-766545604,https://api.github.com/repos/simonw/datasette/issues/1201,766545604,MDEyOklzc3VlQ29tbWVudDc2NjU0NTYwNA==,9599,simonw,2021-01-25T05:14:31Z,2021-01-25T05:14:31Z,OWNER,"The two big ticket items are `<script type=""module"">` support and the new `_internal` mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54, https://github.com/simonw/datasette/issues/1201#issuecomment-766545442,https://api.github.com/repos/simonw/datasette/issues/1201,766545442,MDEyOklzc3VlQ29tbWVudDc2NjU0NTQ0Mg==,9599,simonw,2021-01-25T05:13:59Z,2021-01-25T05:13:59Z,OWNER,"The big stuff: - Database(memory_name=) for shared in-memory databases, closes #1151 - The `_internal` database - #1150 - script type=module support, closes #1186 , #1187 - Improved design for the `.add_database()` method 8919f99c2f7f245aca7f94bd53d5ac9d04aa42b5 - which means databases with the same stem can now be opened, #509 - Adopted Prettier #1166 Smaller: - force_https_urls on for publish cloudrun, refs #1178 - Fixed bug in example nginx config, refs #1091 - Shrunk ecosystem docs in favour of datasette.io, closes #1182 - request.full_path property, closes #1184 - Better PRAGMA error message, closes #1185 - publish heroku now uses python-3.8.7 - Plugin testing documentation on using pytest-httpx Closes #1198 - Contributing docs for Black and Prettier, closes #1167 - All ?_ parameters now copied to hidden form fields, closes #1194 - Fixed bug loading database called 'test-database (1).sqlite' - Closes #1181. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54, https://github.com/simonw/datasette/issues/1201#issuecomment-766543387,https://api.github.com/repos/simonw/datasette/issues/1201,766543387,MDEyOklzc3VlQ29tbWVudDc2NjU0MzM4Nw==,9599,simonw,2021-01-25T05:07:40Z,2021-01-25T05:13:29Z,OWNER,Changes: https://github.com/simonw/datasette/compare/0.53...a5ede3cdd455e2bb1a1fb2f4e1b5a9855caf5179,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54, https://github.com/simonw/datasette/issues/983#issuecomment-766536076,https://api.github.com/repos/simonw/datasette/issues/983,766536076,MDEyOklzc3VlQ29tbWVudDc2NjUzNjA3Ng==,9599,simonw,2021-01-25T04:43:53Z,2021-01-25T04:43:53Z,OWNER,"... actually not going to include this in 0.54, I need to write a couple of plugins myself using it before I even make it available in preview.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1195#issuecomment-766535046,https://api.github.com/repos/simonw/datasette/issues/1195,766535046,MDEyOklzc3VlQ29tbWVudDc2NjUzNTA0Ng==,9599,simonw,2021-01-25T04:40:08Z,2021-01-25T04:40:08Z,OWNER,"Also: should the view name really be the same for both of these pages? - https://latest.datasette.io/fixtures?sql=select+*+from+facetable - https://latest.datasette.io/fixtures/neighborhood_search Where one of them is a canned query and the other is an arbitrary query?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page", https://github.com/simonw/datasette/issues/1195#issuecomment-766534748,https://api.github.com/repos/simonw/datasette/issues/1195,766534748,MDEyOklzc3VlQ29tbWVudDc2NjUzNDc0OA==,9599,simonw,2021-01-25T04:38:56Z,2021-01-25T04:38:56Z,OWNER,"Here's a diff showing how far I got before I abandoned this particular effort: ```diff diff --git a/datasette/views/base.py b/datasette/views/base.py index a21b929..04e4aa9 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -120,7 +120,7 @@ class BaseView: handler = getattr(self, request.method.lower(), None) return await handler(request, *args, **kwargs) - async def render(self, templates, request, context=None): + async def render(self, templates, request, context=None, view_name=None): context = context or {} template = self.ds.jinja_env.select_template(templates) template_context = { @@ -135,7 +135,7 @@ class BaseView: } return Response.html( await self.ds.render_template( - template, template_context, request=request, view_name=self.name + template, template_context, request=request, view_name=view_name ) ) @@ -155,7 +155,7 @@ class BaseView: class DataView(BaseView): - name = """" + view_name = ""no-view-name"" re_named_parameter = re.compile("":([a-zA-Z0-9_]+)"") async def options(self, request, *args, **kwargs): @@ -414,6 +414,10 @@ class DataView(BaseView): args[""table""] = urllib.parse.unquote_plus(args[""table""]) return _format, args + async def get_view_name(self, request, database, hash, **kwargs): + # Defaults to self.view_name, but can be over-ridden by subclasses + return self.view_name + async def view_get(self, request, database, hash, correct_hash_provided, **kwargs): _format, kwargs = await self.get_format(request, database, kwargs) @@ -424,6 +428,8 @@ class DataView(BaseView): # HTML views default to expanding all foreign key labels kwargs[""default_labels""] = True + view_name = await self.get_view_name(request, database, hash, **kwargs) + extra_template_data = {} start = time.perf_counter() status_code = 200 @@ -489,7 +495,7 @@ class DataView(BaseView): database=database, table=data.get(""table""), request=request, - view_name=self.name, + view_name=view_name, # These will be deprecated in Datasette 1.0: args=request.args, data=data, @@ -533,7 +539,7 @@ class DataView(BaseView): database=database, table=data.get(""table""), request=request, - view_name=self.name, + view_name=view_name, ) it_can_render = await await_me_maybe(it_can_render) if it_can_render: @@ -565,7 +571,7 @@ class DataView(BaseView): } if ""metadata"" not in context: context[""metadata""] = self.ds.metadata - r = await self.render(templates, request=request, context=context) + r = await self.render(templates, request=request, context=context, view_name=view_name) r.status = status_code ttl = request.args.get(""_ttl"", None) diff --git a/datasette/views/database.py b/datasette/views/database.py index f6fd579..e425213 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -23,7 +23,11 @@ from .base import DatasetteError, DataView class DatabaseView(DataView): - name = ""database"" + async def get_view_name(self, request, db_name, table_and_format): + if request.args.get(""sql""): + return ""query"" + else: + return ""database"" async def data(self, request, database, hash, default_labels=False, _size=None): await self.check_permissions( @@ -145,7 +149,7 @@ class DatabaseView(DataView): class DatabaseDownload(DataView): - name = ""database_download"" + view_name = ""database_download"" async def view_get(self, request, database, hash, correct_hash_present, **kwargs): await self.check_permissions( diff --git a/datasette/views/index.py b/datasette/views/index.py index b6b8cbe..d750e3d 100644 --- a/datasette/views/index.py +++ b/datasette/views/index.py @@ -16,7 +16,7 @@ COUNT_DB_SIZE_LIMIT = 100 * 1024 * 1024 class IndexView(BaseView): - name = ""index"" + view_name = ""index"" async def get(self, request, as_format): await self.check_permission(request, ""view-instance"") diff --git a/datasette/views/special.py b/datasette/views/special.py index 9750dd0..dbd1e00 100644 --- a/datasette/views/special.py +++ b/datasette/views/special.py @@ -6,7 +6,7 @@ import secrets class JsonDataView(BaseView): - name = ""json_data"" + view_name = ""json_data"" def __init__(self, datasette, filename, data_callback, needs_request=False): self.ds = datasette @@ -42,7 +42,7 @@ class JsonDataView(BaseView): class PatternPortfolioView(BaseView): - name = ""patterns"" + view_name = ""patterns"" async def get(self, request): await self.check_permission(request, ""view-instance"") @@ -50,7 +50,7 @@ class PatternPortfolioView(BaseView): class AuthTokenView(BaseView): - name = ""auth_token"" + view_name = ""auth_token"" async def get(self, request): token = request.args.get(""token"") or """" @@ -68,7 +68,7 @@ class AuthTokenView(BaseView): class LogoutView(BaseView): - name = ""logout"" + view_name = ""logout"" async def get(self, request): if not request.actor: @@ -87,7 +87,7 @@ class LogoutView(BaseView): class PermissionsDebugView(BaseView): - name = ""permissions_debug"" + view_name = ""permissions_debug"" async def get(self, request): await self.check_permission(request, ""view-instance"") @@ -102,7 +102,7 @@ class PermissionsDebugView(BaseView): class AllowDebugView(BaseView): - name = ""allow_debug"" + view_name = ""allow_debug"" async def get(self, request): errors = [] @@ -136,7 +136,7 @@ class AllowDebugView(BaseView): class MessagesDebugView(BaseView): - name = ""messages_debug"" + view_name = ""messages_debug"" async def get(self, request): await self.check_permission(request, ""view-instance"") diff --git a/datasette/views/table.py b/datasette/views/table.py index 0a3504b..45d298a 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -257,7 +257,16 @@ class RowTableShared(DataView): class TableView(RowTableShared): - name = ""table"" + view_name = ""table"" + + async def get_view_name(self, request, db_name, table_and_format): + canned_query = await self.ds.get_canned_query( + db_name, table_and_format, request.actor + ) + if canned_query: + return ""query"" + else: + return ""table"" async def post(self, request, db_name, table_and_format): # Handle POST to a canned query @@ -923,7 +932,7 @@ async def _sql_params_pks(db, table, pk_values): class RowView(RowTableShared): - name = ""row"" + view_name = ""row"" async def data(self, request, database, hash, table, pk_path, default_labels=False): await self.check_permissions( diff --git a/tests/test_plugins.py b/tests/test_plugins.py index 715c7c1..7ce2b1b 100644 --- a/tests/test_plugins.py +++ b/tests/test_plugins.py @@ -252,7 +252,7 @@ def test_plugin_config_file(app_client): }, ), ( - ""/fixtures/"", + ""/fixtures"", { ""template"": ""database.html"", ""database"": ""fixtures"", @@ -285,6 +285,38 @@ def test_plugin_config_file(app_client): ], }, ), + ( + ""/fixtures?sql=select+1+as+one"", + { + ""template"": ""query.html"", + ""database"": ""fixtures"", + ""table"": None, + ""config"": {""depth"": ""database""}, + ""view_name"": ""query"", + ""request_path"": ""/fixtures"", + ""added"": 15, + ""columns"": [ + ""one"", + ], + }, + ), + ( + ""/fixtures/neighborhood_search"", + { + ""template"": ""query.html"", + ""database"": ""fixtures"", + ""table"": None, + ""config"": {""depth"": ""database""}, + ""view_name"": ""query"", + ""request_path"": ""/fixtures/neighborhood_search"", + ""added"": 15, + ""columns"": [ + ""neighborhood"", + ""name"", + ""state"", + ], + }, + ), ], ) def test_hook_extra_body_script(app_client, path, expected_extra_body_script): ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page", https://github.com/simonw/datasette/issues/1195#issuecomment-766534634,https://api.github.com/repos/simonw/datasette/issues/1195,766534634,MDEyOklzc3VlQ29tbWVudDc2NjUzNDYzNA==,9599,simonw,2021-01-25T04:38:30Z,2021-01-25T04:38:30Z,OWNER,"This has proved surprisingly difficult to implement, due to the weird way the QueryView is actually called. The class itself isn't used like other view classes - instead, the `.data()` methods of both `DatabaseView` and `TableView` dispatch out to `QueryView.data()` when they need to: https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/table.py#L259-L270 https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/table.py#L290-L294 https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/database.py#L39-L44 It turns out this is a bad pattern because it makes changes like this one WAY harder than they should be. I think I should clean this up as part of #878.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page", https://github.com/simonw/datasette/pull/1204#issuecomment-766525337,https://api.github.com/repos/simonw/datasette/issues/1204,766525337,MDEyOklzc3VlQ29tbWVudDc2NjUyNTMzNw==,9599,simonw,2021-01-25T04:04:28Z,2021-01-25T04:04:28Z,OWNER,"Writing the tests will be a bit tricky since we need to confirm that the `include_table_top(datasette, database, actor, table)` arguments were all passed correctly but the only thing we get back from the plugin is a list of templates. Maybe encode those values into the template names somehow?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",793002853,WIP: Plugin includes, https://github.com/simonw/datasette/issues/987#issuecomment-766524141,https://api.github.com/repos/simonw/datasette/issues/987,766524141,MDEyOklzc3VlQ29tbWVudDc2NjUyNDE0MQ==,9599,simonw,2021-01-25T03:59:55Z,2021-01-25T03:59:55Z,OWNER,"This is joined with #1191 now, which I've bumped from 0.54.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/1191#issuecomment-766524016,https://api.github.com/repos/simonw/datasette/issues/1191,766524016,MDEyOklzc3VlQ29tbWVudDc2NjUyNDAxNg==,9599,simonw,2021-01-25T03:59:17Z,2021-01-25T03:59:17Z,OWNER,More work can happen in the PR: #1204,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1191#issuecomment-766523866,https://api.github.com/repos/simonw/datasette/issues/1191,766523866,MDEyOklzc3VlQ29tbWVudDc2NjUyMzg2Ng==,9599,simonw,2021-01-25T03:58:34Z,2021-01-25T03:58:34Z,OWNER,"I've got a good prototype working now, but I'm dropping this from the Datasette 0.54 milestone because it requires a bunch of additional work to make sure it is really well tested and documented.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1194#issuecomment-766491911,https://api.github.com/repos/simonw/datasette/issues/1194,766491911,MDEyOklzc3VlQ29tbWVudDc2NjQ5MTkxMQ==,9599,simonw,2021-01-25T02:02:15Z,2021-01-25T02:02:15Z,OWNER,I'm going to copy across anything starting with an underscore.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1167#issuecomment-766491566,https://api.github.com/repos/simonw/datasette/issues/1167,766491566,MDEyOklzc3VlQ29tbWVudDc2NjQ5MTU2Ng==,9599,simonw,2021-01-25T02:01:19Z,2021-01-25T02:01:19Z,OWNER,New documentation section here (I documented Black as well): https://docs.datasette.io/en/latest/contributing.html#code-formatting,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation, https://github.com/simonw/datasette/issues/1167#issuecomment-766487520,https://api.github.com/repos/simonw/datasette/issues/1167,766487520,MDEyOklzc3VlQ29tbWVudDc2NjQ4NzUyMA==,9599,simonw,2021-01-25T01:44:43Z,2021-01-25T01:44:43Z,OWNER,"Thanks @benpickles, I just merged that in. I'll use it in the documentation. # To check code is conformant npm run prettier -- --check # To fix it if it isn't npm run fix ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation, https://github.com/simonw/datasette/issues/983#issuecomment-766484915,https://api.github.com/repos/simonw/datasette/issues/983,766484915,MDEyOklzc3VlQ29tbWVudDc2NjQ4NDkxNQ==,9599,simonw,2021-01-25T01:33:29Z,2021-01-25T01:33:29Z,OWNER,I'm going to ship a version of this in Datasette 0.54 with a warning that the interface should be considered unstable (see #1202) so that we can start trying this out.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1179#issuecomment-766484435,https://api.github.com/repos/simonw/datasette/issues/1179,766484435,MDEyOklzc3VlQ29tbWVudDc2NjQ4NDQzNQ==,9599,simonw,2021-01-25T01:31:36Z,2021-01-25T01:31:36Z,OWNER,Relevant existing tests: https://github.com/simonw/datasette/blob/461670a0b87efa953141b449a9a261919864ceb3/tests/test_utils.py#L365-L398,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-766484257,https://api.github.com/repos/simonw/datasette/issues/1179,766484257,MDEyOklzc3VlQ29tbWVudDc2NjQ4NDI1Nw==,9599,simonw,2021-01-25T01:30:57Z,2021-01-25T01:30:57Z,OWNER,"The challenge here is figuring out what the original path, without the `.format`, actually was - while taking into account that Datasette has a special case for tables that themselves end in a `.something`. The `path_with_format()` function nearly does what we need here: https://github.com/simonw/datasette/blob/b6a7b58fa01af0cd5a5e94bd17d686d283a46819/datasette/utils/__init__.py#L710-L729 It can be called with `replace_format=""csv""` to REMOVE the `.csv` format and replace it with something else. Problem is, we want to use it to get rid of the format entirely. We could update `path_with_format()` to accept `format=''` to mean ""remove the format entirely"", but it's a bit messy. It may be better to reconsider the design of `path_with_format()` and related utility functions entirely.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1191#issuecomment-766466030,https://api.github.com/repos/simonw/datasette/issues/1191,766466030,MDEyOklzc3VlQ29tbWVudDc2NjQ2NjAzMA==,9599,simonw,2021-01-25T00:11:04Z,2021-01-25T00:11:04Z,OWNER,"I can combine this with #987 - each of these areas of the page can be wrapped in a `<div>` with a class that matches the name of the plugin hook, that way JavaScript plugins can append their content in the same place as Python plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1154#issuecomment-766465719,https://api.github.com/repos/simonw/datasette/issues/1154,766465719,MDEyOklzc3VlQ29tbWVudDc2NjQ2NTcxOQ==,9599,simonw,2021-01-25T00:09:22Z,2021-01-25T00:09:22Z,OWNER,"https://docs.datasette.io/en/latest/internals.html#the-internal-database <img width=""738"" alt=""Internals_for_plugins_—_Datasette_documentation"" src=""https://user-images.githubusercontent.com/9599/105648087-76eb4900-5e5e-11eb-9c73-c00ba37b7b10.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771208009,Documentation for new _internal database and tables, https://github.com/simonw/datasette/issues/1202#issuecomment-766464136,https://api.github.com/repos/simonw/datasette/issues/1202,766464136,MDEyOklzc3VlQ29tbWVudDc2NjQ2NDEzNg==,9599,simonw,2021-01-25T00:01:02Z,2021-01-25T00:01:02Z,OWNER,I'm going to use the existing `.. warning::` pattern for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792931244,Documentation convention for marking unstable APIs., https://github.com/simonw/datasette/issues/1090#issuecomment-766463496,https://api.github.com/repos/simonw/datasette/issues/1090,766463496,MDEyOklzc3VlQ29tbWVudDc2NjQ2MzQ5Ng==,9599,simonw,2021-01-24T23:57:00Z,2021-01-24T23:57:00Z,OWNER,Related: I built [datasette-leaflet-freedraw](https://datasette.io/plugins/datasette-leaflet-freedraw) which turns any canned query field called `freedraw` or `something_freedraw` into an interactive map that you can draw on to create a GeoJSON MultiPolygon.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741862364,Custom widgets for canned query forms, https://github.com/simonw/datasette/issues/1202#issuecomment-766462475,https://api.github.com/repos/simonw/datasette/issues/1202,766462475,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjQ3NQ==,9599,simonw,2021-01-24T23:49:28Z,2021-01-24T23:50:33Z,OWNER,"Can use an ""admonition"" similar to this: ```sphinx .. warning:: Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, `like this <https://latest.datasette.io/fixtures?sql=select+*+from+facetable>`__ for example. ``` As seen on https://docs.datasette.io/en/stable/authentication.html#controlling-access-to-specific-tables-and-views Documentation: https://docutils.sourceforge.io/docs/ref/rst/directives.html#specific-admonitions","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792931244,Documentation convention for marking unstable APIs., https://github.com/simonw/datasette/issues/1154#issuecomment-766462197,https://api.github.com/repos/simonw/datasette/issues/1154,766462197,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjE5Nw==,9599,simonw,2021-01-24T23:47:06Z,2021-01-24T23:47:06Z,OWNER,"I'm going to document this but mark it as unstable, using a new documentation convention for marking unstable APIs.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771208009,Documentation for new _internal database and tables, https://github.com/simonw/datasette/issues/1179#issuecomment-766434629,https://api.github.com/repos/simonw/datasette/issues/1179,766434629,MDEyOklzc3VlQ29tbWVudDc2NjQzNDYyOQ==,9599,simonw,2021-01-24T21:23:47Z,2021-01-24T21:23:47Z,OWNER,I'm just going to do `path` and `full_path` (which includes the querystring)`. The `datasette.absolute_url()` method can be used by plugins that need the full URL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1190#issuecomment-766433153,https://api.github.com/repos/simonw/datasette/issues/1190,766433153,MDEyOklzc3VlQ29tbWVudDc2NjQzMzE1Mw==,9599,simonw,2021-01-24T21:13:25Z,2021-01-24T21:13:25Z,OWNER,"This ties in to a bunch of other ideas that are in flight at the moment. If you're publishing databases by uploading them, how do you attach metadata? Ideally by baking it into the database file itself, using the mechanism from #1169. How could this interact with the `datasette insert` concept from #1163? Could you pass a CSV file to the `upload` command and have that converted and uploaded for you, or would you create the database file locally using `datasette insert` and then upload it as a separate `datasette upload` step? Lots to think about here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance, https://github.com/simonw/datasette/issues/1190#issuecomment-766430644,https://api.github.com/repos/simonw/datasette/issues/1190,766430644,MDEyOklzc3VlQ29tbWVudDc2NjQzMDY0NA==,9599,simonw,2021-01-24T20:57:03Z,2021-01-24T20:57:03Z,OWNER,"I really like this idea. It feels like an opportunity for a plugin that adds two things: an API endpoint to Datasette for accepting uploaded databases, and a `datasette publish upload` subcommand which can upload files to that endpoint (with some kind of authentication mechanism).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance, https://github.com/simonw/datasette/issues/1197#issuecomment-766430111,https://api.github.com/repos/simonw/datasette/issues/1197,766430111,MDEyOklzc3VlQ29tbWVudDc2NjQzMDExMQ==,9599,simonw,2021-01-24T20:53:40Z,2021-01-24T20:53:40Z,OWNER,"https://devcenter.heroku.com/articles/slug-compiler#slug-size says that the maximum allowed size is 500MB - my hunch is that the Datasette application itself weighs in at only a dozen or so MB but I haven't measured it. So I would imagine anything up to around 450MB should work OK on Heroku. Cloud Run works for up to about 2GB in my experience.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791381623,DB size limit for publishing with Heroku, https://github.com/simonw/datasette/issues/1198#issuecomment-766428183,https://api.github.com/repos/simonw/datasette/issues/1198,766428183,MDEyOklzc3VlQ29tbWVudDc2NjQyODE4Mw==,9599,simonw,2021-01-24T20:40:37Z,2021-01-24T20:40:37Z,OWNER,https://docs.datasette.io/en/latest/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792625812,Plugin testing documentation on using pytest-httpx, https://github.com/simonw/datasette/issues/1199#issuecomment-766181628,https://api.github.com/repos/simonw/datasette/issues/1199,766181628,MDEyOklzc3VlQ29tbWVudDc2NjE4MTYyOA==,9599,simonw,2021-01-23T21:25:18Z,2021-01-23T21:25:18Z,OWNER,"Comment thread here: https://news.ycombinator.com/item?id=25881911 - cperciva says: > There's an even better reason for databases to not write to memory mapped pages: Pages get synched out to disk at the kernel's leisure. This can be ok for a cache but it's definitely not what you want for a database! But... Datasette is often used in read-only mode, so that disadvantage often doesn't apply.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792652391,Experiment with PRAGMA mmap_size=N, https://github.com/simonw/sqlite-utils/pull/224#issuecomment-765678057,https://api.github.com/repos/simonw/sqlite-utils/issues/224,765678057,MDEyOklzc3VlQ29tbWVudDc2NTY3ODA1Nw==,37962604,polyrand,2021-01-22T20:53:06Z,2021-01-23T20:13:27Z,NONE,"I'm using the FTS methods in sqlite-utils for this website: [drwn.io](https://drwn.io/). I wanted to get pagination to have some kind of infinite scrolling in the landing page, and I ended up using that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792297010,Add fts offset docs., https://github.com/simonw/datasette/issues/1191#issuecomment-765757433,https://api.github.com/repos/simonw/datasette/issues/1191,765757433,MDEyOklzc3VlQ29tbWVudDc2NTc1NzQzMw==,9599,simonw,2021-01-22T23:43:43Z,2021-01-22T23:43:43Z,OWNER,"Another potential use for this: plugins that provide authentication (like `datasette-auth-passwords` and `datasette-auth-github`) could use it to add a chunk of HTML to the ""permission denied"" page that links to their mechanism of authenticating.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1196#issuecomment-765639968,https://api.github.com/repos/simonw/datasette/issues/1196,765639968,MDEyOklzc3VlQ29tbWVudDc2NTYzOTk2OA==,2826376,QAInsights,2021-01-22T19:37:15Z,2021-01-22T19:37:15Z,NONE,I tried deployment in WSL. It is working fine https://jmeter.vercel.app/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791237799,Access Denied Error in Windows, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765525338,MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA==,25372415,cobiadigital,2021-01-22T16:22:44Z,2021-01-22T16:22:44Z,NONE,"rs1333049 associated with coronary artery disease https://www.snpedia.com/index.php/Rs1333049 ``` select rsid, genotype, case genotype when 'CC' then '1.9x increased risk for coronary artery disease' when 'CG' then '1.5x increased risk for CAD' when 'GG' then 'normal' end as interpretation from genome where rsid = 'rs1333049' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765523517,MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw==,25372415,cobiadigital,2021-01-22T16:20:25Z,2021-01-22T16:20:25Z,NONE,"rs53576: the oxytocin receptor (OXTR) gene ``` select rsid, genotype, case genotype when 'AA' then 'Lack of empathy?' when 'AG' then 'Lack of empathy?' when 'GG' then 'Optimistic and empathetic; handle stress well' end as interpretation from genome where rsid = 'rs53576' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765506901,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765506901,MDEyOklzc3VlQ29tbWVudDc2NTUwNjkwMQ==,25372415,cobiadigital,2021-01-22T15:58:41Z,2021-01-22T15:58:58Z,NONE,"Both rs10757274 and rs2383206 can both indicate higher risks of heart disease https://www.snpedia.com/index.php/Rs2383206 ``` select rsid, genotype, case genotype when 'AA' then 'Normal' when 'AG' then '~1.2x increased risk for heart disease' when 'GG' then '~1.3x increased risk for heart disease' end as interpretation from genome where rsid = 'rs10757274' ``` ``` select rsid, genotype, case genotype when 'AA' then 'Normal' when 'AG' then '1.4x increased risk for heart disease' when 'GG' then '1.7x increased risk for heart disease' end as interpretation from genome where rsid = 'rs2383206' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765502845,MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ==,25372415,cobiadigital,2021-01-22T15:53:19Z,2021-01-22T15:53:19Z,NONE,"rs7903146 Influences risk of Type-2 diabetes https://www.snpedia.com/index.php/Rs7903146 ``` select rsid, genotype, case genotype when 'CC' then 'Normal (lower) risk of Type 2 Diabetes and Gestational Diabetes.' when 'CT' then '1.4x increased risk for diabetes (and perhaps colon cancer).' when 'TT' then '2x increased risk for Type-2 diabetes' end as interpretation from genome where rsid = 'rs7903146' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765498984,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765498984,MDEyOklzc3VlQ29tbWVudDc2NTQ5ODk4NA==,25372415,cobiadigital,2021-01-22T15:48:25Z,2021-01-22T15:49:33Z,NONE,"The ""Warrior Gene"" https://www.snpedia.com/index.php/Rs4680 ``` select rsid, genotype, case genotype when 'AA' then '(worrier) advantage in memory and attention tasks' when 'AG' then 'Intermediate dopamine levels, other effects' when 'GG' then '(warrior) multiple associations, see details' end as interpretation from genome where rsid = 'rs4680' ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765495861,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765495861,MDEyOklzc3VlQ29tbWVudDc2NTQ5NTg2MQ==,25372415,cobiadigital,2021-01-22T15:44:00Z,2021-01-22T15:44:00Z,NONE,"Risk of autoimmune disorders: https://www.snpedia.com/index.php/Genotype ``` select rsid, genotype, case genotype when 'AA' then '2x risk of rheumatoid arthritis and other autoimmune diseases' when 'GG' then 'Normal risk for autoimmune disorders' end as interpretation from genome where rsid = 'rs2476601' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/simonw/datasette/issues/1195#issuecomment-763108730,https://api.github.com/repos/simonw/datasette/issues/1195,763108730,MDEyOklzc3VlQ29tbWVudDc2MzEwODczMA==,9599,simonw,2021-01-19T20:22:37Z,2021-01-19T20:22:37Z,OWNER,I can use this test: https://github.com/simonw/datasette/blob/c38c42948cbfddd587729413fd6082ba352eaece/tests/test_plugins.py#L238-L294,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page", https://github.com/simonw/sqlite-utils/issues/223#issuecomment-762540514,https://api.github.com/repos/simonw/sqlite-utils/issues/223,762540514,MDEyOklzc3VlQ29tbWVudDc2MjU0MDUxNA==,9599,simonw,2021-01-19T01:14:58Z,2021-01-19T01:15:54Z,OWNER,"Example from https://docs.python.org/3/library/csv.html#csv.reader ```pycon >>> import csv >>> with open('eggs.csv', newline='') as csvfile: ... spamreader = csv.reader(csvfile, delimiter=' ', quotechar='|') ... for row in spamreader: ... print(', '.join(row)) Spam, Spam, Spam, Spam, Spam, Baked Beans Spam, Lovely Spam, Wonderful Spam ``` I'm going to add `--quotechar` as well as `--delimiter`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788527932,--delimiter option for CSV import, https://github.com/simonw/datasette/issues/1175#issuecomment-762488336,https://api.github.com/repos/simonw/datasette/issues/1175,762488336,MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg==,758858,hannseman,2021-01-18T21:59:28Z,2021-01-18T22:00:31Z,NONE,"I encountered your issue when trying to find a solution and came up with the following, maybe it can help. ```python import logging.config from typing import Tuple import structlog import uvicorn from example.config import settings shared_processors: Tuple[structlog.types.Processor, ...] = ( structlog.contextvars.merge_contextvars, structlog.stdlib.add_logger_name, structlog.stdlib.add_log_level, structlog.processors.TimeStamper(fmt=""iso""), ) logging_config = { ""version"": 1, ""disable_existing_loggers"": False, ""formatters"": { ""json"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.processors.JSONRenderer(), ""foreign_pre_chain"": shared_processors, }, ""console"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.dev.ConsoleRenderer(), ""foreign_pre_chain"": shared_processors, }, **uvicorn.config.LOGGING_CONFIG[""formatters""], }, ""handlers"": { ""default"": { ""level"": ""DEBUG"", ""class"": ""logging.StreamHandler"", ""formatter"": ""json"" if not settings.debug else ""console"", }, ""uvicorn.access"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""access"", }, ""uvicorn.default"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""default"", }, }, ""loggers"": { """": {""handlers"": [""default""], ""level"": ""INFO""}, ""uvicorn.error"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.default""], ""level"": ""INFO"", ""propagate"": False, }, ""uvicorn.access"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.access""], ""level"": ""INFO"", ""propagate"": False, }, }, } def setup_logging() -> None: structlog.configure( processors=[ structlog.stdlib.filter_by_level, *shared_processors, structlog.stdlib.PositionalArgumentsFormatter(), structlog.processors.StackInfoRenderer(), structlog.processors.format_exc_info, structlog.stdlib.ProcessorFormatter.wrap_for_formatter, ], context_class=dict, logger_factory=structlog.stdlib.LoggerFactory(), wrapper_class=structlog.stdlib.AsyncBoundLogger, cache_logger_on_first_use=True, ) logging.config.dictConfig(logging_config) ``` And then it'll be run on the startup event: ```python @app.on_event(""startup"") async def startup_event() -> None: setup_logging() ``` It depends on a setting called `debug` which controls if it should output the regular uvicorn logging or json. ","{""total_count"": 15, ""+1"": 7, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 5, ""rocket"": 1, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1036#issuecomment-762391426,https://api.github.com/repos/simonw/datasette/issues/1036,762391426,MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg==,4997607,philshem,2021-01-18T17:45:00Z,2021-01-18T17:45:00Z,NONE,"It might be possible with this library: https://docs.python.org/3/library/imghdr.html quick test of the downloaded blob: ``` >>> import imghdr >>> imghdr.what('material_culture-1-image.blob') 'jpeg' ``` The output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1194#issuecomment-762390568,https://api.github.com/repos/simonw/datasette/issues/1194,762390568,MDEyOklzc3VlQ29tbWVudDc2MjM5MDU2OA==,9599,simonw,2021-01-18T17:43:03Z,2021-01-18T17:43:03Z,OWNER,Should I just blanket copy over any query string argument that starts with an underscore? Any reason _not_ to do that?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1194#issuecomment-762390401,https://api.github.com/repos/simonw/datasette/issues/1194,762390401,MDEyOklzc3VlQ29tbWVudDc2MjM5MDQwMQ==,9599,simonw,2021-01-18T17:42:38Z,2021-01-18T17:42:38Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/views/table.py#L815-L827 It looks like there are other arguments that may not be persisted too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1036#issuecomment-762387875,https://api.github.com/repos/simonw/datasette/issues/1036,762387875,MDEyOklzc3VlQ29tbWVudDc2MjM4Nzg3NQ==,9599,simonw,2021-01-18T17:36:36Z,2021-01-18T17:36:36Z,OWNER,"As you can see, I'm pretty paranoid about serving content with `Content-Type` HTTP headers because I'm so worried about execution vulnerabilities. I'm much more comfortable exploring that kind of thing in plugins, since that way people can opt-in to riskier features. You found `datasette-media` which is my most comprehensive exploration of that idea so far - but there's definitely lots of room for more plugins along those lines. Maybe even an output plugin? `.jpg` as an export format which returns the `BLOB` column for a row as a JPEG image with the correct `content-type` header (but first verifies that the binary content does indeed look like a real JPEG) could be interesting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-762385981,https://api.github.com/repos/simonw/datasette/issues/1036,762385981,MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ==,4997607,philshem,2021-01-18T17:32:13Z,2021-01-18T17:34:50Z,NONE,"Hi Simon Just finding this old issue regarding downloading blobs. Nice work! <img width=""212"" alt=""image"" src=""https://user-images.githubusercontent.com/4997607/104946741-df4bad80-59ba-11eb-96e3-727c85cc4dc6.png""> As a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. `.jpg`) and then each blob could be downloaded as that type of file (perhaps if the file types were constrained to normal blobs that people store in sqlite databases, this could avoid the execution stuff mentioned above). I guess the column blob-type definition could fit into this dropdown selection: <img width=""201"" alt=""image"" src=""https://user-images.githubusercontent.com/4997607/104947000-479a8f00-59bb-11eb-87d9-1644e5940894.png""> Let me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.) Thanks for the great tool! --- edit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552 perhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/dogsheep/swarm-to-sqlite/issues/11#issuecomment-761967094,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11,761967094,MDEyOklzc3VlQ29tbWVudDc2MTk2NzA5NA==,9599,simonw,2021-01-18T04:11:13Z,2021-01-18T04:11:13Z,MEMBER,"I just got a similar error: ``` File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 79, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""with"", pk=""id"") File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2048, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1781, in insert return self.insert_all( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1899, in insert_all self.insert_chunk( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1709, in insert_chunk result = self.db.execute(query, params) File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 226, in execute return self.conn.execute(sql, parameters) pysqlite3.dbapi2.OperationalError: table users has no column named countryCode ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743400216,Error thrown: sqlite3.OperationalError: table users has no column named lastName, https://github.com/simonw/datasette/issues/1191#issuecomment-761705076,https://api.github.com/repos/simonw/datasette/issues/1191,761705076,MDEyOklzc3VlQ29tbWVudDc2MTcwNTA3Ng==,9599,simonw,2021-01-17T00:35:13Z,2021-01-17T00:37:51Z,OWNER,"I'm going to try using Jinja macros to implement this: https://jinja.palletsprojects.com/en/2.11.x/templates/#macros Maybe using one of these tricks to auto-load the macro? http://codyaray.com/2015/05/auto-load-jinja2-macros","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1181#issuecomment-761703555,https://api.github.com/repos/simonw/datasette/issues/1181,761703555,MDEyOklzc3VlQ29tbWVudDc2MTcwMzU1NQ==,9599,simonw,2021-01-17T00:24:20Z,2021-01-17T00:24:40Z,OWNER,"Here's the incomplete sketch of a test - to go at the bottom of `test_cli.py`. ```python @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(ensure_eventloop, tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output homepage_html = result1.output assert False ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1191#issuecomment-761703368,https://api.github.com/repos/simonw/datasette/issues/1191,761703368,MDEyOklzc3VlQ29tbWVudDc2MTcwMzM2OA==,9599,simonw,2021-01-17T00:22:46Z,2021-01-17T00:22:46Z,OWNER,I'm going to prototype this in a branch.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1191#issuecomment-761703232,https://api.github.com/repos/simonw/datasette/issues/1191,761703232,MDEyOklzc3VlQ29tbWVudDc2MTcwMzIzMg==,9599,simonw,2021-01-17T00:21:31Z,2021-01-17T00:21:54Z,OWNER,"I think this ends up being a whole collection of new plugin hooks, something like: - `include_table_top` - `include_table_bottom` - `include_row_top` - `include_row_bottom` - `include_database_top` - `include_database_bottom` - `include_query_bottom` - `include_query_bottom` - `include_index_bottom` - `include_index_bottom`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1191#issuecomment-761703022,https://api.github.com/repos/simonw/datasette/issues/1191,761703022,MDEyOklzc3VlQ29tbWVudDc2MTcwMzAyMg==,9599,simonw,2021-01-17T00:20:00Z,2021-01-17T00:20:00Z,OWNER,"Plugins that want to provide extra context to the template can already do so using the `extra_template_vars()` plugin hook. So this hook could work by returning a list of template filenames to be included. Those templates can be bundled with the plugin. Since they are included they will have access to the template context and to any `extra_template_vars()` values.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/657#issuecomment-761179229,https://api.github.com/repos/simonw/datasette/issues/657,761179229,MDEyOklzc3VlQ29tbWVudDc2MTE3OTIyOQ==,9599,simonw,2021-01-15T20:24:35Z,2021-01-15T20:24:35Z,OWNER,"I'm not sure how I missed this issue but it's almost a year later and I'm finally taking a look at your Parquet work. This is yet more evidence that allowing plugins to provide their own custom `Database` objects would be a good idea. I started exploring what Datasette would like on PostgreSQL in #670 - my concern was that I would need to add a large amount of database abstraction code which would dramatically increase the complexity of the core project, but my thinking now is that it might be tractable - Datasette doesn't actually construct SQL in complex ways anywhere outside of the `TableView` class so abstracting away just that bit should be feasible.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089,Allow creation of virtual tables at startup, https://github.com/simonw/datasette/issues/1191#issuecomment-761103910,https://api.github.com/repos/simonw/datasette/issues/1191,761103910,MDEyOklzc3VlQ29tbWVudDc2MTEwMzkxMA==,9599,simonw,2021-01-15T18:19:29Z,2021-01-15T18:19:29Z,OWNER,This relates to #987 (documented HTML hooks for JavaScript plugins) but is not quite the same thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/657#issuecomment-761101878,https://api.github.com/repos/simonw/datasette/issues/657,761101878,MDEyOklzc3VlQ29tbWVudDc2MTEwMTg3OA==,9599,simonw,2021-01-15T18:16:01Z,2021-01-15T18:16:01Z,OWNER,"I think the `startup()` plugin hook at https://docs.datasette.io/en/stable/plugin_hooks.html#startup-datasette should be able to fit this. You can write a plugin which uses that hook to execute `CREATE VIRTUAL TABLE` against one or more databases when Datasette first starts running. Would that work here?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089,Allow creation of virtual tables at startup, https://github.com/simonw/sqlite-utils/issues/220#issuecomment-761015218,https://api.github.com/repos/simonw/sqlite-utils/issues/220,761015218,MDEyOklzc3VlQ29tbWVudDc2MTAxNTIxOA==,649467,mhalle,2021-01-15T15:40:08Z,2021-01-15T15:40:08Z,NONE,"Make sense. If you're coming from the sqlite3 side of things, rather than the datasette side, wanting the fts methods to work for views makes more sense. sqlite3 allows fts5 tables on views, so I was looking for CLI functionality to build the fts virtual tables. Ultimately, though, sharing fts virtual tables across tables and derivative views is likely more efficient. Maybe an explicit error message like, ""fts is not supported for views"" rather than just throwing an exception that the method doesn't exist"" might be helpful. Not critical though. Thanks.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",783778672,Better error message for *_fts methods against views, https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-760950128,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55,760950128,MDEyOklzc3VlQ29tbWVudDc2MDk1MDEyOA==,21148,jacobian,2021-01-15T13:44:52Z,2021-01-15T13:44:52Z,CONTRIBUTOR,"I found and fixed another bug, this one around importing the tweets table. @simonw let me know if you'd prefer this broken out into multiple PRs, happy to do that if it makes review/merging easier.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779211940,Fix archive imports, https://github.com/simonw/datasette/issues/1187#issuecomment-759875239,https://api.github.com/repos/simonw/datasette/issues/1187,759875239,MDEyOklzc3VlQ29tbWVudDc1OTg3NTIzOQ==,9599,simonw,2021-01-14T02:02:24Z,2021-01-14T02:02:31Z,OWNER,"This plugin hook currently returns a string of JavaScript. It could optionally return this instead of a string: ```json { ""script"": ""string of JavaScript goes here"", ""module"": true } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785588942,"extra_body_script() support for script type=""module""", https://github.com/simonw/datasette/issues/1186#issuecomment-759874332,https://api.github.com/repos/simonw/datasette/issues/1186,759874332,MDEyOklzc3VlQ29tbWVudDc1OTg3NDMzMg==,9599,simonw,2021-01-14T01:59:35Z,2021-01-14T01:59:35Z,OWNER,Updated documentation: https://docs.datasette.io/en/latest/custom_templates.html#custom-css-and-javascript and https://docs.datasette.io/en/latest/plugin_hooks.html#extra-js-urls-template-database-table-columns-view-name-request-datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785573793,"script type=""module"" support", https://github.com/simonw/datasette/pull/1159#issuecomment-759306228,https://api.github.com/repos/simonw/datasette/issues/1159,759306228,MDEyOklzc3VlQ29tbWVudDc1OTMwNjIyOA==,552629,lovasoa,2021-01-13T08:59:31Z,2021-01-13T08:59:31Z,NONE,@simonw : Did you have the time to take a look at this ?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",774332247,Improve the display of facets information, https://github.com/simonw/sqlite-utils/issues/220#issuecomment-759098964,https://api.github.com/repos/simonw/sqlite-utils/issues/220,759098964,MDEyOklzc3VlQ29tbWVudDc1OTA5ODk2NA==,9599,simonw,2021-01-12T23:19:55Z,2021-01-12T23:19:55Z,OWNER,"I don't think it makes sense to call `.enable_fts()` on a view does it? When I'm working with views and FTS I tend to write my queries against a FTS table for one of the tables that is used by the view - https://docs.datasette.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view describes how I do that in Datasette for example. Can you expand on your use-case for FTS and views? I'm ready to be convinced otherwise, but I don't see how it would work right now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",783778672,Better error message for *_fts methods against views, https://github.com/simonw/datasette/issues/1185#issuecomment-759069342,https://api.github.com/repos/simonw/datasette/issues/1185,759069342,MDEyOklzc3VlQ29tbWVudDc1OTA2OTM0Mg==,9599,simonw,2021-01-12T22:13:18Z,2021-01-12T22:13:18Z,OWNER,I'm going to change the error message to list the allowed pragmas.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1185#issuecomment-759067427,https://api.github.com/repos/simonw/datasette/issues/1185,759067427,MDEyOklzc3VlQ29tbWVudDc1OTA2NzQyNw==,9599,simonw,2021-01-12T22:09:21Z,2021-01-12T22:09:21Z,OWNER,"That allow-list was added in #761 but is not currently documented. It's here in the code: https://github.com/simonw/datasette/blob/8e8fc5cee5c78da8334495c6d6257d5612c40792/datasette/utils/__init__.py#L173-L186","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1185#issuecomment-759066777,https://api.github.com/repos/simonw/datasette/issues/1185,759066777,MDEyOklzc3VlQ29tbWVudDc1OTA2Njc3Nw==,9599,simonw,2021-01-12T22:07:58Z,2021-01-12T22:07:58Z,OWNER,"https://docs.datasette.io/en/stable/sql_queries.html?highlight=pragma#named-parameters documentation is out-of-date as well: > Datasette disallows custom SQL containing the string PRAGMA, as SQLite pragma statements can be used to change database settings at runtime. If you need to include the string ""pragma"" in a query you can do so safely using a named parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1091#issuecomment-758668359,https://api.github.com/repos/simonw/datasette/issues/1091,758668359,MDEyOklzc3VlQ29tbWVudDc1ODY2ODM1OQ==,6739646,tballison,2021-01-12T13:52:29Z,2021-01-12T13:52:29Z,NONE,"Y, thank you to both of you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758448525,https://api.github.com/repos/simonw/datasette/issues/1091,758448525,MDEyOklzc3VlQ29tbWVudDc1ODQ0ODUyNQ==,19328961,henry501,2021-01-12T06:55:08Z,2021-01-12T06:55:08Z,NONE,"Great, really happy I could help! Reverse proxies get tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1183#issuecomment-758356640,https://api.github.com/repos/simonw/datasette/issues/1183,758356640,MDEyOklzc3VlQ29tbWVudDc1ODM1NjY0MA==,9599,simonw,2021-01-12T02:42:08Z,2021-01-12T02:42:08Z,OWNER,"Should Datasette have subcommands for this? `datasette enable-counts data.db` and `datasette disable-counts data.db` and `datasette reset-counts data.db` commands? Maybe. The `sqlite-utils` CLI tool could be used here instead, but that won't be easily available if Datasette was installed as a standalone binary or using `brew install datasette` or `pipx install datasette`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available", https://github.com/simonw/datasette/issues/1183#issuecomment-758356097,https://api.github.com/repos/simonw/datasette/issues/1183,758356097,MDEyOklzc3VlQ29tbWVudDc1ODM1NjA5Nw==,9599,simonw,2021-01-12T02:40:30Z,2021-01-12T02:40:30Z,OWNER,"So how would this work? I think I'm going to automatically use these values if the `_counts` table exists, unless a `ignore_counts_table` boolean setting has been set. I won't bother looking to see if the triggers have been created.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available", https://github.com/simonw/datasette/issues/1091#issuecomment-758283074,https://api.github.com/repos/simonw/datasette/issues/1091,758283074,MDEyOklzc3VlQ29tbWVudDc1ODI4MzA3NA==,9599,simonw,2021-01-11T23:12:46Z,2021-01-11T23:12:46Z,OWNER,Fantastic!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758280611,https://api.github.com/repos/simonw/datasette/issues/1091,758280611,MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ==,6739646,tballison,2021-01-11T23:06:10Z,2021-01-11T23:06:10Z,NONE,"+1 Yep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS! Thank you!","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1182#issuecomment-757375858,https://api.github.com/repos/simonw/datasette/issues/1182,757375858,MDEyOklzc3VlQ29tbWVudDc1NzM3NTg1OA==,9599,simonw,2021-01-09T22:18:47Z,2021-01-09T22:18:47Z,OWNER,https://docs.datasette.io/en/latest/ecosystem.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1182#issuecomment-757373741,https://api.github.com/repos/simonw/datasette/issues/1182,757373741,MDEyOklzc3VlQ29tbWVudDc1NzM3Mzc0MQ==,9599,simonw,2021-01-09T22:01:41Z,2021-01-09T22:01:41Z,OWNER,It can talk about Dogsheep too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1182#issuecomment-757373082,https://api.github.com/repos/simonw/datasette/issues/1182,757373082,MDEyOklzc3VlQ29tbWVudDc1NzM3MzA4Mg==,9599,simonw,2021-01-09T21:55:33Z,2021-01-09T21:55:33Z,OWNER,I'll leave the page there but change it into more of a blurb about the existence of the plugins and tools directories.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1181#issuecomment-756487966,https://api.github.com/repos/simonw/datasette/issues/1181,756487966,MDEyOklzc3VlQ29tbWVudDc1NjQ4Nzk2Ng==,9599,simonw,2021-01-08T01:25:42Z,2021-01-08T01:25:42Z,OWNER,I'm going to add a unit test that tries a variety of weird database names.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1181#issuecomment-756482163,https://api.github.com/repos/simonw/datasette/issues/1181,756482163,MDEyOklzc3VlQ29tbWVudDc1NjQ4MjE2Mw==,9599,simonw,2021-01-08T01:06:23Z,2021-01-08T01:06:54Z,OWNER,"Yes, that logic is definitely at fault. It looks like it applies `urllib.parse.unquote_plus()` AFTER it's tried to do the `-` hash splitting thing, which is why it's failing here: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184-L198","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1091#issuecomment-756453945,https://api.github.com/repos/simonw/datasette/issues/1091,756453945,MDEyOklzc3VlQ29tbWVudDc1NjQ1Mzk0NQ==,9599,simonw,2021-01-07T23:42:50Z,2021-01-07T23:42:50Z,OWNER,"@henry501 it looks like you spotted a bug in the documentation - I just addressed that, the fix is now live here: https://docs.datasette.io/en/latest/deploying.html#running-datasette-behind-a-proxy","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-756453010,https://api.github.com/repos/simonw/datasette/issues/1091,756453010,MDEyOklzc3VlQ29tbWVudDc1NjQ1MzAxMA==,9599,simonw,2021-01-07T23:39:58Z,2021-01-07T23:40:25Z,OWNER,"@tballison I think that's the solution! It looks like you need to use this in your config: `ProxyPass /datasette http://127.0.0.1:8001/datasette` Instead of this: `ProxyPass /datasette http://127.0.0.1:8001/` Give that a go and let me know if it fixes it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-756425587,https://api.github.com/repos/simonw/datasette/issues/1091,756425587,MDEyOklzc3VlQ29tbWVudDc1NjQyNTU4Nw==,19328961,henry501,2021-01-07T22:27:19Z,2021-01-07T22:27:19Z,NONE,"I found this issue while troubleshooting the same behavior with an nginx reverse proxy. The solution was to make sure I set: `proxy_pass http://server:8001/baseurl/ ` instead of just: `proxy_pass http://server:8001 ` The custom SQL query and header links are now correct. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1171#issuecomment-756335394,https://api.github.com/repos/simonw/datasette/issues/1171,756335394,MDEyOklzc3VlQ29tbWVudDc1NjMzNTM5NA==,9599,simonw,2021-01-07T19:35:59Z,2021-01-07T19:35:59Z,OWNER,"I requested a D-U-N-S number as a first step in getting a developer certificate: https://developer.apple.com/support/D-U-N-S/ ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1180#issuecomment-756312213,https://api.github.com/repos/simonw/datasette/issues/1180,756312213,MDEyOklzc3VlQ29tbWVudDc1NjMxMjIxMw==,9599,simonw,2021-01-07T18:56:24Z,2021-01-07T18:56:24Z,OWNER,The `async_call_with_supported_arguments` version should be able to await any of the lazy arguments that are awaitable - can use `await_me_maybe` for that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments, https://github.com/simonw/datasette/issues/1180#issuecomment-755500475,https://api.github.com/repos/simonw/datasette/issues/1180,755500475,MDEyOklzc3VlQ29tbWVudDc1NTUwMDQ3NQ==,9599,simonw,2021-01-06T18:43:41Z,2021-01-06T18:43:41Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/__init__.py#L919-L940,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments, https://github.com/simonw/datasette/issues/1179#issuecomment-755495387,https://api.github.com/repos/simonw/datasette/issues/1179,755495387,MDEyOklzc3VlQ29tbWVudDc1NTQ5NTM4Nw==,9599,simonw,2021-01-06T18:39:23Z,2021-01-06T18:39:23Z,OWNER,"In that case maybe there are three new arguments: `path`, `full_path` and `url`. I'll also add `request.full_path` for consistency with these: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/asgi.py#L77-L90","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755492945,https://api.github.com/repos/simonw/datasette/issues/1179,755492945,MDEyOklzc3VlQ29tbWVudDc1NTQ5Mjk0NQ==,9599,simonw,2021-01-06T18:37:39Z,2021-01-06T18:37:39Z,OWNER,I think I'll call this `full_path` for consistency with Django.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755489974,https://api.github.com/repos/simonw/datasette/issues/1179,755489974,MDEyOklzc3VlQ29tbWVudDc1NTQ4OTk3NA==,9599,simonw,2021-01-06T18:35:24Z,2021-01-06T18:35:24Z,OWNER,Django calls this ` HttpRequest.get_full_path()` - for the path plus the querystring.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755486103,https://api.github.com/repos/simonw/datasette/issues/1179,755486103,MDEyOklzc3VlQ29tbWVudDc1NTQ4NjEwMw==,9599,simonw,2021-01-06T18:32:41Z,2021-01-06T18:34:11Z,OWNER,"This parameter will return the URL path, with querystring arguments, to the HTML version of the page - e.g. `/github/issue_comments` or `/github/issue_comments?_sort_desc=created_at` Open questions: - What should it be called? `path` could be misleading since it also includes the querystring. - Should I provide a `url` or `full_url` version which includes `https://blah.com/...`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/782#issuecomment-755484384,https://api.github.com/repos/simonw/datasette/issues/782,755484384,MDEyOklzc3VlQ29tbWVudDc1NTQ4NDM4NA==,9599,simonw,2021-01-06T18:31:14Z,2021-01-06T18:31:57Z,OWNER,"In building https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on I discovered the following patterns for importing data into both Pandas and Observable/d3: ```python import pandas df = pandas.read_json( ""https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array"" ) ``` And: ```javascript d3 = require(""d3@5"") rows = d3.json( ""https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array"" ) ``` Once again I find myself torn on the best possible default. A list of JSON objects is instantly compatible with both `pandas.read_json()` and `d3.json()` - but it leaves nowhere to put the extra information like pagination and suchlike! Even given this I still think the correct default is an object with `""rows""`, `""total""` and `""next_url""` keys. I should commit to that and implement it - this thought exercise has been running for far too long.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1178#issuecomment-755476820,https://api.github.com/repos/simonw/datasette/issues/1178,755476820,MDEyOklzc3VlQ29tbWVudDc1NTQ3NjgyMA==,9599,simonw,2021-01-06T18:24:47Z,2021-01-06T18:24:47Z,OWNER,"Issue fixed - https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on displays the correct schemes now. I can't think of a reason anyone on Cloud Run would ever NOT want the `force_https_urls` option, but just in case I've made it so if you pass `--extra-options --setting force_https_urls off` to `publish cloudrun` your setting will be respected. https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/publish/cloudrun.py#L105-L110","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755468795,https://api.github.com/repos/simonw/datasette/issues/1178,755468795,MDEyOklzc3VlQ29tbWVudDc1NTQ2ODc5NQ==,9599,simonw,2021-01-06T18:14:35Z,2021-01-06T18:14:35Z,OWNER,Deploying that change now to test it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755163886,https://api.github.com/repos/simonw/datasette/issues/1178,755163886,MDEyOklzc3VlQ29tbWVudDc1NTE2Mzg4Ng==,9599,simonw,2021-01-06T08:37:51Z,2021-01-06T08:37:51Z,OWNER,"Easiest fix would be for `publish cloudrun` to set `force_https_urls`: `datasette publish now` used to do this: https://github.com/simonw/datasette/blob/07e208cc6d9e901b87552c1be2854c220b3f9b6d/datasette/publish/now.py#L59-L63","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1179#issuecomment-755161574,https://api.github.com/repos/simonw/datasette/issues/1179,755161574,MDEyOklzc3VlQ29tbWVudDc1NTE2MTU3NA==,9599,simonw,2021-01-06T08:32:31Z,2021-01-06T08:32:31Z,OWNER,An optional `path` argument to https://docs.datasette.io/en/stable/plugin_hooks.html#register-output-renderer-datasette which shows the path WITHOUT the `.Notebook` extension would be useful here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1178#issuecomment-755160187,https://api.github.com/repos/simonw/datasette/issues/1178,755160187,MDEyOklzc3VlQ29tbWVudDc1NTE2MDE4Nw==,9599,simonw,2021-01-06T08:29:35Z,2021-01-06T08:29:35Z,OWNER,"https://latest-with-plugins.datasette.io/-/asgi-scope ``` {'asgi': {'spec_version': '2.1', 'version': '3.0'}, 'client': ('169.254.8.129', 54971), 'headers': [(b'host', b'latest-with-plugins.datasette.io'), (b'user-agent', b'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:84.0) Gecko' b'/20100101 Firefox/84.0'), (b'accept', b'text/html,application/xhtml+xml,application/xml;q=0.9,image/' b'webp,*/*;q=0.8'), (b'accept-language', b'en-US,en;q=0.5'), (b'dnt', b'1'), (b'cookie', b'_ga_LL6M7BK6D4=GS1.1.1609886546.49.1.1609886923.0; _ga=GA1.1' b'.894633707.1607575712'), (b'upgrade-insecure-requests', b'1'), (b'x-client-data', b'CgSL6ZsV'), (b'x-cloud-trace-context', b'e776af843c657d2a3da28a73b726e6fe/14187666787557102189;o=1'), (b'x-forwarded-for', b'148.64.98.14'), (b'x-forwarded-proto', b'https'), (b'forwarded', b'for=""148.64.98.14"";proto=https'), (b'accept-encoding', b'gzip, deflate, br'), (b'content-length', b'0')], 'http_version': '1.1', 'method': 'GET', 'path': '/-/asgi-scope', 'query_string': b'', 'raw_path': b'/-/asgi-scope', 'root_path': '', 'scheme': 'http', 'server': ('169.254.8.130', 8080), 'type': 'http'} ``` Note the `'scheme': 'http'` but also the `(b'x-forwarded-proto', b'https')`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1176#issuecomment-755159583,https://api.github.com/repos/simonw/datasette/issues/1176,755159583,MDEyOklzc3VlQ29tbWVudDc1NTE1OTU4Mw==,9599,simonw,2021-01-06T08:28:20Z,2021-01-06T08:28:20Z,OWNER,I used `from datasette.utils import path_with_format` in https://github.com/simonw/datasette-export-notebook/blob/0.1/datasette_export_notebook/__init__.py just now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1178#issuecomment-755158310,https://api.github.com/repos/simonw/datasette/issues/1178,755158310,MDEyOklzc3VlQ29tbWVudDc1NTE1ODMxMA==,9599,simonw,2021-01-06T08:25:31Z,2021-01-06T08:25:31Z,OWNER,Moving this to the Datasette repo.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157732,https://api.github.com/repos/simonw/datasette/issues/1178,755157732,MDEyOklzc3VlQ29tbWVudDc1NTE1NzczMg==,9599,simonw,2021-01-06T08:24:12Z,2021-01-06T08:24:12Z,OWNER,https://latest-with-plugins.datasette.io/fixtures/sortable.json has the bug too - the `next_url` is `http://` when it should be `https://`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157281,https://api.github.com/repos/simonw/datasette/issues/1178,755157281,MDEyOklzc3VlQ29tbWVudDc1NTE1NzI4MQ==,9599,simonw,2021-01-06T08:23:14Z,2021-01-06T08:23:14Z,OWNER,"https://latest-with-plugins.datasette.io/-/settings says `""force_https_urls"": false`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157066,https://api.github.com/repos/simonw/datasette/issues/1178,755157066,MDEyOklzc3VlQ29tbWVudDc1NTE1NzA2Ng==,9599,simonw,2021-01-06T08:22:47Z,2021-01-06T08:22:47Z,OWNER,"Weird... https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/app.py#L609-L613 ```python def absolute_url(self, request, path): url = urllib.parse.urljoin(request.url, path) if url.startswith(""http://"") and self.setting(""force_https_urls""): url = ""https://"" + url[len(""http://"") :] return url ``` That looks like it should work. Needs more digging.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755156606,https://api.github.com/repos/simonw/datasette/issues/1178,755156606,MDEyOklzc3VlQ29tbWVudDc1NTE1NjYwNg==,9599,simonw,2021-01-06T08:21:49Z,2021-01-06T08:21:49Z,OWNER,"https://github.com/simonw/datasette-export-notebook/blob/aec398eab4f34791d240d7bc47b6eec575b357be/datasette_export_notebook/__init__.py#L18-L23 Maybe this is a bug in `datasette.absolute_url`? Perhaps it doesn't take the scheme into account.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1101#issuecomment-755134771,https://api.github.com/repos/simonw/datasette/issues/1101,755134771,MDEyOklzc3VlQ29tbWVudDc1NTEzNDc3MQ==,9599,simonw,2021-01-06T07:28:01Z,2021-01-06T07:28:01Z,OWNER,"With this structure it will become possible to stream non-newline-delimited JSON array-of-objects too - the `stream_rows()` method could output `[` first, then each row followed by a comma, then `]` after the very last row.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1101#issuecomment-755133937,https://api.github.com/repos/simonw/datasette/issues/1101,755133937,MDEyOklzc3VlQ29tbWVudDc1NTEzMzkzNw==,9599,simonw,2021-01-06T07:25:48Z,2021-01-06T07:26:43Z,OWNER,"Idea: instead of returning a dictionary, `register_output_renderer` could return an object. The object could have the following properties: - `.extension` - the extension to use - `.can_render(...)` - says if it can render this - `.can_stream(...)` - says if streaming is supported - `async .stream_rows(rows_iterator, send)` - method that loops through all rows and uses `send` to send them to the response in the correct format I can then deprecate the existing `dict` return type for 1.0.","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1101#issuecomment-755128038,https://api.github.com/repos/simonw/datasette/issues/1101,755128038,MDEyOklzc3VlQ29tbWVudDc1NTEyODAzOA==,9599,simonw,2021-01-06T07:10:22Z,2021-01-06T07:10:22Z,OWNER,"Yet another use-case for this: I want to be able to stream newline-delimited JSON in order to better import into Pandas: pandas.read_json(""https://latest.datasette.io/fixtures/compound_three_primary_keys.json?_shape=array&_nl=on"", lines=True)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1171#issuecomment-754958998,https://api.github.com/repos/simonw/datasette/issues/1171,754958998,MDEyOklzc3VlQ29tbWVudDc1NDk1ODk5OA==,9599,simonw,2021-01-05T23:16:33Z,2021-01-05T23:16:33Z,OWNER,"That's really useful, thanks @rcoup ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/782#issuecomment-754958610,https://api.github.com/repos/simonw/datasette/issues/782,754958610,MDEyOklzc3VlQ29tbWVudDc1NDk1ODYxMA==,9599,simonw,2021-01-05T23:15:24Z,2021-01-05T23:15:24Z,OWNER,https://latest-with-plugins.datasette.io/fixtures/roadside_attraction_characteristics/1.json-preview returns a 500 error at the moment - a KeyError on 'filtered_table_rows_count'.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/576#issuecomment-754957658,https://api.github.com/repos/simonw/datasette/issues/576,754957658,MDEyOklzc3VlQ29tbWVudDc1NDk1NzY1OA==,9599,simonw,2021-01-05T23:12:50Z,2021-01-05T23:12:50Z,OWNER,See https://docs.datasette.io/en/stable/internals.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497170355,Documented internals API for use in plugins, https://github.com/simonw/datasette/issues/576#issuecomment-754957563,https://api.github.com/repos/simonw/datasette/issues/576,754957563,MDEyOklzc3VlQ29tbWVudDc1NDk1NzU2Mw==,9599,simonw,2021-01-05T23:12:37Z,2021-01-05T23:12:37Z,OWNER,"I'm happy with how this has evolved, so I'm closing the issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497170355,Documented internals API for use in plugins, https://github.com/simonw/datasette/issues/1176#issuecomment-754957378,https://api.github.com/repos/simonw/datasette/issues/1176,754957378,MDEyOklzc3VlQ29tbWVudDc1NDk1NzM3OA==,9599,simonw,2021-01-05T23:12:03Z,2021-01-05T23:12:03Z,OWNER,This needs to be done for Datasette 1.0. At the very least I need to ensure it's clear that `datasette.utils` is not part of the public API unless explicitly marked as such.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754952146,https://api.github.com/repos/simonw/datasette/issues/1176,754952146,MDEyOklzc3VlQ29tbWVudDc1NDk1MjE0Ng==,9599,simonw,2021-01-05T22:57:26Z,2021-01-05T22:57:26Z,OWNER,Known public APIs might be worth adding type annotations to as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754952040,https://api.github.com/repos/simonw/datasette/issues/1176,754952040,MDEyOklzc3VlQ29tbWVudDc1NDk1MjA0MA==,9599,simonw,2021-01-05T22:57:09Z,2021-01-05T22:57:09Z,OWNER,It might be neater to move all of the non-public functions into a separate module - `datasette.utils.internal` perhaps.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754951786,https://api.github.com/repos/simonw/datasette/issues/1176,754951786,MDEyOklzc3VlQ29tbWVudDc1NDk1MTc4Ng==,9599,simonw,2021-01-05T22:56:27Z,2021-01-05T22:56:43Z,OWNER,"Idea: introduce a `@documented` decorator which marks specific functions as part of the public, documented API. The unit tests can then confirm that anything with that decorator is both documented and tested. ```python @documented def escape_css_string(s): return _css_re.sub( lambda m: ""\\"" + (f""{ord(m.group()):X}"".zfill(6)), s.replace(""\r\n"", ""\n""), ) ``` Or maybe `@public`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1171#issuecomment-754911290,https://api.github.com/repos/simonw/datasette/issues/1171,754911290,MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA==,59874,rcoup,2021-01-05T21:31:15Z,2021-01-05T21:31:15Z,NONE,"We did this for [Sno](https://sno.earth) under macOS — it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging. * [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95) * [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215) * [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it FYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an ""EV CodeSigning for HSM"" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time). ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754729035,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54,754729035,MDEyOklzc3VlQ29tbWVudDc1NDcyOTAzNQ==,21148,jacobian,2021-01-05T16:03:29Z,2021-01-05T16:03:29Z,CONTRIBUTOR,"I was able to fix this, at least enough to get _my_ archive to import. Not sure if there's more work to be done here or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779088071,Archive import appears to be broken on recent exports, https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-754728696,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55,754728696,MDEyOklzc3VlQ29tbWVudDc1NDcyODY5Ng==,21148,jacobian,2021-01-05T16:02:55Z,2021-01-05T16:02:55Z,CONTRIBUTOR,"This now works for me, though I'm entirely ensure if it's a just-my-export thing or a wider issue. Also, this doesn't contain any tests. So I'm not sure if there's more work to be done here, or if this is good enough.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779211940,Fix archive imports, https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754721153,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54,754721153,MDEyOklzc3VlQ29tbWVudDc1NDcyMTE1Mw==,21148,jacobian,2021-01-05T15:51:09Z,2021-01-05T15:51:09Z,CONTRIBUTOR,"Correction: the failure is on `lists-member.js` (I was thrown by the `block` variable name, but that's just a coincidence)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779088071,Archive import appears to be broken on recent exports, https://github.com/simonw/datasette/issues/1175#issuecomment-754696725,https://api.github.com/repos/simonw/datasette/issues/1175,754696725,MDEyOklzc3VlQ29tbWVudDc1NDY5NjcyNQ==,9599,simonw,2021-01-05T15:12:30Z,2021-01-05T15:12:30Z,OWNER,Some tips here: https://github.com/tiangolo/fastapi/issues/78,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1167#issuecomment-754619930,https://api.github.com/repos/simonw/datasette/issues/1167,754619930,MDEyOklzc3VlQ29tbWVudDc1NDYxOTkzMA==,3637,benpickles,2021-01-05T12:57:57Z,2021-01-05T12:57:57Z,CONTRIBUTOR,"Not sure where exactly to put the actual docs (presumably somewhere in [docs/contributing.rst](https://github.com/simonw/datasette/blob/main/docs/contributing.rst)) but I've made a slight change to make it easier to run locally (copying [the approach in excalidraw](https://github.com/excalidraw/excalidraw/blob/ade2565f497243a5e428f4906d8ed80c872fd981/package.json#L90-L94)): https://github.com/simonw/datasette/compare/main...benpickles:prettier-docs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation, https://github.com/simonw/datasette/issues/859#issuecomment-647922203,https://api.github.com/repos/simonw/datasette/issues/859,647922203,MDEyOklzc3VlQ29tbWVudDY0NzkyMjIwMw==,3243482,abdusco,2020-06-23T05:44:58Z,2021-01-05T08:22:43Z,CONTRIBUTOR,"I'm seeing the problem on database page. Index page and table page runs quite fast. - Tables have <10 columns (`id`, `url`, `title`, `body_html`, `date`, `author`, `meta` (for keeping unstructured json)). I've added index on `date` columns (using `sqlite-utils`) in addition to the index present on `id` columns. - All tables have FTS enabled on `text` and `varchar` columns (`title`, `body_html` etc) to speed up searching. - There are couple of tables related with foreign keys (think a thread in a forum and posts in that thread, related with `thread_id`) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841,Database page loads too slowly with many large tables (due to table counts), https://github.com/simonw/datasette/issues/1173#issuecomment-754463845,https://api.github.com/repos/simonw/datasette/issues/1173,754463845,MDEyOklzc3VlQ29tbWVudDc1NDQ2Mzg0NQ==,9599,simonw,2021-01-05T07:41:43Z,2021-01-05T07:41:43Z,OWNER,"https://github.com/oleksis/pyinstaller-manylinux looks useful, via https://twitter.com/oleksis/status/1346341987876823040","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778682317,GitHub Actions workflow to build manylinux binary, https://github.com/simonw/datasette/issues/1171#issuecomment-754296761,https://api.github.com/repos/simonw/datasette/issues/1171,754296761,MDEyOklzc3VlQ29tbWVudDc1NDI5Njc2MQ==,9599,simonw,2021-01-04T23:55:44Z,2021-01-04T23:55:44Z,OWNER,Bit uncomfortable that it looks like you need to include your Apple ID username and password in the CI configuration to do this. I'll use GitHub Secrets for this but I don't like it - I'll definitely setup a dedicated code signing account that's not my access-to-everything AppleID for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754295380,https://api.github.com/repos/simonw/datasette/issues/1171,754295380,MDEyOklzc3VlQ29tbWVudDc1NDI5NTM4MA==,9599,simonw,2021-01-04T23:54:32Z,2021-01-04T23:54:32Z,OWNER,"https://github.com/search?l=YAML&q=gon+json&type=Code reveals some examples of people using `gon` in workflows. These look useful: * https://github.com/coherence/hub-server/blob/3b7e9c7c5bce9e244b14b854f1f89d66f53a5a39/.github/workflows/release_build.yml * https://github.com/simoncozens/pilcrow/blob/5abc145e7fb9577086afe47b48fd730cb8195386/.github/workflows/buildapp.yaml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754287882,https://api.github.com/repos/simonw/datasette/issues/1171,754287882,MDEyOklzc3VlQ29tbWVudDc1NDI4Nzg4Mg==,9599,simonw,2021-01-04T23:40:10Z,2021-01-04T23:42:32Z,OWNER,"This looks VERY useful: https://github.com/mitchellh/gon - "" Sign, notarize, and package macOS CLI tools and applications written in any language. Available as both a CLI and a Go library."" And it installs like this: brew install mitchellh/gon/gon","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754286783,https://api.github.com/repos/simonw/datasette/issues/1171,754286783,MDEyOklzc3VlQ29tbWVudDc1NDI4Njc4Mw==,9599,simonw,2021-01-04T23:38:18Z,2021-01-04T23:38:18Z,OWNER,Oh wow maybe I need to Notarize it too? https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754286618,https://api.github.com/repos/simonw/datasette/issues/1171,754286618,MDEyOklzc3VlQ29tbWVudDc1NDI4NjYxOA==,9599,simonw,2021-01-04T23:37:45Z,2021-01-04T23:37:45Z,OWNER,https://github.com/actions/virtual-environments/issues/1820#issuecomment-719549887 looks useful - not sure if those notes are for iOS or macOS though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/93#issuecomment-754285795,https://api.github.com/repos/simonw/datasette/issues/93,754285795,MDEyOklzc3VlQ29tbWVudDc1NDI4NTc5NQ==,9599,simonw,2021-01-04T23:35:13Z,2021-01-04T23:35:13Z,OWNER,Next step is to automate this all!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/1152#issuecomment-754285588,https://api.github.com/repos/simonw/datasette/issues/1152,754285588,MDEyOklzc3VlQ29tbWVudDc1NDI4NTU4OA==,9599,simonw,2021-01-04T23:34:30Z,2021-01-04T23:34:30Z,OWNER,"I think the way to do this is to have a new plugin hook that returns two SQL where clauses: one returning a list of resources that the user should be able to access (the allow-list) and one returning a list of resources they are explicitly forbidden from accessing (the deny-list). Either of these can be blank. Datasette can then combine those into a full SQL query and use it to answer the question ""show me a list of resources that the user is allowed to perform action X on"". It can also answer the existing question, ""is user X allowed to perform action Y on resource Z""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/93#issuecomment-754233960,https://api.github.com/repos/simonw/datasette/issues/93,754233960,MDEyOklzc3VlQ29tbWVudDc1NDIzMzk2MA==,9599,simonw,2021-01-04T21:35:37Z,2021-01-04T21:35:37Z,OWNER,"I tested it by running a `tmate` session on the GitHub macOS machines, and it worked! ``` Mac-1609795972770:tmp runner$ wget 'https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip' --2021-01-04 21:34:10-- https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip Resolving github.com (github.com)... 140.82.114.4 Connecting to github.com (github.com)|140.82.114.4|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream [following] --2021-01-04 21:34:14-- https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream Resolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)... 52.217.43.164 Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)|52.217.43.164|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 8297283 (7.9M) [application/octet-stream] Saving to: ‘datasette-0.53-macos-binary.zip’ datasette-0.53-maco 100%[===================>] 7.91M --.-KB/s in 0.1s 2021-01-04 21:34:14 (73.4 MB/s) - ‘datasette-0.53-macos-binary.zip’ saved [8297283/8297283] Mac-1609795972770:tmp runner$ unzip datasette-0.53-macos-binary.zip Archive: datasette-0.53-macos-binary.zip creating: datasette-0.53-macos-binary/ inflating: datasette-0.53-macos-binary/datasette Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. package Package specified SQLite files into a new datasette Docker... plugins List currently available plugins publish Publish specified SQLite database files to the internet along... uninstall Uninstall Python packages (e.g. Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --get /-/versions.json {""python"": {""version"": ""3.9.1"", ""full"": ""3.9.1 (default, Dec 10 2020, 10:36:35) \n[Clang 12.0.0 (clang-1200.0.32.27)]""}, ""datasette"": {""version"": ""0.53""}, ""asgi"": ""3.0"", ""uvicorn"": ""0.13.3"", ""sqlite"": {""version"": ""3.34.0"", ""fts_versions"": [""FTS5"", ""FTS4"", ""FTS3""], ""extensions"": {""json1"": null}, ""compile_options"": [""COMPILER=clang-12.0.0"", ""ENABLE_COLUMN_METADATA"", ""ENABLE_FTS3"", ""ENABLE_FTS3_PARENTHESIS"", ""ENABLE_FTS4"", ""ENABLE_FTS5"", ""ENABLE_GEOPOLY"", ""ENABLE_JSON1"", ""ENABLE_PREUPDATE_HOOK"", ""ENABLE_RTREE"", ""ENABLE_SESSION"", ""MAX_VARIABLE_NUMBER=250000"", ""THREADSAFE=1""]}} Mac-1609795972770:tmp runner$ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754229977,https://api.github.com/repos/simonw/datasette/issues/93,754229977,MDEyOklzc3VlQ29tbWVudDc1NDIyOTk3Nw==,9599,simonw,2021-01-04T21:28:01Z,2021-01-04T21:28:01Z,OWNER,"As an experiment, I put the macOS one in a zip file and attached it to the latest release: ``` mkdir datasette-0.53-macos-binary cp dist/datasette datasette-0.53-macos-binary zip -r datasette-0.53-macos-binary.zip datasette-0.53-macos-binary ``` It's available here: https://github.com/simonw/datasette/releases/tag/0.53 - download URL is https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754227543,https://api.github.com/repos/simonw/datasette/issues/93,754227543,MDEyOklzc3VlQ29tbWVudDc1NDIyNzU0Mw==,9599,simonw,2021-01-04T21:23:13Z,2021-01-04T21:23:13Z,OWNER,"``` (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# dist/datasette --get /-/databases.json [{""name"": "":memory:"", ""path"": null, ""size"": 0, ""is_mutable"": true, ""is_memory"": true, ""hash"": null}] (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# ls -lah dist/datasette -rwxr-xr-x 1 root root 8.9M Jan 4 21:05 dist/datasette ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754219002,https://api.github.com/repos/simonw/datasette/issues/93,754219002,MDEyOklzc3VlQ29tbWVudDc1NDIxOTAwMg==,9599,simonw,2021-01-04T21:06:49Z,2021-01-04T21:22:27Z,OWNER,"Works on Linux/Ubuntu too, except I had to do `export BASE=` on a separate line. I also did this: ``` apt-get install python3 python3-venv python3 -m venv pyinstaller-venv source pyinstaller-venv/bin/activate pip install wheel pip install datasette pyinstaller export DATASETTE_BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') pyinstaller -F \ --add-data ""$DATASETTE_BASE/templates:datasette/templates"" \ --add-data ""$DATASETTE_BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754218545,https://api.github.com/repos/simonw/datasette/issues/93,754218545,MDEyOklzc3VlQ29tbWVudDc1NDIxODU0NQ==,9599,simonw,2021-01-04T21:05:57Z,2021-01-04T21:05:57Z,OWNER,That BASE= trick seems to work with `zsh` but not with `bash`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215392,https://api.github.com/repos/simonw/datasette/issues/93,754215392,MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==,9599,simonw,2021-01-04T20:59:20Z,2021-01-04T21:03:14Z,OWNER,"Updated `pyinstaller` recipe - lots of hidden imports needed now: ``` pip install wheel pip install datasette pyinstaller BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') \ pyinstaller -F \ --add-data ""$BASE/templates:datasette/templates"" \ --add-data ""$BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215793,https://api.github.com/repos/simonw/datasette/issues/93,754215793,MDEyOklzc3VlQ29tbWVudDc1NDIxNTc5Mw==,9599,simonw,2021-01-04T21:00:14Z,2021-01-04T21:00:14Z,OWNER,"``` (pyinstaller-datasette) pyinstaller-datasette % file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 (pyinstaller-datasette) pyinstaller-datasette % ls -lah dist/datasette -rwxr-xr-x 1 simon wheel 8.0M Jan 4 12:58 dist/datasette ``` I'm surprised it's only 8MB!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/983#issuecomment-754210356,https://api.github.com/repos/simonw/datasette/issues/983,754210356,MDEyOklzc3VlQ29tbWVudDc1NDIxMDM1Ng==,222245,carlmjohnson,2021-01-04T20:49:05Z,2021-01-04T20:49:05Z,NONE,"For reasons [I've written about elsewhere](https://blog.carlmjohnson.net/post/2020/time-to-kill-ie11/), I'm in favor of modules. It has several beneficial effects. One, old browsers just ignore it all together. Two, if you include the same plain script on the page more than once, it will be executed twice, but if you include the same module script on a page twice, it will only execute once. Three, you get a module local namespace, instead of having to use the global window namespace or a function private namespace. OTOH, if you are going to use an old style script, the code from before isn't ideal, because you wipe out your registry if the script it included more than once. Also you may as well use object methods and splat arguments. The event based architecture probably makes more sense though. Just make up some event names prefixed with `datasette:` and listen for them on the root. The only concern with that approach is it can sometimes be tricky to make sure your plugins are run after datasette has run. Maybe ```js function mycallback(){ // whatever } if (window.datasette) { window.datasette.init(mycallback); } else { document.addEventListener('datasette:init', mycallback); } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/668#issuecomment-754194996,https://api.github.com/repos/simonw/datasette/issues/668,754194996,MDEyOklzc3VlQ29tbWVudDc1NDE5NDk5Ng==,9599,simonw,2021-01-04T20:18:39Z,2021-01-04T20:18:39Z,OWNER,I fixed this in #1115 - you can run `--load-extension=spatialite` now and it will look for the extension in common places.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",563347679,Make it easier to load SpatiaLite, https://github.com/simonw/datasette/issues/436#issuecomment-754193501,https://api.github.com/repos/simonw/datasette/issues/436,754193501,MDEyOklzc3VlQ29tbWVudDc1NDE5MzUwMQ==,9599,simonw,2021-01-04T20:15:41Z,2021-01-04T20:15:41Z,OWNER,"Sadly `publish.datasettes.com` was broken by changes to Zeit, and I don't think I'll be bringing it back.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435819321,400 Error when trying to register new user via https://publish.datasettes.com/, https://github.com/simonw/datasette/issues/371#issuecomment-754192873,https://api.github.com/repos/simonw/datasette/issues/371,754192873,MDEyOklzc3VlQ29tbWVudDc1NDE5Mjg3Mw==,9599,simonw,2021-01-04T20:14:28Z,2021-01-04T20:14:28Z,OWNER,"Now that Digital Ocean has App Platform this is less necessary, especially since the documentation covers how to use App Platform here: https://docs.datasette.io/en/stable/deploying.html#deploying-using-buildpacks","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/102#issuecomment-754192267,https://api.github.com/repos/simonw/datasette/issues/102,754192267,MDEyOklzc3VlQ29tbWVudDc1NDE5MjI2Nw==,9599,simonw,2021-01-04T20:13:19Z,2021-01-04T20:13:19Z,OWNER,"I'm more likely to do Lambda than Elastic Beanstalk, especially now the size limit for Lambdas has been increased as part of their support for Docker.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274264175,datasette publish elasticbeanstalk, https://github.com/simonw/datasette/issues/221#issuecomment-754191699,https://api.github.com/repos/simonw/datasette/issues/221,754191699,MDEyOklzc3VlQ29tbWVudDc1NDE5MTY5OQ==,9599,simonw,2021-01-04T20:12:14Z,2021-01-04T20:12:14Z,OWNER,I'm going to close this. Plugins can register their own CLI tools (see https://github.com/simonw/click-app) if they need to.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754190952,https://api.github.com/repos/simonw/datasette/issues/221,754190952,MDEyOklzc3VlQ29tbWVudDc1NDE5MDk1Mg==,9599,simonw,2021-01-04T20:10:51Z,2021-01-04T20:10:51Z,OWNER,Is this still a good idea? I don't have any pressing need for it at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754190814,https://api.github.com/repos/simonw/datasette/issues/221,754190814,MDEyOklzc3VlQ29tbWVudDc1NDE5MDgxNA==,9599,simonw,2021-01-04T20:10:34Z,2021-01-04T20:10:34Z,OWNER,"For the `csvs-to-sqlite` case I'm going with `datasette insert` instead, see #1160.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/18#issuecomment-754188383,https://api.github.com/repos/simonw/datasette/issues/18,754188383,MDEyOklzc3VlQ29tbWVudDc1NDE4ODM4Mw==,9599,simonw,2021-01-04T20:05:48Z,2021-01-04T20:05:48Z,OWNER,"I'm not using Sanic any more, but this is still very feasible. If I ever do it I'll write a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267739593,See if I can get a websockets interface working, https://github.com/simonw/datasette/issues/103#issuecomment-754188099,https://api.github.com/repos/simonw/datasette/issues/103,754188099,MDEyOklzc3VlQ29tbWVudDc1NDE4ODA5OQ==,9599,simonw,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,"Wontfix, Cloud Run is already implemented and is a better fit for Datasette.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274265878,datasette publish appengine, https://github.com/simonw/datasette/issues/913#issuecomment-754187520,https://api.github.com/repos/simonw/datasette/issues/913,754187520,MDEyOklzc3VlQ29tbWVudDc1NDE4NzUyMA==,9599,simonw,2021-01-04T20:04:10Z,2021-01-04T20:04:10Z,OWNER,That's pretty elegant: each plugin gets its own namespace and can register new settings.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",670209331,Mechanism for passing additional options to `datasette my.db` that affect plugins, https://github.com/simonw/datasette/issues/913#issuecomment-754187326,https://api.github.com/repos/simonw/datasette/issues/913,754187326,MDEyOklzc3VlQ29tbWVudDc1NDE4NzMyNg==,9599,simonw,2021-01-04T20:03:50Z,2021-01-04T20:03:50Z,OWNER,"I renamed `--config` to `--setting` and changed it to work like this: datasette --setting sql_time_limit_ms 1000 Note the lack of colons. This actually makes colons cleaner to use for plugins - I could support this: datasette --setting datasette-insert:unsafe 1","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",670209331,Mechanism for passing additional options to `datasette my.db` that affect plugins, https://github.com/simonw/datasette/issues/1111#issuecomment-754184287,https://api.github.com/repos/simonw/datasette/issues/1111,754184287,MDEyOklzc3VlQ29tbWVudDc1NDE4NDI4Nw==,9599,simonw,2021-01-04T19:57:53Z,2021-01-04T19:57:53Z,OWNER,Relevant new feature in sqlite-utils: the ability to use triggers to maintain fast counts. This optimization could help a lot here. https://github.com/simonw/sqlite-utils/issues/212,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",751195017,Accessing a database's `.json` is slow for very large SQLite files, https://github.com/simonw/datasette/issues/1164#issuecomment-754182058,https://api.github.com/repos/simonw/datasette/issues/1164,754182058,MDEyOklzc3VlQ29tbWVudDc1NDE4MjA1OA==,9599,simonw,2021-01-04T19:53:31Z,2021-01-04T19:53:31Z,OWNER,This will be helped by the new `package.json` added in #1170.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/pull/1170#issuecomment-754181646,https://api.github.com/repos/simonw/datasette/issues/1170,754181646,MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Ng==,9599,simonw,2021-01-04T19:52:40Z,2021-01-04T19:52:40Z,OWNER,Thank you very much!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/issues/983#issuecomment-754181647,https://api.github.com/repos/simonw/datasette/issues/983,754181647,MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Nw==,11941245,jussiarpalahti,2021-01-04T19:52:40Z,2021-01-04T19:52:40Z,NONE,"I was thinking JavaScript plugins going with server side template extensions custom HTML. Attach my own widgets on there and listen for Datasette events to refresh when user interacts with main UI. Like a map view or table that updates according to selected column. There's certainly other ways to look at this. Perhaps you could list possible hooks or high level design doc on what would be possible with the plugin system? Re: modules. I would like to see modules supported at least in development. The developer experience is so much better than what JavaScript coding has been in the past. With large parts of NPM at your disposal I’d imagine even less experienced coder can whisk a custom plugin in no time. Proper production build system (like one you get with Pika or Parcel) could package everything up into bundles that older browsers can understand. Though that does come with performance and size penalties alongside the added complexity. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1169#issuecomment-754007242,https://api.github.com/repos/simonw/datasette/issues/1169,754007242,MDEyOklzc3VlQ29tbWVudDc1NDAwNzI0Mg==,3637,benpickles,2021-01-04T14:29:57Z,2021-01-04T14:29:57Z,CONTRIBUTOR,I somewhat share your reluctance to add a package.json to seemingly every project out there but ultimately if they're project dependencies it's important they're managed within the codebase.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/pull/1170#issuecomment-754004715,https://api.github.com/repos/simonw/datasette/issues/1170,754004715,MDEyOklzc3VlQ29tbWVudDc1NDAwNDcxNQ==,3637,benpickles,2021-01-04T14:25:44Z,2021-01-04T14:25:44Z,CONTRIBUTOR,I was going to re-add the filter to only run Prettier when there have been changes in `datasette/static` but that would mean it wouldn't run when the package is updated. That plus the fact that [the last run of the job took only 8 seconds](https://github.com/benpickles/datasette/runs/1640121514) is why I decided not to re-add the filter.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/pull/1170#issuecomment-754002859,https://api.github.com/repos/simonw/datasette/issues/1170,754002859,MDEyOklzc3VlQ29tbWVudDc1NDAwMjg1OQ==,22429695,codecov[bot],2021-01-04T14:22:52Z,2021-01-04T14:22:52Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=h1) Report > Merging [#1170](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=desc) (a5761cc) into [main](https://codecov.io/gh/simonw/datasette/commit/1e8fa3ac7cb2d6e516c47c306c86ed2334fc3dc0?el=desc) (1e8fa3a) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1170 +/- ## ======================================= Coverage 91.55% 91.55% ======================================= Files 32 32 Lines 3932 3932 ======================================= Hits 3600 3600 Misses 332 332 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=footer). Last update [1e8fa3a...a5761cc](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/issues/983#issuecomment-753690280,https://api.github.com/repos/simonw/datasette/issues/983,753690280,MDEyOklzc3VlQ29tbWVudDc1MzY5MDI4MA==,9599,simonw,2021-01-03T23:13:30Z,2021-01-03T23:13:30Z,OWNER,"Oh that's interesting, I hadn't thought about plugins firing events - just responding to events fired by the rest of the application.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671902,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671902,MDEyOklzc3VlQ29tbWVudDc1MzY3MTkwMg==,9599,simonw,2021-01-03T20:31:04Z,2021-01-03T20:32:13Z,OWNER,A `table.has_count_triggers` property.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671235,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671235,MDEyOklzc3VlQ29tbWVudDc1MzY3MTIzNQ==,9599,simonw,2021-01-03T20:25:10Z,2021-01-03T20:25:10Z,OWNER,"To detect tables, look at the names of the triggers - `{table}{counts_table}_insert` and `{table}{counts_table}_delete`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671009,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671009,MDEyOklzc3VlQ29tbWVudDc1MzY3MTAwOQ==,9599,simonw,2021-01-03T20:22:53Z,2021-01-03T20:22:53Z,OWNER,I think this should be accompanied by a `sqlite-utils reset-counts` command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753670833,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753670833,MDEyOklzc3VlQ29tbWVudDc1MzY3MDgzMw==,9599,simonw,2021-01-03T20:20:54Z,2021-01-03T20:20:54Z,OWNER,"This is a little tricky. We should assume that the existing values in the `_counts` table cannot be trusted at all when this method is called - so we should probably clear that table entirely and then re-populate it. But that means we need to figure out which tables in the database have the counts triggers defined.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753668099,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753668099,MDEyOklzc3VlQ29tbWVudDc1MzY2ODA5OQ==,9599,simonw,2021-01-03T19:55:53Z,2021-01-03T19:55:53Z,OWNER,So if you instantiate the `Database()` constructor with `use_counts_table=True` any access to the `.count` properties will go through this table - otherwise regular `count(*)` queries will be executed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753665521,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753665521,MDEyOklzc3VlQ29tbWVudDc1MzY2NTUyMQ==,9599,simonw,2021-01-03T19:31:33Z,2021-01-03T19:31:33Z,OWNER,"I'm having second thoughts about this being the default behaviour. It's pretty weird. I feel like HUGE databases that need this are rare, so having it on by default doesn't make sense.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753662490,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753662490,MDEyOklzc3VlQ29tbWVudDc1MzY2MjQ5MA==,9599,simonw,2021-01-03T19:05:53Z,2021-01-03T19:05:53Z,OWNER,Idea: a `.execute_count()` method that never uses the cache.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661292,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753661292,MDEyOklzc3VlQ29tbWVudDc1MzY2MTI5Mg==,9599,simonw,2021-01-03T18:56:06Z,2021-01-03T18:56:23Z,OWNER,"Another option: on creation of the `Database()` object, check to see if the `_counts` table exists and use that as the default for a `use_counts_table` property. Also flip that property to `True` if the user calls `.enable_counts()` at any time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661158,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753661158,MDEyOklzc3VlQ29tbWVudDc1MzY2MTE1OA==,9599,simonw,2021-01-03T18:55:16Z,2021-01-03T18:55:16Z,OWNER,"Alternative implementation: provided `db.should_trust_counts` is `True`, try running the query: ```sql select count from _counts where [table] = ? ``` If the query fails to return a result OR throws an error because the table doesn't exist, run the `count(*)` query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660814,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753660814,MDEyOklzc3VlQ29tbWVudDc1MzY2MDgxNA==,9599,simonw,2021-01-03T18:53:05Z,2021-01-03T18:53:05Z,OWNER,"Here's the current `.count` property: https://github.com/simonw/sqlite-utils/blob/036ec6d32313487527c66dea613a3e7118b97459/sqlite_utils/db.py#L597-L609 It's implemented on `Queryable` which means it's available on both `Table` and `View` - the optimization doesn't make sense for views. I'm a bit cautious about making that property so much more complex. In order to decide if it should try the `_counts` table first it needs to know: - Should it be trusting the counts? I'm thinking a `.should_trust_counts` property on `Database` which defaults to `True` would be good - then advanced users can turn that off if they know the counts should not be trusted. - Does the `_counts` table exist? - Are the triggers defined? Then it can do the query, and if the query fails it can fall back on the `count(*)`. That's quite a lot of extra activity though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660379,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753660379,MDEyOklzc3VlQ29tbWVudDc1MzY2MDM3OQ==,9599,simonw,2021-01-03T18:50:15Z,2021-01-03T18:50:15Z,OWNER,"```python def cached_counts(self, tables=None): sql = ""select [table], count from {}"".format(self._counts_table_name) if tables: sql += "" where [table] in ({})"".format("", "".join(""?"" for table in tables)) return {r[0]: r[1] for r in self.execute(sql, tables).fetchall()} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/206#issuecomment-753659260,https://api.github.com/repos/simonw/sqlite-utils/issues/206,753659260,MDEyOklzc3VlQ29tbWVudDc1MzY1OTI2MA==,9599,simonw,2021-01-03T18:42:01Z,2021-01-03T18:42:01Z,OWNER,"``` % sqlite-utils insert blah.db blah global_power_plant_database.csv Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",761915790,sqlite-utils should suggest --csv if JSON parsing fails, https://github.com/simonw/datasette/issues/1169#issuecomment-753657180,https://api.github.com/repos/simonw/datasette/issues/1169,753657180,MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA==,9599,simonw,2021-01-03T18:23:30Z,2021-01-03T18:23:30Z,OWNER,"Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/1169#issuecomment-753653260,https://api.github.com/repos/simonw/datasette/issues/1169,753653260,MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==,9599,simonw,2021-01-03T17:54:40Z,2021-01-03T17:54:40Z,OWNER,And @benpickles yes I would land that pull request straight away as-is. Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/1169#issuecomment-753653033,https://api.github.com/repos/simonw/datasette/issues/1169,753653033,MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw==,9599,simonw,2021-01-03T17:52:53Z,2021-01-03T17:52:53Z,OWNER,"Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the `~/.npm` cache, but evidently that's not the case. You've convinced me that Datasette itself should have a `package.json` - the Dependabot argument is a really good one. But... I'd really love to figure out a general pattern for using `npx` scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or `puppeteer-cli` in without adding a `package.json` to them. Any ideas? The best I can think of is for the workflow itself to write out a `package.json` file (using `echo '{ ... }' > package.json`) as part of the run - that way the cache should work (I think) but I don't get a misleading `package.json` file sitting in the repo.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/983#issuecomment-753600999,https://api.github.com/repos/simonw/datasette/issues/983,753600999,MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ==,475613,MarkusH,2021-01-03T11:11:21Z,2021-01-03T11:11:21Z,NONE,"With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work: ```js // as part of datasette datasette.events.AddMenuItem = 'DatasetteAddMenuItemEvent'; document.addEventListener(datasette.events.AddMenuItem, (e) => { // do whatever is needed to add the menu item. Data comes from `e` alert(e.title + ' ' + e.link); }); // as part of a plugin const event = new Event(datasette.events.AddMenuItem, {link: '/foo/bar', title: 'Go somewhere'}); Document.dispatchEvent(event) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753587963,https://api.github.com/repos/simonw/datasette/issues/983,753587963,MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw==,154364,dracos,2021-01-03T09:02:50Z,2021-01-03T10:00:05Z,NONE,"> but I'm already commited to requiring support for () => {} arrow functions Don't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753570710,https://api.github.com/repos/simonw/datasette/issues/983,753570710,MDEyOklzc3VlQ29tbWVudDc1MzU3MDcxMA==,9599,simonw,2021-01-03T05:29:56Z,2021-01-03T05:29:56Z,OWNER,"I thought about using browser events, but they don't quite match the API that I'm looking to provide. In particular, the great thing about Pluggy is that if you have multiple handlers registered for a specific plugin hook each of those handlers can return a value, and Pluggy will combine those values into a list of replies. This is great for things like plugin hooks that add extra menu items - each plugin can return a menu item (maybe as a label/URL/click-callback object) and the calling code can then add all of those items to the menu. See https://docs.datasette.io/en/stable/plugin_hooks.html#table-actions-datasette-actor-database-table for a Python example. I'm on the fence about relying on JavaScript modules. I need to think about browser compatibility for them - but I'm already commited to requiring support for `() => {}` arrow functions so maybe I'm committed to module support too already?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1160#issuecomment-753568428,https://api.github.com/repos/simonw/datasette/issues/1160,753568428,MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA==,9599,simonw,2021-01-03T05:02:32Z,2021-01-03T05:02:32Z,OWNER,"Should this command include a `--fts` option for configuring full-text search on one-or-more columns? I thought about doing that for `sqlite-utils insert` in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers. But maybe I can set sensible defaults for that with `datasette insert ... -f title -f body`? Worth thinking about a bit more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753568264,https://api.github.com/repos/simonw/sqlite-utils/issues/202,753568264,MDEyOklzc3VlQ29tbWVudDc1MzU2ODI2NA==,9599,simonw,2021-01-03T05:00:24Z,2021-01-03T05:00:24Z,OWNER,"I'm not going to implement this, because it actually needs several additional options that already exist on `sqlite-utils enable-fts`: ``` --fts4 Use FTS4 --fts5 Use FTS5 --tokenize TEXT Tokenizer to use, e.g. porter --create-triggers Create triggers to update the FTS tables when the parent table changes. ``` I'd rather not add all four of those options to `sqlite-utils insert` just to support this shortcut.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738514367,sqlite-utils insert -f colname - for configuring full-text search, https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753567969,https://api.github.com/repos/simonw/sqlite-utils/issues/202,753567969,MDEyOklzc3VlQ29tbWVudDc1MzU2Nzk2OQ==,9599,simonw,2021-01-03T04:55:17Z,2021-01-03T04:55:43Z,OWNER,"The long version of this can be `--fts`, same as in `csvs-to-sqlite`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738514367,sqlite-utils insert -f colname - for configuring full-text search, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567932,MDEyOklzc3VlQ29tbWVudDc1MzU2NzkzMg==,9599,simonw,2021-01-03T04:54:43Z,2021-01-03T04:54:43Z,OWNER,"Another option: expand the `ForeignKey` object to have `.columns` and `.other_columns` properties in addition to the existing `.column` and `.other_column` properties. These new plural properties would always return a tuple, which would be a one-item tuple for a non-compound-foreign-key. The question then is what should `.column` and `.other_column` return for compound foreign keys? I'd be inclined to say they should return `None` - which would trigger errors in code that encounters a compound foreign key for the first time, but those errors would at least be a strong indicator as to what had gone wrong. We can label `.column` and `.other_column` as deprecated and then remove them in `sqlite-utils 4.0`. Since this would still be a breaking change in some minor edge-cases I'm thinking maybe 4.0 needs to happen in order to land this feature. I'm not opposed to doing that, I was just hoping it might be avoidable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567744,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567744,MDEyOklzc3VlQ29tbWVudDc1MzU2Nzc0NA==,9599,simonw,2021-01-03T04:51:44Z,2021-01-03T04:51:44Z,OWNER,"One way that this could avoid a breaking change would be to have `fk.column` and `fk.other_column` remain as strings for non-compound-foreign-keys, but turn into tuples for a compound foreign key. This is a bit of an ugly API design, and it could still break existing code that encounters a compound foreign key for the first time - but it would leave code working for the more common case of a non-compound-foreign-key.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567508,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567508,MDEyOklzc3VlQ29tbWVudDc1MzU2NzUwOA==,9599,simonw,2021-01-03T04:48:17Z,2021-01-03T04:48:17Z,OWNER,"Sorry for taking so long to review this! This approach looks great to me - being able to optionally pass a tuple anywhere the API currently expects a column is smart, and it's consistent with how the `pk=` parameter works elsewhere. There's just one problem I can see with this: the way it changes the `ForeignKey(...)` interface to always return a tuple for `.column` and `.other_column`, even if that tuple only contains a single item. This represents a breaking change to the existing API - any code that expects `ForeignKey.column` to be a single string (which is any code that has been written against that) will break. As such, I'd have to bump the major version of `sqlite-utils` to `4.0` in order to ship this. Ideally I'd like to make this change in a way that doesn't represent an API compatibility break. I need to think a bit harder about how that might be achieved.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753566184,https://api.github.com/repos/simonw/sqlite-utils/issues/217,753566184,MDEyOklzc3VlQ29tbWVudDc1MzU2NjE4NA==,9599,simonw,2021-01-03T04:27:38Z,2021-01-03T04:27:38Z,OWNER,Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#quoting-strings-for-use-in-sql,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777543336,Rename .escape() to .quote(), https://github.com/simonw/sqlite-utils/issues/216#issuecomment-753566156,https://api.github.com/repos/simonw/sqlite-utils/issues/216,753566156,MDEyOklzc3VlQ29tbWVudDc1MzU2NjE1Ng==,9599,simonw,2021-01-03T04:27:14Z,2021-01-03T04:27:14Z,OWNER,Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#introspection,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777540352,database.triggers_dict introspection property, https://github.com/simonw/sqlite-utils/issues/218#issuecomment-753563757,https://api.github.com/repos/simonw/sqlite-utils/issues/218,753563757,MDEyOklzc3VlQ29tbWVudDc1MzU2Mzc1Nw==,9599,simonw,2021-01-03T03:49:51Z,2021-01-03T03:49:51Z,OWNER,Documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#listing-triggers,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777560474,"""sqlite-utils triggers"" command", https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753545757,MDEyOklzc3VlQ29tbWVudDc1MzU0NTc1Nw==,9599,simonw,2021-01-02T23:58:07Z,2021-01-02T23:58:07Z,OWNER,"Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers. One way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545381,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753545381,MDEyOklzc3VlQ29tbWVudDc1MzU0NTM4MQ==,9599,simonw,2021-01-02T23:52:52Z,2021-01-02T23:52:52Z,OWNER,Idea: a `db.cached_counts()` method that returns a dictionary of data from the `_counts` table. Call it with a list of tables to get back the counts for just those tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753544914,https://api.github.com/repos/simonw/sqlite-utils/issues/217,753544914,MDEyOklzc3VlQ29tbWVudDc1MzU0NDkxNA==,9599,simonw,2021-01-02T23:47:42Z,2021-01-02T23:47:42Z,OWNER,https://github.com/simonw/sqlite-utils/blob/9a5c92b63e7917c93cc502478493c51c781b2ecc/sqlite_utils/db.py#L231-L239,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777543336,Rename .escape() to .quote(), https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753535488,https://api.github.com/repos/simonw/sqlite-utils/issues/213,753535488,MDEyOklzc3VlQ29tbWVudDc1MzUzNTQ4OA==,9599,simonw,2021-01-02T22:03:48Z,2021-01-02T22:03:48Z,OWNER,"I got this error while prototyping this: too many levels of trigger recursion It looks like that's because SQLite doesn't like triggers on a table that themselves then update that table - so I'm going to exclude the `_counts` table from this mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777529979,db.enable_counts() method, https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753533775,https://api.github.com/repos/simonw/sqlite-utils/issues/213,753533775,MDEyOklzc3VlQ29tbWVudDc1MzUzMzc3NQ==,9599,simonw,2021-01-02T21:47:10Z,2021-01-02T21:47:10Z,OWNER,"I'm going to skip virtual tables, which I can identify using this property: https://github.com/simonw/sqlite-utils/blob/1cad7fad3e7a5b734088f5cc545b69a055e636da/sqlite_utils/db.py#L720-L726","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777529979,db.enable_counts() method, https://github.com/simonw/datasette/issues/1012#issuecomment-753531657,https://api.github.com/repos/simonw/datasette/issues/1012,753531657,MDEyOklzc3VlQ29tbWVudDc1MzUzMTY1Nw==,45380,bollwyvl,2021-01-02T21:25:36Z,2021-01-02T21:25:36Z,CONTRIBUTOR,"Actually, on more research, I found out this is handled by the [trove-classifiers package](https://github.com/pypa/trove-classifiers/blob/master/src/trove_classifiers/__init__.py#L2) now, so it's just a one-liner pr instead of fire-up-a-docker-container-and-do-some-migrations","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718540751,For 1.0 update trove classifier in setup.py, https://github.com/simonw/datasette/issues/1168#issuecomment-753524779,https://api.github.com/repos/simonw/datasette/issues/1168,753524779,MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ==,9599,simonw,2021-01-02T20:19:26Z,2021-01-02T20:19:26Z,OWNER,Idea: version the metadata scheme. If the table is called `_metadata_v1` it gives me a clear path to designing a new scheme in the future.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/sqlite-utils/issues/212#issuecomment-753422324,https://api.github.com/repos/simonw/sqlite-utils/issues/212,753422324,MDEyOklzc3VlQ29tbWVudDc1MzQyMjMyNA==,9599,simonw,2021-01-02T03:00:34Z,2021-01-02T03:00:34Z,OWNER,"Here's a prototype: ```python with db.conn: db.conn.executescript("""""" CREATE TABLE IF NOT EXISTS [_counts] ([table] TEXT PRIMARY KEY, [count] INTEGER DEFAULT 0); CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ai] AFTER INSERT ON [Street_Tree_List] BEGIN INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', COALESCE( (SELECT count FROM _counts WHERE [table]='Street_Tree_List'), 0) + 1); END; CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ad] AFTER DELETE ON [Street_Tree_List] BEGIN INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', COALESCE( (SELECT count FROM _counts WHERE [table]='Street_Tree_List'), 0) - 1); END; INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', (select count(*) from [Street_Tree_List])); """""") ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777392020,Mechanism for maintaining cache of table counts using triggers, https://github.com/simonw/sqlite-utils/issues/210#issuecomment-753406744,https://api.github.com/repos/simonw/sqlite-utils/issues/210,753406744,MDEyOklzc3VlQ29tbWVudDc1MzQwNjc0NA==,9599,simonw,2021-01-02T00:02:39Z,2021-01-02T00:02:39Z,OWNER,"It looks like https://github.com/ofajardo/pyreadr is a good library for this. I won't add this to `sqlite-utils` because it's quite a bulky dependency for a relatively small feature. Normally I'd write a `rdata-to-sqlite` tool similar to https://pypi.org/project/dbf-to-sqlite/ - but I'm actually working on a new plugin hook for Datasette that might be an even better fit for this. The idea is to allow Datasette plugins to define input formats - such as RData - which would then result in being able to import them on the command-line with `datasette insert my.db file.rdata` or by uploading a file through the Datasette web interface. That work is happening over here: https://github.com/simonw/datasette/issues/1160 - I'll close this issue in favour of a sometime-in-the-future `datasette-import-rdata` plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",767685961,Support of RData files, https://github.com/simonw/sqlite-utils/issues/209#issuecomment-753405835,https://api.github.com/repos/simonw/sqlite-utils/issues/209,753405835,MDEyOklzc3VlQ29tbWVudDc1MzQwNTgzNQ==,9599,simonw,2021-01-01T23:52:06Z,2021-01-01T23:52:06Z,OWNER,I just hit this one too. Such a weird bug!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",766156875,Test failure with sqlite 3.34 in test_cli.py::test_optimize, https://github.com/simonw/datasette/issues/1168#issuecomment-753402423,https://api.github.com/repos/simonw/datasette/issues/1168,753402423,MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw==,9599,simonw,2021-01-01T23:16:05Z,2021-01-01T23:16:05Z,OWNER,"One catch: solving the ""show me all metadata for everything in this Datasette instance"" problem. Ideally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time. Ideally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a `_metadata` table in any of the attached databases had an update. This is a much larger problem - but one potential fix would be to use triggers to maintain a ""version number"" for the `_metadata` table - similar to SQLite's own built-in `schema_version` mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated. Such a mechanism would have applications outside of just this `_metadata` system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753401001,https://api.github.com/repos/simonw/datasette/issues/1168,753401001,MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ==,9599,simonw,2021-01-01T23:01:45Z,2021-01-01T23:01:45Z,OWNER,I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400420,https://api.github.com/repos/simonw/datasette/issues/1168,753400420,MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA==,9599,simonw,2021-01-01T22:53:58Z,2021-01-01T22:53:58Z,OWNER,"Precedence idea: - First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins - Next priority: `_internal` metadata, which should have been loaded from `metadata.json` - Last priority: the `_metadata` table from that database itself, i.e. the default ""baked in"" metadata","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400306,https://api.github.com/repos/simonw/datasette/issues/1168,753400306,MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg==,9599,simonw,2021-01-01T22:52:44Z,2021-01-01T22:52:44Z,OWNER,"Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400265,https://api.github.com/repos/simonw/datasette/issues/1168,753400265,MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ==,9599,simonw,2021-01-01T22:52:09Z,2021-01-01T22:52:09Z,OWNER,"From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399635,https://api.github.com/repos/simonw/datasette/issues/1168,753399635,MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ==,9599,simonw,2021-01-01T22:45:21Z,2021-01-01T22:50:21Z,OWNER,"Would also need to figure out the precedence rules: - What happens if the database has a `_metadata` table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something. - Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned? - If a database has a `license`, does that ""cascade"" down to the tables? What about `source` and `about`? - What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, `_internal` may have loaded data from `metadata.json` that conflicts with some other remote table metadata definition.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399428,https://api.github.com/repos/simonw/datasette/issues/1168,753399428,MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA==,9599,simonw,2021-01-01T22:43:14Z,2021-01-01T22:43:22Z,OWNER,"Could this use a compound primary key on `database, table, column`? Does that work with null values?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399366,https://api.github.com/repos/simonw/datasette/issues/1168,753399366,MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng==,9599,simonw,2021-01-01T22:42:37Z,2021-01-01T22:42:37Z,OWNER,"So what would the database schema for this look like? I'm leaning towards a single table called `_metadata`, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - `_metadata_database` and `_metadata_tables` and `_metadata_columns` perhaps. If it's just a single `_metadata` table, the schema could look like this: | database | table | column | metadata | | --- | --- | --- | --- | | | mytable | | {""title"": ""My Table"" } | | | mytable | mycolumn | {""description"": ""Column description"" } | | otherdb | othertable | | {""description"": ""Table in another DB"" } | If the `database` column is `null` it means ""this is describing a table in the same database file as this `_metadata` table"". The alternative to the `metadata` JSON column would be separate columns for each potential metadata value - `license`, `source`, `about`, `about_url` etc. But that makes it harder for people to create custom metadata fields.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753398542,https://api.github.com/repos/simonw/datasette/issues/1168,753398542,MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg==,9599,simonw,2021-01-01T22:37:24Z,2021-01-01T22:37:24Z,OWNER,"The direction I'm leaning in now is the following: - Metadata always lives in SQLite tables - These tables can be co-located with the database they describe (same DB file) - ... or they can be in a different DB file and reference the other database that they are describing - Metadata provided on startup in a `metadata.json` file is loaded into an in-memory metadata table using that same mechanism Plugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the `_internal` in-memory database, or they could write to a table in a database on disk.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753392102,https://api.github.com/repos/simonw/datasette/issues/1168,753392102,MDEyOklzc3VlQ29tbWVudDc1MzM5MjEwMg==,9599,simonw,2021-01-01T22:06:33Z,2021-01-01T22:06:33Z,OWNER,"Some SQLite databases include SQL comments in the schema definition which tell you what each column means: ```sql CREATE TABLE User -- A table comment ( uid INTEGER, -- A field comment flags INTEGER -- Another field comment ); ``` The problem with these is that they're not exposed to SQLite in any mechanism other than parsing the `CREATE TABLE` statement from the `sqlite_master` table to extract those columns. I had an idea to build a plugin that could return these. That would be easy with a ""get metadata for this column"" plugin hook - in the absence of one a plugin could still run that reads the schemas on startup and uses them to populate a metadata database table somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753391869,https://api.github.com/repos/simonw/datasette/issues/1168,753391869,MDEyOklzc3VlQ29tbWVudDc1MzM5MTg2OQ==,9599,simonw,2021-01-01T22:04:30Z,2021-01-01T22:04:30Z,OWNER,"The sticking point here seems to be the plugin hook. Allowing plugins to over-ride the way the question ""give me the metadata for this database/table/column"" is answered makes the database-backed metadata mechanisms much more complicated to think about. What if plugins didn't get to over-ride metadata in this way, but could instead update the metadata in a persistent Datasette-managed storage mechanism? Then maybe Datasette could do the following: - Maintain metadata in `_internal` that has been loaded from `metadata.json` - Know how to check a database for baked-in metadata (maybe in a `_metadata` table) - Know how to fall back on the `_internal` metadata if no baked-in metadata is available If database files were optionally allowed to store metadata about tables that live in another database file this could perhaps solve the plugin needs - since an ""edit metadata"" plugin would be able to edit records in a separate, dedicated `metadata.db` database to store new information about tables in other files.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753390791,https://api.github.com/repos/simonw/datasette/issues/1168,753390791,MDEyOklzc3VlQ29tbWVudDc1MzM5MDc5MQ==,9599,simonw,2021-01-01T22:00:42Z,2021-01-01T22:00:42Z,OWNER,"Here are the requirements I'm currently trying to satisfy: - It should be possible to query the metadata for ALL attached tables in one place, potentially with pagination and filtering - Metadata should be able to exist in the current `metadata.json` file - It should also be possible to bundle metadata in a table in the SQLite database files themselves - Plugins should be able to define their own special mechanisms for metadata. This is particularly interesting for providing a UI that allows users to edit the metadata for their existing tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753390262,https://api.github.com/repos/simonw/datasette/issues/1168,753390262,MDEyOklzc3VlQ29tbWVudDc1MzM5MDI2Mg==,9599,simonw,2021-01-01T21:58:11Z,2021-01-01T21:58:11Z,OWNER,"One possibility: plugins could write directly to that in-memory database table. But how would they know to write again should the server restart? Maybe they would write to it once when called by the `startup` plugin hook, and then update it (and their own backing store) when metadata changes for some reason. Feels a bit messy though. Also: if I want to support metadata optionally living in a `_metadata` table colocated with the data in a SQLite database file itself, how would that affect the `metadata` columns in `_internal`? How often would Datasette denormalize and copy data across from the on-disk `_metadata` tables to the `_internal` in-memory columns?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753389938,https://api.github.com/repos/simonw/datasette/issues/1168,753389938,MDEyOklzc3VlQ29tbWVudDc1MzM4OTkzOA==,9599,simonw,2021-01-01T21:54:15Z,2021-01-01T21:54:15Z,OWNER,"So what if the `databases`, `tables` and `columns` tables in `_internal` each grew a new `metadata` text column? These columns could be populated by Datasette on startup through reading the `metadata.json` file. But how would plugins interact with them?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753389477,https://api.github.com/repos/simonw/datasette/issues/1168,753389477,MDEyOklzc3VlQ29tbWVudDc1MzM4OTQ3Nw==,9599,simonw,2021-01-01T21:49:57Z,2021-01-01T21:49:57Z,OWNER,"What if metadata was stored in a JSON text column in the existing `_internal` tables? This would allow for users to invent additional metadata fields in the future beyond the current `license`, `license_url` etc fields - without needing a schema change. The downside of JSON columns generally is that they're harder to run indexed queries against. For metadata I don't think that matters - even with 10,000 tables each with their own metadata a SQL query asking for e.g. ""everything that has Apache 2 as the license"" would return in just a few ms.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753388809,https://api.github.com/repos/simonw/datasette/issues/1168,753388809,MDEyOklzc3VlQ29tbWVudDc1MzM4ODgwOQ==,9599,simonw,2021-01-01T21:47:51Z,2021-01-01T21:47:51Z,OWNER,"A database that exposes metadata will have the same restriction as the new `_internal` database that exposes columns and tables, in that it needs to take permissions into account. A user should not be able to view metadata for tables that they are not able to see. As such, I'd rather bundle any metadata tables into the existing `_internal` database so I don't have to solve that permissions problem in two places.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753366024,https://api.github.com/repos/simonw/datasette/issues/1168,753366024,MDEyOklzc3VlQ29tbWVudDc1MzM2NjAyNA==,9599,simonw,2021-01-01T18:48:34Z,2021-01-01T18:48:34Z,OWNER,Also: in #188 I proposed bundling metadata in the SQLite database itself alongside the data. This is a great way of ensuring metadata travels with the data when it is downloaded as a SQLite `.db` file. But how would that play with the idea of an in-memory `_metadata` table? Could that table perhaps offer views that join data across multiple attached physical databases?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/983#issuecomment-753224999,https://api.github.com/repos/simonw/datasette/issues/983,753224999,MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ==,11941245,jussiarpalahti,2020-12-31T23:29:36Z,2020-12-31T23:29:36Z,NONE,"I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events. I was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec. If minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification. On plugins you'd simply: ```javascript import {register} from '/assets/js/datasette' register.on({'click' : my_func}) ``` In Datasette HTML pages' head you'd merely import these files as modules one by one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1166#issuecomment-753224351,https://api.github.com/repos/simonw/datasette/issues/1166,753224351,MDEyOklzc3VlQ29tbWVudDc1MzIyNDM1MQ==,9599,simonw,2020-12-31T23:23:29Z,2020-12-31T23:23:29Z,OWNER,I should configure the action to only run if changes have been made within the `datasette/static` directory.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/983#issuecomment-753221646,https://api.github.com/repos/simonw/datasette/issues/983,753221646,MDEyOklzc3VlQ29tbWVudDc1MzIyMTY0Ng==,9599,simonw,2020-12-31T22:58:47Z,2020-12-31T22:58:47Z,OWNER,"https://github.com/mishoo/UglifyJS/issues/1905#issuecomment-300485490 says: > `sourceMappingURL` aren't added by default in `3.x` due to one of the feature requests not to - some users are putting them within HTTP response headers instead. > > So the command line for that would be: > > ```js > $ uglifyjs main.js -cmo main.min.js --source-map url=main.min.js.map > ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1164#issuecomment-753221362,https://api.github.com/repos/simonw/datasette/issues/1164,753221362,MDEyOklzc3VlQ29tbWVudDc1MzIyMTM2Mg==,9599,simonw,2020-12-31T22:55:57Z,2020-12-31T22:55:57Z,OWNER,"I had to add this as the first line in `table.min.js` for the source mapping to work: ``` //# sourceMappingURL=/-/static/table.min.js.map ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-753220665,https://api.github.com/repos/simonw/datasette/issues/1164,753220665,MDEyOklzc3VlQ29tbWVudDc1MzIyMDY2NQ==,9599,simonw,2020-12-31T22:49:36Z,2020-12-31T22:49:36Z,OWNER,"I started with a 7K `table.js` file. `npx uglifyjs table.js --source-map -o table.min.js` gave me a 5.6K `table.min.js` file. `npx uglifyjs table.js --source-map -o table.min.js --compress --mangle` gave me 4.5K.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-753220412,https://api.github.com/repos/simonw/datasette/issues/1164,753220412,MDEyOklzc3VlQ29tbWVudDc1MzIyMDQxMg==,9599,simonw,2020-12-31T22:47:36Z,2020-12-31T22:47:36Z,OWNER,"I'm trying to minify `table.js` and I ran into a problem: Uglification failed. Unexpected character '`' It turns out `uglify-js` doesn't support ES6 syntax! But `uglify-es` does: npm install uglify-es Annoyingly it looks like `uglify-es` uses the same CLI command, `uglifyjs`. So after installing it this seemed to work: npx uglifyjs table.js --source-map -o table.min.js I really don't like how `npx uglifyjs` could mean different things depending on which package was installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/983#issuecomment-753219521,https://api.github.com/repos/simonw/datasette/issues/983,753219521,MDEyOklzc3VlQ29tbWVudDc1MzIxOTUyMQ==,9599,simonw,2020-12-31T22:39:52Z,2020-12-31T22:39:52Z,OWNER,For inlining the `plugins.min.js` file into the Jinja templates I could use the trick described here: https://stackoverflow.com/a/41404611 - which adds a `{{ include_file('file.txt') }}` function to Jinja.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753219407,https://api.github.com/repos/simonw/datasette/issues/983,753219407,MDEyOklzc3VlQ29tbWVudDc1MzIxOTQwNw==,9599,simonw,2020-12-31T22:38:45Z,2020-12-31T22:39:10Z,OWNER,"You'll be able to add JavaScript plugins using a bunch of different mechanisms: - In a custom template, dropping the code in to a `<script>` block - A bookmarklet that injects an extra script (I'm really excited to try this out) - A separate `script.js` file that's loaded into Datasette using the `""extra_js_urls""` metadata option, documented here: https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript - A plugin you can install, like `datasette-vega` or `datasette-cluster-map` - since plugins can bundle their own script files that then get loaded on pages via this hook: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-js-urls-template-database-table-columns-view-name-request-datasette ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753218817,https://api.github.com/repos/simonw/datasette/issues/983,753218817,MDEyOklzc3VlQ29tbWVudDc1MzIxODgxNw==,173848,yozlet,2020-12-31T22:32:25Z,2020-12-31T22:32:25Z,NONE,"Amazing work! And you've put in far more work than I'd expect to reduce the payload (which is admirable). So, to add a plugin with the current design, it goes in (a) the template or (b) a bookmarklet, right?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753217917,https://api.github.com/repos/simonw/datasette/issues/983,753217917,MDEyOklzc3VlQ29tbWVudDc1MzIxNzkxNw==,9599,simonw,2020-12-31T22:23:29Z,2020-12-31T22:23:36Z,OWNER,"If I'm going to do that, it would be good if subsequent plugins that register against the `load` event are executed straight away. That's a bit of a weird edge-case in plugin world - it would involve the bulkier code that gets loaded redefining how `datasette.plugins.register` works to special-case the `'load'` hook. Maybe the tiny bootstrap code could define a `datasette.plugins.onload(callbackFunction)` method which gets upgraded later into something that fires straight away? Would add more bytes though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753217714,https://api.github.com/repos/simonw/datasette/issues/983,753217714,MDEyOklzc3VlQ29tbWVudDc1MzIxNzcxNA==,9599,simonw,2020-12-31T22:21:33Z,2020-12-31T22:21:33Z,OWNER,"Eventually I'd like to provide a whole bunch of other `datasette.X` utility functions that plugins can use - things like `datasette.addTabbedContentPane()` or similar. But I don't want to inline those into the page. So... I think the basic plugin system remains inline - maybe from an inlined file called `plugins-bootstrap.js`. Then a separate `plugins.js` contains the rest of the API functionality. If a plugin wants to take advantage of those APIs, maybe it registers itself using `datasette.plugins.register('load', () => ...)` - that `load` hook can then be fired once the bulkier plugin code has been loaded.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/987#issuecomment-753217127,https://api.github.com/repos/simonw/datasette/issues/987,753217127,MDEyOklzc3VlQ29tbWVudDc1MzIxNzEyNw==,9599,simonw,2020-12-31T22:16:46Z,2020-12-31T22:16:46Z,OWNER,"I'm going to use `class=""plugin-content-pre-table""` rather than `id=` - just because I still want to be able to display all of this stuff on the single https://latest.datasette.io/-/patterns page so duplicate IDs are best avoided.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/983#issuecomment-753215761,https://api.github.com/repos/simonw/datasette/issues/983,753215761,MDEyOklzc3VlQ29tbWVudDc1MzIxNTc2MQ==,9599,simonw,2020-12-31T22:07:31Z,2020-12-31T22:07:31Z,OWNER,"I think I need to keep the mechanism whereby a plugin can return `undefined` in order to indicate that it has nothing to say for that specific item - that's borrowed from Pluggy and I've used it a bunch in my Python plugins. That makes the code a bit longer. I'll write some example plugins to help me decide if the filtering-out-of-undefined mechanism is needed or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753215545,https://api.github.com/repos/simonw/datasette/issues/983,753215545,MDEyOklzc3VlQ29tbWVudDc1MzIxNTU0NQ==,9599,simonw,2020-12-31T22:05:41Z,2020-12-31T22:05:41Z,OWNER,Using object destructuring like that is a great idea. I'm going to play with your version - it's delightfully succinct.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1166#issuecomment-753214664,https://api.github.com/repos/simonw/datasette/issues/1166,753214664,MDEyOklzc3VlQ29tbWVudDc1MzIxNDY2NA==,9599,simonw,2020-12-31T21:58:04Z,2020-12-31T21:58:04Z,OWNER,Wrote a TIL about this: https://til.simonwillison.net/github-actions/prettier-github-actions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753211535,https://api.github.com/repos/simonw/datasette/issues/1166,753211535,MDEyOklzc3VlQ29tbWVudDc1MzIxMTUzNQ==,9599,simonw,2020-12-31T21:46:04Z,2020-12-31T21:46:04Z,OWNER,"https://github.com/simonw/datasette/runs/1631682372?check_suite_focus=true failed! <img width=""693"" alt=""Trying_out_bad_formatting__refs__1166_·_simonw_datasette_8087091"" src=""https://user-images.githubusercontent.com/9599/103426449-841e5c00-4b6e-11eb-95e6-26432a7b27dd.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753210536,https://api.github.com/repos/simonw/datasette/issues/1166,753210536,MDEyOklzc3VlQ29tbWVudDc1MzIxMDUzNg==,9599,simonw,2020-12-31T21:45:19Z,2020-12-31T21:45:19Z,OWNER,"Oops, committed that bad formatting test to `main` instead of a branch!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753209192,https://api.github.com/repos/simonw/datasette/issues/1166,753209192,MDEyOklzc3VlQ29tbWVudDc1MzIwOTE5Mg==,9599,simonw,2020-12-31T21:44:22Z,2020-12-31T21:44:22Z,OWNER,"Tests passed in https://github.com/simonw/datasette/runs/1631677726?check_suite_focus=true I'm going to try submitting a pull request with badly formatted JavaScript to see if it gets caught.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753200580,https://api.github.com/repos/simonw/datasette/issues/1166,753200580,MDEyOklzc3VlQ29tbWVudDc1MzIwMDU4MA==,9599,simonw,2020-12-31T21:38:06Z,2020-12-31T21:38:06Z,OWNER,"I think this should work: ``` - uses: actions/cache@v2 with: path: ~/.npm key: ${{ runner.os }}-node-${{ hashFiles('**/prettier.yml' }} ``` I'll use the `prettier.yml` workflow that I'm about to create as the cache key for the NPM cache.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753197957,https://api.github.com/repos/simonw/datasette/issues/1166,753197957,MDEyOklzc3VlQ29tbWVudDc1MzE5Nzk1Nw==,9599,simonw,2020-12-31T21:36:14Z,2020-12-31T21:36:14Z,OWNER,"Maybe not that action actually - I wanted to use a pre-built action to avoid installing Prettier every time, but that's what it seems to do: https://github.com/creyD/prettier_action/blob/bb361e2979cff283ca7684908deac8f95400e779/entrypoint.sh#L28-L37","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753195905,https://api.github.com/repos/simonw/datasette/issues/1166,753195905,MDEyOklzc3VlQ29tbWVudDc1MzE5NTkwNQ==,9599,simonw,2020-12-31T21:34:46Z,2020-12-31T21:34:46Z,OWNER,This action looks good - tag 3.2 is equivalent to this commit hash: https://github.com/creyD/prettier_action/tree/bb361e2979cff283ca7684908deac8f95400e779,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1166#issuecomment-753193475,https://api.github.com/repos/simonw/datasette/issues/1166,753193475,MDEyOklzc3VlQ29tbWVudDc1MzE5MzQ3NQ==,9599,simonw,2020-12-31T21:33:00Z,2020-12-31T21:33:00Z,OWNER,"I want a CI check that confirms that files conform to prettier - but only `datasette/static/*.js` files that are not already minified. This seems to do the job: npx prettier --check 'datasette/static/*[!.min].js' ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1165#issuecomment-753033121,https://api.github.com/repos/simonw/datasette/issues/1165,753033121,MDEyOklzc3VlQ29tbWVudDc1MzAzMzEyMQ==,154364,dracos,2020-12-31T19:33:47Z,2020-12-31T19:33:47Z,NONE,"Sorry to go on about it, but it's my only example ;) And thought it might be of interest/use. Here is FixMyStreet's Cypress workflow https://github.com/mysociety/fixmystreet/blob/master/.github/workflows/cypress.yml with the master script that sets up server etc at https://github.com/mysociety/fixmystreet/blob/master/bin/browser-tests (that has features such as working inside/outside Vagrant, and can do JS code coverage) and then the tests are at https://github.com/mysociety/fixmystreet/tree/master/.cypress/cypress/integration","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/983#issuecomment-752882797,https://api.github.com/repos/simonw/datasette/issues/983,752882797,MDEyOklzc3VlQ29tbWVudDc1Mjg4Mjc5Nw==,154364,dracos,2020-12-31T08:07:59Z,2020-12-31T15:04:32Z,NONE,"If you're using arrow functions, you can presumably use default parameters, not much difference in support. That would save you 9 bytes. But OTOH you need `""use strict"";` to use arrow functions etc, and that's 13 bytes. Your latest 250-byte one, with use strict, gzips to 199 bytes. The following might be 292 bytes, but compresses to 204, basically the same, and works in any browser (well, IE9+) at all: `var datasette=datasette||{};datasette.plugins=function(){var d={};return{register:function(b,c,e){d[b]||(d[b]=[]);d[b].push([c,e])},call:function(b,c){c=c||{};var e=[];(d[b]||[]).forEach(function(a){a=a[0].apply(a[0],a[1].map(function(a){return c[a]}));void 0!==a&&e.push(a)});return e}}}();` Source for that is below; I replaced the [fn,parameters] because closure-compiler includes a polyfill for that, and I ran `closure-compiler --language_out ECMASCRIPT3`: ```js var datasette = datasette || {}; datasette.plugins = (() => { var registry = {}; return { register: (hook, fn, parameters) => { if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; var results = []; (registry[hook] || []).forEach((data) => { /* Call with the correct arguments */ var result = data[0].apply(data[0], data[1].map(parameter => args[parameter])); if (result !== undefined) { results.push(result); } }); return results; } }; })(); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752888552,https://api.github.com/repos/simonw/datasette/issues/983,752888552,MDEyOklzc3VlQ29tbWVudDc1Mjg4ODU1Mg==,154364,dracos,2020-12-31T08:33:11Z,2020-12-31T08:34:27Z,NONE,"If you could say that all hook functions had to accept one options parameter (and could use object destructuring if they wished to only see a subset), you could have this, which minifies (to all-browser-JS) to 200 bytes, gzips to 146, and works practically the same: ```js var datasette = datasette || {}; datasette.plugins = (() => { var registry = {}; return { register: (hook, fn) => { registry[hook] = registry[hook] || []; registry[hook].push(fn); }, call: (hook, args) => { var results = (registry[hook] || []).map(fn => fn(args||{})); return results; } }; })(); ``` `var datasette=datasette||{};datasette.plugins=function(){var b={};return{register:function(a,c){b[a]=b[a]||[];b[a].push(c)},call:function(a,c){return(b[a]||[]).map(function(a){return a(c||{})})}}}();` Called the same, definitions tiny bit different: ```js datasette.plugins.register('numbers', ({a, b}) => a + b) datasette.plugins.register('numbers', o => o.a * o.b) datasette.plugins.call('numbers', {a: 4, b: 6}) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1165#issuecomment-752846267,https://api.github.com/repos/simonw/datasette/issues/1165,752846267,MDEyOklzc3VlQ29tbWVudDc1Mjg0NjI2Nw==,9599,simonw,2020-12-31T05:10:41Z,2020-12-31T05:13:14Z,OWNER,"https://github.com/PostHog/posthog/tree/master/cypress/integration has some useful examples, linked from this article: https://posthog.com/blog/cypress-end-to-end-tests Also useful: their workflow https://github.com/PostHog/posthog/blob/master/.github/workflows/e2e.yml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752839433,https://api.github.com/repos/simonw/datasette/issues/1165,752839433,MDEyOklzc3VlQ29tbWVudDc1MjgzOTQzMw==,9599,simonw,2020-12-31T04:29:40Z,2020-12-31T04:29:40Z,OWNER,Important to absorb the slightly bizarre assertion syntax from Chai - docs here https://www.chaijs.com/api/bdd/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752828851,https://api.github.com/repos/simonw/datasette/issues/1165,752828851,MDEyOklzc3VlQ29tbWVudDc1MjgyODg1MQ==,9599,simonw,2020-12-31T03:19:38Z,2020-12-31T03:19:38Z,OWNER,"I got Cypress working! I added the `datasette.plugins` code to the table template and ran a test called `plugins.spec.js` using the following: ```javascript context('datasette.plugins API', () => { beforeEach(() => { cy.visit('/fixtures/compound_three_primary_keys') }); it('should exist', () => { let datasette; cy.window().then(win => { datasette = win.datasette; }).then(() => { expect(datasette).to.exist; expect(datasette.plugins).to.exist; }); }); it('should register and execute plugins', () => { let datasette; cy.window().then(win => { datasette = win.datasette; }).then(() => { expect(datasette.plugins.call('numbers')).to.deep.equal([]); // Register a plugin datasette.plugins.register(""numbers"", (a, b) => a + b, ['a', 'b']); var result = datasette.plugins.call(""numbers"", {a: 1, b: 2}); expect(result).to.deep.equal([3]); // Second plugin datasette.plugins.register(""numbers"", (a, b) => a * b, ['a', 'b']); var result2 = datasette.plugins.call(""numbers"", {a: 1, b: 2}); expect(result2).to.deep.equal([3, 2]); }); }); }); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752780000,https://api.github.com/repos/simonw/datasette/issues/1165,752780000,MDEyOklzc3VlQ29tbWVudDc1Mjc4MDAwMA==,9599,simonw,2020-12-30T22:41:25Z,2020-12-30T22:41:25Z,OWNER,Jest works with Puppeteer: https://jestjs.io/docs/en/puppeteer,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752779820,https://api.github.com/repos/simonw/datasette/issues/1165,752779820,MDEyOklzc3VlQ29tbWVudDc1Mjc3OTgyMA==,9599,simonw,2020-12-30T22:40:28Z,2020-12-30T22:40:28Z,OWNER,"I don't know if Jest on the command-line is the right tool for this. It works for the `plugins.js` script but I'm increasingly going to want to start adding tests for browser JavaScript features - like the https://github.com/simonw/datasette/blob/0.53/datasette/static/table.js script - which will need to run in a browser. So maybe I should just find a browser testing solution and figure out how to run that under CI in GitHub Actions. Maybe https://www.cypress.io/ ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752779490,https://api.github.com/repos/simonw/datasette/issues/1165,752779490,MDEyOklzc3VlQ29tbWVudDc1Mjc3OTQ5MA==,9599,simonw,2020-12-30T22:38:43Z,2020-12-30T22:38:43Z,OWNER,Turned that into a TIL: https://til.simonwillison.net/javascript/jest-without-package-json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1165#issuecomment-752777744,https://api.github.com/repos/simonw/datasette/issues/1165,752777744,MDEyOklzc3VlQ29tbWVudDc1Mjc3Nzc0NA==,9599,simonw,2020-12-30T22:30:24Z,2020-12-30T22:30:24Z,OWNER,"https://www.valentinog.com/blog/jest/ was useful. I created a `static/__tests__` folder and added this file as `plugins.spec.js`: ```javascript const datasette = require(""../plugins.js""); describe(""Datasette Plugins"", () => { test(""it should have datasette.plugins"", () => { expect(!!datasette.plugins).toEqual(true); }); test(""registering a plugin should work"", () => { datasette.plugins.register(""numbers"", (a, b) => a + b, [""a"", ""b""]); var result = datasette.plugins.call(""numbers"", { a: 1, b: 2 }); expect(result).toEqual([3]); datasette.plugins.register(""numbers"", (a, b) => a * b, [""a"", ""b""]); var result2 = datasette.plugins.call(""numbers"", { a: 1, b: 2 }); expect(result2).toEqual([3, 2]); }); }); ``` In `static/plugins.js` I put this: ```javascript var datasette = datasette || {}; datasette.plugins = (() => { var registry = {}; return { register: (hook, fn, parameters) => { if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; var results = []; (registry[hook] || []).forEach(([fn, parameters]) => { /* Call with the correct arguments */ var result = fn.apply(fn, parameters.map(parameter => args[parameter])); if (result !== undefined) { results.push(result); } }); return results; } }; })(); module.exports = datasette; ``` Note the `module.exports` line at the end. Then inside `static/` I ran the following command: ``` % npx jest -c '{}' PASS __tests__/plugins.spec.js Datasette Plugins ✓ it should have datasette.plugins (3 ms) ✓ registering a plugin should work (1 ms) Test Suites: 1 passed, 1 total Tests: 2 passed, 2 total Snapshots: 0 total Time: 1.163 s Ran all test suites. ``` The `-c {}` was necessary because I didn't have a Jest configuration or a `package.json`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/983#issuecomment-752773508,https://api.github.com/repos/simonw/datasette/issues/983,752773508,MDEyOklzc3VlQ29tbWVudDc1Mjc3MzUwOA==,9599,simonw,2020-12-30T22:10:08Z,2020-12-30T22:11:34Z,OWNER,"https://twitter.com/dracos/status/1344402639476424706 points out that plugins returning 0 will be ignored. This should probably check for `result !== undefined` instead - knocks the size up to 250.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752770488,https://api.github.com/repos/simonw/datasette/issues/983,752770488,MDEyOklzc3VlQ29tbWVudDc1Mjc3MDQ4OA==,9599,simonw,2020-12-30T21:55:35Z,2020-12-30T21:58:26Z,OWNER,"This one minifies to 241: ```javascript var datasette = datasette || {}; datasette.plugins = (() => { var registry = {}; return { register: (hook, fn, parameters) => { if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; var results = []; (registry[hook] || []).forEach(([fn, parameters]) => { /* Call with the correct arguments */ var result = fn.apply(fn, parameters.map(parameter => args[parameter])); if (result) { results.push(result); } }); return results; } }; })(); ``` `var datasette=datasette||{};datasette.plugins=(()=>{var a={};return{register:(t,r,e)=>{a[t]||(a[t]=[]),a[t].push([r,e])},call:(t,r)=>{r=r||{};var e=[];return(a[t]||[]).forEach(([a,t])=>{var s=a.apply(a,t.map(a=>r[a]));s&&e.push(s)}),e}}})();`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752770133,https://api.github.com/repos/simonw/datasette/issues/983,752770133,MDEyOklzc3VlQ29tbWVudDc1Mjc3MDEzMw==,9599,simonw,2020-12-30T21:53:45Z,2020-12-30T21:54:22Z,OWNER,"FixMyStreet inlines some JavaScript, and it's always a good idea to copy what they're doing when it comes to web performance: https://github.com/mysociety/fixmystreet/blob/23e9564b58a86b783ce47f3c0bf837cbd4fe7282/templates/web/base/common_header_tags.html#L19-L25 Note `var fixmystreet=fixmystreet||{};` which is shorter - https://twitter.com/dracos/status/1344399909794045954","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1164#issuecomment-752769452,https://api.github.com/repos/simonw/datasette/issues/1164,752769452,MDEyOklzc3VlQ29tbWVudDc1Mjc2OTQ1Mg==,9599,simonw,2020-12-30T21:50:16Z,2020-12-30T21:50:16Z,OWNER,If I implement this I can automate the CodeMirror minification and remove the bit about running `uglify-js` against it from the documentation here: https://docs.datasette.io/en/0.53/contributing.html#upgrading-codemirror,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-752768785,https://api.github.com/repos/simonw/datasette/issues/1164,752768785,MDEyOklzc3VlQ29tbWVudDc1Mjc2ODc4NQ==,9599,simonw,2020-12-30T21:47:06Z,2020-12-30T21:47:06Z,OWNER,If I'm going to minify `table.js` I'd like to offer a source map for it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-752768652,https://api.github.com/repos/simonw/datasette/issues/1164,752768652,MDEyOklzc3VlQ29tbWVudDc1Mjc2ODY1Mg==,9599,simonw,2020-12-30T21:46:29Z,2020-12-30T21:46:29Z,OWNER,Running https://skalman.github.io/UglifyJS-online/ against https://github.com/simonw/datasette/blob/0.53/datasette/static/table.js knocks it down from 7810 characters to 4643.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/983#issuecomment-752767500,https://api.github.com/repos/simonw/datasette/issues/983,752767500,MDEyOklzc3VlQ29tbWVudDc1Mjc2NzUwMA==,9599,simonw,2020-12-30T21:42:07Z,2020-12-30T21:42:07Z,OWNER,"Another option: have both ""dev"" and ""production"" versions of the plugin mechanism script. Make it easy to switch between the two. Build JavaScript unit tests that exercise the ""production"" APIs against the development version, and have extra tests that just work against the features in the development version.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752767174,https://api.github.com/repos/simonw/datasette/issues/983,752767174,MDEyOklzc3VlQ29tbWVudDc1Mjc2NzE3NA==,9599,simonw,2020-12-30T21:40:44Z,2020-12-30T21:40:44Z,OWNER,Started a Twitter thread about this here: https://twitter.com/simonw/status/1344392603794477056,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752751490,https://api.github.com/repos/simonw/datasette/issues/983,752751490,MDEyOklzc3VlQ29tbWVudDc1Mjc1MTQ5MA==,9599,simonw,2020-12-30T20:40:04Z,2020-12-30T21:34:22Z,OWNER,"This one is 683 bytes with Uglify - I like how https://skalman.github.io/UglifyJS-online/ shows you the minified character count as you edit the script: ```javascript window.datasette = window.datasette || {}; window.datasette.plugins = (() => { var registry = {}; var definitions = {}; var stringify = JSON.stringify; function extractParameters(fn) { var match = /\((.*)\)/.exec(fn.toString()); if (match && match[1].trim()) { return match[1].split(',').map(s => s.trim()); } else { return []; } } function isSubSet(a, b) { return a.every(parameter => b.includes(parameter)) } return { _r: registry, define: (hook, parameters) => { definitions[hook] = parameters || []; }, register: (hook, fn, parameters) => { parameters = parameters || extractParameters(fn); if (!definitions[hook]) { throw 'Hook ""' + hook + '"" not defined'; } /* Check parameters is a subset of definitions[hook] */ var validParameters = definitions[hook]; if (!isSubSet(parameters, validParameters)) { throw '""' + hook + '"" valid args: ' + stringify(validParameters); } if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; if (!definitions[hook]) { throw '""' + hook + '"" hook not defined'; } if (!isSubSet(Object.keys(args), definitions[hook])) { throw '""' + hook + '"" valid args: ' + stringify(definitions[hook]); } var implementations = registry[hook] || []; var results = []; implementations.forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } }; })(); ``` `window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var t={},r={},e=JSON.stringify;function i(t,r){return t.every(t=>r.includes(t))}return{_r:t,define:(t,e)=>{r[t]=e||[]},register:(a,n,o)=>{if(o=o||function(t){var r=/\((.*)\)/.exec(t.toString());return r&&r[1].trim()?r[1].split("","").map(t=>t.trim()):[]}(n),!r[a])throw'Hook ""'+a+'"" not defined';var d=r[a];if(!i(o,d))throw'""'+a+'"" valid args: '+e(d);t[a]||(t[a]=[]),t[a].push([n,o])},call:(a,n)=>{if(n=n||{},!r[a])throw'""'+a+'"" hook not defined';if(!i(Object.keys(n),r[a]))throw'""'+a+'"" valid args: '+e(r[a]);var o=t[a]||[],d=[];return o.forEach(([t,r])=>{var e=r.map(t=>n[t]),i=t.apply(t,e);i&&d.push(i)}),d}}})();`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752760815,https://api.github.com/repos/simonw/datasette/issues/983,752760815,MDEyOklzc3VlQ29tbWVudDc1Mjc2MDgxNQ==,9599,simonw,2020-12-30T21:15:41Z,2020-12-30T21:15:41Z,OWNER,"I'm going to write a few example plugins and try them out against the longer and shorter versions of the script, to get a better feel for how useful the longer versions with the error handling and explicit definition actually are.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752760054,https://api.github.com/repos/simonw/datasette/issues/983,752760054,MDEyOklzc3VlQ29tbWVudDc1Mjc2MDA1NA==,9599,simonw,2020-12-30T21:12:36Z,2020-12-30T21:14:05Z,OWNER,"I gotta admit that 262 byte version is pretty tempting, if it's going to end up in the `<head>` of every single page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752759885,https://api.github.com/repos/simonw/datasette/issues/983,752759885,MDEyOklzc3VlQ29tbWVudDc1Mjc1OTg4NQ==,9599,simonw,2020-12-30T21:11:52Z,2020-12-30T21:14:00Z,OWNER,"262 bytes if I remove the parameter introspection code, instead requiring plugin authors to specify the arguments they take: ```javascript window.datasette = window.datasette || {}; window.datasette.plugins = (() => { var registry = {}; return { register: (hook, fn, parameters) => { if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; var results = []; (registry[hook] || []).forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } }; })(); ``` `window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var a={};return{register:(t,e,r)=>{a[t]||(a[t]=[]),a[t].push([e,r])},call:(t,e)=>{e=e||{};var r=[];return(a[t]||[]).forEach(([a,t])=>{var s=t.map(a=>e[a]),d=a.apply(a,s);d&&r.push(d)}),r}}})();`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752758802,https://api.github.com/repos/simonw/datasette/issues/983,752758802,MDEyOklzc3VlQ29tbWVudDc1Mjc1ODgwMg==,9599,simonw,2020-12-30T21:07:33Z,2020-12-30T21:10:10Z,OWNER,"Removing the `datasette.plugin.define()` method and associated error handling reduces the uglified version from 683 bytes to 380 bytes. I think the error checking is worth the extra 303 bytes per page load, even if it's only really needed for a better developer experience. ```javascript window.datasette = window.datasette || {}; window.datasette.plugins = (() => { var registry = {}; function extractParameters(fn) { var match = /\((.*)\)/.exec(fn.toString()); if (match && match[1].trim()) { return match[1].split(',').map(s => s.trim()); } else { return []; } } return { register: (hook, fn, parameters) => { parameters = parameters || extractParameters(fn); if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; var implementations = registry[hook] || []; var results = []; implementations.forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } }; })(); ``` `window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var t={};return{register:(r,a,e)=>{e=e||function(t){var r=/\((.*)\)/.exec(t.toString());return r&&r[1].trim()?r[1].split("","").map(t=>t.trim()):[]}(a),t[r]||(t[r]=[]),t[r].push([a,e])},call:(r,a)=>{a=a||{};var e=t[r]||[],i=[];return e.forEach(([t,r])=>{var e=r.map(t=>a[t]),n=t.apply(t,e);n&&i.push(n)}),i}}})();`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1165#issuecomment-752757910,https://api.github.com/repos/simonw/datasette/issues/1165,752757910,MDEyOklzc3VlQ29tbWVudDc1Mjc1NzkxMA==,9599,simonw,2020-12-30T21:04:18Z,2020-12-30T21:04:18Z,OWNER,https://jestjs.io/ looks worth trying here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/983#issuecomment-752757289,https://api.github.com/repos/simonw/datasette/issues/983,752757289,MDEyOklzc3VlQ29tbWVudDc1Mjc1NzI4OQ==,9599,simonw,2020-12-30T21:02:20Z,2020-12-30T21:02:20Z,OWNER,I'm going to need to add JavaScript unit tests for this new plugin system.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1164#issuecomment-752757075,https://api.github.com/repos/simonw/datasette/issues/1164,752757075,MDEyOklzc3VlQ29tbWVudDc1Mjc1NzA3NQ==,9599,simonw,2020-12-30T21:01:27Z,2020-12-30T21:01:27Z,OWNER,"I don't want Datasette contributors to need a working Node.js install to run the tests or work on Datasette unless they are explicitly working on the JavaScript. I think I'm going to do this with a unit test that runs only if `upglify-js` is available on the path and confirms that the `*.min.js` version of each script in the repository correctly matches the results from running `uglify-js` against it. That way if anyone checks in a change to JavaScript but forgets to run the minifier the tests will fail in CI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-752756612,https://api.github.com/repos/simonw/datasette/issues/1164,752756612,MDEyOklzc3VlQ29tbWVudDc1Mjc1NjYxMg==,9599,simonw,2020-12-30T20:59:54Z,2020-12-30T20:59:54Z,OWNER,"I tried a few different pure-Python JavaScript minifying libraries and none of them produced results as good as https://www.npmjs.com/package/uglify-js for the plugin code I'm considering in #983. So I think I'll need to rely on a Node.js tool for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/983#issuecomment-752750551,https://api.github.com/repos/simonw/datasette/issues/983,752750551,MDEyOklzc3VlQ29tbWVudDc1Mjc1MDU1MQ==,9599,simonw,2020-12-30T20:36:38Z,2020-12-30T20:37:48Z,OWNER,"This version minifies to 702 characters: ```javascript window.datasette = window.datasette || {}; window.datasette.plugins = (() => { var registry = {}; var definitions = {}; var stringify = JSON.stringify; function extractParameters(fn) { var match = /\((.*)\)/.exec(fn.toString()); if (match && match[1].trim()) { return match[1].split(',').map(s => s.trim()); } else { return []; } } function isSubSet(a, b) { return a.every(parameter => b.includes(parameter)) } return { _registry: registry, define: (hook, parameters) => { definitions[hook] = parameters || []; }, register: (hook, fn, parameters) => { parameters = parameters || extractParameters(fn); if (!definitions[hook]) { throw '""' + hook + '"" is not a defined hook'; } /* Check parameters is a subset of definitions[hook] */ var validParameters = definitions[hook]; if (!isSubSet(parameters, validParameters)) { throw '""' + hook + '"" valid args are ' + stringify(validParameters); } if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); }, call: (hook, args) => { args = args || {}; if (!definitions[hook]) { throw '""' + hook + '"" hook is not defined'; } if (!isSubSet(Object.keys(args), definitions[hook])) { throw '""' + hook + '"" valid args: ' + stringify(definitions[hook]); } var implementations = registry[hook] || []; var results = []; implementations.forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } }; })(); ``` Or 701 characters using https://skalman.github.io/UglifyJS-online/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752749189,https://api.github.com/repos/simonw/datasette/issues/983,752749189,MDEyOklzc3VlQ29tbWVudDc1Mjc0OTE4OQ==,9599,simonw,2020-12-30T20:31:28Z,2020-12-30T20:31:28Z,OWNER,"Using raw string exceptions, `throw '""' + hook + '"" hook has not been defined';`, knocks it down to 795 characters.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752748496,https://api.github.com/repos/simonw/datasette/issues/983,752748496,MDEyOklzc3VlQ29tbWVudDc1Mjc0ODQ5Ng==,9599,simonw,2020-12-30T20:28:48Z,2020-12-30T20:28:48Z,OWNER,If I'm going to minify it I'll need to figure out a build step in Datasette itself so that I can easily work on that minified version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752747999,https://api.github.com/repos/simonw/datasette/issues/983,752747999,MDEyOklzc3VlQ29tbWVudDc1Mjc0Nzk5OQ==,9599,simonw,2020-12-30T20:27:00Z,2020-12-30T20:27:00Z,OWNER,"I need to decide how this code is going to be loaded. Putting it in a blocking `<script>` element in the head would work, but I'd rather not block loading of the rest of the page. Using a `<script async>` method would be nicer, but then I have to worry about plugins attempting to register themselves before the page has fully loaded. Running it through https://javascript-minifier.com/ produces this, which is 855 characters - so maybe I could inline that into the header of the page? `window.datasette={},window.datasette.plugins=function(){var r={},n={};function e(r,n){return r.every(r=>n.includes(r))}return{define:function(r,e){n[r]=e||[]},register:function(t,i,o){if(o=o||function(r){var n=/\((.*)\)/.exec(r.toString());return n&&n[1].trim()?n[1].split("","").map(r=>r.trim()):[]}(i),!n[t])throw new Error('""'+t+'"" is not a defined plugin hook');if(!n[t])throw new Error('""'+t+'"" is not a defined plugin hook');var a=n[t];if(!e(o,a))throw new Error('""'+t+'"" valid parameters are '+JSON.stringify(a));r[t]||(r[t]=[]),r[t].push([i,o])},_registry:r,call:function(t,i){if(i=i||{},!n[t])throw new Error('""'+t+'"" hook has not been defined');if(!e(Object.keys(i),n[t]))throw new Error('""'+t+'"" valid arguments are '+JSON.stringify(n[t]));var o=r[t]||[],a=[];return o.forEach(([r,n])=>{var e=n.map(r=>i[r]),t=r.apply(r,e);t&&a.push(t)}),a}}}();`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752747169,https://api.github.com/repos/simonw/datasette/issues/983,752747169,MDEyOklzc3VlQ29tbWVudDc1Mjc0NzE2OQ==,9599,simonw,2020-12-30T20:24:07Z,2020-12-30T20:24:07Z,OWNER,"This version adds `datasette.plugins.define()` plus extra validation of both `.register()` and `.call()`: ```javascript window.datasette = {}; window.datasette.plugins = (function() { var registry = {}; var definitions = {}; function extractParameters(fn) { var match = /\((.*)\)/.exec(fn.toString()); if (match && match[1].trim()) { return match[1].split(',').map(s => s.trim()); } else { return []; } } function define(hook, parameters) { definitions[hook] = parameters || []; } function isSubSet(a, b) { return a.every(parameter => b.includes(parameter)) } function register(hook, fn, parameters) { parameters = parameters || extractParameters(fn); if (!definitions[hook]) { throw new Error('""' + hook + '"" is not a defined plugin hook'); } if (!definitions[hook]) { throw new Error('""' + hook + '"" is not a defined plugin hook'); } /* Check parameters is a subset of definitions[hook] */ var validParameters = definitions[hook]; if (!isSubSet(parameters, validParameters)) { throw new Error('""' + hook + '"" valid parameters are ' + JSON.stringify(validParameters)); } if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); } function call(hook, args) { args = args || {}; if (!definitions[hook]) { throw new Error('""' + hook + '"" hook has not been defined'); } if (!isSubSet(Object.keys(args), definitions[hook])) { throw new Error('""' + hook + '"" valid arguments are ' + JSON.stringify(definitions[hook])); } var implementations = registry[hook] || []; var results = []; implementations.forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } return { define: define, register: register, _registry: registry, call: call }; })(); ``` Usage: ```javascript datasette.plugins.define('numbers', ['a', 'b']) datasette.plugins.register('numbers', (a, b) => a + b) datasette.plugins.register('numbers', (a, b) => a * b) datasette.plugins.call('numbers', {a: 4, b: 6}) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752744311,https://api.github.com/repos/simonw/datasette/issues/983,752744311,MDEyOklzc3VlQ29tbWVudDc1Mjc0NDMxMQ==,9599,simonw,2020-12-30T20:12:50Z,2020-12-30T20:13:02Z,OWNER,"This could work to define a plugin hook: ```javascript datasette.plugins.define('numbers', ['a' ,'b']) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752744195,https://api.github.com/repos/simonw/datasette/issues/983,752744195,MDEyOklzc3VlQ29tbWVudDc1Mjc0NDE5NQ==,9599,simonw,2020-12-30T20:12:26Z,2020-12-30T20:12:26Z,OWNER,"This implementation doesn't have an equivalent of ""hookspecs"" which can identify if a registered plugin implementation matches a known signature. I should add that, it will provide a better developer experience if someone has a typo.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752742669,https://api.github.com/repos/simonw/datasette/issues/983,752742669,MDEyOklzc3VlQ29tbWVudDc1Mjc0MjY2OQ==,9599,simonw,2020-12-30T20:07:05Z,2020-12-30T20:07:18Z,OWNER,"Initial prototype: ```javascript window.datasette = {}; window.datasette.plugins = (function() { var registry = {}; function extractParameters(fn) { var match = /\((.*)\)/.exec(fn.toString()); if (match && match[1].trim()) { return match[1].split(',').map(s => s.trim()); } else { return []; } } function register(hook, fn, parameters) { parameters = parameters || extractParameters(fn); if (!registry[hook]) { registry[hook] = []; } registry[hook].push([fn, parameters]); } function call(hook, args) { args = args || {}; var implementations = registry[hook] || []; var results = []; implementations.forEach(([fn, parameters]) => { /* Call with the correct arguments */ var callWith = parameters.map(parameter => args[parameter]); var result = fn.apply(fn, callWith); if (result) { results.push(result); } }); return results; } return { register: register, _registry: registry, call: call }; })(); ``` Usage example: ```javascript datasette.plugins.register('numbers', (a, b) => a + b) datasette.plugins.register('numbers', (a, b) => a * b) datasette.plugins.call('numbers', {a: 4, b: 6}) /* Returns [10, 24] */ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752729035,https://api.github.com/repos/simonw/datasette/issues/983,752729035,MDEyOklzc3VlQ29tbWVudDc1MjcyOTAzNQ==,9599,simonw,2020-12-30T19:15:56Z,2020-12-30T19:16:44Z,OWNER,"The `column_actions` hook is the obvious first place to try this out. What are some demo plugins I could build for it? - Word cloud for this column - Count values (essentially linking to the SQL query for that column, as an extended version of the facet counts) - would be great if this could include pagination somehow, via #856. - Extract this column into a separate table - Add an index to this column","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752722863,https://api.github.com/repos/simonw/datasette/issues/983,752722863,MDEyOklzc3VlQ29tbWVudDc1MjcyMjg2Mw==,9599,simonw,2020-12-30T18:52:39Z,2020-12-30T18:52:39Z,OWNER,"Then to call the plugins: ```javascript datasette.plugins.call('column_actions', {database: 'database', table: 'table'}) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752721840,https://api.github.com/repos/simonw/datasette/issues/983,752721840,MDEyOklzc3VlQ29tbWVudDc1MjcyMTg0MA==,9599,simonw,2020-12-30T18:48:53Z,2020-12-30T18:51:51Z,OWNER,"Potential design: ```javascript datasette.plugins.register('column_actions', function(database, table, column, actor) { /* ... *l }) ``` Or if you want to be explicit to survive minification: ```javascript datasette.plugins.register('column_actions', function(database, table, column, actor) { /* ... *l }, ['database', 'table', 'column', 'actor']) ``` I'm making that list of parameter names an optional third argument to the `register()` function. If that argument isn't passed, introspection will be used to figure out the parameter names. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752721069,https://api.github.com/repos/simonw/datasette/issues/983,752721069,MDEyOklzc3VlQ29tbWVudDc1MjcyMTA2OQ==,9599,simonw,2020-12-30T18:46:10Z,2020-12-30T18:46:10Z,OWNER,"Pluggy does dependency injection by introspecting the named arguments to the Python function, which I really like. That's tricker in JavaScript. It looks like the only way to introspect a function is to look at the `.toString()` representation of it and parse the `(parameter, list)` using a regular expression. Even more challenging: JavaScript developers love minifying their code, and minification can shorten the function parameter names. From https://code-maven.com/dependency-injection-in-angularjs it looks like Angular.js does dependency injection and solves this by letting you optionally provide a separate list of the arguments your function uses: ```javascript angular.module('DemoApp', []) .controller('DemoController', ['$scope', '$log', function($scope, $log) { $scope.message = ""Hello World""; $log.debug('logging hello'); }]); ``` I can copy that approach: I'll introspect by default, but provide a documented mechanism for explicitly listing your parameter names so that if you know your plugin code will be minified you can use that instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752715412,https://api.github.com/repos/simonw/datasette/issues/983,752715412,MDEyOklzc3VlQ29tbWVudDc1MjcxNTQxMg==,9599,simonw,2020-12-30T18:25:31Z,2020-12-30T18:25:31Z,OWNER,I'm going to introduce a global `datasette` object which holds all the documented JavaScript API for plugin authors.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-752715236,https://api.github.com/repos/simonw/datasette/issues/983,752715236,MDEyOklzc3VlQ29tbWVudDc1MjcxNTIzNg==,9599,simonw,2020-12-30T18:24:54Z,2020-12-30T18:24:54Z,OWNER,"I think I'm going to try building a very lightweight clone of the core API design of Pluggy - not the advanced features, just the idea that plugins can register and a call to `plugin.nameOfHook()` will return the concatenated results of all of the registered hooks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/987#issuecomment-752714747,https://api.github.com/repos/simonw/datasette/issues/987,752714747,MDEyOklzc3VlQ29tbWVudDc1MjcxNDc0Nw==,9599,simonw,2020-12-30T18:23:08Z,2020-12-30T18:23:20Z,OWNER,"In terms of ""places to put your plugin content"", the simplest solution I can think of is something like this: ```html <div id=""plugin-content-pre-table""></div> ``` Alternative designs: - A documented JavaScript function that returns the CSS selector where plugins should put their content - A documented JavaScript function that returns a DOM node where plugins should put their content. This would allow the JavaScript to create the element if it does not already exist (though it wouldn't be obvious WHERE that element should be created) - Documented JavaScript functions for things like ""append this node/HTML to the place-where-plugins-go"" I think the original option - an empty `<div>` with a known `id` attribute - is the right one to go with here. It's the simplest, it's very easy for custom template authors to understand and it acknowledges that plugins may have all kinds of extra crazy stuff they want to do - like checking in that div to see if another plugin has written to it already, for example.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/987#issuecomment-752697279,https://api.github.com/repos/simonw/datasette/issues/987,752697279,MDEyOklzc3VlQ29tbWVudDc1MjY5NzI3OQ==,9599,simonw,2020-12-30T17:23:27Z,2020-12-30T17:23:32Z,OWNER,Related problem: right now if you're writing custom template it's not at all obvious how to write them such that visualization plugins like `datasette-cluster-map` will continue to work.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/987#issuecomment-752696499,https://api.github.com/repos/simonw/datasette/issues/987,752696499,MDEyOklzc3VlQ29tbWVudDc1MjY5NjQ5OQ==,9599,simonw,2020-12-30T17:21:08Z,2020-12-30T17:21:08Z,OWNER,"More generally, I need to document certain areas of the page that JavaScript plugins are invited to append their content to - such that plugin authors can use them and feel confident that future changes to the Datasette templates won't break their plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/1160#issuecomment-752275611,https://api.github.com/repos/simonw/datasette/issues/1160,752275611,MDEyOklzc3VlQ29tbWVudDc1MjI3NTYxMQ==,9599,simonw,2020-12-29T23:32:04Z,2020-12-29T23:32:04Z,OWNER,"If I can get this working for CSV, TSV, JSON and JSON-NL that should be enough to exercise the API design pretty well across both streaming and non-streaming formats.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752274509,https://api.github.com/repos/simonw/datasette/issues/1160,752274509,MDEyOklzc3VlQ29tbWVudDc1MjI3NDUwOQ==,9599,simonw,2020-12-29T23:26:02Z,2020-12-29T23:26:02Z,OWNER,"The documentation for this plugin hook is going to be pretty detailed, since it involves writing custom classes. I'll stick it all on the existing hooks page for the moment, but I should think about breaking up the plugin hook documentation into a page-per-hook in the future.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1163#issuecomment-752274340,https://api.github.com/repos/simonw/datasette/issues/1163,752274340,MDEyOklzc3VlQ29tbWVudDc1MjI3NDM0MA==,9599,simonw,2020-12-29T23:25:02Z,2020-12-29T23:25:02Z,OWNER,This will be built on top of `httpx` since that's already a dependency.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776128565,"""datasette insert data.db url-to-csv""", https://github.com/simonw/datasette/issues/1160#issuecomment-752274078,https://api.github.com/repos/simonw/datasette/issues/1160,752274078,MDEyOklzc3VlQ29tbWVudDc1MjI3NDA3OA==,9599,simonw,2020-12-29T23:23:39Z,2020-12-29T23:23:39Z,OWNER,"If I design this right I can ship a full version of the command-line `datasette insert` command in a release without doing any work at all on the Web UI version of it - that UI can then come later, without needing any changes to be made to the plugin hook.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752273873,https://api.github.com/repos/simonw/datasette/issues/1160,752273873,MDEyOklzc3VlQ29tbWVudDc1MjI3Mzg3Mw==,9599,simonw,2020-12-29T23:22:30Z,2020-12-29T23:22:30Z,OWNER,"How much of this should I get done in a branch before merging into `main`? The challenge here is the plugin hook design: ideally I don't want an incomplete plugin hook design in `main` since that could be a blocker for a release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752273400,https://api.github.com/repos/simonw/datasette/issues/1160,752273400,MDEyOklzc3VlQ29tbWVudDc1MjI3MzQwMA==,9599,simonw,2020-12-29T23:19:46Z,2020-12-29T23:19:46Z,OWNER,I'm going to break out some separate tickets.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752273306,https://api.github.com/repos/simonw/datasette/issues/1160,752273306,MDEyOklzc3VlQ29tbWVudDc1MjI3MzMwNg==,9599,simonw,2020-12-29T23:19:15Z,2020-12-29T23:19:15Z,OWNER,It would be nice if this abstraction could support progress bars as well. These won't necessarily work for every format - or they might work for things loaded from files but not things loaded over URLs (if the `content-length` HTTP header is missing) - but if they ARE possible it would be good to provide them - both for the CLI interface and the web insert UI.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752267905,https://api.github.com/repos/simonw/datasette/issues/1160,752267905,MDEyOklzc3VlQ29tbWVudDc1MjI2NzkwNQ==,9599,simonw,2020-12-29T22:52:09Z,2020-12-29T22:52:09Z,OWNER,"What's the simplest thing that could possible work? I think it's `datasette insert blah.db data.csv` - no URL handling, no other formats.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752266076,https://api.github.com/repos/simonw/datasette/issues/1160,752266076,MDEyOklzc3VlQ29tbWVudDc1MjI2NjA3Ng==,9599,simonw,2020-12-29T22:42:23Z,2020-12-29T22:42:59Z,OWNER,"Aside: maybe `datasette insert` works against simple files, but a later mechanism called `datasette import` allows plugins to register sub-commands, like `datasette import github ...` or `datasette import jira ...` or whatever. This would be useful for import mechanisms that are likely to need their own custom set of command-line options unique to that source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752265600,https://api.github.com/repos/simonw/datasette/issues/1160,752265600,MDEyOklzc3VlQ29tbWVudDc1MjI2NTYwMA==,9599,simonw,2020-12-29T22:39:56Z,2020-12-29T22:39:56Z,OWNER,"Does it definitely make sense to break this operation up into the code that turns the incoming format into a iterator of dictionaries, then the code that inserts those into the database using `sqlite-utils`? That seems right for simple imports, where the incoming file represents a sequence of records in a single table. But what about more complex formats? What if a format needs to be represented as multiple tables?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752259345,https://api.github.com/repos/simonw/datasette/issues/1160,752259345,MDEyOklzc3VlQ29tbWVudDc1MjI1OTM0NQ==,9599,simonw,2020-12-29T22:11:54Z,2020-12-29T22:11:54Z,OWNER,"Important detail from https://docs.python.org/3/library/csv.html#csv.reader > If *csvfile* is a file object, it should be opened with `newline=''`. [1] > > [...] > > If `newline=''` is not specified, newlines embedded inside quoted fields will not be interpreted correctly, and on platforms that use `\r\n` linendings on write an extra `\r` will be added. It should always be safe to specify `newline=''`, since the csv module does its own ([universal](https://docs.python.org/3/glossary.html#term-universal-newlines)) newline handling.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752257666,https://api.github.com/repos/simonw/datasette/issues/1160,752257666,MDEyOklzc3VlQ29tbWVudDc1MjI1NzY2Ng==,9599,simonw,2020-12-29T22:09:18Z,2020-12-29T22:09:18Z,OWNER,"### Figuring out the API design I want to be able to support different formats, and be able to parse them into tables either streaming or in one go depending on if the format supports that. Ideally I want to be able to pull the first 1,024 bytes for the purpose of detecting the format, then replay those bytes again later. I'm considering this a stretch goal though. CSV is easy to parse as a stream - here’s [how sqlite-utils does it](https://github.com/simonw/sqlite-utils/blob/f1277f638f3a54a821db6e03cb980adad2f2fa35/sqlite_utils/cli.py#L630): dialect = ""excel-tab"" if tsv else ""excel"" with file_progress(json_file, silent=silent) as json_file: reader = csv_std.reader(json_file, dialect=dialect) headers = next(reader) docs = (dict(zip(headers, row)) for row in reader) Problem: using `db.insert_all()` could block for a long time on a big set of rows. Probably easiest to batch the records before calling `insert_all()` and then run a batch at a time using a `db.execute_write_fn()` call.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1161#issuecomment-752253095,https://api.github.com/repos/simonw/datasette/issues/1161,752253095,MDEyOklzc3VlQ29tbWVudDc1MjI1MzA5NQ==,9599,simonw,2020-12-29T21:49:57Z,2020-12-29T21:49:57Z,OWNER,https://github.com/Homebrew/homebrew-core/pull/67983,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776101101,Update a whole bunch of links to datasette.io instead of datasette.readthedocs.io, https://github.com/simonw/datasette/issues/1160#issuecomment-752236520,https://api.github.com/repos/simonw/datasette/issues/1160,752236520,MDEyOklzc3VlQ29tbWVudDc1MjIzNjUyMA==,9599,simonw,2020-12-29T20:48:51Z,2020-12-29T20:48:51Z,OWNER,It would be neat if `datasette insert` could accept a `--plugins-dir` option which allowed one-off format plugins to be registered. Bit tricky to implement since the `--format` Click option will already be populated by that plugin hook call.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751925934,https://api.github.com/repos/simonw/datasette/issues/1160,751925934,MDEyOklzc3VlQ29tbWVudDc1MTkyNTkzNA==,9599,simonw,2020-12-29T02:40:13Z,2020-12-29T20:25:57Z,OWNER,"Basic command design: datasette insert data.db blah.csv The options can include: - `--format` to specify the exact format - without this it will be guessed based on the filename - `--table` to specify the table (otherwise the filename is used) - `--pk` to specify one or more primary key columns - `--replace` to specify that existing rows with a matching primary key should be replaced - `--upsert` to specify that existing matching rows should be upserted - `--ignore` to ignore matching rows - `--alter` to alter the table to add missing columns - `--type column type` to specify the type of a column - useful when working with CSV or TSV files","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752208036,https://api.github.com/repos/simonw/datasette/issues/1160,752208036,MDEyOklzc3VlQ29tbWVudDc1MjIwODAzNg==,9599,simonw,2020-12-29T19:06:35Z,2020-12-29T19:06:35Z,OWNER,"If I'm going to execute 1000s of writes in an `async def` operation it may make sense to break that up into smaller chunks, so as not to block the event loop for too long. https://stackoverflow.com/a/36648102 and https://github.com/python/asyncio/issues/284 confirm that `await asyncio.sleep(0)` is the recommended way of doing this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-752203909,https://api.github.com/repos/simonw/datasette/issues/1160,752203909,MDEyOklzc3VlQ29tbWVudDc1MjIwMzkwOQ==,9599,simonw,2020-12-29T18:54:19Z,2020-12-29T18:54:19Z,OWNER,More thoughts on this: the key mechanism that populates the tables needs to be an `aysnc def` method of some sort so that it can run as part of the async loop in core Datasette - for importing from web uploads.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/417#issuecomment-752098906,https://api.github.com/repos/simonw/datasette/issues/417,752098906,MDEyOklzc3VlQ29tbWVudDc1MjA5ODkwNg==,82988,psychemedia,2020-12-29T14:34:30Z,2020-12-29T14:34:50Z,CONTRIBUTOR,"FWIW, I had a look at `watchdog` for a `datasette` powered Jupyter notebook search tool: https://github.com/ouseful-testing/nbsearch/blob/main/nbsearch/nbwatchdog.py Not a production thing, just an experiment trying to explore what might be possible...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/issues/1160#issuecomment-751947991,https://api.github.com/repos/simonw/datasette/issues/1160,751947991,MDEyOklzc3VlQ29tbWVudDc1MTk0Nzk5MQ==,9599,simonw,2020-12-29T05:06:50Z,2020-12-29T05:07:03Z,OWNER,"Given the URL option could it be possible for plugins to ""subscribe"" to URLs that keep on streaming? datasette insert db.db https://example.con/streaming-api \ --format api-stream","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751946262,https://api.github.com/repos/simonw/datasette/issues/1160,751946262,MDEyOklzc3VlQ29tbWVudDc1MTk0NjI2Mg==,9599,simonw,2020-12-29T04:56:12Z,2020-12-29T04:56:32Z,OWNER,"Potential design for this: a `datasette memory` command which takes most of the same arguments as `datasette serve` but starts an in-memory database and treats the command arguments as things that should be inserted into that in-memory database. tail -f access.log | datasette memory - \ --format clf -p 8002 -o","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751945094,https://api.github.com/repos/simonw/datasette/issues/1160,751945094,MDEyOklzc3VlQ29tbWVudDc1MTk0NTA5NA==,9599,simonw,2020-12-29T04:48:11Z,2020-12-29T04:48:11Z,OWNER,"It would be pretty cool if you could launch Datasette directly against an insert-compatible file or URL without first having to load it into a SQLite database file. Or imagine being able to tail a log file and like that directly into a new Datasette process, which then runs a web server with the UI while simultaneously continuing to load new entries from that log into the in-memory SQLite database that it is serving... Not quite sure what that CLI interface would look like. Maybe treat that as a future stretch goal for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751943837,https://api.github.com/repos/simonw/datasette/issues/1160,751943837,MDEyOklzc3VlQ29tbWVudDc1MTk0MzgzNw==,9599,simonw,2020-12-29T04:40:30Z,2020-12-29T04:40:30Z,OWNER,"The `insert` command should also accept URLs - anything starting with `http://` or `https://`. It should accept more than one file name at a time for bulk inserts. if using a URL that URL will be passed to the method that decides if a plugin implementation can handle the import or not. This will allow plugins to register themselves for specific websites.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751926437,https://api.github.com/repos/simonw/datasette/issues/1160,751926437,MDEyOklzc3VlQ29tbWVudDc1MTkyNjQzNw==,9599,simonw,2020-12-29T02:43:21Z,2020-12-29T02:43:37Z,OWNER,"Default formats to support: - CSV - TSV - JSON and newline-delimited JSON - YAML Each of these will be implemented as a default plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751926218,https://api.github.com/repos/simonw/datasette/issues/1160,751926218,MDEyOklzc3VlQ29tbWVudDc1MTkyNjIxOA==,9599,simonw,2020-12-29T02:41:57Z,2020-12-29T02:41:57Z,OWNER,"Other names I considered: - `datasette load` - `datasette import` - I decided to keep this name available for any future work that might involve plugins that help import data from APIs as opposed to inserting it from files","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/1160#issuecomment-751926095,https://api.github.com/repos/simonw/datasette/issues/1160,751926095,MDEyOklzc3VlQ29tbWVudDc1MTkyNjA5NQ==,9599,simonw,2020-12-29T02:41:15Z,2020-12-29T02:41:15Z,OWNER,"The UI can live at `/-/insert` and be available by default to the `root` user only. It can offer the following: - Upload a file and have the import type detected (equivalent to `datasette insert data.db thatfile.csv`) - Copy and paste the data to be inserted into a textarea - API equivalents of these","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/datasette/issues/675#issuecomment-751826749,https://api.github.com/repos/simonw/datasette/issues/675,751826749,MDEyOklzc3VlQ29tbWVudDc1MTgyNjc0OQ==,9599,simonw,2020-12-28T18:49:21Z,2020-12-28T18:49:21Z,OWNER,"That `--exec` could help solve all sorts of other problems too, like needing to `apt-get install` extra packages or download files from somewhere using `wget`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/675#issuecomment-751826621,https://api.github.com/repos/simonw/datasette/issues/675,751826621,MDEyOklzc3VlQ29tbWVudDc1MTgyNjYyMQ==,9599,simonw,2020-12-28T18:48:51Z,2020-12-28T18:48:51Z,OWNER,"I could make `--include` work if I also had a mechanism for running some shell commands inside the container at the end of the build - which users could then use to move files into the correct place. datasette publish cloudrun my.db --include src/ --exec 'mv /app/src/config.yml /etc/conf/config.yml'","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/417#issuecomment-751504136,https://api.github.com/repos/simonw/datasette/issues/417,751504136,MDEyOklzc3VlQ29tbWVudDc1MTUwNDEzNg==,212369,drewda,2020-12-27T19:02:06Z,2020-12-27T19:02:06Z,NONE,"Very much looking forward to seeing this functionality come together. This is probably out-of-scope for an initial release, but in the future it could be useful to also think of how to run this is a container'ized context. For example, an immutable datasette container that points to an S3 bucket of SQLite DBs or CSVs. Or an immutable datasette container pointing to a NFS volume elsewhere on a Kubernetes cluster.","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/issues/1150#issuecomment-751476406,https://api.github.com/repos/simonw/datasette/issues/1150,751476406,MDEyOklzc3VlQ29tbWVudDc1MTQ3NjQwNg==,18221871,noklam,2020-12-27T14:51:39Z,2020-12-27T14:51:39Z,NONE,"I like the idea of _internal, it's a nice way to get a data catalog quickly. I wonder if this trick applies to db other than SQLite.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/dogsheep/github-to-sqlite/pull/59#issuecomment-751375487,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59,751375487,MDEyOklzc3VlQ29tbWVudDc1MTM3NTQ4Nw==,631242,frosencrantz,2020-12-26T17:08:44Z,2020-12-26T17:08:44Z,CONTRIBUTOR,"Hi @simonw, do I need to do anything else for this PR to be considered to be included? I've tried using this project and it is quite nice to be able to explore a repository, but noticed that a couple commands don't allow you to use authorization from the environment variable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771872303,Remove unneeded exists=True for -a/--auth flag., https://github.com/simonw/datasette/issues/417#issuecomment-751127485,https://api.github.com/repos/simonw/datasette/issues/417,751127485,MDEyOklzc3VlQ29tbWVudDc1MTEyNzQ4NQ==,9599,simonw,2020-12-24T22:58:05Z,2020-12-24T22:58:05Z,OWNER,"That's a great idea. I'd ruled that out because working with the different operating system versions of those is tricky, but if `watchdog` can handle those differences for me this could be a really good option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/issues/417#issuecomment-751127384,https://api.github.com/repos/simonw/datasette/issues/417,751127384,MDEyOklzc3VlQ29tbWVudDc1MTEyNzM4NA==,1279360,dyllan-to-you,2020-12-24T22:56:48Z,2020-12-24T22:56:48Z,NONE,"Instead of scanning the directory every 10s, have you considered listening for the native system events to notify you of updates? I think python has a nice module to do this for you called [watchdog](https://pypi.org/project/watchdog/)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/dogsheep/dogsheep-photos/issues/28#issuecomment-751125270,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28,751125270,MDEyOklzc3VlQ29tbWVudDc1MTEyNTI3MA==,129786,jmelloy,2020-12-24T22:26:22Z,2020-12-24T22:26:22Z,NONE,This comes around if you’ve run the photo export without running an s3 upload. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",624490929,Invalid SQL no such table: main.uploads, https://github.com/simonw/datasette/pull/1159#issuecomment-750849460,https://api.github.com/repos/simonw/datasette/issues/1159,750849460,MDEyOklzc3VlQ29tbWVudDc1MDg0OTQ2MA==,22429695,codecov[bot],2020-12-24T11:07:35Z,2020-12-24T11:29:21Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=h1) Report > Merging [#1159](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=desc) (c820abd) into [main](https://codecov.io/gh/simonw/datasette/commit/a882d679626438ba0d809944f06f239bcba8ee96?el=desc) (a882d67) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1159 +/- ## ======================================= Coverage 91.55% 91.55% ======================================= Files 32 32 Lines 3930 3930 ======================================= Hits 3598 3598 Misses 332 332 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=footer). Last update [a882d67...c820abd](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",774332247,Improve the display of facets information, https://github.com/simonw/datasette/pull/1158#issuecomment-750390741,https://api.github.com/repos/simonw/datasette/issues/1158,750390741,MDEyOklzc3VlQ29tbWVudDc1MDM5MDc0MQ==,9599,simonw,2020-12-23T17:05:32Z,2020-12-23T17:05:32Z,OWNER,"Thanks for this! I'm fine keeping the `os.path` stuff as is.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",773913793,Modernize code to Python 3.6+, https://github.com/simonw/datasette/pull/1158#issuecomment-750389683,https://api.github.com/repos/simonw/datasette/issues/1158,750389683,MDEyOklzc3VlQ29tbWVudDc1MDM4OTY4Mw==,6774676,eumiro,2020-12-23T17:02:50Z,2020-12-23T17:02:50Z,CONTRIBUTOR,"The dict/set suggestion comes from `pyupgrade --py36-plus`, but then had to `black` the change. The rest comes from PyCharm's Inspect code function. I reviewed all the suggestions and fixed a thing or two, such as leading/trailing spaces in the docstrings or turned around the chained conditions. Then I tried to convert all `os.path/glob/open` to `Path`, but there were some local test issues, so I'll have to start over in smaller chunks if you want to have that too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",773913793,Modernize code to Python 3.6+, https://github.com/simonw/datasette/pull/1158#issuecomment-750373603,https://api.github.com/repos/simonw/datasette/issues/1158,750373603,MDEyOklzc3VlQ29tbWVudDc1MDM3MzYwMw==,9599,simonw,2020-12-23T16:26:21Z,2020-12-23T16:26:21Z,OWNER,"Did you use a tool to find these or are they all from manual code inspection? They look good - I particularly like the set/dict literals.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",773913793,Modernize code to Python 3.6+, https://github.com/simonw/datasette/pull/1158#issuecomment-750373496,https://api.github.com/repos/simonw/datasette/issues/1158,750373496,MDEyOklzc3VlQ29tbWVudDc1MDM3MzQ5Ng==,22429695,codecov[bot],2020-12-23T16:26:06Z,2020-12-23T16:26:06Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=h1) Report > Merging [#1158](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=desc) (37ce72f) into [main](https://codecov.io/gh/simonw/datasette/commit/90eba4c3ca569c57e96bce314e7ac8caf67d884e?el=desc) (90eba4c) will **not change** coverage. > The diff coverage is `87.50%`. [](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1158 +/- ## ======================================= Coverage 91.55% 91.55% ======================================= Files 32 32 Lines 3930 3930 ======================================= Hits 3598 3598 Misses 332 332 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `77.41% <ø> (ø)` | | | [datasette/facets.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2ZhY2V0cy5weQ==) | `89.04% <ø> (ø)` | | | [datasette/filters.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2ZpbHRlcnMucHk=) | `94.35% <ø> (ø)` | | | [datasette/hookspecs.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2hvb2tzcGVjcy5weQ==) | `100.00% <ø> (ø)` | | | [datasette/inspect.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2luc3BlY3QucHk=) | `36.11% <ø> (ø)` | | | [datasette/renderer.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3JlbmRlcmVyLnB5) | `94.02% <ø> (ø)` | | | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `95.01% <50.00%> (ø)` | | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.85% <100.00%> (ø)` | | | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.11% <100.00%> (ø)` | | | [datasette/utils/asgi.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL2FzZ2kucHk=) | `92.13% <100.00%> (ø)` | | | ... and [1 more](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree-more) | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=footer). Last update [90eba4c...37ce72f](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",773913793,Modernize code to Python 3.6+, https://github.com/simonw/datasette/issues/1099#issuecomment-749845797,https://api.github.com/repos/simonw/datasette/issues/1099,749845797,MDEyOklzc3VlQ29tbWVudDc0OTg0NTc5Nw==,9599,simonw,2020-12-23T00:13:29Z,2020-12-23T00:14:25Z,OWNER,"Also need to solve displaying these links in the opposite direction: https://latest.datasette.io/_internal/tables/fixtures,facet_cities <img width=""617"" alt=""_internal__tables"" src=""https://user-images.githubusercontent.com/9599/102944742-84cd3900-4470-11eb-95a0-bfaf00b17e82.png""> That page should link to lists of records in columns, foreign_keys and indexes - like this example: https://latest.datasette.io/fixtures/roadside_attractions/1 <img width=""712"" alt=""fixtures__roadside_attractions"" src=""https://user-images.githubusercontent.com/9599/102944816-c1009980-4470-11eb-9dbd-50d0ba4fd137.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743371103,Support linking to compound foreign keys, https://github.com/simonw/datasette/issues/1152#issuecomment-748206874,https://api.github.com/repos/simonw/datasette/issues/1152,748206874,MDEyOklzc3VlQ29tbWVudDc0ODIwNjg3NA==,9599,simonw,2020-12-18T17:03:00Z,2020-12-22T23:58:04Z,OWNER,"Another permissions thought: what if ALL Datasette permissions were default-deny, and plugins could only grant permission to things, not block permission? Right now a plugin can reply `False` to block, `True` to allow or `None` for ""I have no opinion on this, ask someone else"" - but even I'm confused by the interactions between block and allow and I implemented the system! If everything in Datasette was default-deny then the user could use `--public-view` as an option when starting the server to default-allow view actions. More importantly: plugins could return SQL statements that select a list of databases/tables the user is allowed access to. These could then be combined with `UNION` to create a full list of available resources.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747920515,https://api.github.com/repos/simonw/datasette/issues/1152,747920515,MDEyOklzc3VlQ29tbWVudDc0NzkyMDUxNQ==,9599,simonw,2020-12-18T07:29:21Z,2020-12-22T23:57:29Z,OWNER,Could I solve this using a configured canned query against the `_internal` tables with the actor's properties as inputs?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1099#issuecomment-749771231,https://api.github.com/repos/simonw/datasette/issues/1099,749771231,MDEyOklzc3VlQ29tbWVudDc0OTc3MTIzMQ==,9599,simonw,2020-12-22T20:54:25Z,2020-12-22T20:54:25Z,OWNER,"https://latest.datasette.io/_internal/foreign_keys (use https://latest.datasette.io/login-as-root first) is now a compound foreign key table: ```sql CREATE TABLE foreign_keys ( ""database_name"" TEXT, ""table_name"" TEXT, ""id"" INTEGER, ""seq"" INTEGER, ""table"" TEXT, ""from"" TEXT, ""to"" TEXT, ""on_update"" TEXT, ""on_delete"" TEXT, ""match"" TEXT, PRIMARY KEY (database_name, table_name, id, seq), FOREIGN KEY (database_name) REFERENCES databases(database_name), FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name) ); ``` Currently the `database_name` column becomes a link (because it's a single foreign key) but the `table_name` one remains a non-link: <img width=""1079"" alt=""_internal__foreign_keys__24_rows"" src=""https://user-images.githubusercontent.com/9599/102932110-87ba3080-4454-11eb-9ad5-b70a65129588.png""> My original idea for compound foreign keys was to turn both of those columns into links, but that doesn't fit here because `database_name` is already part of a different foreign key.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743371103,Support linking to compound foreign keys, https://github.com/simonw/datasette/issues/1152#issuecomment-749750995,https://api.github.com/repos/simonw/datasette/issues/1152,749750995,MDEyOklzc3VlQ29tbWVudDc0OTc1MDk5NQ==,9599,simonw,2020-12-22T20:05:30Z,2020-12-22T20:05:30Z,OWNER,"#1150 is landed now, which means there's a new, hidden `_internal` SQLite in-memory database containing all of the tables and databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/509#issuecomment-749749948,https://api.github.com/repos/simonw/datasette/issues/509,749749948,MDEyOklzc3VlQ29tbWVudDc0OTc0OTk0OA==,9599,simonw,2020-12-22T20:03:10Z,2020-12-22T20:03:10Z,OWNER,"If you open multiple files with the same filename, e.g. like this: datasette fixtures.db templates/fixtures.db plugins/fixtures.db You'll now get this: <img width=""702"" alt=""Datasette__fixtures__fixtures_2__fixtures_3"" src=""https://user-images.githubusercontent.com/9599/102928494-a4069f00-444d-11eb-9c1e-382a4e27bb51.png""> ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",456568880,Support opening multiple databases with the same stem, https://github.com/simonw/datasette/issues/509#issuecomment-749738241,https://api.github.com/repos/simonw/datasette/issues/509,749738241,MDEyOklzc3VlQ29tbWVudDc0OTczODI0MQ==,9599,simonw,2020-12-22T19:38:14Z,2020-12-22T19:38:14Z,OWNER,I'm fixing this in #1155.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",456568880,Support opening multiple databases with the same stem, https://github.com/simonw/datasette/issues/1155#issuecomment-749723557,https://api.github.com/repos/simonw/datasette/issues/1155,749723557,MDEyOklzc3VlQ29tbWVudDc0OTcyMzU1Nw==,9599,simonw,2020-12-22T19:08:27Z,2020-12-22T19:08:27Z,OWNER,"I'm going to have the `.add_database()` method select the name used in the path, de-duping against any existing names. It will then set database.name to that so that the database has access to its own name.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-748356492,https://api.github.com/repos/simonw/datasette/issues/1155,748356492,MDEyOklzc3VlQ29tbWVudDc0ODM1NjQ5Mg==,9599,simonw,2020-12-18T22:49:32Z,2020-12-22T01:13:05Z,OWNER,"There's some messy code that needs fixing here. The `datasette.databases` dictionary right now has a key that corresponds to the `/_internal` URL in the path, and a value that's a `Database()` object. BUT... the `Database()` object doesn't know what its key is. While fixing this I should fix the issue where Datasette gets confused by multiple databases with the same stem: https://github.com/simonw/datasette/issues/509","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1157#issuecomment-749179460,https://api.github.com/repos/simonw/datasette/issues/1157,749179460,MDEyOklzc3VlQ29tbWVudDc0OTE3OTQ2MA==,9599,simonw,2020-12-21T20:24:19Z,2020-12-21T20:24:19Z,OWNER,"Three places to fix: https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/tracer.py#L40-L42 https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/utils/__init__.py#L139-L152 https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/views/base.py#L460-L461","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",772438273,Use time.perf_counter() instead of time.time() to measure performance, https://github.com/simonw/datasette/issues/1155#issuecomment-749176936,https://api.github.com/repos/simonw/datasette/issues/1155,749176936,MDEyOklzc3VlQ29tbWVudDc0OTE3NjkzNg==,9599,simonw,2020-12-21T20:18:15Z,2020-12-21T20:18:15Z,OWNER,"Fun query: ```sql select table_name, group_concat(name, ', ') from columns group by database_name, table_name ``` https://latest.datasette.io/_internal?sql=select+table_name%2C+group_concat%28name%2C+%27%2C+%27%29+from+columns+group+by+database_name%2C+table_name","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-749170608,https://api.github.com/repos/simonw/datasette/issues/1155,749170608,MDEyOklzc3VlQ29tbWVudDc0OTE3MDYwOA==,9599,simonw,2020-12-21T20:01:47Z,2020-12-21T20:01:47Z,OWNER,I removed that `MEMORY` object() in dcdfb2c301341d45b66683e3e3be72f9c7585b2f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1156#issuecomment-749158111,https://api.github.com/repos/simonw/datasette/issues/1156,749158111,MDEyOklzc3VlQ29tbWVudDc0OTE1ODExMQ==,9599,simonw,2020-12-21T19:33:45Z,2020-12-21T19:33:45Z,OWNER,"One reason for this change: it means I can use that database for more stuff. I've been thinking about moving metadata storage there for example, which fits a database called `_internal` but not one called `_schemas`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",772408750,Rename _schemas to _internal, https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-748562330,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31,748562330,MDEyOklzc3VlQ29tbWVudDc0ODU2MjMzMA==,41546558,RhetTbull,2020-12-20T04:45:08Z,2020-12-20T04:45:08Z,CONTRIBUTOR,Fixes the issue mentioned here: https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771511344,Update for Big Sur, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748562288,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,748562288,MDEyOklzc3VlQ29tbWVudDc0ODU2MjI4OA==,41546558,RhetTbull,2020-12-20T04:44:22Z,2020-12-20T04:44:22Z,CONTRIBUTOR,"@nickvazz @simonw I opened a [PR](https://github.com/dogsheep/dogsheep-photos/pull/31) that replaces the SQL for `ZCOMPUTEDASSETATTRIBUTES` to use osxphotos which now exposes all this data and has been updated for Big Sur. I did regression tests to confirm the extracted data is identical, with one exception which should not affect operation: the old code pulled data from `ZCOMPUTEDASSETATTRIBUTES` for missing photos while the main loop ignores missing photos and does not add them to `apple_photos`. The new code does not add rows to the `apple_photos_scores` table for missing photos.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767,Expose scores from ZCOMPUTEDASSETATTRIBUTES, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436779,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,748436779,MDEyOklzc3VlQ29tbWVudDc0ODQzNjc3OQ==,41546558,RhetTbull,2020-12-19T07:49:00Z,2020-12-19T07:49:00Z,CONTRIBUTOR,@nickvazz ZGENERICASSET changed to ZASSET in Big Sur. Here's a list of other changes to the schema in Big Sur: https://github.com/RhetTbull/osxphotos/wiki/Changes-in-Photos-6---Big-Sur,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767,Expose scores from ZCOMPUTEDASSETATTRIBUTES, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,748436115,MDEyOklzc3VlQ29tbWVudDc0ODQzNjExNQ==,8573886,nickvazz,2020-12-19T07:43:38Z,2020-12-19T07:47:36Z,NONE,"Hey Simon! I really enjoy datasette so far, just started trying it out today following your iPhone photos [example](https://simonwillison.net/2020/May/21/dogsheep-photos/). I am not sure if you had run into this or not, but it seems like they might have changed one of the column names from `ZGENERICASSET` to `ZASSET`. Should I open a PR? Would change: - [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L209-L213) - [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L238) - [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L240)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767,Expose scores from ZCOMPUTEDASSETATTRIBUTES, https://github.com/dogsheep/twitter-to-sqlite/issues/53#issuecomment-748436453,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53,748436453,MDEyOklzc3VlQ29tbWVudDc0ODQzNjQ1Mw==,27,anotherjesse,2020-12-19T07:47:01Z,2020-12-19T07:47:01Z,NONE,"I think this should probably be closed as won't fix. Attempting to make a patch for this I realized that the since_id would limit to tweets posted since that since_id, not when it was favorited. So favoriting something in the older would be missed if you used `--since` with a cron script Better to just use `--stop_after` set to a couple hundred","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771324837,--since support for favorites, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-748436195,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,748436195,MDEyOklzc3VlQ29tbWVudDc0ODQzNjE5NQ==,8573886,nickvazz,2020-12-19T07:44:32Z,2020-12-19T07:44:49Z,NONE,"I have also run into this a bit, would it be possible to post your `requirements.txt` so I can try and reproduce your [blog post](https://simonwillison.net/2020/May/21/dogsheep-photos/)?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990,bpylist.archiver.CircularReference: archive has a cycle with uid(13), https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426877,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426877,MDEyOklzc3VlQ29tbWVudDc0ODQyNjg3Nw==,9599,simonw,2020-12-19T06:16:11Z,2020-12-19T06:16:11Z,MEMBER,"Here's why: if ""fts5"" in str(e): But the error being raised here is: sqlite3.OperationalError: no such column: to I'm going to attempt the escaped on on every error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301,"Searching for ""github-to-sqlite"" throws an error", https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426663,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426663,MDEyOklzc3VlQ29tbWVudDc0ODQyNjY2Mw==,9599,simonw,2020-12-19T06:14:06Z,2020-12-19T06:14:06Z,MEMBER,Looks like I already do that here: https://github.com/dogsheep/dogsheep-beta/blob/9ba4401017ac24ffa3bc1db38e0910ea49de7616/dogsheep_beta/__init__.py#L141-L146,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301,"Searching for ""github-to-sqlite"" throws an error", https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426581,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426581,MDEyOklzc3VlQ29tbWVudDc0ODQyNjU4MQ==,9599,simonw,2020-12-19T06:13:17Z,2020-12-19T06:13:17Z,MEMBER,"One fix for this could be to try running the raw query, but if it throws an error run it again with the query escaped.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301,"Searching for ""github-to-sqlite"" throws an error", https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426501,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426501,MDEyOklzc3VlQ29tbWVudDc0ODQyNjUwMQ==,9599,simonw,2020-12-19T06:12:22Z,2020-12-19T06:12:22Z,MEMBER,I deliberately added support for advanced FTS in https://github.com/dogsheep/dogsheep-beta/commit/cbb2491b85d7ff416d6d429b60109e6c2d6d50b9 for #13 but that's the cause of this bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301,"Searching for ""github-to-sqlite"" throws an error", https://github.com/simonw/datasette/issues/1155#issuecomment-748397998,https://api.github.com/repos/simonw/datasette/issues/1155,748397998,MDEyOklzc3VlQ29tbWVudDc0ODM5Nzk5OA==,9599,simonw,2020-12-19T01:28:18Z,2020-12-19T01:28:18Z,OWNER,"`datasette-graphql` returns an error due to this issue: <img width=""878"" alt=""GraphiQL_and_fixtures__compound_three_primary_keys__1_001_rows"" src=""https://user-images.githubusercontent.com/9599/102677257-61e01380-4156-11eb-8f88-fe26b9c4a40b.png""> On the console: ``` INFO: 127.0.0.1:63116 - ""POST /graphql/_internal HTTP/1.1"" 500 Internal Server Error Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/promise/promise.py"", line 844, in handle_future_result resolve(future.result()) File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/datasette_graphql/utils.py"", line 603, in resolve_table data, _, _ = await view.data( File ""/Users/simon/Dropbox/Development/datasette/datasette/views/table.py"", line 304, in data db = self.ds.databases[database] graphql.error.located_error.GraphQLLocatedError: ':memory:c6dd5abe1a757a7de00d99b699175bd33d9a575f05b5751bf856b8656fb07edd' ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-748368660,https://api.github.com/repos/simonw/datasette/issues/1155,748368660,MDEyOklzc3VlQ29tbWVudDc0ODM2ODY2MA==,9599,simonw,2020-12-18T23:18:04Z,2020-12-19T01:12:00Z,OWNER,"A `Database` should have a `.name` which is unique across the Datasette instance and is used in the URL. The `path` should be optional, only set for file databases. A new `.memory_name` property can be used for shared memory databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-748368938,https://api.github.com/repos/simonw/datasette/issues/1155,748368938,MDEyOklzc3VlQ29tbWVudDc0ODM2ODkzOA==,9599,simonw,2020-12-18T23:19:04Z,2020-12-18T23:19:04Z,OWNER,`Database` internal class is documented here: https://docs.datasette.io/en/latest/internals.html#database-class,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-748368384,https://api.github.com/repos/simonw/datasette/issues/1155,748368384,MDEyOklzc3VlQ29tbWVudDc0ODM2ODM4NA==,9599,simonw,2020-12-18T23:17:00Z,2020-12-18T23:17:00Z,OWNER,Here's the commit where I added it. https://github.com/simonw/datasette/commit/9743e1d91b5f0a2b3c1c0bd6ffce8739341f43c4 - I didn't yet have the `.add_database()` mechanism. Today the `MEMORY` object bit is no longer needed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/1155#issuecomment-748367922,https://api.github.com/repos/simonw/datasette/issues/1155,748367922,MDEyOklzc3VlQ29tbWVudDc0ODM2NzkyMg==,9599,simonw,2020-12-18T23:15:24Z,2020-12-18T23:15:24Z,OWNER,"The code for building up that `.databases` dictionary is a bit convoluted. Here's the code that adds a `:memory:` database if the user specified `--memory` OR if there are no files to be attached: https://github.com/simonw/datasette/blob/ebc7aa287c99fe6114b79aeab8efb8d4489a6182/datasette/app.py#L221-L241 I'm not sure why I wrote it this way, instead of just calling `.add_database("":memory:"", Database(..., is_memory=True)`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database, https://github.com/simonw/datasette/issues/509#issuecomment-748356637,https://api.github.com/repos/simonw/datasette/issues/509,748356637,MDEyOklzc3VlQ29tbWVudDc0ODM1NjYzNw==,9599,simonw,2020-12-18T22:50:03Z,2020-12-18T22:50:03Z,OWNER,Related problem caused by the new `_schemas` database - if a user attempts to open their own `_schemas.db` file it will fail. I'd like to open and mount that as `/_schemas_` instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",456568880,Support opening multiple databases with the same stem, https://github.com/simonw/datasette/issues/1150#issuecomment-748354841,https://api.github.com/repos/simonw/datasette/issues/1150,748354841,MDEyOklzc3VlQ29tbWVudDc0ODM1NDg0MQ==,9599,simonw,2020-12-18T22:43:49Z,2020-12-18T22:43:49Z,OWNER,"For a demo, visit https://latest.datasette.io/login-as-root and then hit https://latest.datasette.io/_schemas","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-748352106,https://api.github.com/repos/simonw/datasette/issues/1150,748352106,MDEyOklzc3VlQ29tbWVudDc0ODM1MjEwNg==,9599,simonw,2020-12-18T22:34:40Z,2020-12-18T22:34:40Z,OWNER,"Needs documentation, but I can wait to write that until I've tested out the feature a bit more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-748351350,https://api.github.com/repos/simonw/datasette/issues/1150,748351350,MDEyOklzc3VlQ29tbWVudDc0ODM1MTM1MA==,9599,simonw,2020-12-18T22:32:13Z,2020-12-18T22:32:13Z,OWNER,Getting all the tests to pass is tricky because this adds a whole extra database to Datasette - and there's various code that loops through `ds.databases` as part of the tests.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/493#issuecomment-748305976,https://api.github.com/repos/simonw/datasette/issues/493,748305976,MDEyOklzc3VlQ29tbWVudDc0ODMwNTk3Ng==,50527,jefftriplett,2020-12-18T20:34:39Z,2020-12-18T20:34:39Z,CONTRIBUTOR,"I can't keep up with the renaming contexts, but I like having the ability to run datasette+ datasette-ripgrep against different configs: ```shell datasette serve --metadata=./metadata.json ``` I have one for all of my code and one per client who has lots of code. So as long as I can point to datasette to something, it's easy to work with. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",449886319,Rename metadata.json to config.json, https://github.com/simonw/datasette/issues/1150#issuecomment-748260875,https://api.github.com/repos/simonw/datasette/issues/1150,748260875,MDEyOklzc3VlQ29tbWVudDc0ODI2MDg3NQ==,9599,simonw,2020-12-18T18:55:12Z,2020-12-18T18:55:12Z,OWNER,"I'm going to move the code into a `utils/schemas.py` module, to avoid further extending the `Datasette` class definition and to make it more easily testable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-748260118,https://api.github.com/repos/simonw/datasette/issues/1150,748260118,MDEyOklzc3VlQ29tbWVudDc0ODI2MDExOA==,9599,simonw,2020-12-18T18:54:12Z,2020-12-18T18:54:12Z,OWNER,"I'm going to tidy this up and land it. A couple of additional decisions: - The database will be called `/_schemas` - By default it will only be visible to `root` - thus avoiding having to solve the permissions problem with regards to users seeing schemas for tables that are otherwise invisible to them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/493#issuecomment-747966232,https://api.github.com/repos/simonw/datasette/issues/493,747966232,MDEyOklzc3VlQ29tbWVudDc0Nzk2NjIzMg==,9599,simonw,2020-12-18T09:19:41Z,2020-12-18T09:19:41Z,OWNER,Is there any reason to keep `--setting` rather than moving those items into a `configure.json` file with all the configuration options that currently live in `metadata.json`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",449886319,Rename metadata.json to config.json, https://github.com/simonw/datasette/issues/1152#issuecomment-747921195,https://api.github.com/repos/simonw/datasette/issues/1152,747921195,MDEyOklzc3VlQ29tbWVudDc0NzkyMTE5NQ==,9599,simonw,2020-12-18T07:31:25Z,2020-12-18T07:31:25Z,OWNER,It's also a really good fit for the new mechanism that's coming together in #1150.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747920852,https://api.github.com/repos/simonw/datasette/issues/1152,747920852,MDEyOklzc3VlQ29tbWVudDc0NzkyMDg1Mg==,9599,simonw,2020-12-18T07:30:22Z,2020-12-18T07:30:22Z,OWNER,Redefining all Datasette permissions in terms of SQL queries that return the set of databases and tables that the user is allowed to interact with does feel VERY Datasette-y.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747920087,https://api.github.com/repos/simonw/datasette/issues/1152,747920087,MDEyOklzc3VlQ29tbWVudDc0NzkyMDA4Nw==,9599,simonw,2020-12-18T07:27:58Z,2020-12-18T07:28:30Z,OWNER,"I want to keep the existing `metadata.json` ""allow"" blocks mechanism working. Note that if you have 1,000 tables and a permissions policy you won't be using ""allow"" blocks, you'll be using a more sophisticated permissions plugin instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747919782,https://api.github.com/repos/simonw/datasette/issues/1152,747919782,MDEyOklzc3VlQ29tbWVudDc0NzkxOTc4Mg==,9599,simonw,2020-12-18T07:27:01Z,2020-12-18T07:27:01Z,OWNER,"Perhaps this can be solved by keeping the existing plugin hooks and adding new, optional ones for bulk lookups. If your plugin doesn't implement the bulk lookup hooks Datasette will do an inefficient loop through everything checking permissions on each one. If you DO implement it you can speed things up dramatically. Not sure if this would solve the homepage problem though, where you might need to run 1,000 table permission checks. That's more a case where you want to think in terms of a SQL where clause.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1150#issuecomment-747893704,https://api.github.com/repos/simonw/datasette/issues/1150,747893704,MDEyOklzc3VlQ29tbWVudDc0Nzg5MzcwNA==,9599,simonw,2020-12-18T06:19:13Z,2020-12-18T06:19:13Z,OWNER,I'm not going to block this issue on permissions - I will tackle the efficient bulk permissions problem in #1152.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1152#issuecomment-747893423,https://api.github.com/repos/simonw/datasette/issues/1152,747893423,MDEyOklzc3VlQ29tbWVudDc0Nzg5MzQyMw==,9599,simonw,2020-12-18T06:18:24Z,2020-12-18T06:18:24Z,OWNER,"What would Datasette's permission hooks look like if they all dealt with sets of items rather than individual items? So plugins could return a set of items that the user has permission to access, or even a WHERE clause?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747892731,https://api.github.com/repos/simonw/datasette/issues/1152,747892731,MDEyOklzc3VlQ29tbWVudDc0Nzg5MjczMQ==,9599,simonw,2020-12-18T06:16:29Z,2020-12-18T06:16:29Z,OWNER,"One enormous advantage I have is that after #1150 I will have a database table full of databases and tables that I can execute queries against. This means I could calculate visible tables using SQL where clauses, which should be easily fast enough even against ten thousand plus tables. The catch is the permissions hooks. Since I haven't hit Datasette 1.0 yet maybe I should redesign those hooks to work against the new in-memory database schema stuff?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1152#issuecomment-747891854,https://api.github.com/repos/simonw/datasette/issues/1152,747891854,MDEyOklzc3VlQ29tbWVudDc0Nzg5MTg1NA==,9599,simonw,2020-12-18T06:14:09Z,2020-12-18T06:14:15Z,OWNER,"This is a classic challenge in permissions systems. If I want Datasette to be able to handle thousands of tables I need a reasonable solution for it. Twitter conversation: https://twitter.com/simonw/status/1339791768842248192","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/1150#issuecomment-747864831,https://api.github.com/repos/simonw/datasette/issues/1150,747864831,MDEyOklzc3VlQ29tbWVudDc0Nzg2NDgzMQ==,9599,simonw,2020-12-18T04:46:18Z,2020-12-18T04:46:18Z,OWNER,"The homepage currently performs a massive flurry of permission checks - one for each, database, table and view: https://github.com/simonw/datasette/blob/0.53/datasette/views/index.py#L21-L75 A paginated version of this is a little daunting as the permission checks would have to be carried out in every single table just to calculate the count that will be paginated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747864080,https://api.github.com/repos/simonw/datasette/issues/1150,747864080,MDEyOklzc3VlQ29tbWVudDc0Nzg2NDA4MA==,9599,simonw,2020-12-18T04:43:29Z,2020-12-18T04:43:29Z,OWNER,"I may be overthinking that problem. Many queries are fast in SQLite. If a Datasette instance has 1,000 connected tables will even that be a performance problem for permission checks? I should benchmark to find out.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747862001,https://api.github.com/repos/simonw/datasette/issues/1150,747862001,MDEyOklzc3VlQ29tbWVudDc0Nzg2MjAwMQ==,9599,simonw,2020-12-18T04:35:34Z,2020-12-18T04:35:34Z,OWNER,"I do need to solve the permissions problem properly though, because one of the goals of this system is to provide a paginated, searchable list of databases and tables for the homepage of the instance - #991. As such, the homepage will need to be able to display only the tables and databases that the user has permission to view.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747861556,https://api.github.com/repos/simonw/datasette/issues/1150,747861556,MDEyOklzc3VlQ29tbWVudDc0Nzg2MTU1Ng==,9599,simonw,2020-12-18T04:33:45Z,2020-12-18T04:33:45Z,OWNER,"One solution on permissions: if Datasette had an efficient way of saying ""list the tables that this user has access to"" I could use that as a filter any time the user views the schema information. The implementation could be tricky though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747861357,https://api.github.com/repos/simonw/datasette/issues/1150,747861357,MDEyOklzc3VlQ29tbWVudDc0Nzg2MTM1Nw==,9599,simonw,2020-12-18T04:32:52Z,2020-12-18T04:32:52Z,OWNER,"I need to figure out how this will interact with Datasette permissions. If some tables are private, but others are public, should users be able to see the private tables listed in the schema metadata? If not, how can that mechanism work?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747833639,https://api.github.com/repos/simonw/datasette/issues/1150,747833639,MDEyOklzc3VlQ29tbWVudDc0NzgzMzYzOQ==,9599,simonw,2020-12-18T02:49:40Z,2020-12-18T03:52:12Z,OWNER,"I'm going to use five tables to start off with: - `databases` - a list of databases. Each one has a `name`, `path` (if it's on disk), `is_memory`, `schema_version` - `tables` - a list of tables. Each row is `database_name`, `table_name`, `sql` (the create table statement) - may add more tables in the future, in particular maybe a `last_row_count` to cache results of counting the rows. - `columns` - a list of columns. It's the output of `pragma_table_xinfo` with the `database_name` and `table_name` columns added at the beginning. - `foreign_keys` - a list of foreign keys - `pragma_foreign_key_list` output plus `database_name` and `table_name`. - `indexes` - a list of indexes - `pragma_table_xinfo` output plus `database_name` and `table_name`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747847405,https://api.github.com/repos/simonw/datasette/issues/1150,747847405,MDEyOklzc3VlQ29tbWVudDc0Nzg0NzQwNQ==,9599,simonw,2020-12-18T03:36:04Z,2020-12-18T03:36:04Z,OWNER,I could have another table that stores the combined rows from `sqlite_máster` on every connected database so I have a copy of the schema SQL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747847180,https://api.github.com/repos/simonw/datasette/issues/1150,747847180,MDEyOklzc3VlQ29tbWVudDc0Nzg0NzE4MA==,9599,simonw,2020-12-18T03:35:15Z,2020-12-18T03:35:15Z,OWNER,"Simpler implementation idea: a Datasette method `.refresh_schemas()` which loops through all known databases, checks their schema version and updates the in-memory schemas database if they have changed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747834762,https://api.github.com/repos/simonw/datasette/issues/1150,747834762,MDEyOklzc3VlQ29tbWVudDc0NzgzNDc2Mg==,9599,simonw,2020-12-18T02:53:22Z,2020-12-18T02:53:22Z,OWNER,I think I'm going to have to build this without using the `pragma_x()` SQL functions as they were only added in 3.16 in 2017-01-02 and I've seen plenty of Datasette instances running on older versions of SQLite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747834462,https://api.github.com/repos/simonw/datasette/issues/1150,747834462,MDEyOklzc3VlQ29tbWVudDc0NzgzNDQ2Mg==,9599,simonw,2020-12-18T02:52:19Z,2020-12-18T02:52:26Z,OWNER,Maintaining this database will be the responsibility of a subclass of `Database` called `_SchemaDatabase` which will be managed by the `Datasette` instance.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747834113,https://api.github.com/repos/simonw/datasette/issues/1150,747834113,MDEyOklzc3VlQ29tbWVudDc0NzgzNDExMw==,9599,simonw,2020-12-18T02:51:13Z,2020-12-18T02:51:20Z,OWNER,"SQLite uses `indexes` rather than `indices` as the plural, so I'll go with that: https://sqlite.org/lang_createindex.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747809670,https://api.github.com/repos/simonw/datasette/issues/1150,747809670,MDEyOklzc3VlQ29tbWVudDc0NzgwOTY3MA==,9599,simonw,2020-12-18T01:29:30Z,2020-12-18T01:29:30Z,OWNER,I've been rediscovering the pattern I already documented in this TIL: https://github.com/simonw/til/blob/main/sqlite/list-all-columns-in-a-database.md#better-alternative-using-a-join,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747807891,https://api.github.com/repos/simonw/datasette/issues/1150,747807891,MDEyOklzc3VlQ29tbWVudDc0NzgwNzg5MQ==,9599,simonw,2020-12-18T01:23:59Z,2020-12-18T01:23:59Z,OWNER,"https://www.sqlite.org/pragma.html#pragfunc says: > * This feature is experimental and is subject to change. Further documentation will become available if and when the table-valued functions for PRAGMAs feature becomes officially supported. > * The table-valued functions for PRAGMA feature was added in SQLite version 3.16.0 (2017-01-02). Prior versions of SQLite cannot use this feature. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747807289,https://api.github.com/repos/simonw/datasette/issues/1150,747807289,MDEyOklzc3VlQ29tbWVudDc0NzgwNzI4OQ==,9599,simonw,2020-12-18T01:22:05Z,2020-12-18T01:22:05Z,OWNER,"Here's a simpler query pattern (not using CTEs so should work on older versions of SQLite) - this one lists all indexes for all tables: ```sql select sqlite_master.name as 'table', indexes.* from sqlite_master join pragma_index_list(sqlite_master.name) indexes where sqlite_master.type = 'table' ``` https://latest.datasette.io/fixtures?sql=select%0D%0A++sqlite_master.name+as+%27table%27%2C%0D%0A++indexes.*%0D%0Afrom%0D%0A++sqlite_master%0D%0A++join+pragma_index_list%28sqlite_master.name%29+indexes%0D%0Awhere%0D%0A++sqlite_master.type+%3D+%27table%27","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747805275,https://api.github.com/repos/simonw/datasette/issues/1150,747805275,MDEyOklzc3VlQ29tbWVudDc0NzgwNTI3NQ==,9599,simonw,2020-12-18T01:15:27Z,2020-12-18T01:16:17Z,OWNER,"This query uses a join to pull foreign key information for every table: https://latest.datasette.io/fixtures?sql=with+tables+as+%28%0D%0A++select%0D%0A++++name%0D%0A++from%0D%0A++++sqlite_master%0D%0A++where%0D%0A++++type+%3D+%27table%27%0D%0A%29%0D%0Aselect%0D%0A++tables.name+as+%27table%27%2C%0D%0A++foo.*%0D%0Afrom%0D%0A++tables%0D%0A++join+pragma_foreign_key_list%28tables.name%29+foo ```sql with tables as ( select name from sqlite_master where type = 'table' ) select tables.name as 'table', foo.* from tables join pragma_foreign_key_list(tables.name) foo ``` Same query for `pragma_table_xinfo`: https://latest.datasette.io/fixtures?sql=with+tables+as+%28%0D%0A++select%0D%0A++++name%0D%0A++from%0D%0A++++sqlite_master%0D%0A++where%0D%0A++++type+%3D+%27table%27%0D%0A%29%0D%0Aselect%0D%0A++tables.name+as+%27table%27%2C%0D%0A++foo.*%0D%0Afrom%0D%0A++tables%0D%0A++join+pragma_table_xinfo%28tables.name%29+foo","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747804254,https://api.github.com/repos/simonw/datasette/issues/1150,747804254,MDEyOklzc3VlQ29tbWVudDc0NzgwNDI1NA==,9599,simonw,2020-12-18T01:12:13Z,2020-12-18T01:12:13Z,OWNER,"Prototype: https://latest.datasette.io/fixtures?sql=select+%27facetable%27+as+%27table%27%2C+*+from+pragma_table_xinfo%28%27facetable%27%29%0D%0Aunion%0D%0Aselect+%27searchable%27+as+%27table%27%2C+*+from+pragma_table_xinfo%28%27searchable%27%29%0D%0Aunion%0D%0Aselect+%27compound_three_primary_keys%27+as+%27table%27%2C+*+from+pragma_table_xinfo%28%27compound_three_primary_keys%27%29 ```sql select 'facetable' as 'table', * from pragma_table_xinfo('facetable') union select 'searchable' as 'table', * from pragma_table_xinfo('searchable') union select 'compound_three_primary_keys' as 'table', * from pragma_table_xinfo('compound_three_primary_keys') ``` table | cid | name | type | notnull | dflt_value | pk | hidden -- | -- | -- | -- | -- | -- | -- | -- compound_three_primary_keys | 0 | pk1 | varchar(30) | 0 | | 1 | 0 compound_three_primary_keys | 1 | pk2 | varchar(30) | 0 | | 2 | 0 compound_three_primary_keys | 2 | pk3 | varchar(30) | 0 | | 3 | 0 compound_three_primary_keys | 3 | content | text | 0 | | 0 | 0 facetable | 0 | pk | integer | 0 | | 1 | 0 facetable | 1 | created | text | 0 | | 0 | 0 facetable | 2 | planet_int | integer | 0 | | 0 | 0 facetable | 3 | on_earth | integer | 0 | | 0 | 0 facetable | 4 | state | text | 0 | | 0 | 0 facetable | 5 | city_id | integer | 0 | | 0 | 0 facetable | 6 | neighborhood | text | 0 | | 0 | 0 facetable | 7 | tags | text | 0 | | 0 | 0 facetable | 8 | complex_array | text | 0 | | 0 | 0 facetable | 9 | distinct_some_null | | 0 | | 0 | 0 searchable | 0 | pk | integer | 0 | | 1 | 0 searchable | 1 | text1 | text | 0 | | 0 | 0 searchable | 2 | text2 | text | 0 | | 0 | 0 searchable | 3 | name with . and spaces | text | 0 | | 0 | 0","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747803268,https://api.github.com/repos/simonw/datasette/issues/1150,747803268,MDEyOklzc3VlQ29tbWVudDc0NzgwMzI2OA==,9599,simonw,2020-12-18T01:08:40Z,2020-12-18T01:08:40Z,OWNER,"Next step: design a schema for the in-memory database table that exposes all of the tables. I want to support things like: - Show me all of the tables - Show me the columns in a table - Show me all tables that contain a `tags` column - Show me the indexes - Show me every table configured for full-text search Maybe a starting point would be to build concrete tables using the results of things like `PRAGMA foreign_key_list(table)` and `PRAGMA table_xinfo(table)` - note though that `table_xinfo` is SQLite 3.26.0 or higher, as shown here: https://github.com/simonw/datasette/blob/5e9895c67f08e9f42acedd3d6d29512ac446e15f/datasette/utils/__init__.py#L563-L579","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1151#issuecomment-747801751,https://api.github.com/repos/simonw/datasette/issues/1151,747801751,MDEyOklzc3VlQ29tbWVudDc0NzgwMTc1MQ==,9599,simonw,2020-12-18T01:03:39Z,2020-12-18T01:03:39Z,OWNER,"This feature is illustrated by the tests: https://github.com/simonw/datasette/blob/5e9895c67f08e9f42acedd3d6d29512ac446e15f/tests/test_internals_database.py#L469-L496 I added new documentation for the `Datasette()` constructor here as well: https://docs.datasette.io/en/latest/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747801084,https://api.github.com/repos/simonw/datasette/issues/1151,747801084,MDEyOklzc3VlQ29tbWVudDc0NzgwMTA4NA==,9599,simonw,2020-12-18T01:01:26Z,2020-12-18T01:01:26Z,OWNER,"I tested this with a one-off plugin and it worked! ```python from datasette import hookimpl from datasette.database import Database @hookimpl def startup(datasette): datasette.add_database(""statistics"", Database( datasette, memory_name=""statistics"" )) ``` This created a `/statistics` database when I ran `datasette` - and if I installed https://github.com/simonw/datasette-write I could then create tables in it which persisted until I restarted the server.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747784199,https://api.github.com/repos/simonw/datasette/issues/1151,747784199,MDEyOklzc3VlQ29tbWVudDc0Nzc4NDE5OQ==,9599,simonw,2020-12-18T00:09:36Z,2020-12-18T00:09:36Z,OWNER,"Is it possible to connect to a memory database in read-only mode? `file:foo?mode=memory&cache=shared&mode=ro` isn't valid because it features `mode=` more than once. https://stackoverflow.com/a/40548682 suggests using `PRAGMA query_only` on the connection instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747775245,https://api.github.com/repos/simonw/datasette/issues/1151,747775245,MDEyOklzc3VlQ29tbWVudDc0Nzc3NTI0NQ==,9599,simonw,2020-12-17T23:43:41Z,2020-12-17T23:56:27Z,OWNER,"I'm going to add an argument to the `Database()` constructor which means ""connect to named in-memory database called X"". ```python db = Database(ds, memory_name=""datasette"") ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747779056,https://api.github.com/repos/simonw/datasette/issues/1151,747779056,MDEyOklzc3VlQ29tbWVudDc0Nzc3OTA1Ng==,9599,simonw,2020-12-17T23:55:57Z,2020-12-17T23:55:57Z,OWNER,Wait I do use it - if you run `datasette --memory` - which is useful for trying things out in SQL that doesn't need to run against a table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747775792,https://api.github.com/repos/simonw/datasette/issues/1151,747775792,MDEyOklzc3VlQ29tbWVudDc0Nzc3NTc5Mg==,9599,simonw,2020-12-17T23:45:20Z,2020-12-17T23:45:20Z,OWNER,"Do I use the current `is_memory=` boolean anywhere at the moment? https://ripgrep.datasette.io/-/ripgrep?pattern=is_memory - doesn't look like it. I may remove that feature, since it's not actually useful, and replace it with a mechanism for creating shared named memory databases instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747774855,https://api.github.com/repos/simonw/datasette/issues/1151,747774855,MDEyOklzc3VlQ29tbWVudDc0Nzc3NDg1NQ==,9599,simonw,2020-12-17T23:42:34Z,2020-12-17T23:42:34Z,OWNER,"This worked as a prototype: ```diff diff --git a/datasette/database.py b/datasette/database.py index 412e0c5..a90e617 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -24,11 +24,12 @@ connections = threading.local() class Database: - def __init__(self, ds, path=None, is_mutable=False, is_memory=False): + def __init__(self, ds, path=None, is_mutable=False, is_memory=False, uri=None): self.ds = ds self.path = path self.is_mutable = is_mutable self.is_memory = is_memory + self.uri = uri self.hash = None self.cached_size = None self.cached_table_counts = None @@ -46,6 +47,8 @@ class Database: } def connect(self, write=False): + if self.uri: + return sqlite3.connect(self.uri, uri=True, check_same_thread=False) if self.is_memory: return sqlite3.connect("":memory:"") # mode=ro or immutable=1? ``` Then in `ipython`: ``` from datasette.app import Datasette from datasette.database import Database ds = Datasette([]) db = Database(ds, uri=""file:datasette?mode=memory&cache=shared"", is_memory=True) await db.execute_write(""create table foo (bar text)"") await db.table_names() # Outputs [""foo""] db2 = Database(ds, uri=""file:datasette?mode=memory&cache=shared"", is_memory=True) await db2.table_names() # Also outputs [""foo""] ``` ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747770581,https://api.github.com/repos/simonw/datasette/issues/1151,747770581,MDEyOklzc3VlQ29tbWVudDc0Nzc3MDU4MQ==,9599,simonw,2020-12-17T23:31:18Z,2020-12-17T23:32:07Z,OWNER,"This works in `ipython`: ``` In [1]: import sqlite3 In [2]: c1 = sqlite3.connect(""file:datasette?mode=memory&cache=shared"", uri=True) In [3]: c2 = sqlite3.connect(""file:datasette?mode=memory&cache=shared"", uri=True) In [4]: c1.executescript(""CREATE TABLE hello (world TEXT)"") Out[4]: <sqlite3.Cursor at 0x1104addc0> In [5]: c1.execute(""select * from sqlite_master"").fetchall() Out[5]: [('table', 'hello', 'hello', 2, 'CREATE TABLE hello (world TEXT)')] In [6]: c2.execute(""select * from sqlite_master"").fetchall() Out[6]: [('table', 'hello', 'hello', 2, 'CREATE TABLE hello (world TEXT)')] In [7]: c3 = sqlite3.connect(""file:datasette?mode=memory&cache=shared"", uri=True) In [9]: c3.execute(""select * from sqlite_master"").fetchall() Out[9]: [('table', 'hello', 'hello', 2, 'CREATE TABLE hello (world TEXT)')] In [10]: c4 = sqlite3.connect(""file:datasette?mode=memory"", uri=True) In [11]: c4.execute(""select * from sqlite_master"").fetchall() Out[11]: [] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747770082,https://api.github.com/repos/simonw/datasette/issues/1151,747770082,MDEyOklzc3VlQ29tbWVudDc0Nzc3MDA4Mg==,9599,simonw,2020-12-17T23:29:53Z,2020-12-17T23:29:53Z,OWNER,I'm going to try with `file:datasette?mode=memory&cache=shared`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1151#issuecomment-747769830,https://api.github.com/repos/simonw/datasette/issues/1151,747769830,MDEyOklzc3VlQ29tbWVudDc0Nzc2OTgzMA==,9599,simonw,2020-12-17T23:29:08Z,2020-12-17T23:29:08Z,OWNER,"https://sqlite.org/inmemorydb.html > The database ceases to exist as soon as the database connection is closed. Every :memory: database is distinct from every other. So, opening two database connections each with the filename "":memory:"" will create two independent in-memory databases. > > [...] > > The special `"":memory:""` filename also works when using URI filenames. For example: > > rc = sqlite3_open(""file::memory:"", &db); > > [...] > > However, the same in-memory database can be opened by two or more database connections as follows: > > rc = sqlite3_open(""file::memory:?cache=shared"", &db); > > [...] > If two or more distinct but shareable in-memory databases are needed in a single process, then the mode=memory query parameter can be used with a URI filename to create a named in-memory database: > > rc = sqlite3_open(""file:memdb1?mode=memory&cache=shared"", &db); ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770448622,Database class mechanism for cross-connection in-memory databases, https://github.com/simonw/datasette/issues/1150#issuecomment-747768112,https://api.github.com/repos/simonw/datasette/issues/1150,747768112,MDEyOklzc3VlQ29tbWVudDc0Nzc2ODExMg==,9599,simonw,2020-12-17T23:25:21Z,2020-12-17T23:25:21Z,OWNER,"Next challenge: figure out how to use the `Database` class from https://github.com/simonw/datasette/blob/0.53/datasette/database.py for an in-memory database which persists data for the duration of the lifetime of the server, and allows access to that in-memory database from multiple threads in a way that lets them see each other's changes.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747767598,https://api.github.com/repos/simonw/datasette/issues/1150,747767598,MDEyOklzc3VlQ29tbWVudDc0Nzc2NzU5OA==,9599,simonw,2020-12-17T23:24:03Z,2020-12-17T23:24:03Z,OWNER,"I'm going to assume that even the heaviest user will have trouble going beyond a few hundred database files, so this is fine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747767499,https://api.github.com/repos/simonw/datasette/issues/1150,747767499,MDEyOklzc3VlQ29tbWVudDc0Nzc2NzQ5OQ==,9599,simonw,2020-12-17T23:23:44Z,2020-12-17T23:23:44Z,OWNER,Grabbing the schema version of 380 files in the root directory takes 70ms.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747767055,https://api.github.com/repos/simonw/datasette/issues/1150,747767055,MDEyOklzc3VlQ29tbWVudDc0Nzc2NzA1NQ==,9599,simonw,2020-12-17T23:22:41Z,2020-12-17T23:22:41Z,OWNER,"It's just recursion that's expensive. I created 380 empty SQLite databases in a folder and timed `list(pathlib.Path(""/tmp"").glob(""*.db""));` and it took 0.002s. So maybe I tell users that all SQLite databases have to be in the root folder.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747766310,https://api.github.com/repos/simonw/datasette/issues/1150,747766310,MDEyOklzc3VlQ29tbWVudDc0Nzc2NjMxMA==,9599,simonw,2020-12-17T23:20:49Z,2020-12-17T23:20:49Z,OWNER,"I tried against my entire `~/Development/Dropbox` folder - deeply nested with 381 SQLite database files in sub-folders - and it took 25s! But it turned out 23.9s of that was the call to `pathlib.Path(""/Users/simon/Dropbox/Development"").glob('**/*.db')`. So it looks like connecting to a SQLite database file and getting the schema version is extremely fast. Scanning directories is slower.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747764712,https://api.github.com/repos/simonw/datasette/issues/1150,747764712,MDEyOklzc3VlQ29tbWVudDc0Nzc2NDcxMg==,9599,simonw,2020-12-17T23:16:31Z,2020-12-17T23:16:31Z,OWNER,"Quick micro-benchmark, run against a folder with 46 database files adding up to 1.4GB total: ```python import pathlib, sqlite3, time paths = list(pathlib.Path(""."").glob('*.db')) def schema_version(path): db = sqlite3.connect(path) version = db.execute(""PRAGMA schema_version"").fetchall()[0] db.close() return version def all(): versions = {} for path in paths: versions[path.name] = schema_version(path) return versions start = time.time(); all(); print(time.time() - start) # 0.012346982955932617 ``` So that's 12ms. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747754229,https://api.github.com/repos/simonw/datasette/issues/1150,747754229,MDEyOklzc3VlQ29tbWVudDc0Nzc1NDIyOQ==,9599,simonw,2020-12-17T23:04:38Z,2020-12-17T23:04:38Z,OWNER,"Open question: will this work for hundreds of database files, or is the overhead of connecting to each of 100 databases in turn to run `PRAGMA schema_version` too high?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1150#issuecomment-747754082,https://api.github.com/repos/simonw/datasette/issues/1150,747754082,MDEyOklzc3VlQ29tbWVudDc0Nzc1NDA4Mg==,9599,simonw,2020-12-17T23:04:13Z,2020-12-17T23:04:13Z,OWNER,"Pages that need a list of all databases - the index page and /-/databases for example - could trigger a ""check for new directories in the configured directories"" scan. That scan would run at most once every 5 (n) seconds - the check is triggered if it’s run more recently than that it doesn’t run. Hopefully this means it could be done as a blocking operation, rather than trying to run it in a thread. When it runs it scans for *.db or *.sqlite files (maybe one or two other extensions) that it hasn’t seen before. It also checks that the existing list of known database files still exists. If it finds any new ones it connects to them once to run `.schema`. It also runs `PRAGMA schema_version` on each known database so that it can compare the schema version number to the last one it saw. That's how it detects if there are new tables or if the cached schema needs to be updated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/461#issuecomment-747734273,https://api.github.com/repos/simonw/datasette/issues/461,747734273,MDEyOklzc3VlQ29tbWVudDc0NzczNDI3Mw==,9599,simonw,2020-12-17T22:14:46Z,2020-12-17T22:14:46Z,OWNER,"I've been thinking about this a bunch. For Datasette to be useful as a private repository of data (Datasette Library, #417) it's crucial that it can handle a much, much larger number of databases. This makes me worry about how many connections (and open file handles) it makes sense to have open at one time. I realize now that this is much less of a problem for private instances. Public instances on the internet could get traffic to any database at any time, so connections could easily get out of control. A private instance with only a few users could instead get away with only opening connections to databases in ""active use"". This does however make it even more important for Datasette to maintain a cached set of metadata about the tables - which is also needed to power this feature.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443021509,Paginate + search for databases/tables on the homepage, https://github.com/simonw/datasette/issues/1005#issuecomment-747209115,https://api.github.com/repos/simonw/datasette/issues/1005,747209115,MDEyOklzc3VlQ29tbWVudDc0NzIwOTExNQ==,9599,simonw,2020-12-17T05:11:04Z,2020-12-17T05:11:04Z,OWNER,Tracking ticket for the next HTTPX release is https://github.com/encode/httpx/pull/1403,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718259202,Remove xfail tests when new httpx is released, https://github.com/simonw/datasette/issues/741#issuecomment-747208543,https://api.github.com/repos/simonw/datasette/issues/741,747208543,MDEyOklzc3VlQ29tbWVudDc0NzIwODU0Mw==,9599,simonw,2020-12-17T05:09:03Z,2020-12-17T05:09:03Z,OWNER,I really like this in `datasette-publish-vercel` - I'm definitely going to bring this to the other publish implementations as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",607223136,"Replace ""datasette publish --extra-options"" with ""--setting""", https://github.com/simonw/datasette/issues/1149#issuecomment-747207787,https://api.github.com/repos/simonw/datasette/issues/1149,747207787,MDEyOklzc3VlQ29tbWVudDc0NzIwNzc4Nw==,9599,simonw,2020-12-17T05:06:16Z,2020-12-17T05:06:16Z,OWNER,"So, an idea: what if Datasette's default CSS applied only to elements with classes - or maybe to childen of a `body class=""datasette""` element? In such a way that you could write your own custom HTML that reused elements of Datasette's CSS - the cog menu styling for example - but only on an opt-in basis?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769520939,Make it easier to theme Datasette with CSS, https://github.com/simonw/datasette/issues/1149#issuecomment-747207487,https://api.github.com/repos/simonw/datasette/issues/1149,747207487,MDEyOklzc3VlQ29tbWVudDc0NzIwNzQ4Nw==,9599,simonw,2020-12-17T05:05:08Z,2020-12-17T05:05:08Z,OWNER,"I think what I want is for it to be easy to reuse portions of Datasette's CSS - the bit that styles the cog menu for example - without pulling in the whole thing. I tried linking in the `<link rel=""stylesheet"" href=""/-/static/app.css"">` stylesheet and the page broke, wildly: <img width=""1030"" alt=""content__categories__3_rows_and_Datasette_search__pytest"" src=""https://user-images.githubusercontent.com/9599/102446222-2290b600-3fe2-11eb-9d3a-e32ca72d1dac.png""> That's because Datasette's [built-in CSS](https://github.com/simonw/datasette/blob/0.53/datasette/static/app.css) applies styles directly to a whole bunch of different tags - `body`, `header`, `footer` etc - which means that if you import that stylesheet it can play havoc with the site you have already built.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769520939,Make it easier to theme Datasette with CSS, https://github.com/dogsheep/google-takeout-to-sqlite/issues/2#issuecomment-747130908,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2,747130908,MDEyOklzc3VlQ29tbWVudDc0NzEzMDkwOA==,231498,khimaros,2020-12-17T00:47:04Z,2020-12-17T00:47:43Z,NONE,"it looks like almost all of the memory consumption is coming from `json.load()`. another direction here may be to use the new ""Semantic Location History"" data which is already broken down by year and month. it also provides much more interesting data, such as estimated address, form of travel, etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769376447,killed by oomkiller on large location-history, https://github.com/dogsheep/google-takeout-to-sqlite/issues/2#issuecomment-747126777,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2,747126777,MDEyOklzc3VlQ29tbWVudDc0NzEyNjc3Nw==,9599,simonw,2020-12-17T00:36:52Z,2020-12-17T00:36:52Z,MEMBER,The memory profiler tricks I used in https://github.com/dogsheep/healthkit-to-sqlite/issues/7 could help figure out what's going on here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769376447,killed by oomkiller on large location-history, https://github.com/simonw/datasette/issues/675#issuecomment-747070709,https://api.github.com/repos/simonw/datasette/issues/675,747070709,MDEyOklzc3VlQ29tbWVudDc0NzA3MDcwOQ==,9599,simonw,2020-12-16T22:09:15Z,2020-12-16T22:09:15Z,OWNER,"The other way this could work is passing a single argument - the file (or directory) to be copied in - and assuming it should always go in the `/app` root. Something like: datasette publish cloudrun my.db --include src/ --include dogsheep-beta.yml Which would add `/app/src/...` and `/app/dogsheep-beta.yml`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/675#issuecomment-747068624,https://api.github.com/repos/simonw/datasette/issues/675,747068624,MDEyOklzc3VlQ29tbWVudDc0NzA2ODYyNA==,9599,simonw,2020-12-16T22:04:42Z,2020-12-16T22:04:42Z,OWNER,"I can't just use `COPY /path/to/blah.yml /app` in the `Dockerfile` because it runs on the Google Cloud Build servers, not on the user's laptop - so I need to first copy the files they specify to that temporary directory that gets uploaded to the cloud, then rewrite the `COPY` lines in the `Dockerfile` to copy from there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/675#issuecomment-747067864,https://api.github.com/repos/simonw/datasette/issues/675,747067864,MDEyOklzc3VlQ29tbWVudDc0NzA2Nzg2NA==,9599,simonw,2020-12-16T22:02:55Z,2020-12-16T22:02:55Z,OWNER,"But since we're already running `COPY . /app` anything that's made it into the temporary directory will get copied into `/app`. But... I feel the usability of the command will be better if users can use absolute paths on the `target` side: datasette publish cloudrun my.db --cp dogsheep-beta.yml /app","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/675#issuecomment-747066629,https://api.github.com/repos/simonw/datasette/issues/675,747066629,MDEyOklzc3VlQ29tbWVudDc0NzA2NjYyOQ==,9599,simonw,2020-12-16T21:59:58Z,2020-12-16T22:00:48Z,OWNER,"Note that `datasette publish cloudrun` uses a working directory of `/app` - so users will need to copy their files into `/app` if that's where they need to live. https://github.com/simonw/datasette/blob/17cbbb1f7f230b39650afac62dd16476626001b5/datasette/utils/__init__.py#L348-L357","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/1148#issuecomment-747065487,https://api.github.com/repos/simonw/datasette/issues/1148,747065487,MDEyOklzc3VlQ29tbWVudDc0NzA2NTQ4Nw==,9599,simonw,2020-12-16T21:57:29Z,2020-12-16T21:57:29Z,OWNER,I filed a new public bug in their issue tracker here: https://github.com/vercel/vercel/issues/5575,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",767561886,Syntax error with + symbol when deployed to Vercel, https://github.com/simonw/datasette/issues/1148#issuecomment-747062909,https://api.github.com/repos/simonw/datasette/issues/1148,747062909,MDEyOklzc3VlQ29tbWVudDc0NzA2MjkwOQ==,9599,simonw,2020-12-16T21:51:54Z,2020-12-16T21:51:54Z,OWNER,"This is a really frustrating bug with Vercel: https://github.com/simonw/datasette-publish-vercel/issues/28 `+` characters in URLs get translated into spaces before they get to Datasette. They know about the bug and said they were working on a fix a few months ago, but looks like it's still a problem. A workaround is to avoid `+` and use `-` instead - I think this SQL query does the same thing as yours: https://aws-partners-singapore.vercel.app/partners?sql=select%0D%0A++A.launch_rank%2C%0D%0A++A.partner_info%0D%0Afrom%0D%0A++summary+A%0D%0A++INNER+JOIN+summary+B+ON+A.launch_rank+%3E%3D+B.launch_rank+-+3%0D%0A++AND+A.launch_rank+-4+%3C%3D+B.launch_rank%0D%0AWHERE%0D%0A++B.%22partner_info%22+LIKE+%27%25Palo+Alto%25%27 ```sql select A.launch_rank, A.partner_info from summary A INNER JOIN summary B ON A.launch_rank >= B.launch_rank - 3 AND A.launch_rank -4 <= B.launch_rank WHERE B.""partner_info"" LIKE '%Palo Alto%' ``` I've been moving projects from Vercel to Cloud Run when they run into this, but that's not a great situation to be in.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",767561886,Syntax error with + symbol when deployed to Vercel, https://github.com/simonw/datasette/issues/675#issuecomment-747059277,https://api.github.com/repos/simonw/datasette/issues/675,747059277,MDEyOklzc3VlQ29tbWVudDc0NzA1OTI3Nw==,9599,simonw,2020-12-16T21:43:52Z,2020-12-16T21:43:52Z,OWNER,"It turns out I need this for a couple of projects: - [datasette-ripgrep](https://github.com/simonw/datasette-ripgrep) needs to ship a whole bunch of source code files up in a known location. I worked around this with a nasty hack involving `--static` but it would be better if I wasn't doing that. - [dogsheep-beta](https://github.com/dogsheep/dogsheep-beta) uses an additional `dogsheep-beta.yml` configuration file in the project root (a sibling to `metadata.yml`) which needs to be included when publishing - see https://github.com/simonw/datasette.io/issues/21#issuecomment-747058067 I want this for `datasette publish cloudrun`, not just for `datasette package`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747034481,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,747034481,MDEyOklzc3VlQ29tbWVudDc0NzAzNDQ4MQ==,9599,simonw,2020-12-16T21:17:05Z,2020-12-16T21:17:05Z,MEMBER,I'm just going to add `q` for the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588,Add search highlighting snippets, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747031608,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,747031608,MDEyOklzc3VlQ29tbWVudDc0NzAzMTYwOA==,9599,simonw,2020-12-16T21:15:18Z,2020-12-16T21:15:18Z,MEMBER,Should I pass any other details to the `display_sql` here as well?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588,Add search highlighting snippets, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747030964,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,747030964,MDEyOklzc3VlQ29tbWVudDc0NzAzMDk2NA==,9599,simonw,2020-12-16T21:14:54Z,2020-12-16T21:14:54Z,MEMBER,"To do this I'll need the search term to be passed to the `display_sql` SQL query: https://github.com/dogsheep/dogsheep-beta/blob/4890ec87b5e2ec48940f32c9ad1f5aae25c75a4d/dogsheep_beta/__init__.py#L164-L171","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588,Add search highlighting snippets, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747029636,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,747029636,MDEyOklzc3VlQ29tbWVudDc0NzAyOTYzNg==,9599,simonw,2020-12-16T21:14:03Z,2020-12-16T21:14:03Z,MEMBER,"I think I can do this as a cunning trick in `display_sql`. Consider this example query: https://til.simonwillison.net/tils?sql=select%0D%0A++path%2C%0D%0A++snippet%28til_fts%2C+-1%2C+%27b4de2a49c8%27%2C+%278c94a2ed4b%27%2C+%27...%27%2C+60%29+as+snippet%0D%0Afrom%0D%0A++til%0D%0A++join+til_fts+on+til.rowid+%3D+til_fts.rowid%0D%0Awhere%0D%0A++til_fts+match+escape_fts%28%3Aq%29%0D%0A++and+path+%3D+%27asgi_lifespan-test-httpx.md%27%0D%0A&q=pytest ```sql select path, snippet(til_fts, -1, 'b4de2a49c8', '8c94a2ed4b', '...', 60) as snippet from til join til_fts on til.rowid = til_fts.rowid where til_fts match escape_fts(:q) and path = 'asgi_lifespan-test-httpx.md' ``` The `and path = 'asgi_lifespan-test-httpx.md'` bit means we only get back a specific document - but the snippet highlighting is applied to it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588,Add search highlighting snippets, https://github.com/simonw/datasette/issues/1143#issuecomment-746827083,https://api.github.com/repos/simonw/datasette/issues/1143,746827083,MDEyOklzc3VlQ29tbWVudDc0NjgyNzA4Mw==,9599,simonw,2020-12-16T18:56:07Z,2020-12-16T18:56:07Z,OWNER,"I think the right way to do this is to support multiple optional `--cors-origin=` pattern values, like you suggested.","{""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/dogsheep/github-to-sqlite/issues/58#issuecomment-746735889,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58,746735889,MDEyOklzc3VlQ29tbWVudDc0NjczNTg4OQ==,9599,simonw,2020-12-16T17:59:50Z,2020-12-16T17:59:50Z,MEMBER,"I don't want to add a full HTML parser (like BeautifulSoup) as a dependency for this feature. Since the HTML comes from a single, trusted source (GitHub) I could probably handle this using [regular expressions](https://stackoverflow.com/a/1732454).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769150394,Readme HTML has broken internal links, https://github.com/dogsheep/github-to-sqlite/issues/58#issuecomment-746734412,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58,746734412,MDEyOklzc3VlQ29tbWVudDc0NjczNDQxMg==,9599,simonw,2020-12-16T17:58:56Z,2020-12-16T17:58:56Z,MEMBER,"I'm going to rewrite those `<a href=""#filtering-tables"">` links to `<a href=""#user-content-filtering-tables"">` - but only if a corresponding `id=""user-content-filtering-tables""` element exists.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769150394,Readme HTML has broken internal links, https://github.com/simonw/datasette/issues/1142#issuecomment-745162571,https://api.github.com/repos/simonw/datasette/issues/1142,745162571,MDEyOklzc3VlQ29tbWVudDc0NTE2MjU3MQ==,6622733,nitinpaultifr,2020-12-15T09:22:58Z,2020-12-15T09:22:58Z,NONE,"You're right, probably more straightforward to have the links for JSON. I was imagining to toggle the `href` for the 'Export JSON' link (button) to the selected shape, but it'll probably be needlessly complex in the end.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1143#issuecomment-744618787,https://api.github.com/repos/simonw/datasette/issues/1143,744618787,MDEyOklzc3VlQ29tbWVudDc0NDYxODc4Nw==,114388,yurivish,2020-12-14T18:15:00Z,2020-12-15T02:21:53Z,NONE,"From a quick look at the README, it does seem to do everything I need, thanks! I think the argument for inclusion in core is to lower the chances of unwanted data access. A local server can be accessed by anybody who can make an HTTP request to your computer regardless of CORS rules, but the default `*` rule additionally opens up access to the local instance to any website you visit while it is running. That's probably not what people typically intend, particularly when the data is of a sensitive nature. A default of requiring the user to specify the origin (allowing `*` but encouraging a narrower scope) would solve this problem entirely, I think. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/simonw/datasette/issues/1143#issuecomment-744757558,https://api.github.com/repos/simonw/datasette/issues/1143,744757558,MDEyOklzc3VlQ29tbWVudDc0NDc1NzU1OA==,9599,simonw,2020-12-14T22:42:10Z,2020-12-14T22:42:10Z,OWNER,"This may involve a breaking change to the CLI settings interface, so I'm adding this to the 1.0 milestone.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/simonw/datasette/issues/1143#issuecomment-744756861,https://api.github.com/repos/simonw/datasette/issues/1143,744756861,MDEyOklzc3VlQ29tbWVudDc0NDc1Njg2MQ==,9599,simonw,2020-12-14T22:40:28Z,2020-12-14T22:40:28Z,OWNER,"That's a very convincing argument. I'm keen on making sure Datasette is ""secure by default"" so you're right, encouraging finely grains CORS rules in core rather than leaving that to a plugin sounds like the right call.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/simonw/datasette/issues/1142#issuecomment-744576894,https://api.github.com/repos/simonw/datasette/issues/1142,744576894,MDEyOklzc3VlQ29tbWVudDc0NDU3Njg5NA==,9599,simonw,2020-12-14T17:03:13Z,2020-12-14T17:03:13Z,OWNER,"I'm not sure about the radio boxes for JSON, just because you can't right-click on a radio box and copy it to your clipboard like you can with links. Worth trying it out though. The radio boxes for that CSV option are definitely the right way to go.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-744563209,https://api.github.com/repos/simonw/datasette/issues/1142,744563209,MDEyOklzc3VlQ29tbWVudDc0NDU2MzIwOQ==,9599,simonw,2020-12-14T16:41:11Z,2020-12-14T16:41:11Z,OWNER,"To check out and start the server: /tmp % git clone git@github.com:nitinpaul/datasette Cloning into 'datasette'... remote: Enumerating objects: 124, done. # ... datasette % python3 -m venv venv datasette % source venv/bin/activate (venv) datasette % pip install -e '.[test]' Obtaining file:///private/tmp/datasette Collecting asgiref<3.4.0,>=3.2.10 Using cached asgiref-3.3.1-py3-none-any.whl (19 kB) # ... (venv) datasette % datasette INFO: Started server process [24002] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) And to run the tests: (venv) datasette % pytest ======================================================================== test session starts ======================================================================== platform darwin -- Python 3.9.1, pytest-6.1.2, py-1.10.0, pluggy-0.13.1 SQLite: 3.34.0 rootdir: /private/tmp/datasette, configfile: pytest.ini plugins: asyncio-0.14.0, timeout-1.4.2 collected 841 items tests/test_package.py .. [ 0%] ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-744522099,https://api.github.com/repos/simonw/datasette/issues/1142,744522099,MDEyOklzc3VlQ29tbWVudDc0NDUyMjA5OQ==,6622733,nitinpaultifr,2020-12-14T15:37:47Z,2020-12-14T15:37:47Z,NONE,"Alright I could give it a try! This might be a stupid question, can you tell me how to run the server from my fork? So that I can test the changes?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1144#issuecomment-744489028,https://api.github.com/repos/simonw/datasette/issues/1144,744489028,MDEyOklzc3VlQ29tbWVudDc0NDQ4OTAyOA==,475613,MarkusH,2020-12-14T14:47:11Z,2020-12-14T14:47:11Z,NONE,"Thanks for opening the issue, @simonw. Let me elaborate on my Tweets. [datasette-chartjs](https://github.com/MarkusH/datasette-chartjs) provides drop down lists to pick the chart visualization (e.g. bar, line, doughnut, pie, ...) as well as the column used for the ""x axis"" (e.g. time). A user can change the values on-demand. The chart will be redrawn w/o querying the database again. However, if a user wants to change the underlying query, they will use the SQL field provided by datasette or any of the other datasette built-in features to amend a query. In order to maintain a user's selections for the plugin, datasette-chartjs copies some parts of [datasette-vega](https://github.com/simonw/datasette-vega) which persist the chosen visualization and column in the hash part of a URL (the stuff behind the `#`). The plugin load the config from the hash upon initialization on the next page and use it accordingly. Additionally, datasette-vega and datasette-chartjs need to make sure to include the hash in all links and forms that cause a reload of the page. This is, such that the config persists between clicks. This ticket is about moving thes parts into datasette that provide the functionality to do so. This includes: 1. a way to load config options with a given prefix from the current URL hash 1. a way to update the current URL hash with a new config value or a bunch of config options 1. updating all necessary links and forms on the current page to include the URL hash whenever its updated 1. to prevent leaking config options to external pages, only ""internal"" links should be updated There's another, optional, feature that we might want to think about during the design phase: the scope of the config. Links within a datasette instance have 1 of 3 scopes: 1. global, for the whole datasette project 1. database, for all tables in a database 1. table, only for a table within a database When updating the links and forms as pointed out in 3. above, it might be worth considering which links need to be updated. I could imagine a plugin that wants to persist some setting across all tables within a database but another setting only within a table.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",765637324,JavaScript to help plugins interact with the fragment part of the URL, https://github.com/simonw/datasette/pull/1145#issuecomment-744475543,https://api.github.com/repos/simonw/datasette/issues/1145,744475543,MDEyOklzc3VlQ29tbWVudDc0NDQ3NTU0Mw==,22429695,codecov[bot],2020-12-14T14:26:25Z,2020-12-14T14:26:25Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=h1) Report > Merging [#1145](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=desc) (a8588f9) into [main](https://codecov.io/gh/simonw/datasette/commit/0c616f732cee79db80cad830917666f41b344262?el=desc) (0c616f7) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1145 +/- ## ======================================= Coverage 91.41% 91.41% ======================================= Files 31 31 Lines 3881 3881 ======================================= Hits 3548 3548 Misses 333 333 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=footer). Last update [0c616f7...a8588f9](https://codecov.io/gh/simonw/datasette/pull/1145?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",766494367,"Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0", https://github.com/simonw/datasette/issues/276#issuecomment-744461856,https://api.github.com/repos/simonw/datasette/issues/276,744461856,MDEyOklzc3VlQ29tbWVudDc0NDQ2MTg1Ng==,296686,robintw,2020-12-14T14:04:57Z,2020-12-14T14:04:57Z,NONE,"I'm looking into using datasette with a database with spatialite geometry columns, and came across this issue. Has there been any progress on this since 2018? In one of my tables I'm just storing lat/lon points in a spatialite point geometry, and I've managed to make datasette-cluster-map display the points by extracting the lat and lon in SQL - using something like `select ... ST_X(location) as longitude, ST_Y(location) as latitude from Blah`. Something more 'built-in' would be great though - particularly for the tables I have that store more complex geometries.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/1142#issuecomment-744251252,https://api.github.com/repos/simonw/datasette/issues/1142,744251252,MDEyOklzc3VlQ29tbWVudDc0NDI1MTI1Mg==,9599,simonw,2020-12-14T07:56:38Z,2020-12-14T07:56:38Z,OWNER,That's a really solid design for this! I'd be very happy to review a pull request - you should be able to implement this with just template edits and some CSS changes I think.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1143#issuecomment-744249157,https://api.github.com/repos/simonw/datasette/issues/1143,744249157,MDEyOklzc3VlQ29tbWVudDc0NDI0OTE1Nw==,9599,simonw,2020-12-14T07:53:15Z,2020-12-14T07:53:15Z,OWNER,"Does this plugin do everything you need? https://github.com/simonw/datasette-cors I'm open to arguments as to why this should be in core rather than in a plugin - I'm on the fence about that at the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/simonw/datasette/issues/741#issuecomment-744142692,https://api.github.com/repos/simonw/datasette/issues/741,744142692,MDEyOklzc3VlQ29tbWVudDc0NDE0MjY5Mg==,9599,simonw,2020-12-14T03:28:56Z,2020-12-14T03:28:56Z,OWNER,I'm going to try this out on `datasette-publish-vercel` first.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",607223136,"Replace ""datasette publish --extra-options"" with ""--setting""", https://github.com/simonw/datasette/issues/983#issuecomment-744066249,https://api.github.com/repos/simonw/datasette/issues/983,744066249,MDEyOklzc3VlQ29tbWVudDc0NDA2NjI0OQ==,9599,simonw,2020-12-13T20:47:52Z,2020-12-13T20:47:52Z,OWNER,"@yozlet just spotted this comment. Wow that is interesting! With the right plugin hooks on the page (see also #987) one relatively simple way to do that could be with bookmarklets - users could install bookmarklets which, when executed against a Datasette page in their browser, use the existing JavaScript plugin integration points to add all kinds of functionality. Doing full sandboxing is certainly daunting, but it looks like Figma figured it out so TIL it's technically feasible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/pull/1031#issuecomment-744003454,https://api.github.com/repos/simonw/datasette/issues/1031,744003454,MDEyOklzc3VlQ29tbWVudDc0NDAwMzQ1NA==,299380,frankier,2020-12-13T12:52:56Z,2020-12-13T12:52:56Z,NONE,"Please let me know if there's anything I can do to help get this merged. This is causing problems for me because it means when I build my Docker image my databases aren't considered immutable, which I would like them to be so that a download link is produced.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/simonw/datasette/issues/1142#issuecomment-743998792,https://api.github.com/repos/simonw/datasette/issues/1142,743998792,MDEyOklzc3VlQ29tbWVudDc0Mzk5ODc5Mg==,6622733,nitinpaultifr,2020-12-13T12:14:06Z,2020-12-13T12:14:06Z,NONE,"Agreed, it would definitely provide better controls. However, I do feel it makes for a bit of inconsistent UX for the 'Advanced export' section, with links to download for JSON, checkboxes and radio buttons + button to download for CSV. Do you think this example makes the UX a bit nicer/consistent?  I could give it a try if you'd like but I've never contributed to an actual project! ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/sqlite-utils/issues/207#issuecomment-743966801,https://api.github.com/repos/simonw/sqlite-utils/issues/207,743966801,MDEyOklzc3VlQ29tbWVudDc0Mzk2NjgwMQ==,9599,simonw,2020-12-13T07:25:23Z,2020-12-13T07:25:23Z,OWNER,"CLI documentation: https://sqlite-utils.readthedocs.io/en/latest/cli.html#analyzing-tables Python library documentation: https://sqlite-utils.readthedocs.io/en/latest/python-api.html#analyzing-a-column","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763283616,sqlite-utils analyze-tables command, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-743966289,https://api.github.com/repos/simonw/sqlite-utils/issues/203,743966289,MDEyOklzc3VlQ29tbWVudDc0Mzk2NjI4OQ==,9599,simonw,2020-12-13T07:20:51Z,2020-12-13T07:20:51Z,OWNER,Sorry for not reviewing this yet! I'll try to carve out time to look at it in the next few days.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743956666,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743956666,MDEyOklzc3VlQ29tbWVudDc0Mzk1NjY2Ng==,9599,simonw,2020-12-13T05:44:49Z,2020-12-13T05:44:49Z,OWNER,"Example output: ``` % sqlite-utils analyze-tables github.db tags tags.repo: (1/3) Total rows: 261 Null rows: 0 Blank rows: 0 Distinct values: 14 Most common: 88: 107914493 75: 140912432 27: 206156866 21: 207052882 17: 197431109 8: 197882382 5: 256834907 5: 205429375 4: 248903544 3: 206202864 Least common: 1: 209590345 2: 206649770 2: 303218369 3: 206202864 3: 213286752 4: 248903544 5: 205429375 5: 256834907 8: 197882382 17: 197431109 tags.name: (2/3) Total rows: 261 Null rows: 0 Blank rows: 0 Distinct values: 175 Most common: 10: 0.2 9: 0.1 7: 0.3 6: 0.4 5: 0.7 5: 0.5 5: 0.1a 4: 0.9 4: 0.8 4: 0.6 Least common: 1: 0.1.1 1: 0.11.1 1: 0.1a2 1: 0.20.1 1: 0.21.1 1: 0.21.2 1: 0.21.3 1: 0.22 1: 0.22.1 1: 0.23 tags.sha: (3/3) Total rows: 261 Null rows: 0 Blank rows: 0 Distinct values: 261 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/datasette/issues/1142#issuecomment-743913004,https://api.github.com/repos/simonw/datasette/issues/1142,743913004,MDEyOklzc3VlQ29tbWVudDc0MzkxMzAwNA==,9599,simonw,2020-12-12T22:17:46Z,2020-12-12T22:17:46Z,OWNER,"You're actually choosing between two options here: the 100 rows you can see on the screen, or the x,000 rows that match the current query. Maybe a radio box would be more obvious?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-743912875,https://api.github.com/repos/simonw/datasette/issues/1142,743912875,MDEyOklzc3VlQ29tbWVudDc0MzkxMjg3NQ==,9599,simonw,2020-12-12T22:16:38Z,2020-12-12T22:16:38Z,OWNER,"Yeah, maybe with the number of rows to make it completely clear. `Include all 2,455 rows` perhaps.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-743732440,https://api.github.com/repos/simonw/datasette/issues/1142,743732440,MDEyOklzc3VlQ29tbWVudDc0MzczMjQ0MA==,6622733,nitinpaultifr,2020-12-12T09:56:40Z,2020-12-12T09:56:40Z,NONE,'Include all rows' seem like a fairly obvious alternative,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743708524,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743708524,MDEyOklzc3VlQ29tbWVudDc0MzcwODUyNA==,9599,simonw,2020-12-12T05:48:20Z,2020-12-12T05:48:32Z,OWNER,"``` % sqlite-utils analyze-tables ../datasette/fixtures.db facetable --column pk 1/1: ColumnDetails(table='facetable', column='pk', total_rows=15, num_null=0, num_blank=0, num_distinct=15, most_common=None, least_common=None) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743708325,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743708325,MDEyOklzc3VlQ29tbWVudDc0MzcwODMyNQ==,9599,simonw,2020-12-12T05:46:27Z,2020-12-12T05:46:27Z,OWNER,"It would be neat if you could optionally specify a subset of columns to analyze, using `-c` or `--column`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743708169,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743708169,MDEyOklzc3VlQ29tbWVudDc0MzcwODE2OQ==,9599,simonw,2020-12-12T05:44:46Z,2020-12-12T05:44:46Z,OWNER,"If there are less than ten values is it worth outputting them twice, once in `most_common` and then in reverse in `least_common`? Feels redundant - I think I should leave `least_common` empty in that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743708080,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743708080,MDEyOklzc3VlQ29tbWVudDc0MzcwODA4MA==,9599,simonw,2020-12-12T05:43:45Z,2020-12-12T05:43:45Z,OWNER,"CLI output looks like this at the moment, which is bad: ``` % sqlite-utils analyze-tables ../datasette/fixtures.db facetable 1/10: ColumnDetails(table='facetable', column='pk', total_rows=15, num_null=0, num_blank=0, num_distinct=15, most_common=None, least_common=None) 2/10: ColumnDetails(table='facetable', column='created', total_rows=15, num_null=0, num_blank=0, num_distinct=4, most_common=[('2019-01-17 08:00:00', 4), ('2019-01-15 08:00:00', 4), ('2019-01-14 08:00:00', 4), ('2019-01-16 08:00:00', 3)], least_common=[('2019-01-16 08:00:00', 3), ('2019-01-14 08:00:00', 4), ('2019-01-15 08:00:00', 4), ('2019-01-17 08:00:00', 4)]) 3/10: ColumnDetails(table='facetable', column='planet_int', total_rows=15, num_null=0, num_blank=0, num_distinct=2, most_common=[(1, 14), (2, 1)], least_common=[(2, 1), (1, 14)]) 4/10: ColumnDetails(table='facetable', column='on_earth', total_rows=15, num_null=0, num_blank=0, num_distinct=2, most_common=[(1, 14), (0, 1)], least_common=[(0, 1), (1, 14)]) 5/10: ColumnDetails(table='facetable', column='state', total_rows=15, num_null=0, num_blank=0, num_distinct=3, most_common=[('CA', 10), ('MI', 4), ('MC', 1)], least_common=[('MC', 1), ('MI', 4), ('CA', 10)]) 6/10: ColumnDetails(table='facetable', column='city_id', total_rows=15, num_null=0, num_blank=0, num_distinct=4, most_common=[(1, 6), (3, 4), (2, 4), (4, 1)], least_common=[(4, 1), (2, 4), (3, 4), (1, 6)]) 7/10: ColumnDetails(table='facetable', column='neighborhood', total_rows=15, num_null=0, num_blank=0, num_distinct=14, most_common=[('Downtown', 2), ('Tenderloin', 1), ('SOMA', 1), ('Mission', 1), ('Mexicantown', 1), ('Los Feliz', 1), ('Koreatown', 1), ('Hollywood', 1), ('Hayes Valley', 1), ('Greektown', 1)], least_common=[('Arcadia Planitia', 1), ('Bernal Heights', 1), ('Corktown', 1), ('Dogpatch', 1), ('Greektown', 1), ('Hayes Valley', 1), ('Hollywood', 1), ('Koreatown', 1), ('Los Feliz', 1), ('Mexicantown', 1)]) 8/10: ColumnDetails(table='facetable', column='tags', total_rows=15, num_null=0, num_blank=0, num_distinct=3, most_common=[('[]', 13), ('[""tag1"", ""tag3""]', 1), ('[""tag1"", ""tag2""]', 1)], least_common=[('[""tag1"", ""tag2""]', 1), ('[""tag1"", ""tag3""]', 1), ('[]', 13)]) 9/10: ColumnDetails(table='facetable', column='complex_array', total_rows=15, num_null=0, num_blank=0, num_distinct=2, most_common=[('[]', 14), ('[{""foo"": ""bar""}]', 1)], least_common=[('[{""foo"": ""bar""}]', 1), ('[]', 14)]) 10/10: ColumnDetails(table='facetable', column='distinct_some_null', total_rows=15, num_null=13, num_blank=0, num_distinct=2, most_common=[(None, 13), ('two', 1), ('one', 1)], least_common=[('one', 1), ('two', 1), (None, 13)]) (sqlite-utils) sqlite-utils % ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/sqlite-utils/pull/208#issuecomment-743707969,https://api.github.com/repos/simonw/sqlite-utils/issues/208,743707969,MDEyOklzc3VlQ29tbWVudDc0MzcwNzk2OQ==,9599,simonw,2020-12-12T05:42:26Z,2020-12-12T05:43:06Z,OWNER,"Should truncate values in the least/most common JSON array to a sensible length, otherwise you end up with stuff like this: ```json [ [ ""b'\\x00\\x05barry\\x03\\x01\\x02\\x00\\x00\\x03cat\\x03\\x01\\x03\\x00\\x00\\x03dog\\x08\\x01\\x01\\x01\\x03\\x00\\x01\\x03\\x00\\x00\\x07panther\\x05\\x01\\x01\\x02\\x02\\x00\\x01\\x03uma\\x05\\x02\\x01\\x02\\x02\\x00\\x00\\x04sara\\x05\\x02\\x01\\x01\\x02\\x00\\x00\\x05terry\\x08\\x01\\x01\\x01\\x02\\x00\\x01\\x02\\x00\\x00\\x06weasel\\x05\\x02\\x01\\x01\\x03\\x00'"", 1 ] ] ``` This example also shows that binary values (like those in `_fts` tables) look a bit weird, but I think I'm OK with that since binary data can't be represented neatly in JSON anyway.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763320133,sqlite-utils analyze-tables command and table.analyze_column() method, https://github.com/simonw/sqlite-utils/issues/207#issuecomment-743701697,https://api.github.com/repos/simonw/sqlite-utils/issues/207,743701697,MDEyOklzc3VlQ29tbWVudDc0MzcwMTY5Nw==,9599,simonw,2020-12-12T04:39:51Z,2020-12-12T04:39:51Z,OWNER,"CLI could be: sqlite-utils analyze-tables To analyze all tables or: sqlite-utils analyze-tables table1 table2 To analyze specific tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763283616,sqlite-utils analyze-tables command, https://github.com/simonw/sqlite-utils/issues/207#issuecomment-743701599,https://api.github.com/repos/simonw/sqlite-utils/issues/207,743701599,MDEyOklzc3VlQ29tbWVudDc0MzcwMTU5OQ==,9599,simonw,2020-12-12T04:38:52Z,2020-12-12T04:39:07Z,OWNER,I'll add a `table.analyze_column(column)` method which is used by the CLI tool - with a note that this is an unstable interface which may change in the future.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763283616,sqlite-utils analyze-tables command, https://github.com/simonw/sqlite-utils/issues/207#issuecomment-743701422,https://api.github.com/repos/simonw/sqlite-utils/issues/207,743701422,MDEyOklzc3VlQ29tbWVudDc0MzcwMTQyMg==,9599,simonw,2020-12-12T04:37:14Z,2020-12-12T04:38:25Z,OWNER,"Prototype: ```python from collections import namedtuple ColumnDetails = namedtuple(""ColumnDetails"", (""column"", ""num_null"", ""num_blank"", ""num_distinct"", ""most_common"", ""least_common"")) def analyze_column(db, table, column, values=10): num_null = db.execute(""select count(*) from [{}] where [{}] is null"".format(table, column)).fetchone()[0] num_blank = db.execute(""select count(*) from [{}] where [{}] = ''"".format(table, column)).fetchone()[0] num_distinct = db.execute(""select count(distinct [{}]) from [{}]"".format(column, table)).fetchone()[0] most_common = None least_common = None if num_distinct != 1: most_common = [(r[0], r[1]) for r in db.execute( ""select [{}], count(*) from [{}] group by [{}] order by count(*) desc limit "".format(column, table, column, values) ).fetchall()] if num_distinct <= values: # No need to run the query if it will just return the results in revers order least_common = most_common[::-1] else: least_common = [(r[0], r[1]) for r in db.execute( ""select [{}], count(*) from [{}] group by [{}] order by count(*) limit {}"".format(column, table, column, values) ).fetchall()] return ColumnDetails(column, num_null, num_blank, num_distinct, most_common, least_common) def analyze_table(db, table): for column in db[table].columns: details = analyze_column(db, table, column.name) print(details) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763283616,sqlite-utils analyze-tables command, https://github.com/simonw/datasette/issues/998#issuecomment-743080047,https://api.github.com/repos/simonw/datasette/issues/998,743080047,MDEyOklzc3VlQ29tbWVudDc0MzA4MDA0Nw==,6371750,JBPressac,2020-12-11T09:25:09Z,2020-12-11T09:25:09Z,CONTRIBUTOR,"Hello Simon, I have a similar problem with horizontal scrollbar display with Datasette version 0.51 and superior for a table with more than 30 rows. With Datasette 0.50, the horizontal scrollbar is displayed, if I upgrade Datasette to 0.51 and superior, the horizontal scrollbar disappears. Datasette 0.50: horizontal scrollbar  Datasette 0.51 and superior: no horizontal scrollbar  Thanks,","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/sqlite-utils/issues/205#issuecomment-742737794,https://api.github.com/repos/simonw/sqlite-utils/issues/205,742737794,MDEyOklzc3VlQ29tbWVudDc0MjczNzc5NA==,9599,simonw,2020-12-10T19:18:22Z,2020-12-10T19:18:22Z,OWNER,"Yup, it looks like you're using a window function that was added in SQLite 3.25.0: https://www.sqlite.org/changes.html#version_3_25_0","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760960559,"sqlite3.OperationalError: near ""("": syntax error", https://github.com/simonw/sqlite-utils/issues/205#issuecomment-742299584,https://api.github.com/repos/simonw/sqlite-utils/issues/205,742299584,MDEyOklzc3VlQ29tbWVudDc0MjI5OTU4NA==,765871,kaihendry,2020-12-10T07:24:22Z,2020-12-10T07:24:22Z,NONE,Bumping to ubuntu-20.04 appears to have solved my syntax error. 🤷,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760960559,"sqlite3.OperationalError: near ""("": syntax error", https://github.com/simonw/datasette/issues/1134#issuecomment-742260116,https://api.github.com/repos/simonw/datasette/issues/1134,742260116,MDEyOklzc3VlQ29tbWVudDc0MjI2MDExNg==,2181410,clausjuhl,2020-12-10T05:57:17Z,2020-12-10T05:57:17Z,NONE,"Hi Simon Thank you for the quick fix! And glad you like our use of Datasette (launches 1. january 2021). It's a site that currently (more to come) makes all minutes and their annexes from Aarhus City Council and the major committees (1997-2019) available to the public. So we're putting Datasette to good use :)","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",760312579,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""", https://github.com/simonw/datasette/issues/1091#issuecomment-741992106,https://api.github.com/repos/simonw/datasette/issues/1091,741992106,MDEyOklzc3VlQ29tbWVudDc0MTk5MjEwNg==,9599,simonw,2020-12-09T19:19:54Z,2020-12-09T20:27:45Z,OWNER,"Could you try removing the `ProxyPassReverse /datasette http://0.0.0.0:8001` line? My hunch is that `ProxyPassReverse` is rewriting some of the links in the HTML (or maybe in the HTTP headers) in a way that breaks things. Normally you would need `ProxyPassReverse` to compensate for the underlying application being unable to rewrite its links - but Datasette's `base_url` setting causes Datasette to rewrite all of the links for you, so `ProxyPassReverse` should be unneccessary.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1134#issuecomment-742024588,https://api.github.com/repos/simonw/datasette/issues/1134,742024588,MDEyOklzc3VlQ29tbWVudDc0MjAyNDU4OA==,9599,simonw,2020-12-09T20:19:59Z,2020-12-09T20:20:33Z,OWNER,https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search=sundhedsfrem%2A is an absolutely beautiful example of a themed Datasette! Very excited to show this to people.,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760312579,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""", https://github.com/simonw/datasette/issues/1134#issuecomment-742023775,https://api.github.com/repos/simonw/datasette/issues/1134,742023775,MDEyOklzc3VlQ29tbWVudDc0MjAyMzc3NQ==,9599,simonw,2020-12-09T20:18:23Z,2020-12-09T20:18:23Z,OWNER,A fix for this should be available if you upgrade to 0.52.5,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760312579,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""", https://github.com/simonw/datasette/issues/1091#issuecomment-742023541,https://api.github.com/repos/simonw/datasette/issues/1091,742023541,MDEyOklzc3VlQ29tbWVudDc0MjAyMzU0MQ==,9599,simonw,2020-12-09T20:17:54Z,2020-12-09T20:17:54Z,OWNER,OK that is really weird. I'll have another go at replicating this locally.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1136#issuecomment-742023111,https://api.github.com/repos/simonw/datasette/issues/1136,742023111,MDEyOklzc3VlQ29tbWVudDc0MjAyMzExMQ==,9599,simonw,2020-12-09T20:17:02Z,2020-12-09T20:17:02Z,OWNER,Documentation for this procedure is now here: https://docs.datasette.io/en/latest/contributing.html#releasing-bug-fixes-from-a-branch,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742022222,https://api.github.com/repos/simonw/datasette/issues/1136,742022222,MDEyOklzc3VlQ29tbWVudDc0MjAyMjIyMg==,9599,simonw,2020-12-09T20:15:24Z,2020-12-09T20:15:51Z,OWNER,Used this procedure for the first time for 0.52.5 - deploy run here: https://github.com/simonw/datasette/actions/runs/411465648 - PyPI release here: https://pypi.org/project/datasette/0.52.5/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742017622,https://api.github.com/repos/simonw/datasette/issues/1136,742017622,MDEyOklzc3VlQ29tbWVudDc0MjAxNzYyMg==,9599,simonw,2020-12-09T20:06:47Z,2020-12-09T20:06:47Z,OWNER,"Then I can ship the release directly from that branch, creating the tag as part of the release process: <img width=""1057"" alt=""New_release_·_simonw_datasette"" src=""https://user-images.githubusercontent.com/9599/101681442-ffc93500-3a16-11eb-8801-3a0e357a2bc1.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742014881,https://api.github.com/repos/simonw/datasette/issues/1136,742014881,MDEyOklzc3VlQ29tbWVudDc0MjAxNDg4MQ==,9599,simonw,2020-12-09T20:01:27Z,2020-12-09T20:01:27Z,OWNER,"I'll write the release notes in the branch, then cherry-pick them over to `main`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742014366,https://api.github.com/repos/simonw/datasette/issues/1136,742014366,MDEyOklzc3VlQ29tbWVudDc0MjAxNDM2Ng==,9599,simonw,2020-12-09T20:00:35Z,2020-12-09T20:00:35Z,OWNER,"Actually I'll start from 0.52.4 and then cherry-pick the fixes. git branch 0.52.x 0.52.4 git checkout 0.52.x git cherry-pick COMMIT","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1091#issuecomment-742010306,https://api.github.com/repos/simonw/datasette/issues/1091,742010306,MDEyOklzc3VlQ29tbWVudDc0MjAxMDMwNg==,6739646,tballison,2020-12-09T19:53:18Z,2020-12-09T19:59:52Z,NONE,"I can't imagine this helps (esp. given your point about potential rewrites), but you can see that /datasette/ was correctly added to the sql form, but not to the ""export-links"" <img width=""484"" alt=""Screen Shot 2020-12-09 at 2 51 09 PM"" src=""https://user-images.githubusercontent.com/6739646/101680055-234baa00-3a2e-11eb-8650-2b369bc6f031.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1134#issuecomment-742012324,https://api.github.com/repos/simonw/datasette/issues/1134,742012324,MDEyOklzc3VlQ29tbWVudDc0MjAxMjMyNA==,9599,simonw,2020-12-09T19:57:05Z,2020-12-09T19:57:05Z,OWNER,Thanks for the bug report!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760312579,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""", https://github.com/simonw/datasette/issues/1136#issuecomment-742009294,https://api.github.com/repos/simonw/datasette/issues/1136,742009294,MDEyOklzc3VlQ29tbWVudDc0MjAwOTI5NA==,9599,simonw,2020-12-09T19:51:18Z,2020-12-09T19:51:18Z,OWNER,"Likewise, Read The Docs publishes as stable the docs from the latest tagged release, so I would expect that to work fine as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742009101,https://api.github.com/repos/simonw/datasette/issues/1136,742009101,MDEyOklzc3VlQ29tbWVudDc0MjAwOTEwMQ==,9599,simonw,2020-12-09T19:50:53Z,2020-12-09T19:50:53Z,OWNER,"My concern is if this will break anything about CI. I don't think it will - the code that deploys the latest `main` to https://latest.datasette.io/ should be unaffected, and the checkout code in `publish.yml` should check out the correct code based on the tag used for that release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1136#issuecomment-742008087,https://api.github.com/repos/simonw/datasette/issues/1136,742008087,MDEyOklzc3VlQ29tbWVudDc0MjAwODA4Nw==,9599,simonw,2020-12-09T19:48:56Z,2020-12-09T19:48:56Z,OWNER,I think I'm going to create a branch called `0.52.x` that starts with `8ae0f9f7f0d644b0161165a1084f53acd2786f7c` and then tag the release from there.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",760621356,Establish pattern for release branches to support bug fixes, https://github.com/simonw/datasette/issues/1091#issuecomment-742001510,https://api.github.com/repos/simonw/datasette/issues/1091,742001510,MDEyOklzc3VlQ29tbWVudDc0MjAwMTUxMA==,6739646,tballison,2020-12-09T19:36:42Z,2020-12-09T19:38:04Z,NONE,"I don't think this fixes it: ``` grep -R datasette . ./sites-available/000-default.conf: ProxyPass /datasette http://127.0.0.1:8001/ ./sites-available/000-default.conf: #ProxyPassReverse /datasette http://127.0.0.1:8001/ ./sites-available/corpora-le-ssl.conf: ProxyPass /datasette http://0.0.0.0:8001 ./sites-available/corpora-le-ssl.conf: #ProxyPassReverse /datasette http://0.0.0.0:8001 ./sites-enabled/corpora-le-ssl.conf: ProxyPass /datasette http://0.0.0.0:8001 ./sites-enabled/corpora-le-ssl.conf: #ProxyPassReverse /datasette http://0.0.0.0:8001 ``` And I confirmed that I actually restarted the server. :rofl: https://corpora.tika.apache.org/datasette/file_profiles","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-741804334,https://api.github.com/repos/simonw/datasette/issues/1091,741804334,MDEyOklzc3VlQ29tbWVudDc0MTgwNDMzNA==,6739646,tballison,2020-12-09T14:26:05Z,2020-12-09T14:26:05Z,NONE,"Anything we can do to help debug this? Thank you, again!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/766#issuecomment-741665253,https://api.github.com/repos/simonw/datasette/issues/766,741665253,MDEyOklzc3VlQ29tbWVudDc0MTY2NTI1Mw==,2181410,clausjuhl,2020-12-09T09:59:05Z,2020-12-09T09:59:05Z,NONE,Hi Simon. Any news on using wildcard-searches with datasette? Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",617323873,Enable wildcard-searches by default, https://github.com/simonw/datasette/issues/1133#issuecomment-740850920,https://api.github.com/repos/simonw/datasette/issues/1133,740850920,MDEyOklzc3VlQ29tbWVudDc0MDg1MDkyMA==,9599,simonw,2020-12-08T18:55:59Z,2020-12-08T18:55:59Z,OWNER,Inspiration was this script: https://gist.github.com/simonw/f6e3cd29fde5d15ea9cd746c942046ba - which pipes output through `tail -n +2` to strip off the headers.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",759695780,Option to omit header row in CSV export, https://github.com/simonw/datasette/issues/1133#issuecomment-740850057,https://api.github.com/repos/simonw/datasette/issues/1133,740850057,MDEyOklzc3VlQ29tbWVudDc0MDg1MDA1Nw==,9599,simonw,2020-12-08T18:55:29Z,2020-12-08T18:55:29Z,OWNER,Can work on this as part of #1062.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",759695780,Option to omit header row in CSV export, https://github.com/simonw/sqlite-utils/pull/204#issuecomment-740796067,https://api.github.com/repos/simonw/sqlite-utils/issues/204,740796067,MDEyOklzc3VlQ29tbWVudDc0MDc5NjA2Nw==,9599,simonw,2020-12-08T17:49:22Z,2020-12-08T17:49:22Z,OWNER,"Great catch, thank you.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752888228,use jsonify_if_need for sql updates, https://github.com/simonw/datasette/issues/815#issuecomment-740385032,https://api.github.com/repos/simonw/datasette/issues/815,740385032,MDEyOklzc3VlQ29tbWVudDc0MDM4NTAzMg==,9599,simonw,2020-12-08T05:26:09Z,2020-12-08T05:26:16Z,OWNER,"Sure! It's a bit of a fiddle one - I've not found an approach that I like, but I also haven't thought about it since June. I'd love to see what you come up with!","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",634663505,Group permission checks by request on /-/permissions debug page, https://github.com/simonw/datasette/issues/815#issuecomment-740383884,https://api.github.com/repos/simonw/datasette/issues/815,740383884,MDEyOklzc3VlQ29tbWVudDc0MDM4Mzg4NA==,11761973,sturzl,2020-12-08T05:23:18Z,2020-12-08T05:23:18Z,NONE,hey! I'd like to take a look at this if you're open to a PR for it,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",634663505,Group permission checks by request on /-/permissions debug page, https://github.com/simonw/datasette/issues/1132#issuecomment-740228858,https://api.github.com/repos/simonw/datasette/issues/1132,740228858,MDEyOklzc3VlQ29tbWVudDc0MDIyODg1OA==,9599,simonw,2020-12-07T22:50:36Z,2020-12-07T22:50:36Z,OWNER,"Documented here: https://docs.datasette.io/en/latest/json_api.html#column-filter-arguments Demo: https://latest.datasette.io/fixtures/facetable?tags__arraynotcontains=tag2","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",758899581,New filter: array does not contain, https://github.com/simonw/datasette/issues/1131#issuecomment-739414118,https://api.github.com/repos/simonw/datasette/issues/1131,739414118,MDEyOklzc3VlQ29tbWVudDczOTQxNDExOA==,9599,simonw,2020-12-05T20:48:33Z,2020-12-05T20:48:33Z,OWNER,"Oddly enough, I tried fixing this with `sys.stderr.write(""{}\n"".format(e))` - but my Click `CLIRunner` tests failed because `result.stderr` was an empty string. Adding `sys.stderr.flush()` to the code that output errors fixed that issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",757481949,"""datasette inspect"" outputs invalid JSON if an error is logged", https://github.com/simonw/datasette/issues/398#issuecomment-739357330,https://api.github.com/repos/simonw/datasette/issues/398,739357330,MDEyOklzc3VlQ29tbWVudDczOTM1NzMzMA==,9599,simonw,2020-12-05T19:36:27Z,2020-12-05T19:36:27Z,OWNER,This was fixed in #749 by 88ac538b41a4753c3de9b509c3a0e13077f66182,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398011658,Ensure downloading a 100+MB SQLite database file works, https://github.com/simonw/datasette/pull/1128#issuecomment-739355855,https://api.github.com/repos/simonw/datasette/issues/1128,739355855,MDEyOklzc3VlQ29tbWVudDczOTM1NTg1NQ==,9599,simonw,2020-12-05T19:34:57Z,2020-12-05T19:34:57Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756867924,Fix startup error on windows, https://github.com/simonw/datasette/issues/1131#issuecomment-739083673,https://api.github.com/repos/simonw/datasette/issues/1131,739083673,MDEyOklzc3VlQ29tbWVudDczOTA4MzY3Mw==,9599,simonw,2020-12-05T00:02:10Z,2020-12-05T00:02:10Z,OWNER,"https://clig.dev/#the-basics > **Send messaging to stderr**. Log messages, errors, and so on should all be sent to stderr. This means that when commands are piped together, these messages are displayed to the user and not fed into the next command.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",757481949,"""datasette inspect"" outputs invalid JSON if an error is logged", https://github.com/simonw/datasette/issues/1131#issuecomment-739083472,https://api.github.com/repos/simonw/datasette/issues/1131,739083472,MDEyOklzc3VlQ29tbWVudDczOTA4MzQ3Mg==,9599,simonw,2020-12-05T00:01:12Z,2020-12-05T00:01:12Z,OWNER,Here's why: https://github.com/simonw/datasette/blob/37f87b5e52e7f8ddd1c4ffcf368bd7a62a406a6d/datasette/database.py#L158-L163,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",757481949,"""datasette inspect"" outputs invalid JSON if an error is logged", https://github.com/dogsheep/dogsheep-photos/pull/29#issuecomment-739058820,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/29,739058820,MDEyOklzc3VlQ29tbWVudDczOTA1ODgyMA==,9599,simonw,2020-12-04T22:32:35Z,2020-12-04T22:32:35Z,MEMBER,Thanks for this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",638375985,Fixed bug in SQL query for photo scores, https://github.com/simonw/datasette/pull/1130#issuecomment-738907852,https://api.github.com/repos/simonw/datasette/issues/1130,738907852,MDEyOklzc3VlQ29tbWVudDczODkwNzg1Mg==,3243482,abdusco,2020-12-04T17:22:29Z,2020-12-04T17:31:25Z,CONTRIBUTOR,"EDIT: I misunderstood the problem. This seems like a fix better suited for Safari. But I don't have any Apple device to test it. ```css body { min-height: 100vh; min-height: -webkit-fill-available; } html { height: -webkit-fill-available; } ``` https://css-tricks.com/css-fix-for-100vh-in-mobile-webkit/ --- It's actually not that difficult to fix. Well, this is actually a workaround to keep viewport in place. I usually put a transition (forgot to do it here) that keeps page from resizing. ```css .container { min-height: 100vh; transition: height 10000s steps(0); } ``` `steps()` function prevents excessive layout calculations, and lets the page snap back into place (10000s ~= 3h later) in a single step. This fix also prevents page from jumping around when the keyboard pops up and down.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756876238,Fix footer not sticking to bottom in short pages, https://github.com/simonw/datasette/issues/188#issuecomment-738905376,https://api.github.com/repos/simonw/datasette/issues/188,738905376,MDEyOklzc3VlQ29tbWVudDczODkwNTM3Ng==,9599,simonw,2020-12-04T17:18:34Z,2020-12-04T17:18:34Z,OWNER,This is likely to be covered by plugin hooks: #860 for the metadata and after investigating in #1042 it looks like the existing `prepare_jinja2_environment` hook may already be enough to load templates from the database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/111#issuecomment-738904347,https://api.github.com/repos/simonw/datasette/issues/111,738904347,MDEyOklzc3VlQ29tbWVudDczODkwNDM0Nw==,9599,simonw,2020-12-04T17:16:56Z,2020-12-04T17:16:56Z,OWNER,This is STILL a good idea.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/pull/1130#issuecomment-738897582,https://api.github.com/repos/simonw/datasette/issues/1130,738897582,MDEyOklzc3VlQ29tbWVudDczODg5NzU4Mg==,9599,simonw,2020-12-04T17:03:30Z,2020-12-04T17:03:30Z,OWNER,"I deployed this to https://datasette-issue-1129.vercel.app/ (using `datasette publish vercel fixtures.db --branch 8d4c69c6fb0ef741a19070f5172017ea3522e83c --about_url https://github.com/simonw/datasette/issues/1129 --about datasette/issues/1129 --project datasette-issue-1129`) - weirdly, on Mobile Safari the footer appears just below the visible window:  I've seen other problems with fixed footers on Mobile Safari too: at Eventbrite this was a really nasty problem for us to figure out: https://www.eventbrite.com/engineering/mobile-safari-why/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756876238,Fix footer not sticking to bottom in short pages, https://github.com/simonw/datasette/pull/1130#issuecomment-738620153,https://api.github.com/repos/simonw/datasette/issues/1130,738620153,MDEyOklzc3VlQ29tbWVudDczODYyMDE1Mw==,22429695,codecov[bot],2020-12-04T07:34:48Z,2020-12-04T07:34:48Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=h1) Report > Merging [#1130](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=desc) (8d4c69c) into [main](https://codecov.io/gh/simonw/datasette/commit/49d8fc056844d5a537d6cfd96dab0dd5686fe718?el=desc) (49d8fc0) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1130 +/- ## ======================================= Coverage 91.42% 91.42% ======================================= Files 31 31 Lines 3873 3873 ======================================= Hits 3541 3541 Misses 332 332 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=footer). Last update [49d8fc0...8d4c69c](https://codecov.io/gh/simonw/datasette/pull/1130?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756876238,Fix footer not sticking to bottom in short pages, https://github.com/simonw/datasette/pull/1128#issuecomment-738613497,https://api.github.com/repos/simonw/datasette/issues/1128,738613497,MDEyOklzc3VlQ29tbWVudDczODYxMzQ5Nw==,22429695,codecov[bot],2020-12-04T07:17:12Z,2020-12-04T07:17:12Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=h1) Report > Merging [#1128](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=desc) (7004c3b) into [main](https://codecov.io/gh/simonw/datasette/commit/49d8fc056844d5a537d6cfd96dab0dd5686fe718?el=desc) (49d8fc0) will **decrease** coverage by `0.00%`. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1128 +/- ## ========================================== - Coverage 91.42% 91.42% -0.01% ========================================== Files 31 31 Lines 3873 3872 -1 ========================================== - Hits 3541 3540 -1 Misses 332 332 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/utils/asgi.py](https://codecov.io/gh/simonw/datasette/pull/1128/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL2FzZ2kucHk=) | `92.13% <ø> (-0.04%)` | :arrow_down: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=footer). Last update [49d8fc0...7004c3b](https://codecov.io/gh/simonw/datasette/pull/1128?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756867924,Fix startup error on windows, https://github.com/simonw/datasette/issues/1125#issuecomment-738554392,https://api.github.com/repos/simonw/datasette/issues/1125,738554392,MDEyOklzc3VlQ29tbWVudDczODU1NDM5Mg==,9599,simonw,2020-12-04T04:16:57Z,2020-12-04T04:16:57Z,OWNER,"https://latest.datasette.io/-/versions now shows this: ```json { ""python"": { ""version"": ""3.8.6"", ""full"": ""3.8.6 (default, Nov 18 2020, 13:49:49) \n[GCC 8.3.0]"" }, ""datasette"": { ""version"": ""0.52.3"", ""note"": ""49d8fc056844d5a537d6cfd96dab0dd5686fe718"" }, ""asgi"": ""3.0"", ""uvicorn"": ""0.12.3"", ""sqlite"": { ""version"": ""3.33.0"", ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""extensions"": { ""json1"": null }, ""compile_options"": [] }, ""pysqlite3"": ""0.4.4"" } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756622648,Show pysqlite3 version on /-/versions, https://github.com/simonw/datasette/issues/1125#issuecomment-738551280,https://api.github.com/repos/simonw/datasette/issues/1125,738551280,MDEyOklzc3VlQ29tbWVudDczODU1MTI4MA==,9599,simonw,2020-12-04T04:03:54Z,2020-12-04T04:03:54Z,OWNER,"I'm going to check `pkg_resources.get_distribution(""pysqlite3-binary"").version` too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756622648,Show pysqlite3 version on /-/versions, https://github.com/simonw/datasette/issues/1125#issuecomment-738550588,https://api.github.com/repos/simonw/datasette/issues/1125,738550588,MDEyOklzc3VlQ29tbWVudDczODU1MDU4OA==,9599,simonw,2020-12-04T04:01:10Z,2020-12-04T04:01:10Z,OWNER,"Urgh, figuring out the version of `pysqlite3` is WAY harder than I expected. The `getversion` module looks like the smartest attempt at solving this problem generally, but I'd like to avoid adding another dependency just for this: https://github.com/smarie/python-getversion","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756622648,Show pysqlite3 version on /-/versions, https://github.com/simonw/datasette/issues/1125#issuecomment-738548693,https://api.github.com/repos/simonw/datasette/issues/1125,738548693,MDEyOklzc3VlQ29tbWVudDczODU0ODY5Mw==,9599,simonw,2020-12-04T03:52:51Z,2020-12-04T03:52:51Z,OWNER,"That didn't work - https://latest.datasette.io/-/versions isn't showing the package. I bet that's because I'm actually installing `pysqlite3-binary` here: https://github.com/simonw/datasette/blob/e2fea36540e952d8d72c1bd0af7144b85b7a4671/.github/workflows/deploy-latest.yml#L57","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756622648,Show pysqlite3 version on /-/versions, https://github.com/simonw/datasette/issues/1126#issuecomment-738548393,https://api.github.com/repos/simonw/datasette/issues/1126,738548393,MDEyOklzc3VlQ29tbWVudDczODU0ODM5Mw==,9599,simonw,2020-12-04T03:51:38Z,2020-12-04T03:51:38Z,OWNER,That worked.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756761963,Switch to google-github-actions/setup-gcloud for demo deploy, https://github.com/simonw/datasette/issues/1125#issuecomment-738347171,https://api.github.com/repos/simonw/datasette/issues/1125,738347171,MDEyOklzc3VlQ29tbWVudDczODM0NzE3MQ==,9599,simonw,2020-12-03T22:04:52Z,2020-12-03T22:04:52Z,OWNER,"``` pkg_resources.get_distribution(""pysqlite3"").version Out[14]: '0.4.4' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756622648,Show pysqlite3 version on /-/versions, https://github.com/simonw/datasette/issues/1124#issuecomment-738215686,https://api.github.com/repos/simonw/datasette/issues/1124,738215686,MDEyOklzc3VlQ29tbWVudDczODIxNTY4Ng==,9599,simonw,2020-12-03T18:50:48Z,2020-12-03T21:42:02Z,OWNER,I'm going to punt on writing a unit test for this (not sure how I'd simulate those symlinks) - I'll manually test it and push out a dot release instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738215487,https://api.github.com/repos/simonw/datasette/issues/1124,738215487,MDEyOklzc3VlQ29tbWVudDczODIxNTQ4Nw==,9599,simonw,2020-12-03T18:50:26Z,2020-12-03T21:41:25Z,OWNER,"This fix works - calling `.resolve()` on the `root_path` before the comparison to ensure symlinks are resolved: ```python # Ensure full_path is within root_path to avoid weird ""../"" tricks try: print(""full_path={}, root_path={}"".format(full_path, root_path)) full_path.relative_to(root_path.resolve()) except ValueError as e: print("" ValueError:"", e) await asgi_send_html(send, ""404"", 404) return ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738213342,https://api.github.com/repos/simonw/datasette/issues/1124,738213342,MDEyOklzc3VlQ29tbWVudDczODIxMzM0Mg==,9599,simonw,2020-12-03T18:46:22Z,2020-12-03T21:40:51Z,OWNER,"I replaced that function with this code: ```python def asgi_static(root_path, chunk_size=4096, headers=None, content_type=None): async def inner_static(request, send): path = request.scope[""url_route""][""kwargs""][""path""] print(""path ="", path) try: full_path = (Path(root_path) / path).resolve().absolute() except FileNotFoundError as e: print(""FileNotFoundError:"", e) await asgi_send_html(send, ""404"", 404) return if full_path.is_dir(): await asgi_send_html(send, ""403: Directory listing is not allowed"", 403) return # Ensure full_path is within root_path to avoid weird ""../"" tricks try: print(""full_path={}, root_path={}"".format(full_path, root_path)) full_path.relative_to(root_path) except ValueError as e: print("" ValueError:"", e) await asgi_send_html(send, ""404"", 404) return try: await asgi_send_file(send, full_path, chunk_size=chunk_size) except FileNotFoundError: await asgi_send_html(send, ""404"", 404) return return inner_static ``` Edited using `vi /home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/utils/asgi.py` The output shows me what the bug is: ``` $ datasette --get /-/static/app.css --pdb app_root = /home/ec2-user/.local/pipx/venvs/datasette/lib64/python3.7/site-packages path = app.css full_path=/home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/static/app.css, root_path=/home/ec2-user/.local/pipx/venvs/datasette/lib64/python3.7/site-packages/datasette/static ValueError: '/home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/static/app.css' does not start with '/home/ec2-user/.local/pipx/venvs/datasette/lib64/python3.7/site-packages/datasette/static' 404 ``` ` ValueError: '/home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/static/app.css' does not start with '/home/ec2-user/.local/pipx/venvs/datasette/lib64/python3.7/site-packages/datasette/static'` One is `../lib/python3.7/..` and the other is `../lib64/python3.7/..` - there's clearly some kind of symlink in play here which I'm not taking into account.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738313399,https://api.github.com/repos/simonw/datasette/issues/1124,738313399,MDEyOklzc3VlQ29tbWVudDczODMxMzM5OQ==,9599,simonw,2020-12-03T21:10:54Z,2020-12-03T21:10:54Z,OWNER,Confirmed that installing a fresh copy of Datasette 0.52.3 on that server works correctly as expected.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738224865,https://api.github.com/repos/simonw/datasette/issues/1124,738224865,MDEyOklzc3VlQ29tbWVudDczODIyNDg2NQ==,9599,simonw,2020-12-03T19:01:52Z,2020-12-03T19:01:52Z,OWNER,"https://github.com/simonw/datasette/runs/1494631261 ``` /home/runner/work/datasette/datasette/tests/test_html.py:81: AssertionError ----------------------------- Captured stderr call ----------------------------- Traceback (most recent call last): File ""/home/runner/work/datasette/datasette/datasette/app.py"", line 1039, in route_path response = await view(request, send) File ""/home/runner/work/datasette/datasette/datasette/utils/asgi.py"", line 297, in inner_static full_path.relative_to(root_path.resolve()) AttributeError: 'str' object has no attribute 'resolve' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738220067,https://api.github.com/repos/simonw/datasette/issues/1124,738220067,MDEyOklzc3VlQ29tbWVudDczODIyMDA2Nw==,9599,simonw,2020-12-03T18:58:17Z,2020-12-03T18:58:17Z,OWNER,"I tested this by running: pipx uninstall datasette pipx install 'https://github.com/simonw/datasette/archive/6b4c55efea3e9d34d92cbe5f0066553ad9b14071.zip' To replace that version of Datasette (in the correct virtual environment) with this patch. It worked! ``` [ec2-user@ip-172-31-30-7 ~]$ datasette --get /-/static/app.css /* Reset and Page Setup ==================================================== */ ... ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738211776,https://api.github.com/repos/simonw/datasette/issues/1124,738211776,MDEyOklzc3VlQ29tbWVudDczODIxMTc3Ng==,9599,simonw,2020-12-03T18:43:21Z,2020-12-03T18:43:21Z,OWNER,I'm suspicious of this code here:https://github.com/simonw/datasette/blob/e048791a9a2686f47d81a2c8aa88aa1966d82521/datasette/utils/asgi.py#L284-L307,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738211152,https://api.github.com/repos/simonw/datasette/issues/1124,738211152,MDEyOklzc3VlQ29tbWVudDczODIxMTE1Mg==,9599,simonw,2020-12-03T18:42:12Z,2020-12-03T18:42:12Z,OWNER,"Added a line to print out `app_root` from https://github.com/simonw/datasette/blob/e048791a9a2686f47d81a2c8aa88aa1966d82521/datasette/app.py#L848-L853 ``` app_root = /home/ec2-user/.local/pipx/venvs/datasette/lib64/python3.7/site-packages ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1124#issuecomment-738209642,https://api.github.com/repos/simonw/datasette/issues/1124,738209642,MDEyOklzc3VlQ29tbWVudDczODIwOTY0Mg==,9599,simonw,2020-12-03T18:39:19Z,2020-12-03T18:39:19Z,OWNER,"The CSS files are in the expected location: ``` [ec2-user@ip-172-31-30-7 ~]$ find /home/ec2-user/.local/pipx/venvs/datasette | grep css /home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/static/app.css /home/ec2-user/.local/pipx/venvs/datasette/lib/python3.7/site-packages/datasette/static/codemirror-5.57.0.min.css ``` Wow it's running an ANCIENT version of SQLite: ``` [ec2-user@ip-172-31-30-7 ~]$ datasette --get /-/versions.json {""python"": {""version"": ""3.7.9"", ""full"": ""3.7.9 (default, Aug 27 2020, 21:58:41) \n[GCC 7.3.1 20180712 (Red Hat 7.3.1-9)]""}, ""datasette"": {""version"": ""0.52.2""}, ""asgi"": ""3.0"", ""uvicorn"": ""0.12.3"", ""sqlite"": {""version"": ""3.7.17"", ""fts_versions"": [""FTS4"", ""FTS3""], ""extensions"": {}, ""compile_options"": [""DISABLE_DIRSYNC"", ""ENABLE_COLUMN_METADATA"", ""ENABLE_FTS3"", ""ENABLE_RTREE"", ""ENABLE_UNLOCK_NOTIFY"", ""SECURE_DELETE"", ""TEMP_STORE=1"", ""THREADSAFE=1""]}} ``` http://www.sqlite.org/releaselog/3_7_17.html - SQLite Release 3.7.17 On 2013-05-20","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756439516,Datasette on Amazon Linux on ARM returns 404 for static assets, https://github.com/simonw/datasette/issues/1121#issuecomment-737591281,https://api.github.com/repos/simonw/datasette/issues/1121,737591281,MDEyOklzc3VlQ29tbWVudDczNzU5MTI4MQ==,9599,simonw,2020-12-03T01:03:18Z,2020-12-03T01:03:18Z,OWNER,"Demo: https://latest.datasette.io/fixtures?_bot=1 <img width=""729"" alt=""fixtures"" src=""https://user-images.githubusercontent.com/9599/100950004-43c0b500-34c0-11eb-918c-aa959376461f.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",754178780,Table actions cog is misaligned, https://github.com/simonw/datasette/issues/1100#issuecomment-737589314,https://api.github.com/repos/simonw/datasette/issues/1100,737589314,MDEyOklzc3VlQ29tbWVudDczNzU4OTMxNA==,9599,simonw,2020-12-03T00:57:35Z,2020-12-03T00:57:35Z,OWNER,"Fixed in the demo: ``` % curl -XOPTIONS https://latest.datasette.io/fixtures.json ok% ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",747702144,Error on OPTIONS request to database, https://github.com/simonw/datasette/issues/1123#issuecomment-737586248,https://api.github.com/repos/simonw/datasette/issues/1123,737586248,MDEyOklzc3VlQ29tbWVudDczNzU4NjI0OA==,9599,simonw,2020-12-03T00:47:37Z,2020-12-03T00:47:37Z,OWNER,"Affected tests: ``` FAILED tests/test_plugins.py::test_hook_table_actions[facetable] - AssertionE... FAILED tests/test_plugins.py::test_hook_table_actions[simple_view] - Assertio... ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",755721275,"Table actions hook are order dependent, should not be", https://github.com/simonw/datasette/issues/1100#issuecomment-737581719,https://api.github.com/repos/simonw/datasette/issues/1100,737581719,MDEyOklzc3VlQ29tbWVudDczNzU4MTcxOQ==,9599,simonw,2020-12-03T00:35:23Z,2020-12-03T00:35:23Z,OWNER,"Replicated this against the live demo as well: ``` /tmp % curl -XOPTIONS https://latest.datasette.io/fixtures.json {""ok"": false, ""error"": ""object Response can't be used in 'await' expression"", ""status"": 500, ""title"": null}% /tmp % ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",747702144,Error on OPTIONS request to database, https://github.com/simonw/datasette/pull/1122#issuecomment-737580813,https://api.github.com/repos/simonw/datasette/issues/1122,737580813,MDEyOklzc3VlQ29tbWVudDczNzU4MDgxMw==,9599,simonw,2020-12-03T00:33:09Z,2020-12-03T00:33:09Z,OWNER,"This is a very neat fix, thank you.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",754179035,Fix misaligned table actions cog, https://github.com/simonw/datasette/issues/749#issuecomment-737580084,https://api.github.com/repos/simonw/datasette/issues/749,737580084,MDEyOklzc3VlQ29tbWVudDczNzU4MDA4NA==,9599,simonw,2020-12-03T00:31:14Z,2020-12-03T00:31:14Z,OWNER,"This works! ``` /tmp % wget 'https://covid-19.datasettes.com/covid.db' --2020-12-02 16:28:02-- https://covid-19.datasettes.com/covid.db Resolving covid-19.datasettes.com (covid-19.datasettes.com)... 172.217.5.83 Connecting to covid-19.datasettes.com (covid-19.datasettes.com)|172.217.5.83|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [application/octet-stream] Saving to: ‘covid.db’ covid.db [ <=> ] 306.42M 3.27MB/s in 98s 2020-12-02 16:29:40 (3.13 MB/s) - ‘covid.db’ saved [321306624] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610829227,Cloud Run fails to serve database files larger than 32MB, https://github.com/simonw/datasette/issues/749#issuecomment-737563699,https://api.github.com/repos/simonw/datasette/issues/749,737563699,MDEyOklzc3VlQ29tbWVudDczNzU2MzY5OQ==,9599,simonw,2020-12-02T23:45:42Z,2020-12-02T23:45:42Z,OWNER,"I asked about this on Twitter - https://twitter.com/steren/status/1334281184965140483 > You simply need to send the `Transfer-Encoding: chunked` header.","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610829227,Cloud Run fails to serve database files larger than 32MB, https://github.com/simonw/datasette/issues/942#issuecomment-737463116,https://api.github.com/repos/simonw/datasette/issues/942,737463116,MDEyOklzc3VlQ29tbWVudDczNzQ2MzExNg==,9599,simonw,2020-12-02T20:02:10Z,2020-12-02T20:03:01Z,OWNER,"My idea is that if you installed my proposed plugin you wouldn't need `metadata.json` at all - your metadata would instead live in a table in the connected SQLite database files - either one table per database (so the metadata can live in the same place as the data) or maybe also in a dedicated separate database file, for if you want to add metadata to an otherwise read-only database. The plugin would then provide a UI for editing that metadata - maybe by configuring some writable canned queries or maybe something more custom than that. Or you could edit the metadata by manually editing the SQLite database file (or loading data into it using a tool like [yaml-to-sqlite](https://github.com/simonw/yaml-to-sqlite)).","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/942#issuecomment-737428262,https://api.github.com/repos/simonw/datasette/issues/942,737428262,MDEyOklzc3VlQ29tbWVudDczNzQyODI2Mg==,596279,zaneselvans,2020-12-02T18:55:21Z,2020-12-02T18:55:21Z,NONE,"Are you thinking that those metadata tables would be added to the SQLite DB by Datasette, when you tell it to wrap up the database, with the metadata coming from the `metadata.json`? Would it be easy to allow the prepopulation of those tables in the database itself? We've been struggling with the best way to make sure that the data is always accompanied by metadata, and baking it all into the database itself would be nice, since then we wouldn't need to worry about separately distributing different files in different contexts.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/942#issuecomment-737402392,https://api.github.com/repos/simonw/datasette/issues/942,737402392,MDEyOklzc3VlQ29tbWVudDczNzQwMjM5Mg==,9599,simonw,2020-12-02T18:08:55Z,2020-12-02T18:08:55Z,OWNER,"SQLite does let you add comments in your CREATE TABLE statements: ```sql CREATE TABLE something ( id integer primary key, -- integer primary key created text -- created date as ISO datetime ); ``` But the only mechanism for reading those back is to retrieve that `CREATE TABLE` block of SQL from the `sqlite_master` table and run a parser against it. I've so far resisted adding a SQL syntax parser to Datasette for complexity reasons - though I'm increasingly thinking I'll need to do it at some point. I think I'll leave this to plugins. I'm definitely going to build a plugin that lets you store metadata for tables and columns in a SQLite database table, which will then support interactively editing metadata through a UI. A plugin which extracts column comments from the SQLite CREATE TABLE comments would be feasible too, if I design the plugin hooks well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/1111#issuecomment-736322290,https://api.github.com/repos/simonw/datasette/issues/1111,736322290,MDEyOklzc3VlQ29tbWVudDczNjMyMjI5MA==,3243482,abdusco,2020-12-01T08:54:47Z,2020-12-01T08:54:47Z,CONTRIBUTOR,"Somewhat related: https://github.com/simonw/datasette/issues/859 I fixed the issue with forking and disabling the counts for hidden tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",751195017,Accessing a database's `.json` is slow for very large SQLite files, https://github.com/simonw/datasette/pull/1122#issuecomment-736318377,https://api.github.com/repos/simonw/datasette/issues/1122,736318377,MDEyOklzc3VlQ29tbWVudDczNjMxODM3Nw==,22429695,codecov[bot],2020-12-01T08:47:33Z,2020-12-01T08:47:33Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=h1) Report > Merging [#1122](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=desc) (94ea22f) into [main](https://codecov.io/gh/simonw/datasette/commit/a970276b9999687b96c5e11ea1c817d814f5d267?el=desc) (a970276) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1122 +/- ## ======================================= Coverage 91.49% 91.49% ======================================= Files 31 31 Lines 3856 3856 ======================================= Hits 3528 3528 Misses 328 328 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=footer). Last update [a970276...94ea22f](https://codecov.io/gh/simonw/datasette/pull/1122?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",754179035,Fix misaligned table actions cog, https://github.com/simonw/datasette/issues/942#issuecomment-736173084,https://api.github.com/repos/simonw/datasette/issues/942,736173084,MDEyOklzc3VlQ29tbWVudDczNjE3MzA4NA==,596279,zaneselvans,2020-12-01T02:20:58Z,2020-12-01T02:20:58Z,NONE,"Are there common patterns for storing column-based metadata inside SQLite itself? I know Postgres allows ""comment"" fields, which this is kind of trying to replicate. Should the `units` and `description` and possibly other per-column metadata fields be combined into a single (tabular?) structure, that would be displayed above the data on the table / query results page?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/1119#issuecomment-736142201,https://api.github.com/repos/simonw/datasette/issues/1119,736142201,MDEyOklzc3VlQ29tbWVudDczNjE0MjIwMQ==,9599,simonw,2020-12-01T00:41:14Z,2020-12-01T00:41:14Z,OWNER,"On my laptop: <img width=""1123"" alt=""fixtures__generated_columns__1_row"" src=""https://user-images.githubusercontent.com/9599/100681839-8734eb00-3329-11eb-8aea-813bdc18efa1.png""> https://latest.datasette.io/-/versions is running SQLite 3.27.2 at the moment so it won't show that table until it gets to 3.31.0.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753876808,"Include generated columns in fixtures.db, if SQLite version supports it", https://github.com/simonw/datasette/pull/1120#issuecomment-736135125,https://api.github.com/repos/simonw/datasette/issues/1120,736135125,MDEyOklzc3VlQ29tbWVudDczNjEzNTEyNQ==,22429695,codecov[bot],2020-12-01T00:22:36Z,2020-12-01T00:22:36Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=h1) Report > Merging [#1120](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=desc) (ddad8db) into [main](https://codecov.io/gh/simonw/datasette/commit/461670a0b87efa953141b449a9a261919864ceb3?el=desc) (461670a) will **increase** coverage by `0.00%`. > The diff coverage is `100.00%`. [](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1120 +/- ## ======================================= Coverage 91.48% 91.49% ======================================= Files 31 31 Lines 3852 3856 +4 ======================================= + Hits 3524 3528 +4 Misses 328 328 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1120/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.10% <100.00%> (ø)` | | | [datasette/utils/sqlite.py](https://codecov.io/gh/simonw/datasette/pull/1120/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL3NxbGl0ZS5weQ==) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=footer). Last update [461670a...ddad8db](https://codecov.io/gh/simonw/datasette/pull/1120?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753898359,generated_columns table in fixtures.py, https://github.com/simonw/datasette/pull/1117#issuecomment-736088949,https://api.github.com/repos/simonw/datasette/issues/1117,736088949,MDEyOklzc3VlQ29tbWVudDczNjA4ODk0OQ==,2789593,nattaylor,2020-11-30T22:15:58Z,2020-11-30T22:23:19Z,NONE,"I just deployed this and its working great. ~In a very unscientific benchmark my response times went from around 22-25ms to 33-36ms, but I didn't even dig enough to confirm the latency is related to the change. It's on a VPS, so maybe the load changed.~ I don't see any difference in performance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/pull/1117#issuecomment-736067475,https://api.github.com/repos/simonw/datasette/issues/1117,736067475,MDEyOklzc3VlQ29tbWVudDczNjA2NzQ3NQ==,22429695,codecov[bot],2020-11-30T21:28:22Z,2020-11-30T21:28:22Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=h1) Report > Merging [#1117](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=desc) (ccdf2c6) into [main](https://codecov.io/gh/simonw/datasette/commit/dea3c508b39528e566d711c38a467b3d372d220b?el=desc) (dea3c50) will **decrease** coverage by `0.00%`. > The diff coverage is `95.23%`. [](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1117 +/- ## ========================================== - Coverage 91.48% 91.48% -0.01% ========================================== Files 30 31 +1 Lines 3841 3852 +11 ========================================== + Hits 3514 3524 +10 - Misses 327 328 +1 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1117/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.10% <87.50%> (-0.20%)` | :arrow_down: | | [datasette/utils/sqlite.py](https://codecov.io/gh/simonw/datasette/pull/1117/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL3NxbGl0ZS5weQ==) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=footer). Last update [dea3c50...ccdf2c6](https://codecov.io/gh/simonw/datasette/pull/1117?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/issues/1116#issuecomment-736030599,https://api.github.com/repos/simonw/datasette/issues/1116,736030599,MDEyOklzc3VlQ29tbWVudDczNjAzMDU5OQ==,9599,simonw,2020-11-30T20:41:41Z,2020-11-30T20:41:41Z,OWNER,"Here's the problem: https://www.sqlite.org/changes.html#version_3_26_0 > ### 2018-12-01 (3.26.0) > > - Added [PRAGMA table_xinfo](https://www.sqlite.org/pragma.html#pragma_table_xinfo) that works just like [PRAGMA table_info](https://www.sqlite.org/pragma.html#pragma_table_info) except that it also shows [hidden columns](https://www.sqlite.org/vtab.html#hiddencol) in virtual tables. CI is running 3.22.0.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/pull/1117#issuecomment-736029337,https://api.github.com/repos/simonw/datasette/issues/1117,736029337,MDEyOklzc3VlQ29tbWVudDczNjAyOTMzNw==,9599,simonw,2020-11-30T20:39:06Z,2020-11-30T20:39:06Z,OWNER,"Here's the problem: https://www.sqlite.org/changes.html#version_3_26_0 > ### 2018-12-01 (3.26.0) > > - Added [PRAGMA table_xinfo](https://www.sqlite.org/pragma.html#pragma_table_xinfo) that works just like [PRAGMA table_info](https://www.sqlite.org/pragma.html#pragma_table_info) except that it also shows [hidden columns](https://www.sqlite.org/vtab.html#hiddencol) in virtual tables. CI is running 3.22.0.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/pull/1117#issuecomment-736028726,https://api.github.com/repos/simonw/datasette/issues/1117,736028726,MDEyOklzc3VlQ29tbWVudDczNjAyODcyNg==,9599,simonw,2020-11-30T20:37:50Z,2020-11-30T20:37:50Z,OWNER,"This kind of problem is why I have a `tmate` workflow: <img width=""1086"" alt=""Actions_·_simonw_datasette"" src=""https://user-images.githubusercontent.com/9599/100660845-72dff680-3307-11eb-89dc-211d226f68dd.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/pull/1117#issuecomment-736023089,https://api.github.com/repos/simonw/datasette/issues/1117,736023089,MDEyOklzc3VlQ29tbWVudDczNjAyMzA4OQ==,9599,simonw,2020-11-30T20:26:27Z,2020-11-30T20:26:27Z,OWNER,"On my laptop: ``` platform darwin -- Python 3.8.6, pytest-6.0.1, py-1.9.0, pluggy-0.13.1 SQLite: 3.33.0 ``` In CI they are all SQLite: 3.22.0","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/pull/1117#issuecomment-736018609,https://api.github.com/repos/simonw/datasette/issues/1117,736018609,MDEyOklzc3VlQ29tbWVudDczNjAxODYwOQ==,9599,simonw,2020-11-30T20:17:31Z,2020-11-30T20:17:31Z,OWNER,I need to replicate these failures on my laptop. My hunch is that this is down to the version of SQLite available to Python.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753767911,Support for generated columns, https://github.com/simonw/datasette/issues/1116#issuecomment-736015487,https://api.github.com/repos/simonw/datasette/issues/1116,736015487,MDEyOklzc3VlQ29tbWVudDczNjAxNTQ4Nw==,9599,simonw,2020-11-30T20:11:07Z,2020-11-30T20:11:07Z,OWNER,Working on this in a pull request: https://github.com/simonw/datasette/pull/1117,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-736014372,https://api.github.com/repos/simonw/datasette/issues/1116,736014372,MDEyOklzc3VlQ29tbWVudDczNjAxNDM3Mg==,9599,simonw,2020-11-30T20:08:48Z,2020-11-30T20:08:48Z,OWNER,"Ouch, the tests pass on my laptop but failed in CI: https://github.com/simonw/datasette/actions/runs/392367997 Lots of failures look like this: ``` ERROR: conn=<sqlite3.Connection object at 0x7f44f0494030>, sql = 'select rowid, from facetable order by rowid limit 51', params = {}: near ""from"": syntax error ``` Note the `select rowid, from...` - so it looks like invalid SQL queries are being constructed maybe due to mis-detecting columns somehow. I wonder why it didn't fail on my laptop?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-736010720,https://api.github.com/repos/simonw/datasette/issues/1116,736010720,MDEyOklzc3VlQ29tbWVudDczNjAxMDcyMA==,9599,simonw,2020-11-30T20:01:53Z,2020-11-30T20:01:53Z,OWNER,"I'm OK exposing hidden columns, unless someone comes up with a pressing reason not to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-736005833,https://api.github.com/repos/simonw/datasette/issues/1116,736005833,MDEyOklzc3VlQ29tbWVudDczNjAwNTgzMw==,2789593,nattaylor,2020-11-30T19:54:39Z,2020-11-30T19:54:39Z,NONE,"@simonw thanks for investigating so quickly. If it is undesirable to change that hidden behavior, maybe something like this is a suitable workaround: ``` SELECT * FROM pragma_table_xinfo('deeds') where hidden in (0,2); 0|body|TEXT|0||0|0 1|id|INT GENERATED ALWAYS|0||0|2 2|consideration|INT GENERATED ALWAYS|0||0|2 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-736004383,https://api.github.com/repos/simonw/datasette/issues/1116,736004383,MDEyOklzc3VlQ29tbWVudDczNjAwNDM4Mw==,9599,simonw,2020-11-30T19:51:51Z,2020-11-30T19:51:51Z,OWNER,"This change will also have an impact on how hidden virtual FTS tables are displayed, since apparently those have some hidden columns: https://latest.datasette.io/fixtures?sql=select+*+from+pragma_table_xinfo%28%27searchable_fts%27%29 | cid | name | type | notnull | dflt_value | pk | hidden | | --- | --- | --- | --- | --- | --- | --- | | 0 | text1 | | 0 | | 0 | 0 | | 1 | text2 | | 0 | | 0 | 0 | | 2 | name with . and spaces | | 0 | | 0 | 0 | | 3 | searchable_fts | | 0 | | 0 | 1 | | 4 | docid | | 0 | | 0 | 1 | | 5 | __langid | | 0 | | 0 | 1 |","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-735995695,https://api.github.com/repos/simonw/datasette/issues/1116,735995695,MDEyOklzc3VlQ29tbWVudDczNTk5NTY5NQ==,9599,simonw,2020-11-30T19:34:15Z,2020-11-30T19:34:15Z,OWNER,"Generated column support was added in SQLite 3.31.0, so any unit tests I write for this should use skipIf to only run on that version or later.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-735993935,https://api.github.com/repos/simonw/datasette/issues/1116,735993935,MDEyOklzc3VlQ29tbWVudDczNTk5MzkzNQ==,9599,simonw,2020-11-30T19:30:44Z,2020-11-30T19:32:15Z,OWNER,"It looks like `PRAGMA table_info` skips ""hidden"" columns: https://www.sqlite.org/pragma.html#pragma_table_info But `PRAGMA table_xinfo` does not: https://www.sqlite.org/pragma.html#pragma_table_xinfo Compare https://latest.datasette.io/fixtures?sql=select+*+from+pragma_table_info%28%27searchable%27%29 to https://latest.datasette.io/fixtures?sql=select+*+from+pragma_table_xinfo%28%27searchable%27%29 - the `xinfo` one has an additional `hidden` column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1116#issuecomment-735992106,https://api.github.com/repos/simonw/datasette/issues/1116,735992106,MDEyOklzc3VlQ29tbWVudDczNTk5MjEwNg==,9599,simonw,2020-11-30T19:27:10Z,2020-11-30T19:27:10Z,OWNER,"I'm treating this as a bug - these columns should definitely be visible in Datasette. I created my own test database using SQLite from Homebrew like this: ``` /usr/local/Cellar/sqlite/3.33.0/bin/sqlite3 deeds.db << EOF CREATE TABLE deeds ( body TEXT, id INT GENERATED ALWAYS AS (json_extract(body, '$.id')) STORED, consideration INT GENERATED ALWAYS AS (json_extract(body, '$.consideration')) STORED ); INSERT INTO deeds (body) VALUES ('{ ""id"": 1, ""consideration"": ""This is the consideration"" }'); EOF ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/263#issuecomment-735960132,https://api.github.com/repos/simonw/datasette/issues/263,735960132,MDEyOklzc3VlQ29tbWVudDczNTk2MDEzMg==,9599,simonw,2020-11-30T18:25:17Z,2020-11-30T18:25:17Z,OWNER,Fixing this would unblock this issue for switching `datasette-graphql` to using `datasette.client` internally: https://github.com/simonw/datasette-graphql/issues/61,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/dogsheep/github-to-sqlite/issues/53#issuecomment-735485677,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/53,735485677,MDEyOklzc3VlQ29tbWVudDczNTQ4NTY3Nw==,9599,simonw,2020-11-30T00:36:09Z,2020-11-30T00:36:09Z,MEMBER,Given rate limits (see #51) this command might be better implemented by running a `git clone` into a temporary directory - doing so would retrieve all of the files in one go.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753000405,Command for fetching file contents, https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-735484186,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51,735484186,MDEyOklzc3VlQ29tbWVudDczNTQ4NDE4Ng==,9599,simonw,2020-11-30T00:29:31Z,2020-11-30T00:29:31Z,MEMBER,"This just caused a failure in deploying the demo: https://github.com/dogsheep/github-to-sqlite/runs/1471304407?check_suite_focus=true ``` File ""/opt/hostedtoolcache/Python/3.8.6/x64/bin/github-to-sqlite"", line 33, in <module> sys.exit(load_entry_point('github-to-sqlite', 'console_scripts', 'github-to-sqlite')()) File ""/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/cli.py"", line 142, in issue_comments for comment in utils.fetch_issue_comments(repo, token, issue): File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 380, in fetch_issue_comments for comments in paginate(url, headers): File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 472, in paginate raise GitHubError.from_response(response) github_to_sqlite.utils.GitHubError: ('API rate limit exceeded for user ID 9599.', 403) Error: Process completed with exit code 1. ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703246031,github-to-sqlite should handle rate limits better, https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-735483820,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46,735483820,MDEyOklzc3VlQ29tbWVudDczNTQ4MzgyMA==,9599,simonw,2020-11-30T00:27:47Z,2020-11-30T00:27:47Z,MEMBER,"So it looks like anything that pulls reviews needs to pull each review, then for each one pull the comments. I'm going to consider this blocked on smarter rate limit handling in #51.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",664485022,Feature: pull request reviews and comments, https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-735483604,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46,735483604,MDEyOklzc3VlQ29tbWVudDczNTQ4MzYwNA==,9599,simonw,2020-11-30T00:26:50Z,2020-11-30T00:26:50Z,MEMBER,"It seems like there's a lot missing from that - those aren't particularly interesting given the data that is returned. From the docs at https://docs.github.com/en/free-pro-team@latest/rest/reference/pulls#reviews it looks like each review consists of multiple comments, and the comments are where the useful material is - https://docs.github.com/en/free-pro-team@latest/rest/reference/pulls#list-comments-for-a-pull-request-review `github-to-sqlite get https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48/reviews/503368921/comments --accept 'application/vnd.github.v3+json'` ```json [ { ""id"": 500603838, ""node_id"": ""MDI0OlB1bGxSZXF1ZXN0UmV2aWV3Q29tbWVudDUwMDYwMzgzOA=="", ""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500603838"", ""pull_request_review_id"": 503368921, ""diff_hunk"": ""@@ -0,0 +1,370 @@\n+[\n+ {\n+ \""url\"": \""https://api.github.com/repos/simonw/datasette/pulls/571\"",\n+ \""id\"": 313384926,\n+ \""node_id\"": \""MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2\"",\n+ \""html_url\"": \""https://github.com/simonw/datasette/pull/571\"",\n+ \""diff_url\"": \""https://github.com/simonw/datasette/pull/571.diff\"",\n+ \""patch_url\"": \""https://github.com/simonw/datasette/pull/571.patch\"",\n+ \""issue_url\"": \""https://api.github.com/repos/simonw/datasette/issues/571\"",\n+ \""number\"": 571,\n+ \""state\"": \""closed\"",\n+ \""locked\"": false,\n+ \""title\"": \""detect_fts now works with alternative table escaping\"",\n+ \""user\"": {\n+ \""login\"": \""simonw\"",\n+ \""id\"": 9599,\n+ \""node_id\"": \""MDQ6VXNlcjk1OTk=\"",\n+ \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"",\n+ \""gravatar_id\"": \""\"",\n+ \""url\"": \""https://api.github.com/users/simonw\"",\n+ \""html_url\"": \""https://github.com/simonw\"",\n+ \""followers_url\"": \""https://api.github.com/users/simonw/followers\"",\n+ \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"",\n+ \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"",\n+ \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"",\n+ \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"",\n+ \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"",\n+ \""repos_url\"": \""https://api.github.com/users/simonw/repos\"",\n+ \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"",\n+ \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"",\n+ \""type\"": \""User\"",\n+ \""site_admin\"": false\n+ },\n+ \""body\"": \""Fixes #570\"",\n+ \""created_at\"": \""2019-09-03T00:23:39Z\"",\n+ \""updated_at\"": \""2019-09-03T00:32:28Z\"",\n+ \""closed_at\"": \""2019-09-03T00:32:28Z\"",\n+ \""merged_at\"": \""2019-09-03T00:32:28Z\"",\n+ \""merge_commit_sha\"": \""2dc5c8dc259a0606162673d394ba8cc1c6f54428\"",\n+ \""assignee\"": null,\n+ \""assignees\"": [\n+\n+ ],\n+ \""requested_reviewers\"": [\n+\n+ ],\n+ \""requested_teams\"": [\n+\n+ ],\n+ \""labels\"": [\n+\n+ ],\n+ \""milestone\"": null,\n+ \""draft\"": false,\n+ \""commits_url\"": \""https://api.github.com/repos/simonw/datasette/pulls/571/commits\"",\n+ \""review_comments_url\"": \""https://api.github.com/repos/simonw/datasette/pulls/571/comments\"",\n+ \""review_comment_url\"": \""https://api.github.com/repos/simonw/datasette/pulls/comments{/number}\"",\n+ \""comments_url\"": \""https://api.github.com/repos/simonw/datasette/issues/571/comments\"",\n+ \""statuses_url\"": \""https://api.github.com/repos/simonw/datasette/statuses/a85239f69261c10f1a9f90514c8b5d113cb94585\"",\n+ \""head\"": {\n+ \""label\"": \""simonw:detect-fts\"",\n+ \""ref\"": \""detect-fts\"",\n+ \""sha\"": \""a85239f69261c10f1a9f90514c8b5d113cb94585\"",\n+ \""user\"": {\n+ \""login\"": \""simonw\"",\n+ \""id\"": 9599,\n+ \""node_id\"": \""MDQ6VXNlcjk1OTk=\"",\n+ \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"",\n+ \""gravatar_id\"": \""\"",\n+ \""url\"": \""https://api.github.com/users/simonw\"",\n+ \""html_url\"": \""https://github.com/simonw\"",\n+ \""followers_url\"": \""https://api.github.com/users/simonw/followers\"",\n+ \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"",\n+ \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"",\n+ \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"",\n+ \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"",\n+ \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"",\n+ \""repos_url\"": \""https://api.github.com/users/simonw/repos\"",\n+ \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"",\n+ \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"",\n+ \""type\"": \""User\"",\n+ \""site_admin\"": false\n+ },\n+ \""repo\"": {\n+ \""id\"": 107914493,\n+ \""node_id\"": \""MDEwOlJlcG9zaXRvcnkxMDc5MTQ0OTM=\"",\n+ \""name\"": \""datasette\"",\n+ \""full_name\"": \""simonw/datasette\"",\n+ \""private\"": false,\n+ \""owner\"": {\n+ \""login\"": \""simonw\"",\n+ \""id\"": 9599,\n+ \""node_id\"": \""MDQ6VXNlcjk1OTk=\"",\n+ \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"",\n+ \""gravatar_id\"": \""\"",\n+ \""url\"": \""https://api.github.com/users/simonw\"",\n+ \""html_url\"": \""https://github.com/simonw\"",\n+ \""followers_url\"": \""https://api.github.com/users/simonw/followers\"",\n+ \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"",\n+ \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"",\n+ \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"",\n+ \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"",\n+ \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"",\n+ \""repos_url\"": \""https://api.github.com/users/simonw/repos\"",\n+ \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"",\n+ \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"",\n+ \""type\"": \""User\"",\n+ \""site_admin\"": false\n+ },\n+ \""html_url\"": \""https://github.com/simonw/datasette\"",\n+ \""description\"": \""An open source multi-tool for exploring and publishing data\"",\n+ \""fork\"": false,\n+ \""url\"": \""https://api.github.com/repos/simonw/datasette\"",\n+ \""forks_url\"": \""https://api.github.com/repos/simonw/datasette/forks\"",\n+ \""keys_url\"": \""https://api.github.com/repos/simonw/datasette/keys{/key_id}\"",\n+ \""collaborators_url\"": \""https://api.github.com/repos/simonw/datasette/collaborators{/collaborator}\"",\n+ \""teams_url\"": \""https://api.github.com/repos/simonw/datasette/teams\"",\n+ \""hooks_url\"": \""https://api.github.com/repos/simonw/datasette/hooks\"",\n+ \""issue_events_url\"": \""https://api.github.com/repos/simonw/datasette/issues/events{/number}\"",\n+ \""events_url\"": \""https://api.github.com/repos/simonw/datasette/events\"",\n+ \""assignees_url\"": \""https://api.github.com/repos/simonw/datasette/assignees{/user}\"",\n+ \""branches_url\"": \""https://api.github.com/repos/simonw/datasette/branches{/branch}\"",\n+ \""tags_url\"": \""https://api.github.com/repos/simonw/datasette/tags\"",\n+ \""blobs_url\"": \""https://api.github.com/repos/simonw/datasette/git/blobs{/sha}\"",\n+ \""git_tags_url\"": \""https://api.github.com/repos/simonw/datasette/git/tags{/sha}\"",\n+ \""git_refs_url\"": \""https://api.github.com/repos/simonw/datasette/git/refs{/sha}\"",\n+ \""trees_url\"": \""https://api.github.com/repos/simonw/datasette/git/trees{/sha}\"",\n+ \""statuses_url\"": \""https://api.github.com/repos/simonw/datasette/statuses/{sha}\"",\n+ \""languages_url\"": \""https://api.github.com/repos/simonw/datasette/languages\"",\n+ \""stargazers_url\"": \""https://api.github.com/repos/simonw/datasette/stargazers\"",\n+ \""contributors_url\"": \""https://api.github.com/repos/simonw/datasette/contributors\"",\n+ \""subscribers_url\"": \""https://api.github.com/repos/simonw/datasette/subscribers\"",\n+ \""subscription_url\"": \""https://api.github.com/repos/simonw/datasette/subscription\"",\n+ \""commits_url\"": \""https://api.github.com/repos/simonw/datasette/commits{/sha}\"",\n+ \""git_commits_url\"": \""https://api.github.com/repos/simonw/datasette/git/commits{/sha}\"",\n+ \""comments_url\"": \""https://api.github.com/repos/simonw/datasette/comments{/number}\"",\n+ \""issue_comment_url\"": \""https://api.github.com/repos/simonw/datasette/issues/comments{/number}\"",\n+ \""contents_url\"": \""https://api.github.com/repos/simonw/datasette/contents/{+path}\"",\n+ \""compare_url\"": \""https://api.github.com/repos/simonw/datasette/compare/{base}...{head}\"",\n+ \""merges_url\"": \""https://api.github.com/repos/simonw/datasette/merges\"",\n+ \""archive_url\"": \""https://api.github.com/repos/simonw/datasette/{archive_format}{/ref}\"",\n+ \""downloads_url\"": \""https://api.github.com/repos/simonw/datasette/downloads\"",\n+ \""issues_url\"": \""https://api.github.com/repos/simonw/datasette/issues{/number}\"",\n+ \""pulls_url\"": \""https://api.github.com/repos/simonw/datasette/pulls{/number}\"",\n+ \""milestones_url\"": \""https://api.github.com/repos/simonw/datasette/milestones{/number}\"",\n+ \""notifications_url\"": \""https://api.github.com/repos/simonw/datasette/notifications{?since,all,participating}\"",\n+ \""labels_url\"": \""https://api.github.com/repos/simonw/datasette/labels{/name}\"",\n+ \""releases_url\"": \""https://api.github.com/repos/simonw/datasette/releases{/id}\"",\n+ \""deployments_url\"": \""https://api.github.com/repos/simonw/datasette/deployments\"",\n+ \""created_at\"": \""2017-10-23T00:39:03Z\"",\n+ \""updated_at\"": \""2020-07-27T20:42:15Z\"",\n+ \""pushed_at\"": \""2020-07-26T01:21:05Z\"",\n+ \""git_url\"": \""git://github.com/simonw/datasette.git\"",\n+ \""ssh_url\"": \""git@github.com:simonw/datasette.git\"",\n+ \""clone_url\"": \""https://github.com/simonw/datasette.git\"",\n+ \""svn_url\"": \""https://github.com/simonw/datasette\"",\n+ \""homepage\"": \""http://datasette.readthedocs.io/\"",\n+ \""size\"": 3487,\n+ \""stargazers_count\"": 3642,\n+ \""watchers_count\"": 3642,\n+ \""language\"": \""Python\"",\n+ \""has_issues\"": true,\n+ \""has_projects\"": false,\n+ \""has_downloads\"": true,\n+ \""has_wiki\"": true,\n+ \""has_pages\"": false,\n+ \""forks_count\"": 206,\n+ \""mirror_url\"": null,\n+ \""archived\"": false,\n+ \""disabled\"": false,\n+ \""open_issues_count\"": 190,\n+ \""license\"": {\n+ \""key\"": \""apache-2.0\"",\n+ \""name\"": \""Apache License 2.0\"",\n+ \""spdx_id\"": \""Apache-2.0\"",\n+ \""url\"": \""https://api.github.com/licenses/apache-2.0\"",\n+ \""node_id\"": \""MDc6TGljZW5zZTI=\""\n+ },\n+ \""forks\"": 206,\n+ \""open_issues\"": 190,\n+ \""watchers\"": 3642,\n+ \""default_branch\"": \""master\""\n+ }\n+ },\n+ \""base\"": {\n+ \""label\"": \""simonw:master\"",\n+ \""ref\"": \""master\"",\n+ \""sha\"": \""f04deebec4f3842f7bd610cd5859de529f77d50e\"",\n+ \""user\"": {\n+ \""login\"": \""simonw\"",\n+ \""id\"": 9599,\n+ \""node_id\"": \""MDQ6VXNlcjk1OTk=\"",\n+ \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"",\n+ \""gravatar_id\"": \""\"",\n+ \""url\"": \""https://api.github.com/users/simonw\"",\n+ \""html_url\"": \""https://github.com/simonw\"",\n+ \""followers_url\"": \""https://api.github.com/users/simonw/followers\"",\n+ \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"",\n+ \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"",\n+ \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"",\n+ \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"",\n+ \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"",\n+ \""repos_url\"": \""https://api.github.com/users/simonw/repos\"",\n+ \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"",\n+ \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"",\n+ \""type\"": \""User\"",\n+ \""site_admin\"": false\n+ },\n+ \""repo\"": {\n+ \""id\"": 107914493,\n+ \""node_id\"": \""MDEwOlJlcG9zaXRvcnkxMDc5MTQ0OTM=\"",\n+ \""name\"": \""datasette\"",\n+ \""full_name\"": \""simonw/datasette\"",\n+ \""private\"": false,\n+ \""owner\"": {\n+ \""login\"": \""simonw\"",\n+ \""id\"": 9599,\n+ \""node_id\"": \""MDQ6VXNlcjk1OTk=\"",\n+ \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"",\n+ \""gravatar_id\"": \""\"",\n+ \""url\"": \""https://api.github.com/users/simonw\"",\n+ \""html_url\"": \""https://github.com/simonw\"",\n+ \""followers_url\"": \""https://api.github.com/users/simonw/followers\"",\n+ \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"",\n+ \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"",\n+ \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"",\n+ \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"",\n+ \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"",\n+ \""repos_url\"": \""https://api.github.com/users/simonw/repos\"",\n+ \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"",\n+ \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"",\n+ \""type\"": \""User\"",\n+ \""site_admin\"": false\n+ },\n+ \""html_url\"": \""https://github.com/simonw/datasette\"",\n+ \""description\"": \""An open source multi-tool for exploring and publishing data\"",\n+ \""fork\"": false,\n+ \""url\"": \""https://api.github.com/repos/simonw/datasette\"",\n+ \""forks_url\"": \""https://api.github.com/repos/simonw/datasette/forks\"",\n+ \""keys_url\"": \""https://api.github.com/repos/simonw/datasette/keys{/key_id}\"",\n+ \""collaborators_url\"": \""https://api.github.com/repos/simonw/datasette/collaborators{/collaborator}\"",\n+ \""teams_url\"": \""https://api.github.com/repos/simonw/datasette/teams\"",\n+ \""hooks_url\"": \""https://api.github.com/repos/simonw/datasette/hooks\"",\n+ \""issue_events_url\"": \""https://api.github.com/repos/simonw/datasette/issues/events{/number}\"",\n+ \""events_url\"": \""https://api.github.com/repos/simonw/datasette/events\"",\n+ \""assignees_url\"": \""https://api.github.com/repos/simonw/datasette/assignees{/user}\"",\n+ \""branches_url\"": \""https://api.github.com/repos/simonw/datasette/branches{/branch}\"",\n+ \""tags_url\"": \""https://api.github.com/repos/simonw/datasette/tags\"",\n+ \""blobs_url\"": \""https://api.github.com/repos/simonw/datasette/git/blobs{/sha}\"",\n+ \""git_tags_url\"": \""https://api.github.com/repos/simonw/datasette/git/tags{/sha}\"",\n+ \""git_refs_url\"": \""https://api.github.com/repos/simonw/datasette/git/refs{/sha}\"",\n+ \""trees_url\"": \""https://api.github.com/repos/simonw/datasette/git/trees{/sha}\"",\n+ \""statuses_url\"": \""https://api.github.com/repos/simonw/datasette/statuses/{sha}\"",\n+ \""languages_url\"": \""https://api.github.com/repos/simonw/datasette/languages\"",\n+ \""stargazers_url\"": \""https://api.github.com/repos/simonw/datasette/stargazers\"",\n+ \""contributors_url\"": \""https://api.github.com/repos/simonw/datasette/contributors\"",\n+ \""subscribers_url\"": \""https://api.github.com/repos/simonw/datasette/subscribers\"",\n+ \""subscription_url\"": \""https://api.github.com/repos/simonw/datasette/subscription\"",\n+ \""commits_url\"": \""https://api.github.com/repos/simonw/datasette/commits{/sha}\"",\n+ \""git_commits_url\"": \""https://api.github.com/repos/simonw/datasette/git/commits{/sha}\"",\n+ \""comments_url\"": \""https://api.github.com/repos/simonw/datasette/comments{/number}\"",\n+ \""issue_comment_url\"": \""https://api.github.com/repos/simonw/datasette/issues/comments{/number}\"",\n+ \""contents_url\"": \""https://api.github.com/repos/simonw/datasette/contents/{+path}\"",\n+ \""compare_url\"": \""https://api.github.com/repos/simonw/datasette/compare/{base}...{head}\"",\n+ \""merges_url\"": \""https://api.github.com/repos/simonw/datasette/merges\"",\n+ \""archive_url\"": \""https://api.github.com/repos/simonw/datasette/{archive_format}{/ref}\"",\n+ \""downloads_url\"": \""https://api.github.com/repos/simonw/datasette/downloads\"",\n+ \""issues_url\"": \""https://api.github.com/repos/simonw/datasette/issues{/number}\"",\n+ \""pulls_url\"": \""https://api.github.com/repos/simonw/datasette/pulls{/number}\"",\n+ \""milestones_url\"": \""https://api.github.com/repos/simonw/datasette/milestones{/number}\"",\n+ \""notifications_url\"": \""https://api.github.com/repos/simonw/datasette/notifications{?since,all,participating}\"",\n+ \""labels_url\"": \""https://api.github.com/repos/simonw/datasette/labels{/name}\"",\n+ \""releases_url\"": \""https://api.github.com/repos/simonw/datasette/releases{/id}\"",\n+ \""deployments_url\"": \""https://api.github.com/repos/simonw/datasette/deployments\"",\n+ \""created_at\"": \""2017-10-23T00:39:03Z\"",\n+ \""updated_at\"": \""2020-07-27T20:42:15Z\"",\n+ \""pushed_at\"": \""2020-07-26T01:21:05Z\"",\n+ \""git_url\"": \""git://github.com/simonw/datasette.git\"",\n+ \""ssh_url\"": \""git@github.com:simonw/datasette.git\"",\n+ \""clone_url\"": \""https://github.com/simonw/datasette.git\"",\n+ \""svn_url\"": \""https://github.com/simonw/datasette\"",\n+ \""homepage\"": \""http://datasette.readthedocs.io/\"",\n+ \""size\"": 3487,\n+ \""stargazers_count\"": 3642,\n+ \""watchers_count\"": 3642,\n+ \""language\"": \""Python\"",\n+ \""has_issues\"": true,\n+ \""has_projects\"": false,\n+ \""has_downloads\"": true,\n+ \""has_wiki\"": true,\n+ \""has_pages\"": false,\n+ \""forks_count\"": 206,\n+ \""mirror_url\"": null,\n+ \""archived\"": false,\n+ \""disabled\"": false,\n+ \""open_issues_count\"": 190,\n+ \""license\"": {\n+ \""key\"": \""apache-2.0\"",\n+ \""name\"": \""Apache License 2.0\"",\n+ \""spdx_id\"": \""Apache-2.0\"",\n+ \""url\"": \""https://api.github.com/licenses/apache-2.0\"",\n+ \""node_id\"": \""MDc6TGljZW5zZTI=\""\n+ },\n+ \""forks\"": 206,\n+ \""open_issues\"": 190,\n+ \""watchers\"": 3642,\n+ \""default_branch\"": \""master\""\n+ }\n+ },\n+ \""_links\"": {\n+ \""self\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/pulls/571\""\n+ },\n+ \""html\"": {\n+ \""href\"": \""https://github.com/simonw/datasette/pull/571\""\n+ },\n+ \""issue\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/issues/571\""\n+ },\n+ \""comments\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/issues/571/comments\""\n+ },\n+ \""review_comments\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/pulls/571/comments\""\n+ },\n+ \""review_comment\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/pulls/comments{/number}\""\n+ },\n+ \""commits\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/pulls/571/commits\""\n+ },\n+ \""statuses\"": {\n+ \""href\"": \""https://api.github.com/repos/simonw/datasette/statuses/a85239f69261c10f1a9f90514c8b5d113cb94585\""\n+ }\n+ },\n+ \""author_association\"": \""OWNER\"",\n+ \""active_lock_reason\"": null,\n+ \""merged\"": true,\n+ \""mergeable\"": null,\n+ \""rebaseable\"": null,\n+ \""mergeable_state\"": \""unknown\"",\n+ \""merged_by\"": {"", ""path"": ""tests/pull_requests.json"", ""position"": 342, ""original_position"": 342, ""commit_id"": ""3a0d5c498f9faae4e40aab204cd01b965a4f61f3"", ""user"": { ""login"": ""simonw"", ""id"": 9599, ""node_id"": ""MDQ6VXNlcjk1OTk="", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/9599?u=5968723deb1a55b82620e106f5ca58e9b11a0942&v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/simonw"", ""html_url"": ""https://github.com/simonw"", ""followers_url"": ""https://api.github.com/users/simonw/followers"", ""following_url"": ""https://api.github.com/users/simonw/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/simonw/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/simonw/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/simonw/subscriptions"", ""organizations_url"": ""https://api.github.com/users/simonw/orgs"", ""repos_url"": ""https://api.github.com/users/simonw/repos"", ""events_url"": ""https://api.github.com/users/simonw/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/simonw/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": ""Running this should create a `merged_by` column on the `pull_requests` table which is a foreign key to the `users` table."", ""created_at"": ""2020-10-06T21:22:47Z"", ""updated_at"": ""2020-10-20T20:56:33Z"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500603838"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""MEMBER"", ""_links"": { ""self"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500603838"" }, ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500603838"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""original_commit_id"": ""4f33b850bd37829262dd29e1c520afffebedc19c"" }, { ""id"": 500606198, ""node_id"": ""MDI0OlB1bGxSZXF1ZXN0UmV2aWV3Q29tbWVudDUwMDYwNjE5OA=="", ""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500606198"", ""pull_request_review_id"": 503368921, ""diff_hunk"": ""@@ -0,0 +1,124 @@\n+from github_to_sqlite import utils\n+import pytest\n+import pathlib\n+import sqlite_utils\n+from sqlite_utils.db import ForeignKey\n+import json\n+\n+\n+@pytest.fixture\n+def pull_requests():\n+ return json.load(open(pathlib.Path(__file__).parent / \""pull_requests.json\""))\n+\n+\n+@pytest.fixture\n+def db(pull_requests):\n+ db = sqlite_utils.Database(memory=True)\n+ db[\""repos\""].insert(\n+ {\""id\"": 1},\n+ pk=\""id\"",\n+ columns={\""organization\"": int, \""topics\"": str, \""name\"": str, \""description\"": str},\n+ )\n+ utils.save_pull_requests(db, pull_requests, {\""id\"": 1})\n+ return db\n+\n+\n+def test_tables(db):\n+ assert {\""pull_requests\"", \""users\"", \""repos\"", \""milestones\""} == set(\n+ db.table_names()\n+ )\n+ assert {\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""repo\"", other_table=\""repos\"", other_column=\""id\""\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"",\n+ column=\""milestone\"",\n+ other_table=\""milestones\"",\n+ other_column=\""id\"",\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""assignee\"", other_table=\""users\"", other_column=\""id\""\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""user\"", other_table=\""users\"", other_column=\""id\""\n+ ),\n+ } == set(db[\""pull_requests\""].foreign_keys)\n+\n+\n+def test_pull_requests(db):\n+ pull_request_rows = list(db[\""pull_requests\""].rows)\n+ assert [\n+ {\n+ 'id': 313384926,\n+ 'node_id': 'MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2',\n+ 'number': 571,\n+ 'state': 'closed',\n+ 'locked': 0,\n+ 'title': 'detect_fts now works with alternative table escaping',\n+ 'user': 9599,\n+ 'body': 'Fixes #570',\n+ 'created_at': '2019-09-03T00:23:39Z',\n+ 'updated_at': '2019-09-03T00:32:28Z',\n+ 'closed_at': '2019-09-03T00:32:28Z',\n+ 'merged_at': '2019-09-03T00:32:28Z',\n+ 'merge_commit_sha': '2dc5c8dc259a0606162673d394ba8cc1c6f54428',\n+ 'assignee': None,\n+ 'milestone': None,\n+ 'draft': 0,\n+ 'head': 'a85239f69261c10f1a9f90514c8b5d113cb94585',\n+ 'base': 'f04deebec4f3842f7bd610cd5859de529f77d50e',\n+ 'author_association': 'OWNER',\n+ 'merged': 1,\n+ 'mergeable': None,\n+ 'rebaseable': None,\n+ 'mergeable_state': 'unknown',\n+ 'merged_by': '{\""login\"": \""simonw\"", \""id\"": 9599, \""node_id\"": \""MDQ6VXNlcjk1OTk=\"", \""avatar_url\"": \""https://avatars0.githubusercontent.com/u/9599?v=4\"", \""gravatar_id\"": \""\"", \""url\"": \""https://api.github.com/users/simonw\"", \""html_url\"": \""https://github.com/simonw\"", \""followers_url\"": \""https://api.github.com/users/simonw/followers\"", \""following_url\"": \""https://api.github.com/users/simonw/following{/other_user}\"", \""gists_url\"": \""https://api.github.com/users/simonw/gists{/gist_id}\"", \""starred_url\"": \""https://api.github.com/users/simonw/starred{/owner}{/repo}\"", \""subscriptions_url\"": \""https://api.github.com/users/simonw/subscriptions\"", \""organizations_url\"": \""https://api.github.com/users/simonw/orgs\"", \""repos_url\"": \""https://api.github.com/users/simonw/repos\"", \""events_url\"": \""https://api.github.com/users/simonw/events{/privacy}\"", \""received_events_url\"": \""https://api.github.com/users/simonw/received_events\"", \""type\"": \""User\"", \""site_admin\"": false}',"", ""path"": ""tests/test_pull_requests.py"", ""position"": null, ""original_position"": 76, ""commit_id"": ""3a0d5c498f9faae4e40aab204cd01b965a4f61f3"", ""user"": { ""login"": ""simonw"", ""id"": 9599, ""node_id"": ""MDQ6VXNlcjk1OTk="", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/9599?u=5968723deb1a55b82620e106f5ca58e9b11a0942&v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/simonw"", ""html_url"": ""https://github.com/simonw"", ""followers_url"": ""https://api.github.com/users/simonw/followers"", ""following_url"": ""https://api.github.com/users/simonw/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/simonw/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/simonw/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/simonw/subscriptions"", ""organizations_url"": ""https://api.github.com/users/simonw/orgs"", ""repos_url"": ""https://api.github.com/users/simonw/repos"", ""events_url"": ""https://api.github.com/users/simonw/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/simonw/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": ""See above - this should be 9599, an integer reference to the row in the users table."", ""created_at"": ""2020-10-06T21:27:43Z"", ""updated_at"": ""2020-10-20T20:56:33Z"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500606198"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""MEMBER"", ""_links"": { ""self"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500606198"" }, ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500606198"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""original_commit_id"": ""4f33b850bd37829262dd29e1c520afffebedc19c"" }, { ""id"": 500606665, ""node_id"": ""MDI0OlB1bGxSZXF1ZXN0UmV2aWV3Q29tbWVudDUwMDYwNjY2NQ=="", ""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500606665"", ""pull_request_review_id"": 503368921, ""diff_hunk"": ""@@ -0,0 +1,124 @@\n+from github_to_sqlite import utils\n+import pytest\n+import pathlib\n+import sqlite_utils\n+from sqlite_utils.db import ForeignKey\n+import json\n+\n+\n+@pytest.fixture\n+def pull_requests():\n+ return json.load(open(pathlib.Path(__file__).parent / \""pull_requests.json\""))\n+\n+\n+@pytest.fixture\n+def db(pull_requests):\n+ db = sqlite_utils.Database(memory=True)\n+ db[\""repos\""].insert(\n+ {\""id\"": 1},\n+ pk=\""id\"",\n+ columns={\""organization\"": int, \""topics\"": str, \""name\"": str, \""description\"": str},\n+ )\n+ utils.save_pull_requests(db, pull_requests, {\""id\"": 1})\n+ return db\n+\n+\n+def test_tables(db):\n+ assert {\""pull_requests\"", \""users\"", \""repos\"", \""milestones\""} == set(\n+ db.table_names()\n+ )\n+ assert {\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""repo\"", other_table=\""repos\"", other_column=\""id\""\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"",\n+ column=\""milestone\"",\n+ other_table=\""milestones\"",\n+ other_column=\""id\"",\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""assignee\"", other_table=\""users\"", other_column=\""id\""\n+ ),\n+ ForeignKey(\n+ table=\""pull_requests\"", column=\""user\"", other_table=\""users\"", other_column=\""id\""\n+ ),\n+ } == set(db[\""pull_requests\""].foreign_keys)\n+\n+\n+def test_pull_requests(db):\n+ pull_request_rows = list(db[\""pull_requests\""].rows)\n+ assert [\n+ {\n+ 'id': 313384926,"", ""path"": ""tests/test_pull_requests.py"", ""position"": null, ""original_position"": 53, ""commit_id"": ""3a0d5c498f9faae4e40aab204cd01b965a4f61f3"", ""user"": { ""login"": ""simonw"", ""id"": 9599, ""node_id"": ""MDQ6VXNlcjk1OTk="", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/9599?u=5968723deb1a55b82620e106f5ca58e9b11a0942&v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/simonw"", ""html_url"": ""https://github.com/simonw"", ""followers_url"": ""https://api.github.com/users/simonw/followers"", ""following_url"": ""https://api.github.com/users/simonw/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/simonw/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/simonw/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/simonw/subscriptions"", ""organizations_url"": ""https://api.github.com/users/simonw/orgs"", ""repos_url"": ""https://api.github.com/users/simonw/repos"", ""events_url"": ""https://api.github.com/users/simonw/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/simonw/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": ""Minor detail: I use Black for this repo, which requires double quotes - running \""black .\"" in the root directory (with the latest version of Black) should handle this for you."", ""created_at"": ""2020-10-06T21:28:31Z"", ""updated_at"": ""2020-10-20T20:56:33Z"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500606665"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""MEMBER"", ""_links"": { ""self"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/comments/500606665"" }, ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#discussion_r500606665"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""original_commit_id"": ""4f33b850bd37829262dd29e1c520afffebedc19c"" } ] ``` That's a lot more interesting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",664485022,Feature: pull request reviews and comments, https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-735482546,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46,735482546,MDEyOklzc3VlQ29tbWVudDczNTQ4MjU0Ng==,9599,simonw,2020-11-30T00:22:02Z,2020-11-30T00:22:02Z,MEMBER,"As for reviews... here's the output of `github-to-sqlite get https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48/reviews --accept 'application/vnd.github.v3+json'` ```json [ { ""id"": 503368921, ""node_id"": ""MDE3OlB1bGxSZXF1ZXN0UmV2aWV3NTAzMzY4OTIx"", ""user"": { ""login"": ""simonw"", ""id"": 9599, ""node_id"": ""MDQ6VXNlcjk1OTk="", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/9599?u=5968723deb1a55b82620e106f5ca58e9b11a0942&v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/simonw"", ""html_url"": ""https://github.com/simonw"", ""followers_url"": ""https://api.github.com/users/simonw/followers"", ""following_url"": ""https://api.github.com/users/simonw/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/simonw/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/simonw/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/simonw/subscriptions"", ""organizations_url"": ""https://api.github.com/users/simonw/orgs"", ""repos_url"": ""https://api.github.com/users/simonw/repos"", ""events_url"": ""https://api.github.com/users/simonw/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/simonw/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": """", ""state"": ""CHANGES_REQUESTED"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-503368921"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""MEMBER"", ""_links"": { ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-503368921"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""submitted_at"": ""2020-10-06T21:28:40Z"", ""commit_id"": ""4f33b850bd37829262dd29e1c520afffebedc19c"" }, { ""id"": 513118561, ""node_id"": ""MDE3OlB1bGxSZXF1ZXN0UmV2aWV3NTEzMTE4NTYx"", ""user"": { ""login"": ""adamjonas"", ""id"": 755825, ""node_id"": ""MDQ6VXNlcjc1NTgyNQ=="", ""avatar_url"": ""https://avatars1.githubusercontent.com/u/755825?v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/adamjonas"", ""html_url"": ""https://github.com/adamjonas"", ""followers_url"": ""https://api.github.com/users/adamjonas/followers"", ""following_url"": ""https://api.github.com/users/adamjonas/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/adamjonas/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/adamjonas/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/adamjonas/subscriptions"", ""organizations_url"": ""https://api.github.com/users/adamjonas/orgs"", ""repos_url"": ""https://api.github.com/users/adamjonas/repos"", ""events_url"": ""https://api.github.com/users/adamjonas/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/adamjonas/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": """", ""state"": ""COMMENTED"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-513118561"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""CONTRIBUTOR"", ""_links"": { ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-513118561"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""submitted_at"": ""2020-10-20T20:45:05Z"", ""commit_id"": ""4f33b850bd37829262dd29e1c520afffebedc19c"" }, { ""id"": 513127529, ""node_id"": ""MDE3OlB1bGxSZXF1ZXN0UmV2aWV3NTEzMTI3NTI5"", ""user"": { ""login"": ""adamjonas"", ""id"": 755825, ""node_id"": ""MDQ6VXNlcjc1NTgyNQ=="", ""avatar_url"": ""https://avatars1.githubusercontent.com/u/755825?v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/adamjonas"", ""html_url"": ""https://github.com/adamjonas"", ""followers_url"": ""https://api.github.com/users/adamjonas/followers"", ""following_url"": ""https://api.github.com/users/adamjonas/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/adamjonas/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/adamjonas/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/adamjonas/subscriptions"", ""organizations_url"": ""https://api.github.com/users/adamjonas/orgs"", ""repos_url"": ""https://api.github.com/users/adamjonas/repos"", ""events_url"": ""https://api.github.com/users/adamjonas/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/adamjonas/received_events"", ""type"": ""User"", ""site_admin"": false }, ""body"": """", ""state"": ""COMMENTED"", ""html_url"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-513127529"", ""pull_request_url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"", ""author_association"": ""CONTRIBUTOR"", ""_links"": { ""html"": { ""href"": ""https://github.com/dogsheep/github-to-sqlite/pull/48#pullrequestreview-513127529"" }, ""pull_request"": { ""href"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/pulls/48"" } }, ""submitted_at"": ""2020-10-20T20:57:33Z"", ""commit_id"": ""3a0d5c498f9faae4e40aab204cd01b965a4f61f3"" } ] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",664485022,Feature: pull request reviews and comments, https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-735482187,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46,735482187,MDEyOklzc3VlQ29tbWVudDczNTQ4MjE4Nw==,9599,simonw,2020-11-30T00:20:11Z,2020-11-30T00:20:11Z,MEMBER,"Pull request are now added, thanks to @adamjonas.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",664485022,Feature: pull request reviews and comments, https://github.com/dogsheep/github-to-sqlite/issues/54#issuecomment-735465708,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54,735465708,MDEyOklzc3VlQ29tbWVudDczNTQ2NTcwOA==,9599,simonw,2020-11-29T22:08:46Z,2020-11-29T22:08:46Z,MEMBER,"Demo: - https://github-to-sqlite.dogsheep.net/github/steps?_facet=repo - https://github-to-sqlite.dogsheep.net/github/workflows - https://github-to-sqlite.dogsheep.net/github/jobs","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753026003,github-to-sqlite workflows command, https://github.com/dogsheep/github-to-sqlite/issues/54#issuecomment-735464493,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54,735464493,MDEyOklzc3VlQ29tbWVudDczNTQ2NDQ5Mw==,9599,simonw,2020-11-29T21:57:32Z,2020-11-29T21:57:32Z,MEMBER,`$ github-to-sqlite workflows github.db simonw/datasette dogsheep/github-to-sqlite`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753026003,github-to-sqlite workflows command, https://github.com/dogsheep/github-to-sqlite/issues/54#issuecomment-735464438,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54,735464438,MDEyOklzc3VlQ29tbWVudDczNTQ2NDQzOA==,9599,simonw,2020-11-29T21:57:08Z,2020-11-29T21:57:08Z,MEMBER,Inspired by this tweet from Michael Heap https://twitter.com/mheap/status/1333108608817631238,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753026003,github-to-sqlite workflows command, https://github.com/simonw/datasette/issues/1114#issuecomment-735447635,https://api.github.com/repos/simonw/datasette/issues/1114,735447635,MDEyOklzc3VlQ29tbWVudDczNTQ0NzYzNQ==,9599,simonw,2020-11-29T20:16:32Z,2020-11-29T20:17:11Z,OWNER,"The new Docker container is pushed to Docker Hub too. Here's it in action: ``` % docker pull datasetteproject/datasette % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette --version datasette, version 0.52.1 % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/nps-spatialite.db --load-extension=spatialite INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752966476,--load-extension=spatialite not working with datasetteproject/datasette docker image, https://github.com/simonw/datasette/issues/1115#issuecomment-735447198,https://api.github.com/repos/simonw/datasette/issues/1115,735447198,MDEyOklzc3VlQ29tbWVudDczNTQ0NzE5OA==,9599,simonw,2020-11-29T20:12:49Z,2020-11-29T20:12:49Z,OWNER,"``` $ datasette ../shapefile-to-sqlite/nps-spatialite.db Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Try adding the --load-extension=spatialite option. Read more: https://docs.datasette.io/en/stable/spatialite.html ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752995227,SpatiaLite error could suggest --load-extension=spatialite, https://github.com/simonw/datasette/issues/1099#issuecomment-735444858,https://api.github.com/repos/simonw/datasette/issues/1099,735444858,MDEyOklzc3VlQ29tbWVudDczNTQ0NDg1OA==,9599,simonw,2020-11-29T19:51:58Z,2020-11-29T19:51:58Z,OWNER,"My fix in deb0be4ae56f191f121239b29e83dd53b62d6305 for #1098 was to have Datasette deliberately pretend that compound foreign keys don't exist: https://github.com/simonw/datasette/blob/deb0be4ae56f191f121239b29e83dd53b62d6305/datasette/utils/__init__.py#L470-L495 This workaround will need to be rethought to implement real support for them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743371103,Support linking to compound foreign keys, https://github.com/simonw/datasette/issues/1098#issuecomment-735444698,https://api.github.com/repos/simonw/datasette/issues/1098,735444698,MDEyOklzc3VlQ29tbWVudDczNTQ0NDY5OA==,9599,simonw,2020-11-29T19:50:14Z,2020-11-29T19:50:14Z,OWNER,Demo of the fix: https://latest.datasette.io/fixtures/foreign_key_references (the compound foreign key columns do not link to anything),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743370900,Foreign key links break for compound foreign keys, https://github.com/simonw/datasette/issues/1098#issuecomment-735443654,https://api.github.com/repos/simonw/datasette/issues/1098,735443654,MDEyOklzc3VlQ29tbWVudDczNTQ0MzY1NA==,9599,simonw,2020-11-29T19:41:01Z,2020-11-29T19:41:01Z,OWNER,Fix is out in 0.52.1: https://docs.datasette.io/en/latest/changelog.html#v0-52-1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743370900,Foreign key links break for compound foreign keys, https://github.com/simonw/datasette/issues/1114#issuecomment-735443626,https://api.github.com/repos/simonw/datasette/issues/1114,735443626,MDEyOklzc3VlQ29tbWVudDczNTQ0MzYyNg==,9599,simonw,2020-11-29T19:40:49Z,2020-11-29T19:40:49Z,OWNER,Fix is out in 0.52.1: https://docs.datasette.io/en/latest/changelog.html#v0-52-1,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752966476,--load-extension=spatialite not working with datasetteproject/datasette docker image, https://github.com/simonw/datasette/issues/1098#issuecomment-735442226,https://api.github.com/repos/simonw/datasette/issues/1098,735442226,MDEyOklzc3VlQ29tbWVudDczNTQ0MjIyNg==,9599,simonw,2020-11-29T19:28:04Z,2020-11-29T19:28:04Z,OWNER,"For the moment I'm going to solve this by teaching Datasette's internal introspection methods - in particular these ones - to ignore compound foreign keys entirely: https://github.com/simonw/datasette/blob/e800ffcf7cc6a915eb554b369c654f87162575e5/datasette/utils/__init__.py#L470-L505","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743370900,Foreign key links break for compound foreign keys, https://github.com/simonw/datasette/issues/123#issuecomment-735440555,https://api.github.com/repos/simonw/datasette/issues/123,735440555,MDEyOklzc3VlQ29tbWVudDczNTQ0MDU1NQ==,11912854,jsancho-gpl,2020-11-29T19:12:30Z,2020-11-29T19:12:30Z,NONE,"[datasette-connectors](https://github.com/pytables/datasette-connectors) provides an API for making connectors for any file based database. For example, [datasette-pytables](https://github.com/pytables/datasette-pytables) is a connector for HDF5 files, so now is possible to use this type of files with Datasette. It'd be nice if Datasette coud provide that API directly, for other file formats and for urls too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/1114#issuecomment-735436014,https://api.github.com/repos/simonw/datasette/issues/1114,735436014,MDEyOklzc3VlQ29tbWVudDczNTQzNjAxNA==,2182,danp,2020-11-29T18:33:30Z,2020-11-29T18:33:30Z,CONTRIBUTOR,Thank you!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752966476,--load-extension=spatialite not working with datasetteproject/datasette docker image, https://github.com/simonw/datasette/issues/1114#issuecomment-735429041,https://api.github.com/repos/simonw/datasette/issues/1114,735429041,MDEyOklzc3VlQ29tbWVudDczNTQyOTA0MQ==,9599,simonw,2020-11-29T17:36:18Z,2020-11-29T17:36:18Z,OWNER,"This is a great catch, thanks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752966476,--load-extension=spatialite not working with datasetteproject/datasette docker image, https://github.com/simonw/datasette/issues/1102#issuecomment-735356882,https://api.github.com/repos/simonw/datasette/issues/1102,735356882,MDEyOklzc3VlQ29tbWVudDczNTM1Njg4Mg==,9599,simonw,2020-11-29T07:47:22Z,2020-11-29T07:47:22Z,OWNER,Live: https://docs.datasette.io/en/latest/testing_plugins.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749289611,Plugin testing docs should show datasette.client, https://github.com/simonw/datasette/issues/1111#issuecomment-735320736,https://api.github.com/repos/simonw/datasette/issues/1111,735320736,MDEyOklzc3VlQ29tbWVudDczNTMyMDczNg==,9599,simonw,2020-11-29T02:46:23Z,2020-11-29T02:46:23Z,OWNER,"This is a really useful bug report, thanks! I agree: more aggressive timeouts on table counts sounds like the right solution here. I've learned that avoiding `count(*)` is crucial for handling these larger databases. Datasette has been getting better about that over time but there are still some edge-cases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",751195017,Accessing a database's `.json` is slow for very large SQLite files, https://github.com/simonw/datasette/issues/1108#issuecomment-735320499,https://api.github.com/repos/simonw/datasette/issues/1108,735320499,MDEyOklzc3VlQ29tbWVudDczNTMyMDQ5OQ==,9599,simonw,2020-11-29T02:42:42Z,2020-11-29T02:42:42Z,OWNER,https://docs.datasette.io/en/stable/config.html now redirects correctly.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750087350,Configure /en/stable/config.html redirect when I ship 0.52, https://github.com/simonw/datasette/issues/1113#issuecomment-735304071,https://api.github.com/repos/simonw/datasette/issues/1113,735304071,MDEyOklzc3VlQ29tbWVudDczNTMwNDA3MQ==,9599,simonw,2020-11-28T23:23:31Z,2020-11-28T23:23:31Z,OWNER,https://github.com/simonw/datasette/blob/bbde835a1fec01458e8d00929e7bab6d6a5ba948/datasette/views/table.py#L1013-L1026 - looks like I fixed this in #1088.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752789159,500 error on row page if query against foreign keys hits time limit, https://github.com/simonw/datasette/pull/1112#issuecomment-735283033,https://api.github.com/repos/simonw/datasette/issues/1112,735283033,MDEyOklzc3VlQ29tbWVudDczNTI4MzAzMw==,9599,simonw,2020-11-28T19:53:36Z,2020-11-28T19:53:36Z,OWNER,Thanks!,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752749485,Fix --metadata doc usage, https://github.com/simonw/datasette/issues/493#issuecomment-735281577,https://api.github.com/repos/simonw/datasette/issues/493,735281577,MDEyOklzc3VlQ29tbWVudDczNTI4MTU3Nw==,50527,jefftriplett,2020-11-28T19:39:53Z,2020-11-28T19:39:53Z,CONTRIBUTOR,"I was confused by `--config` and I tried passing the json from datasette-ripgrep into `config.json` just as a wild guess. A short term solution might be pointing out in plugins that their snippet json can go in `metadata.json` at least makes it easier to search for config options or to know where to start if someone is new. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",449886319,Rename metadata.json to config.json, https://github.com/simonw/datasette/pull/1112#issuecomment-735279733,https://api.github.com/repos/simonw/datasette/issues/1112,735279733,MDEyOklzc3VlQ29tbWVudDczNTI3OTczMw==,22429695,codecov[bot],2020-11-28T19:24:28Z,2020-11-28T19:24:28Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=h1) Report > Merging [#1112](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=desc) (1a30fc2) into [main](https://codecov.io/gh/simonw/datasette/commit/37d18a5bce08c9ee53c080f613bae84fc2ccc853?el=desc) (37d18a5) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1112 +/- ## ======================================= Coverage 91.44% 91.44% ======================================= Files 30 30 Lines 3833 3833 ======================================= Hits 3505 3505 Misses 328 328 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=footer). Last update [37d18a5...1a30fc2](https://codecov.io/gh/simonw/datasette/pull/1112?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752749485,Fix --metadata doc usage, https://github.com/simonw/datasette/pull/1112#issuecomment-735279355,https://api.github.com/repos/simonw/datasette/issues/1112,735279355,MDEyOklzc3VlQ29tbWVudDczNTI3OTM1NQ==,50527,jefftriplett,2020-11-28T19:21:09Z,2020-11-28T19:21:09Z,CONTRIBUTOR,"(Even more annoying is that I see my editor leaked an extra delete space at the end of the line. I'm happy to rebuild this to be less annoying, but you probably don't want the changelog update either way)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752749485,Fix --metadata doc usage, https://github.com/simonw/datasette/issues/1110#issuecomment-733432916,https://api.github.com/repos/simonw/datasette/issues/1110,733432916,MDEyOklzc3VlQ29tbWVudDczMzQzMjkxNg==,9599,simonw,2020-11-25T03:04:29Z,2020-11-25T03:04:29Z,OWNER,Already have a pattern for extra packages in the form of the `--spatialite` feature: https://github.com/simonw/datasette/blob/f2e2bfcdd9ad4891f3f66c9104c09943d943ffe4/datasette/utils/__init__.py#L50-L55,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750330029,datasette publish option for installing extra apt-get packages, https://github.com/simonw/datasette/issues/1110#issuecomment-733432768,https://api.github.com/repos/simonw/datasette/issues/1110,733432768,MDEyOklzc3VlQ29tbWVudDczMzQzMjc2OA==,9599,simonw,2020-11-25T03:04:00Z,2020-11-25T03:04:00Z,OWNER,"Design: datasette publish cloudrun blah.db --apt-get-install ripgrep --install datasette-ripgrep","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750330029,datasette publish option for installing extra apt-get packages, https://github.com/simonw/datasette/issues/860#issuecomment-733288841,https://api.github.com/repos/simonw/datasette/issues/860,733288841,MDEyOklzc3VlQ29tbWVudDczMzI4ODg0MQ==,9599,simonw,2020-11-24T23:19:47Z,2020-11-24T23:20:24Z,OWNER,Here's what I have today - it's an undocumented `datasette.metadata()` method that returns a full JSON dictionary of values OR a single value if the optional `key=` argument is provided: https://github.com/simonw/datasette/blob/f2e2bfcdd9ad4891f3f66c9104c09943d943ffe4/datasette/app.py#L357-L388,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/860#issuecomment-733288522,https://api.github.com/repos/simonw/datasette/issues/860,733288522,MDEyOklzc3VlQ29tbWVudDczMzI4ODUyMg==,9599,simonw,2020-11-24T23:18:47Z,2020-11-24T23:18:47Z,OWNER,"In #942 I want to add support for per-column metadata - which means this new lookup mechanism will need to be able to answer the question ""what description is available for this column"". So what should the `.metadata()` method look like? A couple of options: - `datasette.metadata(""description"", table=x, database=y)` - can take optional `column=` too. - `datasette.table_metadata(""description"", table=x, database=y)` and `datasette.database_metadata(""description"", database=y)` and so on - multiple methods for the different types of metadata.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/860#issuecomment-733287619,https://api.github.com/repos/simonw/datasette/issues/860,733287619,MDEyOklzc3VlQ29tbWVudDczMzI4NzYxOQ==,9599,simonw,2020-11-24T23:16:21Z,2020-11-24T23:16:21Z,OWNER,I'll also allow any key to be looked up - so if users want to invent their own metadata keys other than the default `license_url` etc they can do so.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/860#issuecomment-733287416,https://api.github.com/repos/simonw/datasette/issues/860,733287416,MDEyOklzc3VlQ29tbWVudDczMzI4NzQxNg==,9599,simonw,2020-11-24T23:15:44Z,2020-11-24T23:15:44Z,OWNER,"I'm going to go with a plugin hook (and Datasette method) that returns individual values - so you ask it for e.g. the `license_url` for a specific table and it returns a string or `None`. The default plugin hook implementation that ships with Datasette will then implement cascading lookups against `metadata.json` - but other plugins will be able to provide their own implementations, which should make it easy to build a plugin that lets you keep metadata in a database file and edit it interactively.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/1107#issuecomment-733261501,https://api.github.com/repos/simonw/datasette/issues/1107,733261501,MDEyOklzc3VlQ29tbWVudDczMzI2MTUwMQ==,9599,simonw,2020-11-24T22:09:11Z,2020-11-24T22:09:11Z,OWNER,Documentation: https://docs.datasette.io/en/latest/internals.html#setting-key,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750079085,Rename datasette.config() method to datasette.setting(), https://github.com/simonw/datasette/issues/1107#issuecomment-733257071,https://api.github.com/repos/simonw/datasette/issues/1107,733257071,MDEyOklzc3VlQ29tbWVudDczMzI1NzA3MQ==,9599,simonw,2020-11-24T21:59:32Z,2020-11-24T21:59:32Z,OWNER,I'm going to make this a documented method in https://docs.datasette.io/en/latest/internals.html#datasette-class,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750079085,Rename datasette.config() method to datasette.setting(), https://github.com/simonw/datasette/issues/1105#issuecomment-733249176,https://api.github.com/repos/simonw/datasette/issues/1105,733249176,MDEyOklzc3VlQ29tbWVudDczMzI0OTE3Ng==,9599,simonw,2020-11-24T21:40:28Z,2020-11-24T21:40:28Z,OWNER,This rebranding is complete - #1107 is a follow-up internal refactor.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749982022,Rebrand config as settings, https://github.com/simonw/datasette/issues/1106#issuecomment-733248437,https://api.github.com/repos/simonw/datasette/issues/1106,733248437,MDEyOklzc3VlQ29tbWVudDczMzI0ODQzNw==,9599,simonw,2020-11-24T21:38:50Z,2020-11-24T21:38:50Z,OWNER,"I used an ""exact redirect"" instead and it worked: <img width=""654"" alt=""Edit_Redirects___Read_the_Docs"" src=""https://user-images.githubusercontent.com/9599/100154418-60bf0d80-2e5a-11eb-92bf-527a14edb5c5.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749983857,Rebrand and redirect config.rst as settings.rst, https://github.com/simonw/datasette/issues/1106#issuecomment-733247101,https://api.github.com/repos/simonw/datasette/issues/1106,733247101,MDEyOklzc3VlQ29tbWVudDczMzI0NzEwMQ==,9599,simonw,2020-11-24T21:35:29Z,2020-11-24T21:36:04Z,OWNER,"<img width=""861"" alt=""Edit_Redirects___Read_the_Docs"" src=""https://user-images.githubusercontent.com/9599/100153884-a3341a80-2e59-11eb-8d8c-7027c714b9cf.png""> https://docs.datasette.io/en/latest/config.html isn't redirecting though, even after I tried running a rebuild of the `latest` version.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749983857,Rebrand and redirect config.rst as settings.rst, https://github.com/simonw/datasette/issues/1106#issuecomment-733245596,https://api.github.com/repos/simonw/datasette/issues/1106,733245596,MDEyOklzc3VlQ29tbWVudDczMzI0NTU5Ng==,9599,simonw,2020-11-24T21:32:11Z,2020-11-24T21:32:11Z,OWNER,https://docs.datasette.io/en/latest/settings.html is now live - need to redirect https://docs.datasette.io/en/latest/config.html to it using the ReadTheDocs redirects interface.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749983857,Rebrand and redirect config.rst as settings.rst, https://github.com/simonw/datasette/issues/1107#issuecomment-733245097,https://api.github.com/repos/simonw/datasette/issues/1107,733245097,MDEyOklzc3VlQ29tbWVudDczMzI0NTA5Nw==,9599,simonw,2020-11-24T21:31:10Z,2020-11-24T21:31:10Z,OWNER,"Most of these use `plugin_config` which is unaffected. It looks like the only code I need to worry about is this trick in `datasette-graphl`: https://github.com/simonw/datasette-graphql/blob/483c9a9e203bb90365def3df8b8f01dda1e75865/datasette_graphql/utils.py#L456-L460 ```python class DatasetteSpecialConfig(wrapt.ObjectProxy): def config(self, key): if key == ""suggest_facets"": return False return self.__wrapped__.config(key) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750079085,Rename datasette.config() method to datasette.setting(), https://github.com/simonw/datasette/issues/1107#issuecomment-733244471,https://api.github.com/repos/simonw/datasette/issues/1107,733244471,MDEyOklzc3VlQ29tbWVudDczMzI0NDQ3MQ==,9599,simonw,2020-11-24T21:29:59Z,2020-11-24T21:29:59Z,OWNER,"I ran `rg '.config\(' datasette-*/` in my top-level directory: ``` datasette-sentry/test_datasette_sentry.py: def plugin_config(self, name): datasette-sentry/datasette_sentry.py: config = datasette.plugin_config(""datasette-sentry"") or {} datasette-render-markdown/datasette_render_markdown/__init__.py: datasette.plugin_config( datasette-render-images/datasette_render_images.py: plugin_config = datasette.plugin_config(""datasette-render-images"") or {} datasette-render-html/datasette_render_html.py: config = datasette.plugin_config( datasette-render-timestamps/datasette_render_timestamps/__init__.py: datasette.plugin_config( datasette-permissions-sql/datasette_permissions_sql/__init__.py: for rule in datasette.plugin_config(""datasette-permissions-sql"") or []: datasette-mask-columns/datasette_mask_columns/__init__.py: datasette.plugin_config(""datasette-mask-columns"", database=database) or {} datasette-mask-columns/datasette_mask_columns/__init__.py: masks = datasette.plugin_config(""datasette-mask-columns"", database=database) or {} datasette-media/datasette_media/__init__.py: plugin_config = datasette.plugin_config(""datasette-media"") or {} datasette-insert/datasette_insert/__init__.py: plugin_config = datasette.plugin_config(""datasette-insert"") or {} datasette-indieauth/datasette_indieauth/__init__.py: plugin_config = datasette.plugin_config(""datasette-indieauth"") or {} datasette-graphql/datasette_graphql/__init__.py: config = datasette.plugin_config(""datasette-graphql"") or {} datasette-graphql/datasette_graphql/__init__.py: config = datasette.plugin_config(""datasette-graphql"") or {} datasette-graphql/datasette_graphql/utils.py: auto_camelcase=(datasette.plugin_config(""datasette-graphql"") or {}).get( datasette-graphql/datasette_graphql/utils.py: table_plugin_config = datasette.plugin_config( datasette-graphql/datasette_graphql/utils.py: def config(self, key): datasette-graphql/datasette_graphql/utils.py: return self.__wrapped__.config(key) datasette-init/datasette_init/__init__.py: config = datasette.plugin_config(""datasette-init"") datasette-edit-templates/datasette_edit_templates/__init__.py: plugin_config = datasette.plugin_config(""datasette-edit-templates"") or {} datasette-cors/datasette_cors.py: config = datasette.plugin_config(""datasette-cors"") or {} datasette-cluster-map-old/build/lib/datasette_cluster_map/__init__.py: datasette.plugin_config(""datasette-cluster-map"", database=database, table=table) datasette-cluster-map/datasette_cluster_map/__init__.py: datasette.plugin_config(""datasette-cluster-map"", database=database, table=table) datasette-cluster-map/datasette_cluster_map/__init__.py: datasette.plugin_config(""datasette-cluster-map"", database=database, table=table) datasette-cluster-map/tests/test_cluster_map.py:async def test_plugin_config(db_path, config, table, expected_fragments): datasette-configure-asgi/datasette_configure_asgi.py: configs = datasette.plugin_config(""datasette-configure-asgi"") or [] datasette-configure-asgi/test_datasette_configure_asgi.py: def plugin_config(self, name): datasette-cluster-map-old/datasette_cluster_map/__init__.py: datasette.plugin_config(""datasette-cluster-map"", database=database, table=table) datasette-auth-passwords/datasette_auth_passwords/__init__.py: config = datasette.plugin_config(""datasette-auth-passwords"") or {} datasette-auth-github/datasette_auth_github/views.py:def verify_config(config): datasette-auth-github/datasette_auth_github/views.py: config = datasette.plugin_config(""datasette-auth-github"") datasette-auth-github/datasette_auth_github/views.py: verify_config(config) datasette-auth-github/datasette_auth_github/views.py: config = datasette.plugin_config(""datasette-auth-github"") datasette-auth-github/datasette_auth_github/views.py: verify_config(config) datasette-atom/datasette_atom/__init__.py: plugin_config = datasette.plugin_config(""datasette-atom"") datasette-auth-google/datasette_auth_google/__init__.py: config = datasette.plugin_config(""datasette-auth-github"") or {} datasette-auth-existing-cookies/test_datasette_auth_existing_cookies.py: def plugin_config(self, name): datasette-auth-passwords/build/lib/datasette_auth_passwords/__init__.py: config = datasette.plugin_config(""datasette-auth-passwords"") or {} datasette-annotate/datasette_annotate/utils.py: plugin_config = datasette.plugin_config(""datasette-annotate"") or {} datasette-auth-existing-cookies/datasette_auth_existing_cookies/__init__.py: config = datasette.plugin_config(""datasette-auth-existing-cookies"") or {} datasette-auth-simple/datasette_auth_simple/__init__.py:datasette.plugin_config(""datasette-auth-simple"") datasette-auth-tokens/datasette_auth_tokens/__init__.py: config = datasette.plugin_config(""datasette-auth-tokens"") or {} datasette-block-robots/datasette_block_robots/__init__.py: config = datasette.plugin_config(""datasette-block-robots"") or {} datasette-block-robots/datasette_block_robots/__init__.py: config = datasette.plugin_config(""datasette-block-robots"") or {} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750079085,Rename datasette.config() method to datasette.setting(), https://github.com/simonw/datasette/issues/1107#issuecomment-733241949,https://api.github.com/repos/simonw/datasette/issues/1107,733241949,MDEyOklzc3VlQ29tbWVudDczMzI0MTk0OQ==,9599,simonw,2020-11-24T21:24:26Z,2020-11-24T21:24:26Z,OWNER,Are there any plugins that use this API even though it isn't documented?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750079085,Rename datasette.config() method to datasette.setting(), https://github.com/simonw/datasette/issues/1106#issuecomment-733221359,https://api.github.com/repos/simonw/datasette/issues/1106,733221359,MDEyOklzc3VlQ29tbWVudDczMzIyMTM1OQ==,9599,simonw,2020-11-24T20:40:21Z,2020-11-24T20:40:21Z,OWNER,https://readthedocs.org/dashboard/datasette/redirects/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749983857,Rebrand and redirect config.rst as settings.rst, https://github.com/simonw/datasette/issues/1104#issuecomment-733212084,https://api.github.com/repos/simonw/datasette/issues/1104,733212084,MDEyOklzc3VlQ29tbWVudDczMzIxMjA4NA==,9599,simonw,2020-11-24T20:20:33Z,2020-11-24T20:20:33Z,OWNER,I'll throw an error if a `config.json` file is detected.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749981663,config.json in directory config mode should be settings.json, https://github.com/simonw/datasette/issues/226#issuecomment-733198051,https://api.github.com/repos/simonw/datasette/issues/226,733198051,MDEyOklzc3VlQ29tbWVudDczMzE5ODA1MQ==,9599,simonw,2020-11-24T19:52:46Z,2020-11-24T19:52:46Z,OWNER,This is well handled now: https://github.com/simonw/datasette/tree/0.51.1/tests/plugins,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315738696,Unit tests for installable plugins, https://github.com/simonw/datasette/issues/1105#issuecomment-733190827,https://api.github.com/repos/simonw/datasette/issues/1105,733190827,MDEyOklzc3VlQ29tbWVudDczMzE5MDgyNw==,9599,simonw,2020-11-24T19:38:02Z,2020-11-24T19:38:02Z,OWNER,I'd like to redirect https://docs.datasette.io/en/stable/config.html to a new https://docs.datasette.io/en/stable/settings.html page too. I can use https://docs.readthedocs.io/en/stable/user-defined-redirects.html for that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749982022,Rebrand config as settings, https://github.com/simonw/datasette/issues/1103#issuecomment-733189737,https://api.github.com/repos/simonw/datasette/issues/1103,733189737,MDEyOklzc3VlQ29tbWVudDczMzE4OTczNw==,9599,simonw,2020-11-24T19:35:45Z,2020-11-24T19:35:45Z,OWNER,Part of #1105,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749979454,Rename /-/config to /-/settings, https://github.com/simonw/datasette/issues/992#issuecomment-733189693,https://api.github.com/repos/simonw/datasette/issues/992,733189693,MDEyOklzc3VlQ29tbWVudDczMzE4OTY5Mw==,9599,simonw,2020-11-24T19:35:38Z,2020-11-24T19:35:38Z,OWNER,Part of #1105,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714449879,"Change ""--config foo:bar"" to ""--setting foo bar""", https://github.com/simonw/datasette/issues/1104#issuecomment-733189620,https://api.github.com/repos/simonw/datasette/issues/1104,733189620,MDEyOklzc3VlQ29tbWVudDczMzE4OTYyMA==,9599,simonw,2020-11-24T19:35:30Z,2020-11-24T19:35:30Z,OWNER,Part of #1105,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749981663,config.json in directory config mode should be settings.json, https://github.com/simonw/datasette/issues/1103#issuecomment-733187586,https://api.github.com/repos/simonw/datasette/issues/1103,733187586,MDEyOklzc3VlQ29tbWVudDczMzE4NzU4Ng==,9599,simonw,2020-11-24T19:31:23Z,2020-11-24T19:31:23Z,OWNER,I'll set up a redirect from `/-/config` to `/-/settings`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749979454,Rename /-/config to /-/settings, https://github.com/simonw/datasette/issues/992#issuecomment-733180289,https://api.github.com/repos/simonw/datasette/issues/992,733180289,MDEyOklzc3VlQ29tbWVudDczMzE4MDI4OQ==,9599,simonw,2020-11-24T19:16:30Z,2020-11-24T19:16:30Z,OWNER,Need to figure out the `--setting foo bar` alternative for this `--config foo:bar` logic: https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/datasette/cli.py#L31-L63,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714449879,"Change ""--config foo:bar"" to ""--setting foo bar""", https://github.com/simonw/datasette/issues/992#issuecomment-733176252,https://api.github.com/repos/simonw/datasette/issues/992,733176252,MDEyOklzc3VlQ29tbWVudDczMzE3NjI1Mg==,9599,simonw,2020-11-24T19:07:49Z,2020-11-24T19:07:49Z,OWNER,I'm going to keep `--config` for the moment but show a deprecation warning that it will be gone in Datasette 1.0.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714449879,"Change ""--config foo:bar"" to ""--setting foo bar""", https://github.com/simonw/datasette/issues/992#issuecomment-733175965,https://api.github.com/repos/simonw/datasette/issues/992,733175965,MDEyOklzc3VlQ29tbWVudDczMzE3NTk2NQ==,9599,simonw,2020-11-24T19:07:13Z,2020-11-24T19:07:13Z,OWNER,"This is blocking progress on other metadata tickets like #860 because I want to split the concept of concrete metadata (source, license, etc) from configuration that currently lives in metadata (default sort order, default facets). I'm going to solve this next to unblock that stuff.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714449879,"Change ""--config foo:bar"" to ""--setting foo bar""", https://github.com/simonw/datasette/issues/860#issuecomment-733175454,https://api.github.com/repos/simonw/datasette/issues/860,733175454,MDEyOklzc3VlQ29tbWVudDczMzE3NTQ1NA==,9599,simonw,2020-11-24T19:06:07Z,2020-11-24T19:06:07Z,OWNER,"I see two ways this plugin hook could work. It could be asked about a specific instance, database or table and return the full metadata for that object. OR it could ask for a specific metadata field - e.g. `source_url` for table X, and return that. The more finely grained one would allow plugins to implement their own cascading rules pretty easily. Is there a reason it would be better for the hook to return an entire block of JSON for a specific table or database? I also need to decide if this hook is just going to be about source/license/about displayed metadata, or if it will include the functionality that has been sneaking into `metadata.json` over time - stuff like page size, default sort order or default facets. Perhaps I should split those out into a ""configuration"" concept first, after renaming `--config` to `--setting` in #992.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/1101#issuecomment-732544590,https://api.github.com/repos/simonw/datasette/issues/1101,732544590,MDEyOklzc3VlQ29tbWVudDczMjU0NDU5MA==,9599,simonw,2020-11-24T02:22:55Z,2020-11-24T02:22:55Z,OWNER,"The trick I'm using here is to follow the `next_url` in order to paginate through all of the matching results. The loop calls the `data()` method multiple times, once for each page of results: https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/datasette/views/base.py#L304-L307","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1101#issuecomment-732543700,https://api.github.com/repos/simonw/datasette/issues/1101,732543700,MDEyOklzc3VlQ29tbWVudDczMjU0MzcwMA==,9599,simonw,2020-11-24T02:20:30Z,2020-11-24T02:20:30Z,OWNER,"Current design: https://docs.datasette.io/en/stable/plugin_hooks.html#register-output-renderer-datasette ```python @hookimpl def register_output_renderer(datasette): return { ""extension"": ""test"", ""render"": render_demo, ""can_render"": can_render_demo, # Optional } ``` Where `render_demo` looks something like this: ```python async def render_demo(datasette, columns, rows): db = datasette.get_database() result = await db.execute(""select sqlite_version()"") first_row = "" | "".join(columns) lines = [first_row] lines.append(""="" * len(first_row)) for row in rows: lines.append("" | "".join(row)) return Response( ""\n"".join(lines), content_type=""text/plain; charset=utf-8"", headers={""x-sqlite-version"": result.first()[0]} ) ``` Meanwhile here's where the CSV streaming mode is implemented: https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/datasette/views/base.py#L297-L380","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1096#issuecomment-732542285,https://api.github.com/repos/simonw/datasette/issues/1096,732542285,MDEyOklzc3VlQ29tbWVudDczMjU0MjI4NQ==,9599,simonw,2020-11-24T02:16:22Z,2020-11-24T02:16:22Z,OWNER,"I'd like to implement this by first extending the `register_output_renderer()` hook to support streaming huge responses, then switching CSV to use the plugin hook in addition to TSV using it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743359646,TSV should be a default export option, https://github.com/simonw/datasette/issues/860#issuecomment-731658059,https://api.github.com/repos/simonw/datasette/issues/860,731658059,MDEyOklzc3VlQ29tbWVudDczMTY1ODA1OQ==,9599,simonw,2020-11-22T00:31:47Z,2020-11-22T00:33:48Z,OWNER,"Documented behaviour right now, for metadata set at the instance level, is: https://docs.datasette.io/en/stable/metadata.html > The above metadata will be displayed on the index page of your Datasette-powered site. The source and license information will also be included in the footer of every page served by Datasette. > > ... > > Metadata at the top level of the JSON will be shown on the index page and in the footer on every page of the site. The license and source is expected to apply to all of your data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/860#issuecomment-731657660,https://api.github.com/repos/simonw/datasette/issues/860,731657660,MDEyOklzc3VlQ29tbWVudDczMTY1NzY2MA==,9599,simonw,2020-11-22T00:27:32Z,2020-11-22T00:31:54Z,OWNER,"Open question: how should cascading work? If a table is missing a field but the database or instance has it, should that value cascade down to the table? It feels like `license` should definitely cascade: if an instance lists a certain `license` that should absolutely filter through to all databases and tables. But... should the other fields cascade? Cascading `description` doesn't feel right at all, and neither does `title`. What about `about` and `about_url` and `source` and `source_url`? I'm a bit torn on whether they should cascade or not. I'm leaning towards cascading them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/860#issuecomment-731657486,https://api.github.com/repos/simonw/datasette/issues/860,731657486,MDEyOklzc3VlQ29tbWVudDczMTY1NzQ4Ng==,9599,simonw,2020-11-22T00:25:34Z,2020-11-22T00:25:34Z,OWNER,"There are three layers of metadata: table, database and instance. Currently the metadata fields are (ignoring not-quite-metadata like `sort` and `sort_desc`): - `title` - `description` (or `description_html`) - `about` / `about_url` - `source` / `source_url` - `license` / `license_url`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642651572,Plugin hook for instance/database/table metadata, https://github.com/simonw/datasette/issues/1084#issuecomment-731654132,https://api.github.com/repos/simonw/datasette/issues/1084,731654132,MDEyOklzc3VlQ29tbWVudDczMTY1NDEzMg==,9599,simonw,2020-11-21T23:45:59Z,2020-11-21T23:45:59Z,OWNER,https://datasette-graphql-demo.datasette.io/github/users now demonstrates the fix.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737394470,Table/database action menu cut off if too short , https://github.com/simonw/datasette/issues/1084#issuecomment-731653083,https://api.github.com/repos/simonw/datasette/issues/1084,731653083,MDEyOklzc3VlQ29tbWVudDczMTY1MzA4Mw==,9599,simonw,2020-11-21T23:35:07Z,2020-11-21T23:35:07Z,OWNER,I got to use CSS `calc()` for this: https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/datasette/static/app.css#L367-L371,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737394470,Table/database action menu cut off if too short , https://github.com/simonw/datasette/issues/1084#issuecomment-731652991,https://api.github.com/repos/simonw/datasette/issues/1084,731652991,MDEyOklzc3VlQ29tbWVudDczMTY1Mjk5MQ==,9599,simonw,2020-11-21T23:34:22Z,2020-11-21T23:34:22Z,OWNER,"Fixed by positioning the menu relative to the `<div class=""page-header"">` element rather than the cog icon: <img width=""966"" alt=""data"" src=""https://user-images.githubusercontent.com/9599/99889950-02eab580-2c0f-11eb-85dc-dd4bc840a92e.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737394470,Table/database action menu cut off if too short , https://github.com/simonw/datasette/issues/1084#issuecomment-731644064,https://api.github.com/repos/simonw/datasette/issues/1084,731644064,MDEyOklzc3VlQ29tbWVudDczMTY0NDA2NA==,9599,simonw,2020-11-21T22:10:15Z,2020-11-21T22:10:15Z,OWNER,"Another example of this bug: https://datasette-graphql-demo.datasette.io/github/users <img width=""350"" alt=""github__users__4_823_rows_and_Comparing_1_1___main_·_simonw_datasette-graphql"" src=""https://user-images.githubusercontent.com/9599/99888639-3889a180-2c03-11eb-83b9-41f2eb12287e.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737394470,Table/database action menu cut off if too short , https://github.com/simonw/datasette/issues/1094#issuecomment-731260091,https://api.github.com/repos/simonw/datasette/issues/1094,731260091,MDEyOklzc3VlQ29tbWVudDczMTI2MDA5MQ==,4808085,bapowell,2020-11-20T16:11:29Z,2020-11-20T16:11:29Z,NONE,"I can confirm this issue, running version 0.51.1 under Windows. Fixed by commenting out the following line near the top of datasette\utils\asgi.py : `#from os import EX_CANTCREAT` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743011397,import EX_CANTCREAT means datasette fails to work on Windows, https://github.com/simonw/datasette/issues/511#issuecomment-730893729,https://api.github.com/repos/simonw/datasette/issues/511,730893729,MDEyOklzc3VlQ29tbWVudDczMDg5MzcyOQ==,4060506,Carib0u,2020-11-20T06:35:13Z,2020-11-20T06:35:13Z,NONE,"Trying to run on Windows today, I get an error from the utils/asgi.py module. It's trying `from os import EX_CANTCREAT` which is Unix-only. I commented this line out, and (so far) it's working. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",456578474,Get Datasette tests passing on Windows in GitHub Actions, https://github.com/dogsheep/twitter-to-sqlite/issues/52#issuecomment-729484478,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/52,729484478,MDEyOklzc3VlQ29tbWVudDcyOTQ4NDQ3OA==,4169772,fatihky,2020-11-18T07:12:45Z,2020-11-18T07:12:45Z,NONE,I'm so sorry that you already have `--since_id` option and that's enough for the case I've mentioned. Thank you for this excellent tool!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",745393298,Discussion: Adding support for fetching only fresh tweets, https://github.com/simonw/datasette/issues/1091#issuecomment-729045320,https://api.github.com/repos/simonw/datasette/issues/1091,729045320,MDEyOklzc3VlQ29tbWVudDcyOTA0NTMyMA==,6739646,tballison,2020-11-17T16:31:00Z,2020-11-17T16:31:00Z,NONE,We're using mod_proxy.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-729018386,https://api.github.com/repos/simonw/datasette/issues/1091,729018386,MDEyOklzc3VlQ29tbWVudDcyOTAxODM4Ng==,6739646,tballison,2020-11-17T15:48:58Z,2020-11-17T15:48:58Z,NONE,"I don't think we are, but I'll check with Maruan. I think this is the relevant part of our config? ``` Alias ""/base/"" ""/usr/share/corpora/"" <Directory ""/usr/share/corpora/""> Options +Indexes -Multiviews AllowOverride None </Directory> ProxyPreserveHost On ProxyPass /datasette http://0.0.0.0:8001 ProxyPassReverse /datasette http://0.0.0.0:8001 </VirtualHost> ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-728262974,https://api.github.com/repos/simonw/datasette/issues/1091,728262974,MDEyOklzc3VlQ29tbWVudDcyODI2Mjk3NA==,9599,simonw,2020-11-16T19:05:08Z,2020-11-16T19:05:08Z,OWNER,I have a hunch that there may be some extra configuration in play here - could Apache itself be rewriting some of the links using [mod_proxy_html](https://httpd.apache.org/docs/2.4/mod/mod_proxy_html.html)?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/dogsheep/swarm-to-sqlite/issues/11#issuecomment-727692413,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11,727692413,MDEyOklzc3VlQ29tbWVudDcyNzY5MjQxMw==,9599,simonw,2020-11-16T02:15:22Z,2020-11-16T02:15:22Z,MEMBER,"Thanks, I'll look into this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743400216,Error thrown: sqlite3.OperationalError: table users has no column named lastName, https://github.com/simonw/datasette/issues/1098#issuecomment-727656208,https://api.github.com/repos/simonw/datasette/issues/1098,727656208,MDEyOklzc3VlQ29tbWVudDcyNzY1NjIwOA==,9599,simonw,2020-11-15T23:26:14Z,2020-11-15T23:26:14Z,OWNER,"Schema for that broken example: ```sql CREATE TABLE generators_eia860 ( id INTEGER NOT NULL, plant_id_eia INTEGER, generator_id TEXT, report_date DATE, operational_status_code TEXT, operational_status TEXT, ownership_code TEXT, utility_id_eia INTEGER, capacity_mw FLOAT, summer_capacity_mw FLOAT, winter_capacity_mw FLOAT, energy_source_code_1 TEXT, energy_source_code_2 TEXT, energy_source_code_3 TEXT, energy_source_code_4 TEXT, energy_source_code_5 TEXT, energy_source_code_6 TEXT, fuel_type_code_pudl TEXT, multiple_fuels BOOLEAN, deliver_power_transgrid BOOLEAN, syncronized_transmission_grid BOOLEAN, turbines_num INTEGER, planned_modifications BOOLEAN, planned_net_summer_capacity_uprate_mw FLOAT, planned_net_winter_capacity_uprate_mw FLOAT, planned_uprate_date DATE, planned_net_summer_capacity_derate_mw FLOAT, planned_net_winter_capacity_derate_mw FLOAT, planned_derate_date DATE, planned_new_prime_mover_code TEXT, planned_energy_source_code_1 TEXT, planned_repower_date DATE, other_planned_modifications BOOLEAN, other_modifications_date DATE, planned_retirement_date DATE, carbon_capture BOOLEAN, startup_source_code_1 TEXT, startup_source_code_2 TEXT, startup_source_code_3 TEXT, startup_source_code_4 TEXT, technology_description TEXT, turbines_inverters_hydrokinetics TEXT, time_cold_shutdown_full_load_code TEXT, planned_new_capacity_mw FLOAT, cofire_fuels BOOLEAN, switch_oil_gas BOOLEAN, nameplate_power_factor FLOAT, minimum_load_mw FLOAT, uprate_derate_during_year BOOLEAN, uprate_derate_completed_date DATE, current_planned_operating_date DATE, summer_estimated_capability_mw FLOAT, winter_estimated_capability_mw FLOAT, retirement_date DATE, PRIMARY KEY (id), FOREIGN KEY(plant_id_eia, generator_id) REFERENCES generators_entity_eia (plant_id_eia, generator_id), FOREIGN KEY(utility_id_eia) REFERENCES utilities_entity_eia (utility_id_eia), CHECK (multiple_fuels IN (0, 1)), CHECK (deliver_power_transgrid IN (0, 1)), CHECK (syncronized_transmission_grid IN (0, 1)), CHECK (planned_modifications IN (0, 1)), CHECK (other_planned_modifications IN (0, 1)), CHECK (carbon_capture IN (0, 1)), CHECK (cofire_fuels IN (0, 1)), CHECK (switch_oil_gas IN (0, 1)), CHECK (uprate_derate_during_year IN (0, 1)) ); ``` https://pudl-datasette-xl7xwcpe2a-uc.a.run.app/pudl/generators_entity_eia is: ```sql CREATE TABLE generators_entity_eia ( plant_id_eia INTEGER NOT NULL, generator_id TEXT NOT NULL, prime_mover_code TEXT, duct_burners BOOLEAN, operating_date DATE, topping_bottoming_code TEXT, solid_fuel_gasification BOOLEAN, pulverized_coal_tech BOOLEAN, fluidized_bed_tech BOOLEAN, subcritical_tech BOOLEAN, supercritical_tech BOOLEAN, ultrasupercritical_tech BOOLEAN, stoker_tech BOOLEAN, other_combustion_tech BOOLEAN, bypass_heat_recovery BOOLEAN, rto_iso_lmp_node_id TEXT, rto_iso_location_wholesale_reporting_id TEXT, associated_combined_heat_power BOOLEAN, original_planned_operating_date DATE, operating_switch TEXT, previously_canceled BOOLEAN, PRIMARY KEY (plant_id_eia, generator_id), FOREIGN KEY(plant_id_eia) REFERENCES plants_entity_eia (plant_id_eia), CHECK (duct_burners IN (0, 1)), CHECK (solid_fuel_gasification IN (0, 1)), CHECK (pulverized_coal_tech IN (0, 1)), CHECK (fluidized_bed_tech IN (0, 1)), CHECK (subcritical_tech IN (0, 1)), CHECK (supercritical_tech IN (0, 1)), CHECK (ultrasupercritical_tech IN (0, 1)), CHECK (stoker_tech IN (0, 1)), CHECK (other_combustion_tech IN (0, 1)), CHECK (bypass_heat_recovery IN (0, 1)), CHECK (associated_combined_heat_power IN (0, 1)), CHECK (previously_canceled IN (0, 1)) ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743370900,Foreign key links break for compound foreign keys, https://github.com/simonw/datasette/issues/1098#issuecomment-727655636,https://api.github.com/repos/simonw/datasette/issues/1098,727655636,MDEyOklzc3VlQ29tbWVudDcyNzY1NTYzNg==,9599,simonw,2020-11-15T23:22:27Z,2020-11-15T23:22:27Z,OWNER,"Need to replicate this in the fixtures, then fix it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743370900,Foreign key links break for compound foreign keys, https://github.com/simonw/datasette/pull/1097#issuecomment-727655018,https://api.github.com/repos/simonw/datasette/issues/1097,727655018,MDEyOklzc3VlQ29tbWVudDcyNzY1NTAxOA==,22429695,codecov[bot],2020-11-15T23:18:18Z,2020-11-15T23:18:18Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=h1) Report > Merging [#1097](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=desc) (e89211d) into [main](https://codecov.io/gh/simonw/datasette/commit/5eb8e9bf250b26e30b017d39a392c33973997656?el=desc) (5eb8e9b) will **not change** coverage. > The diff coverage is `84.61%`. [](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1097 +/- ## ======================================= Coverage 91.38% 91.38% ======================================= Files 30 30 Lines 3785 3785 ======================================= Hits 3459 3459 Misses 326 326 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `73.63% <0.00%> (ø)` | | | [datasette/inspect.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2luc3BlY3QucHk=) | `36.11% <ø> (ø)` | | | [datasette/publish/common.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvY29tbW9uLnB5) | `94.73% <ø> (ø)` | | | [datasette/tracer.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3RyYWNlci5weQ==) | `81.60% <0.00%> (ø)` | | | [datasette/utils/testing.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL3Rlc3RpbmcucHk=) | `95.16% <ø> (ø)` | | | [datasette/publish/heroku.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvaGVyb2t1LnB5) | `87.12% <50.00%> (ø)` | | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.46% <66.66%> (ø)` | | | [datasette/filters.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2ZpbHRlcnMucHk=) | `94.35% <77.77%> (ø)` | | | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.01% <86.20%> (ø)` | | | [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `95.92% <92.30%> (ø)` | | | ... and [9 more](https://codecov.io/gh/simonw/datasette/pull/1097/diff?src=pr&el=tree-more) | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=footer). Last update [5eb8e9b...e89211d](https://codecov.io/gh/simonw/datasette/pull/1097?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743369188,Use f-strings, https://github.com/simonw/datasette/issues/942#issuecomment-727626657,https://api.github.com/repos/simonw/datasette/issues/942,727626657,MDEyOklzc3VlQ29tbWVudDcyNzYyNjY1Nw==,9599,simonw,2020-11-15T19:54:44Z,2020-11-15T19:54:44Z,OWNER,This will also benefit from the metadata plugin hook: #860 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/1091#issuecomment-727233553,https://api.github.com/repos/simonw/datasette/issues/1091,727233553,MDEyOklzc3VlQ29tbWVudDcyNzIzMzU1Mw==,9599,simonw,2020-11-14T16:46:52Z,2020-11-14T16:46:52Z,OWNER,@tballison could I see the section of your Apache config that configures the proxying to `/datasette/`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726801731,https://api.github.com/repos/simonw/datasette/issues/1091,726801731,MDEyOklzc3VlQ29tbWVudDcyNjgwMTczMQ==,6739646,tballison,2020-11-13T14:40:56Z,2020-11-13T14:40:56Z,NONE,"My headers aren't clickable/sortable with custom sql, but I think that's by design. In the default view, https://corpora.tika.apache.org/datasette/file_profiles/file_profiles, ah, y, now I see that the headers should be sortable, but you're right the base_url is not applied. base_url works with ""View and Edit SQL"" and with ""(advanced)"" As you point out, does not work with the export csv, json, other or with the ""Next page"" navigational button at the bottom.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726798745,https://api.github.com/repos/simonw/datasette/issues/1091,726798745,MDEyOklzc3VlQ29tbWVudDcyNjc5ODc0NQ==,6739646,tballison,2020-11-13T14:35:22Z,2020-11-13T14:35:22Z,NONE,"I'm starting this with docker like so: `docker run --name datasette -d -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/file_profiles.db --config sql_time_limit_ms:120000 --config max_returned_rows:100000 --config base_url:/datasette/ --config cache_size_kb:50000` I'm not doing any templating or anything else custom. Apropos of nothing, I swapped out a simpler db, so this query should now work: https://corpora.tika.apache.org/datasette/file_profiles?sql=select%0D%0A++*%0D%0Afrom%0D%0A++file_profiles+fp%0D%0Alimit%0D%0A++10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/268#issuecomment-726419027,https://api.github.com/repos/simonw/datasette/issues/268,726419027,MDEyOklzc3VlQ29tbWVudDcyNjQxOTAyNw==,9599,simonw,2020-11-13T00:09:04Z,2020-11-13T00:09:04Z,OWNER,Part of the challenge here is that this is the first time the `TableView` will have had a complete rewrite of the SQL it is going to execute. That SQL is currently constructed here: https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/views/table.py#L628-L636,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/749#issuecomment-726417847,https://api.github.com/repos/simonw/datasette/issues/749,726417847,MDEyOklzc3VlQ29tbWVudDcyNjQxNzg0Nw==,9599,simonw,2020-11-13T00:05:14Z,2020-11-13T00:05:14Z,OWNER,"https://cloud.google.com/blog/products/serverless/cloud-run-now-supports-http-grpc-server-streaming indicates this limit should no longer apply: > With this addition, Cloud Run can now ... Send responses larger than the previous 32 MB limit But I'm still getting errors from Cloud Run attempting to download `.db` files larger than 32 MB. I filed a question in their issue tracker about that here: https://issuetracker.google.com/issues/173038375","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610829227,Cloud Run fails to serve database files larger than 32MB, https://github.com/simonw/datasette/issues/1091#issuecomment-726416330,https://api.github.com/repos/simonw/datasette/issues/1091,726416330,MDEyOklzc3VlQ29tbWVudDcyNjQxNjMzMA==,9599,simonw,2020-11-13T00:00:43Z,2020-11-13T00:00:43Z,OWNER,Here's where `url_csv` comes from: https://github.com/simonw/datasette/blob/11eb1e026f3d84cb771f8d6e204939cbaee130cd/datasette/views/base.py#L542-L545,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726415991,https://api.github.com/repos/simonw/datasette/issues/1091,726415991,MDEyOklzc3VlQ29tbWVudDcyNjQxNTk5MQ==,9599,simonw,2020-11-12T23:59:34Z,2020-11-12T23:59:34Z,OWNER,"The sort headers are generated by this template code: https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/templates/_table.html#L11-L15 The export links use this code: https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/templates/table.html#L134 https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/templates/table.html#L180-L201","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726415019,https://api.github.com/repos/simonw/datasette/issues/1091,726415019,MDEyOklzc3VlQ29tbWVudDcyNjQxNTAxOQ==,9599,simonw,2020-11-12T23:56:23Z,2020-11-12T23:56:23Z,OWNER,@tballison is there any chance you're running any custom templates in that installation? I'm really confused as to why I can't replicate the bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726413829,https://api.github.com/repos/simonw/datasette/issues/1091,726413829,MDEyOklzc3VlQ29tbWVudDcyNjQxMzgyOQ==,9599,simonw,2020-11-12T23:52:50Z,2020-11-12T23:54:16Z,OWNER,"Hmm... it's not just the `.csv` and `.json` export links - it's the column headings (which can be clicked to change the sort order) as well. Here's an extract of the HTML from that page: ```html <p class=""export-links"">This data as <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.json"">json</a>, <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.csv?_size=max"">CSV</a> ( <a href=""#export"">advanced</a>) </p> <div class=""table-wrapper""> <table class=""rows-and-columns""> <thead> <tr> <th class=""col-Link"" scope=""col"" data-column=""Link"" data-column-type="""" data-column-not-null=""0"" data-is-pk=""0""> Link </th> <th class=""col-rowid"" scope=""col"" data-column=""rowid"" data-column-type=""integer"" data-column-not-null=""0"" data-is-pk=""1""> <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES?_sort_desc=rowid"" rel=""nofollow"">rowid ▼</a> </th> <th class=""col-PARSE_EXCEPTION_ID"" scope=""col"" data-column=""PARSE_EXCEPTION_ID"" data-column-type=""INTEGER"" data-column-not-null=""0"" data-is-pk=""0""> <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES?_sort=PARSE_EXCEPTION_ID"" rel=""nofollow"">PARSE_EXCEPTION_ID</a> </th> <th class=""col-PARSE_EXCEPTION_DESCRIPTION"" scope=""col"" data-column=""PARSE_EXCEPTION_DESCRIPTION"" data-column-type=""VARCHAR(128)"" data-column-not-null=""0"" data-is-pk=""0""> <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES?_sort=PARSE_EXCEPTION_DESCRIPTION"" rel=""nofollow"">PARSE_EXCEPTION_DESCRIPTION</a> </th> </tr> </thead> <tbody> <tr> <td class=""col-Link type-pk""> <a href=""/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES/1"">1</a> </td> <td class=""col-rowid type-int"">1</td> <td class=""col-PARSE_EXCEPTION_ID type-int"">0</td> <td class=""col-PARSE_EXCEPTION_DESCRIPTION type-str"">RUNTIME</td> </tr> <tr> <td class=""col-Link type-pk""> <a href=""/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES/2"">2</a> </td> <td class=""col-rowid type-int"">2</td> <td class=""col-PARSE_EXCEPTION_ID type-int"">1</td> <td class=""col-PARSE_EXCEPTION_DESCRIPTION type-str"">ENCRYPTION</td> </tr> <tr> <td class=""col-Link type-pk""> <a href=""/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES/3"">3</a> </td> <td class=""col-rowid type-int"">3</td> <td class=""col-PARSE_EXCEPTION_ID type-int"">2</td> <td class=""col-PARSE_EXCEPTION_DESCRIPTION type-str"">ACCESS_PERMISSION</td> </tr> <tr> <td class=""col-Link type-pk""> <a href=""/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES/4"">4</a> </td> <td class=""col-rowid type-int"">4</td> <td class=""col-PARSE_EXCEPTION_ID type-int"">3</td> <td class=""col-PARSE_EXCEPTION_DESCRIPTION type-str"">UNSUPPORTED_VERSION</td> </tr> </tbody> </table> </div> <div id=""export"" class=""advanced-export""> <h3>Advanced export</h3> <p>JSON shape: <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.json"">default</a>, <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.json?_shape=array"">array</a>, <a href=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.json?_shape=array&_nl=on"">newline-delimited</a> </p> <form action=""/corpora-metadata/REF_PARSE_EXCEPTION_TYPES.csv"" method=""get""> <p> CSV options: <label> <input type=""checkbox"" name=""_dl""> download file </label> <input type=""submit"" value=""Export CSV""> <input type=""hidden"" name=""_size"" value=""max""> </p> </form> </div> ``` But here's something _really_ weird - the links to the individual rows DO include the `/datasette/` prefix: ```html <td class=""col-Link type-pk""> <a href=""/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES/2"">2</a> </td> ``` The navigation bar on that page is correct too: ```html <p class=""crumbs""> <a href=""/datasette/"">home</a> / <a href=""/datasette/corpora-metadata"">corpora-metadata</a> </p> ``` I've also been unable to replicate this in my own local environment, running `datasette fixtures.db --config base_url:/datasette/`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/865#issuecomment-726412057,https://api.github.com/repos/simonw/datasette/issues/865,726412057,MDEyOklzc3VlQ29tbWVudDcyNjQxMjA1Nw==,9599,simonw,2020-11-12T23:49:23Z,2020-11-12T23:49:23Z,OWNER,"@tballison thanks, I've split that out into a new issue #1091","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/865#issuecomment-726385782,https://api.github.com/repos/simonw/datasette/issues/865,726385782,MDEyOklzc3VlQ29tbWVudDcyNjM4NTc4Mg==,6739646,tballison,2020-11-12T22:41:06Z,2020-11-12T22:41:06Z,NONE,"The same is true if I select advanced export and hit the 'export csv' at the bottom of the page. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/865#issuecomment-726385422,https://api.github.com/repos/simonw/datasette/issues/865,726385422,MDEyOklzc3VlQ29tbWVudDcyNjM4NTQyMg==,6739646,tballison,2020-11-12T22:40:14Z,2020-11-12T22:40:14Z,NONE,"Just tested with the latest Docker image, and it works pretty much everywhere! THANK YOU! I did notice that if I try to export json or csv, the base is not applied. Not sure if I should reopen this issue or open a new one. To see this, go here: https://corpora.tika.apache.org/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES Click/hover over json or CSV and you'll see that the 'datasette' base is not included. Again, many thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/1089#issuecomment-726127465,https://api.github.com/repos/simonw/datasette/issues/1089,726127465,MDEyOklzc3VlQ29tbWVudDcyNjEyNzQ2NQ==,9599,simonw,2020-11-12T14:54:11Z,2020-11-12T14:54:11Z,OWNER,"Suggested list to look out for from that PR: - simply/simple - easy/easier/easiest - obvious/obviously - just - merely - straightforward - ridiculous","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741665726,Sweep documentation for words that minimize involved difficulty, https://github.com/simonw/datasette/issues/1088#issuecomment-725830716,https://api.github.com/repos/simonw/datasette/issues/1088,725830716,MDEyOklzc3VlQ29tbWVudDcyNTgzMDcxNg==,9599,simonw,2020-11-12T04:35:38Z,2020-11-12T04:35:38Z,OWNER,"I'm going to fix this without a test, because writing a test for this is a bit fiddly and it's a very minor bug. If it comes back again I'll do the work to test for it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741268956,OperationalError('interrupted') can 500 on row page, https://github.com/simonw/datasette/issues/1088#issuecomment-725830533,https://api.github.com/repos/simonw/datasette/issues/1088,725830533,MDEyOklzc3VlQ29tbWVudDcyNTgzMDUzMw==,9599,simonw,2020-11-12T04:35:08Z,2020-11-12T04:35:08Z,OWNER,"Yup, swapping `QueryInterrupted` fixed this against my giant database.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741268956,OperationalError('interrupted') can 500 on row page, https://github.com/simonw/datasette/issues/1088#issuecomment-725829903,https://api.github.com/repos/simonw/datasette/issues/1088,725829903,MDEyOklzc3VlQ29tbWVudDcyNTgyOTkwMw==,9599,simonw,2020-11-12T04:33:14Z,2020-11-12T04:33:14Z,OWNER,"I'm suspicious of this code: https://github.com/simonw/datasette/blob/e8e0a6f284ca953b2980186c4356594c07bd1929/datasette/views/table.py#L1032-L1045 This code uses a different exception: https://github.com/simonw/datasette/blob/e8e0a6f284ca953b2980186c4356594c07bd1929/datasette/views/table.py#L658-L663 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741268956,OperationalError('interrupted') can 500 on row page, https://github.com/simonw/datasette/pull/1085#issuecomment-725731685,https://api.github.com/repos/simonw/datasette/issues/1085,725731685,MDEyOklzc3VlQ29tbWVudDcyNTczMTY4NQ==,22429695,codecov[bot],2020-11-12T00:01:18Z,2020-11-12T00:01:18Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=h1) Report > Merging [#1085](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=desc) (51e7651) into [main](https://codecov.io/gh/simonw/datasette/commit/2a981e2ac1d13125973904b777d00ea75e8df4e6?el=desc) (2a981e2) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1085 +/- ## ======================================= Coverage 91.38% 91.38% ======================================= Files 30 30 Lines 3785 3785 ======================================= Hits 3459 3459 Misses 326 326 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=footer). Last update [2a981e2...51e7651](https://codecov.io/gh/simonw/datasette/pull/1085?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",740512882,Use FTS4 in fixtures, https://github.com/simonw/datasette/issues/1086#issuecomment-725729857,https://api.github.com/repos/simonw/datasette/issues/1086,725729857,MDEyOklzc3VlQ29tbWVudDcyNTcyOTg1Nw==,9599,simonw,2020-11-11T23:55:39Z,2020-11-11T23:55:39Z,OWNER,"Demo: https://latest.datasette.io/fixtures/foreign_key_references?_facet=foreign_key_with_blank_label <img width=""772"" alt=""fixtures__foreign_key_references__2_rows"" src=""https://user-images.githubusercontent.com/9599/98877868-53575b80-2436-11eb-8669-f24eeb8d0f3c.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741021342,Foreign keys with blank titles result in non-clickable links, https://github.com/simonw/datasette/issues/1086#issuecomment-725714908,https://api.github.com/repos/simonw/datasette/issues/1086,725714908,MDEyOklzc3VlQ29tbWVudDcyNTcxNDkwOA==,9599,simonw,2020-11-11T23:17:26Z,2020-11-11T23:17:26Z,OWNER,I'm just going to use a regular coloured hyphen.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741021342,Foreign keys with blank titles result in non-clickable links, https://github.com/simonw/datasette/issues/1086#issuecomment-725622784,https://api.github.com/repos/simonw/datasette/issues/1086,725622784,MDEyOklzc3VlQ29tbWVudDcyNTYyMjc4NA==,9599,simonw,2020-11-11T19:41:45Z,2020-11-11T19:41:45Z,OWNER,Maybe use a grey hyphen here?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741021342,Foreign keys with blank titles result in non-clickable links, https://github.com/simonw/sqlite-utils/issues/168#issuecomment-723829147,https://api.github.com/repos/simonw/sqlite-utils/issues/168,723829147,MDEyOklzc3VlQ29tbWVudDcyMzgyOTE0Nw==,9599,simonw,2020-11-09T07:43:30Z,2020-11-09T07:43:30Z,OWNER,Yeah whatever process the have in place is working great without any extra intervention: they upgraded to 3.0 four hours ago.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",706167456,Automate (as much as possible) updates published to Homebrew, https://github.com/simonw/datasette/issues/268#issuecomment-723740546,https://api.github.com/repos/simonw/datasette/issues/268,723740546,MDEyOklzc3VlQ29tbWVudDcyMzc0MDU0Ng==,9599,simonw,2020-11-09T04:01:50Z,2020-11-09T04:01:50Z,OWNER,I should depend on `sqlite-fts4` - I'm doing that in `sqlite-utils` now and it works great: https://github.com/simonw/sqlite-utils/issues/198,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-723637930,https://api.github.com/repos/simonw/sqlite-utils/issues/192,723637930,MDEyOklzc3VlQ29tbWVudDcyMzYzNzkzMA==,9599,simonw,2020-11-08T17:06:56Z,2020-11-08T17:06:56Z,OWNER,"This looks pretty good now! ``` % sqlite-utils search 24ways.db articles simon -c title -c author -t title author ----------------------------------------------------------------------------- ------------------ Don't be eval() Simon Willison DOM Scripting Your Way to Better Blockquotes Jeremy Keith Swooshy Curly Quotes Without Images Simon Collison The Articulate Web Designer of Tomorrow Simon Collison Writing Responsible JavaScript Drew McLellan Going Nuts with CSS Transitions Natalie Downe Managing a Mind Christopher Murphy Design Systems Laura Kalbag Bringing Your Code to the Streets Ruth John Taming Complexity Simon Collison Unobtrusively Mapping Microformats with jQuery Simon Willison Crafting the Front-end Ben Bodien Develop Your Naturalist Superpowers with Observable Notebooks and iNaturalist Natalie Downe Fast Autocomplete Search for Your Website Simon Willison ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/201#issuecomment-723369033,https://api.github.com/repos/simonw/sqlite-utils/issues/201,723369033,MDEyOklzc3VlQ29tbWVudDcyMzM2OTAzMw==,9599,simonw,2020-11-07T01:28:11Z,2020-11-07T01:28:11Z,OWNER,Need to fix this to close #192 and #197.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738128913,.search(columns=) and sqlite-utils search -c ... bug, https://github.com/simonw/sqlite-utils/issues/194#issuecomment-723368528,https://api.github.com/repos/simonw/sqlite-utils/issues/194,723368528,MDEyOklzc3VlQ29tbWVudDcyMzM2ODUyOA==,9599,simonw,2020-11-07T01:24:55Z,2020-11-07T01:24:55Z,OWNER,Here's the alpha release: https://github.com/simonw/sqlite-utils/releases/tag/3.0a0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735650864,3.0 release with some minor breaking changes, https://github.com/simonw/sqlite-utils/issues/197#issuecomment-723365651,https://api.github.com/repos/simonw/sqlite-utils/issues/197,723365651,MDEyOklzc3VlQ29tbWVudDcyMzM2NTY1MQ==,9599,simonw,2020-11-07T01:06:32Z,2020-11-07T01:06:32Z,OWNER,Documentation: https://sqlite-utils.readthedocs.io/en/latest/python-api.html#searching-with-table-search,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737153927,Rethink how table.search() method works, https://github.com/simonw/sqlite-utils/issues/197#issuecomment-723360842,https://api.github.com/repos/simonw/sqlite-utils/issues/197,723360842,MDEyOklzc3VlQ29tbWVudDcyMzM2MDg0Mg==,9599,simonw,2020-11-07T00:40:55Z,2020-11-07T00:40:55Z,OWNER,"The `order=` parameter should be called `order_by` for consistency with this: ```python for row in db[""dogs""].rows_where(""age > 1"", order_by=""age""): ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737153927,Rethink how table.search() method works, https://github.com/simonw/sqlite-utils/issues/200#issuecomment-723357855,https://api.github.com/repos/simonw/sqlite-utils/issues/200,723357855,MDEyOklzc3VlQ29tbWVudDcyMzM1Nzg1NQ==,9599,simonw,2020-11-07T00:24:37Z,2020-11-07T00:24:37Z,OWNER,"``` (sqlite-utils) sqlite-utils % sqlite-utils rows 24ways-fts4.db articles -c title -c author | head -n 3 [{""title"": ""Why Bother with Accessibility?"", ""author"": ""Laura Kalbag""}, {""title"": ""Levelling Up"", ""author"": ""Ashley Baxter""}, {""title"": ""Project Hubs: A Home Base for Design Projects"", ""author"": ""Brad Frost""}, ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738115165,sqlite-utils rows -c option, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-723357117,https://api.github.com/repos/simonw/sqlite-utils/issues/192,723357117,MDEyOklzc3VlQ29tbWVudDcyMzM1NzExNw==,9599,simonw,2020-11-07T00:21:05Z,2020-11-07T00:21:05Z,OWNER,"I'm going to have it only output the exact `-c` columns you requested (if you requested any). Add `--rank` to specify the rank column, since you may not know what its name is.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/194#issuecomment-723356020,https://api.github.com/repos/simonw/sqlite-utils/issues/194,723356020,MDEyOklzc3VlQ29tbWVudDcyMzM1NjAyMA==,9599,simonw,2020-11-07T00:16:06Z,2020-11-07T00:16:06Z,OWNER,"I'm going to release this as an alpha first and sit on it for a few days, since I don't want to ship any mistakes that would result in having to bump straight to 4.0!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735650864,3.0 release with some minor breaking changes, https://github.com/simonw/sqlite-utils/issues/91#issuecomment-723350956,https://api.github.com/repos/simonw/sqlite-utils/issues/91,723350956,MDEyOklzc3VlQ29tbWVudDcyMzM1MDk1Ng==,9599,simonw,2020-11-06T23:53:25Z,2020-11-06T23:53:25Z,OWNER,"This is now possible, for both FTS4 and FTS5 - see #197.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",577302229,Enable ordering FTS results by rank, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-723348722,https://api.github.com/repos/simonw/sqlite-utils/issues/192,723348722,MDEyOklzc3VlQ29tbWVudDcyMzM0ODcyMg==,9599,simonw,2020-11-06T23:43:09Z,2020-11-06T23:43:09Z,OWNER,"Also that order looks incorrect. It looks like most relevant came back last, not first.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-723348614,https://api.github.com/repos/simonw/sqlite-utils/issues/192,723348614,MDEyOklzc3VlQ29tbWVudDcyMzM0ODYxNA==,9599,simonw,2020-11-06T23:42:38Z,2020-11-06T23:42:38Z,OWNER,"This is a bit surprising: ``` (sqlite-utils) sqlite-utils % sqlite-utils search 24ways-fts4.db articles maps -c title [{""rowid"": 41, ""title"": ""What Is Vagrant and Why Should I Care?"", ""rank"": -1.9252039178908076}, {""rowid"": 298, ""title"": ""First Steps in VR"", ""rank"": -1.9945466378736434}, {""rowid"": 43, ""title"": ""Content Production Planning"", ""rank"": -2.1928058363046143}, {""rowid"": 100, ""title"": ""Moo'y Christmas"", ""rank"": -2.2698482999851675}, {""rowid"": 91, ""title"": ""Infinite Canvas: Moving Beyond the Page"", ""rank"": -2.290928999035195}, {""rowid"": 175, ""title"": ""Front-End Code Reusability with CSS and JavaScript"", ""rank"": -2.498731782924352}, {""rowid"": 209, ""title"": ""Feeding the Audio Graph"", ""rank"": -2.619968930100356}, {""rowid"": 296, ""title"": ""Animation in Design Systems"", ""rank"": -2.62060151817201}, {""rowid"": 118, ""title"": ""Ghosts On The Internet"", ""rank"": -2.7224894534521087}, {""rowid"": 77, ""title"": ""Colour Accessibility"", ""rank"": -2.7389782859427343}, {""rowid"": 245, ""title"": ""Web Content Accessibility Guidelines 2.1\u2014for People Who Haven\u2019t Read the Update"", ""rank"": -2.9750992611162888}, {""rowid"": 56, ""title"": ""Helping VIPs Care About Performance"", ""rank"": -3.0819662908932535}, {""rowid"": 109, ""title"": ""Geotag Everywhere with Fire Eagle"", ""rank"": -3.1371975973877277}, {""rowid"": 203, ""title"": ""Jobs-to-Be-Done in Your UX Toolbox"", ""rank"": -3.2416719461682733}, {""rowid"": 276, ""title"": ""Your jQuery: Now With 67% Less Suck"", ""rank"": -3.4947916564653028}, {""rowid"": 58, ""title"": ""Beyond the Style Guide"", ""rank"": -3.7508321464447905}, {""rowid"": 225, ""title"": ""Good Ideas Grow on Paper"", ""rank"": -4.120077674716844}, {""rowid"": 168, ""title"": ""Unobtrusively Mapping Microformats with jQuery"", ""rank"": -4.662224207228984}, {""rowid"": 27, ""title"": ""Putting Design on the Map"", ""rank"": -5.667327088267961}, {""rowid"": 220, ""title"": ""Finding Your Way with Static Maps"", ""rank"": -9.952534352591737}] ``` I requested just `-c title` but also got back `rowid` and `rank`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/194#issuecomment-723234493,https://api.github.com/repos/simonw/sqlite-utils/issues/194,723234493,MDEyOklzc3VlQ29tbWVudDcyMzIzNDQ5Mw==,9599,simonw,2020-11-06T18:32:34Z,2020-11-06T18:32:34Z,OWNER,"The breaking changes will be: - `table.search()` now returns a generator that produces dictionaries, similar to `table.rows_where()` - The `-c` shortcut no longer works, use `--csv` instead. - The `-f` shortcut no longer works, use `--fmt` instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735650864,3.0 release with some minor breaking changes, https://github.com/simonw/sqlite-utils/issues/197#issuecomment-723230732,https://api.github.com/repos/simonw/sqlite-utils/issues/197,723230732,MDEyOklzc3VlQ29tbWVudDcyMzIzMDczMg==,9599,simonw,2020-11-06T18:24:29Z,2020-11-06T18:24:29Z,OWNER,Still need to update docs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737153927,Rethink how table.search() method works, https://github.com/simonw/sqlite-utils/pull/195#issuecomment-723148906,https://api.github.com/repos/simonw/sqlite-utils/issues/195,723148906,MDEyOklzc3VlQ29tbWVudDcyMzE0ODkwNg==,9599,simonw,2020-11-06T15:43:51Z,2020-11-06T15:43:51Z,OWNER,Thanks to #198 (introducing a `rank_bm25()` custom function for FTS4) this feature will be able to offer relevance search for both FTS5 AND FTS4 tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735663855,table.search() improvements plus sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/197#issuecomment-723148310,https://api.github.com/repos/simonw/sqlite-utils/issues/197,723148310,MDEyOklzc3VlQ29tbWVudDcyMzE0ODMxMA==,9599,simonw,2020-11-06T15:42:43Z,2020-11-06T15:42:43Z,OWNER,Having `.search()` return tuples when `.rows_where()` returns dictionaries just feels like bad API design to me - it's inconsistent. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737153927,Rethink how table.search() method works, https://github.com/simonw/sqlite-utils/issues/199#issuecomment-723147463,https://api.github.com/repos/simonw/sqlite-utils/issues/199,723147463,MDEyOklzc3VlQ29tbWVudDcyMzE0NzQ2Mw==,9599,simonw,2020-11-06T15:41:00Z,2020-11-06T15:41:00Z,OWNER,"Something like this: ``` @db.register_function(replace=True) def my_function(a): return a.upper() ``` If `replace=True` then this function will be registered even if a `my_function` of arity 1 has already been registered previously. It defaults to `False` though which means the Database object tracks what functions and arities have been registered in the past and silently ignores any new attempts to register the same name/arity. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737855731,"@db.register_function(..., replace=False) to avoid double-registering custom functions", https://github.com/simonw/sqlite-utils/issues/198#issuecomment-723145383,https://api.github.com/repos/simonw/sqlite-utils/issues/198,723145383,MDEyOklzc3VlQ29tbWVudDcyMzE0NTM4Mw==,9599,simonw,2020-11-06T15:36:47Z,2020-11-06T15:36:47Z,OWNER,"Should I register the custom `rank_bm25` SQLite function for every connection, or should I register it against the connection just the first time the user attempts an FTS4 search? I think I'd rather register it only if it is needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/198#issuecomment-723144893,https://api.github.com/repos/simonw/sqlite-utils/issues/198,723144893,MDEyOklzc3VlQ29tbWVudDcyMzE0NDg5Mw==,9599,simonw,2020-11-06T15:35:45Z,2020-11-06T15:35:45Z,OWNER,"Here's a demo of that rank query. I had to sort by `rank` instead of `rank desc` - need to double-check that: https://datasette-sqlite-fts4.datasette.io/24ways-fts4?sql=with+original+as+(%0D%0A++++select%0D%0A++++++++rowid%2C%0D%0A++++++++*%0D%0A++++from+[articles]%0D%0A)%0D%0Aselect%0D%0A++++original.*%2C%0D%0A++++rank_bm25(matchinfo([articles_fts]%2C+%27pcnalx%27))+as+rank%0D%0Afrom%0D%0A++++[original]%0D%0A++++join+[articles_fts]+on+[original].rowid+%3D+[articles_fts].rowid%0D%0Awhere%0D%0A++++[articles_fts]+match+%3Aquery%0D%0Aorder+by%0D%0A++++rank%0D%0Alimit+20&query=jquery+maps","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/198#issuecomment-723143633,https://api.github.com/repos/simonw/sqlite-utils/issues/198,723143633,MDEyOklzc3VlQ29tbWVudDcyMzE0MzYzMw==,9599,simonw,2020-11-06T15:33:12Z,2020-11-06T15:33:12Z,OWNER,"Here's the FTS5 query: ```sql with original as ( select rowid, * from [global-power-plants] ) select original.*, [global-power-plants_fts].rank as rank from [original] join [global-power-plants_fts] on [original].rowid = [global-power-plants_fts].rowid where [global-power-plants_fts] match :query order by rank desc limit 20 ``` The equivalent using `rank_bm25()` for FTS4 would be: ```sql with original as ( select rowid, * from [global-power-plants] ) select original.*, rank_bm25(matchinfo([global-power-plants_fts], 'pcnalx')) as rank from [original] join [global-power-plants_fts] on [original].rowid = [global-power-plants_fts].rowid where [global-power-plants_fts] match :query order by rank desc limit 20 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/198#issuecomment-722895825,https://api.github.com/repos/simonw/sqlite-utils/issues/198,722895825,MDEyOklzc3VlQ29tbWVudDcyMjg5NTgyNQ==,9599,simonw,2020-11-06T06:29:17Z,2020-11-06T06:29:17Z,OWNER,I released a 1.0 (and 1.0.1) version of https://github.com/simonw/sqlite-fts4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/198#issuecomment-722852262,https://api.github.com/repos/simonw/sqlite-utils/issues/198,722852262,MDEyOklzc3VlQ29tbWVudDcyMjg1MjI2Mg==,9599,simonw,2020-11-06T05:41:58Z,2020-11-06T05:41:58Z,OWNER,"Example query (from the tests): ```sql select c0, c1, rank_bm25(matchinfo(search, 'pcnalx')) as bm25 from search where search match ? ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/198#issuecomment-722849539,https://api.github.com/repos/simonw/sqlite-utils/issues/198,722849539,MDEyOklzc3VlQ29tbWVudDcyMjg0OTUzOQ==,9599,simonw,2020-11-06T05:39:17Z,2020-11-06T05:39:17Z,OWNER,I'd have to copy almost all of the code in https://github.com/simonw/sqlite-fts4/blob/master/sqlite_fts4/__init__.py so I think I will add it as a dependency instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737476423,Support order by relevance against FTS4, https://github.com/simonw/sqlite-utils/issues/197#issuecomment-722545442,https://api.github.com/repos/simonw/sqlite-utils/issues/197,722545442,MDEyOklzc3VlQ29tbWVudDcyMjU0NTQ0Mg==,9599,simonw,2020-11-05T18:05:33Z,2020-11-05T18:05:33Z,OWNER,This is likely to result in a 3.0 release due to a backwards-incompatible change to the current `.search()` method - #194,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",737153927,Rethink how table.search() method works, https://github.com/simonw/sqlite-utils/pull/195#issuecomment-722542895,https://api.github.com/repos/simonw/sqlite-utils/issues/195,722542895,MDEyOklzc3VlQ29tbWVudDcyMjU0Mjg5NQ==,9599,simonw,2020-11-05T18:01:33Z,2020-11-05T18:01:33Z,OWNER,"Latest test failure: ``` 114 -> assert [(""racoons are biting trash pandas"", ""USA"", ""bar"")] == table.search( 115 ""bite"", order=""rowid"" 116 ) 117 118 119 def test_optimize_fts(fresh_db): (Pdb) table.search(""bite"") [(2, 'racoons are biting trash pandas', 'USA', 'bar', -9.641434262948206e-07)] ``` The problem here is that the `table.search()` method now behaves differently for FTS4 v.s. FTS5 tables. With FTS4 you get back just the table columns. With FTS5 you also get back the `rowid` as the first column and the `rank` score as the last column. This is weird. It also makes me question whether having `.search()` return a list of tuples is the right API design.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735663855,table.search() improvements plus sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722086105,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722086105,MDEyOklzc3VlQ29tbWVudDcyMjA4NjEwNQ==,9599,simonw,2020-11-05T02:29:50Z,2020-11-05T03:39:58Z,OWNER,"The finished monster: ```python _virtual_table_using_re = re.compile(r"""""" ^ # Start of string \s*CREATE\s+VIRTUAL\s+TABLE\s+ # CREATE VIRTUAL TABLE ( '(?P<squoted_table>[^']*(?:''[^']*)*)' | # single quoted name ""(?P<dquoted_table>[^""]*(?:""""[^""]*)*)"" | # double quoted name `(?P<backtick_table>[^`]+)` | # `backtick` quoted name \[(?P<squarequoted_table>[^\]]+)\] | # [...] quoted name (?P<identifier> # SQLite non-quoted identifier [A-Za-z_\u0080-\uffff] # \u0080-\uffff = ""any character larger than u007f"" [A-Za-z_\u0080-\uffff0-9\$]* # zero-or-more alphanemuric or $ ) ) \s+(IF\s+NOT\s+EXISTS\s+)? # IF NOT EXISTS (optional) USING\s+(?P<using>\w+) # e.g. USING FTS5 """""", re.VERBOSE | re.IGNORECASE) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722084593,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722084593,MDEyOklzc3VlQ29tbWVudDcyMjA4NDU5Mw==,9599,simonw,2020-11-05T02:24:47Z,2020-11-05T02:24:47Z,OWNER,"And an identifier is ""ALPHABETIC character and continue with zero or more ALPHANUMERIC characters and/or ""$"" (u0024) characters"" So... [\u0041-\u005a\u0061-\u0071\u007f-\uffff\u005f][\u0041-\u005a\u0061-\u0071\u007f-\uffff\u005f\u0030-\u0039\u0024]+","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722084213,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722084213,MDEyOklzc3VlQ29tbWVudDcyMjA4NDIxMw==,9599,simonw,2020-11-05T02:23:37Z,2020-11-05T02:23:37Z,OWNER,"So... ALPHABETIC: `[\u0041-\u005a\u0061-\u0071\u007f-\uffff\u005f]` NUMERIC: `[\u0030-\u0039]`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722083527,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722083527,MDEyOklzc3VlQ29tbWVudDcyMjA4MzUyNw==,9599,simonw,2020-11-05T02:21:26Z,2020-11-05T02:21:26Z,OWNER,I think that's `\u007F-\uFFFF` in regex range speak.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722082874,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722082874,MDEyOklzc3VlQ29tbWVudDcyMjA4Mjg3NA==,9599,simonw,2020-11-05T02:19:18Z,2020-11-05T02:19:18Z,OWNER,"""any other character larger than u007f."" Need to figure that out!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722082759,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722082759,MDEyOklzc3VlQ29tbWVudDcyMjA4Mjc1OQ==,9599,simonw,2020-11-05T02:18:58Z,2020-11-05T02:18:58Z,OWNER,"More from that document, describing `ALPHANUMERIC`: > **ALPHABETIC** > > Any of the characters in the range u0041 through u005a (letters ""A"" through ""Z"") or in the range u0061 through u007a (letters ""a"" through ""z"") or the character u005f (""_"") or any other character larger than u007f. > > **NUMERIC** > > Any of the characters in the range u0030 through u0039 (digits ""0"" through ""9"") > > **ALPHANUMERIC** > > Any character which is either ALPHABETIC or NUMERIC","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722082497,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722082497,MDEyOklzc3VlQ29tbWVudDcyMjA4MjQ5Nw==,9599,simonw,2020-11-05T02:18:08Z,2020-11-05T02:18:08Z,OWNER,"I'm missing the case where a table has no quotes around it at all - `create virtual table foo using fts5` So I need to know how to create a regex for a SQLite identifier. https://www.sqlite.org/draft/tokenreq.html seems to be the only available documentation for that. > > ### Identifier tokens > > Identifiers follow the usual rules with the exception that SQLite allows the dollar-sign symbol in the interior of an identifier. The dollar-sign is for compatibility with Microsoft SQL-Server and is not part of the SQL standard. > > > **H41130:** SQLite shall recognize as an ID token any sequence of characters that begins with an ALPHABETIC character and continue with zero or more ALPHANUMERIC characters and/or ""$"" (u0024) characters and which is not a keyword token. > > Identifiers can be arbitrary character strings within square brackets. This feature is also for compatibility with Microsoft SQL-Server and not a part of the SQL standard. > > > **H41140:** SQLite shall recognize as an ID token any sequence of non-zero characters that begins with ""["" (u005b) and continuing through the first ""]"" (u005d) character. > > The standard way of quoting SQL identifiers is to use double-quotes. > > > **H41150:** SQLite shall recognize as an ID token any sequence of characters that begins with a double-quote (u0022), is followed by zero or more non-zero characters and/or pairs of double-quotes (u0022) and terminates with a double-quote (u0022) that is not part of a pair. > > MySQL allows identifiers to be quoted using the grave accent character. SQLite supports this for interoperability. > > > **H41160:** SQLite shall recognize as an ID token any sequence of characters that begins with a grave accent (u0060), is followed by zero or more non-zero characters and/or pairs ofgrave accents (u0060) and terminates with a grave accent (u0022) that is not part of a pair.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722078361,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722078361,MDEyOklzc3VlQ29tbWVudDcyMjA3ODM2MQ==,9599,simonw,2020-11-05T02:04:33Z,2020-11-05T02:04:33Z,OWNER,Next step: lots of unit tests.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722078286,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722078286,MDEyOklzc3VlQ29tbWVudDcyMjA3ODI4Ng==,9599,simonw,2020-11-05T02:04:18Z,2020-11-05T02:04:18Z,OWNER,"I think this might be it: ```python create_virtual_table_re = re.compile(r"""""" \s*CREATE\s+VIRTUAL\s+TABLE\s+ # CREATE VIRTUAL TABLE ( '(?P<squoted_table>[^']*(?:''[^']*)*)' | # single quoted name ""(?P<dquoted_table>[^""]*(?:""""[^""]*)*)"" | # double quoted name `(?P<backtick_table>[^`]+)` | # `backtick` quoted name \[(?P<squarequoted_table>[^\]]+)\] # [...] quoted name ) \s+(IF\s+NOT\s+EXISTS\s+)? # IF NOT EXISTS (optional) USING\s+(?P<using>\w+) """""", re.VERBOSE | re.IGNORECASE) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722070569,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722070569,MDEyOklzc3VlQ29tbWVudDcyMjA3MDU2OQ==,9599,simonw,2020-11-05T01:38:40Z,2020-11-05T01:38:40Z,OWNER,"I'm going to try `re.VERBOSE` to see if I can make this readable with comments. https://docs.python.org/3/howto/regex.html ```python charref = re.compile(r"""""" &[#] # Start of a numeric entity reference ( 0[0-7]+ # Octal form | [0-9]+ # Decimal form | x[0-9a-fA-F]+ # Hexadecimal form ) ; # Trailing semicolon """""", re.VERBOSE) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722064258,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722064258,MDEyOklzc3VlQ29tbWVudDcyMjA2NDI1OA==,9599,simonw,2020-11-05T01:18:07Z,2020-11-05T01:21:31Z,OWNER,"``` In [8]: r = re.compile(r""""""'[^']*(?:''[^']*)*'"""""") In [9]: r.match(""'fo'o'"") Out[9]: <re.Match object; span=(0, 4), match=""'fo'""> In [10]: r.match(""'fo''o'"") Out[10]: <re.Match object; span=(0, 7), match=""'fo''o'""> ``` `'[^']*(?:''[^']*)*'` This matches a single quote, then 0+ not-single-quotes, then 0+ (either 0+ not-single quotes or a double single quote), then a single quote. Unrolling the loop technique described here: http://www.softec.lu/site/RegularExpressions/UnrollingTheLoop","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722062449,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722062449,MDEyOklzc3VlQ29tbWVudDcyMjA2MjQ0OQ==,9599,simonw,2020-11-05T01:12:14Z,2020-11-05T01:12:14Z,OWNER,"Good news: I don't think I have to deal with `foo.tablename`, because that doesn't get reflected in the `sqlite_master` table: ``` sqlite> attach 'foo.db' as foo; sqlite> create table foo.`bant` (id int); sqlite> select * from foo.sqlite_master; table|bant|bant|2|CREATE TABLE `bant` (id int) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722062082,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722062082,MDEyOklzc3VlQ29tbWVudDcyMjA2MjA4Mg==,9599,simonw,2020-11-05T01:10:51Z,2020-11-05T01:10:51Z,OWNER,"I confirmed all three of these are valid syntax for creating tables: ``` ~ % sqlite3 tmp.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> create table 'foo''and' (id int); sqlite> create table ""bar""""and"" (id int); sqlite> create table [baz] (id int); sqlite> create table `bant` (id int); sqlite> .schema CREATE TABLE IF NOT EXISTS 'foo''and' (id int); CREATE TABLE IF NOT EXISTS ""bar""""and"" (id int); CREATE TABLE [baz] (id int); CREATE TABLE `bant` (id int); sqlite> select * from sqlite_master; table|foo'and|foo'and|2|CREATE TABLE 'foo''and' (id int) table|bar""and|bar""and|3|CREATE TABLE ""bar""""and"" (id int) table|baz|baz|4|CREATE TABLE [baz] (id int) table|bant|bant|5|CREATE TABLE `bant` (id int) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722058598,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722058598,MDEyOklzc3VlQ29tbWVudDcyMjA1ODU5OA==,9599,simonw,2020-11-05T00:59:58Z,2020-11-05T00:59:58Z,OWNER,"That two-in-a-row thing works for `""` too: https://latest.datasette.io/fixtures?sql=select+%22foo%22%2C+%27bar%27%2C+%22foo%22%22and%22%2C+%27bar%27%27and%27 <img width=""501"" alt=""fixtures__select__foo____bar____foo__and____bar__and_"" src=""https://user-images.githubusercontent.com/9599/98184381-1da6f580-1ebf-11eb-9396-beb4586d69ca.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722057923,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722057923,MDEyOklzc3VlQ29tbWVudDcyMjA1NzkyMw==,9599,simonw,2020-11-05T00:57:22Z,2020-11-05T00:57:22Z,OWNER,"Then https://sqlite.org/lang_expr.html#literal_values_constants_ says: > A string constant is formed by enclosing the string in single quotes ('). A single quote within the string can be encoded by putting two single quotes in a row - as in Pascal. C-style escapes using the backslash character are not supported because they are not standard SQL. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722057392,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722057392,MDEyOklzc3VlQ29tbWVudDcyMjA1NzM5Mg==,9599,simonw,2020-11-05T00:55:31Z,2020-11-05T00:55:51Z,OWNER,"https://sqlite.org/lang_keywords.html says: > There are four ways of quoting keywords in SQLite: > > **'keyword'** A keyword in single quotes is a string literal. > **""keyword""** A keyword in double-quotes is an identifier. > **[keyword]** A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. > **\`keyword\`** A keyword enclosed in grave accents (ASCII code 96) is an identifier. This is not standard SQL. This quoting mechanism is used by MySQL and is included in SQLite for compatibility.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722056576,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722056576,MDEyOklzc3VlQ29tbWVudDcyMjA1NjU3Ng==,9599,simonw,2020-11-05T00:52:42Z,2020-11-05T00:52:42Z,OWNER,"I could use a parsing library like https://parsy.readthedocs.io/en/latest/tutorial.html for this - or `pyparsing` which has a SQLite example here: https://github.com/pyparsing/pyparsing/blob/master/examples/select_parser.py I'd rather not add a new dependency for this though so I'm going to see if I can get something that's good-enough just using a regular expression.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722055291,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722055291,MDEyOklzc3VlQ29tbWVudDcyMjA1NTI5MQ==,9599,simonw,2020-11-05T00:48:10Z,2020-11-05T00:48:10Z,OWNER,This is blocking landing `.search()` in #195,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/196#issuecomment-722055104,https://api.github.com/repos/simonw/sqlite-utils/issues/196,722055104,MDEyOklzc3VlQ29tbWVudDcyMjA1NTEwNA==,9599,simonw,2020-11-05T00:47:34Z,2020-11-05T00:47:34Z,OWNER,"This is surprisingly difficult. I need to parse the `CREATE VIRTUAL TABLE` statement, which will look something like this: ```sql CREATE VIRTUAL TABLE ""global-power-plants_fts"" USING FTS5 (""name"", content=""global-power-plants"") ``` The problem is I need to be able to handle various different quoting formats for the table name (`mytable` v.s. `""mytable""` v.s. `[mytable]`) plus I need to look out for `CREATE TABLE IF NOT EXISTS`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736520310,Introspect if table is FTS4 or FTS5, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-722054264,https://api.github.com/repos/simonw/sqlite-utils/issues/192,722054264,MDEyOklzc3VlQ29tbWVudDcyMjA1NDI2NA==,9599,simonw,2020-11-05T00:44:39Z,2020-11-05T00:44:39Z,OWNER,"I want `.search()` to work against both FTS5 and FTS4 tables - but sort by rank should only work for FTS5. This means I need to be able to introspect and tell if a table is FTS4 or FTS5.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/datasette/issues/1082#issuecomment-721931504,https://api.github.com/repos/simonw/datasette/issues/1082,721931504,MDEyOklzc3VlQ29tbWVudDcyMTkzMTUwNA==,9599,simonw,2020-11-04T19:32:47Z,2020-11-04T19:35:44Z,OWNER,"I wonder if setting a soft memory limit within Datasette would help here: https://www.sqlite.org/malloc.html#_setting_memory_usage_limits > If attempts are made to allocate more memory than specified by the soft heap limit, then SQLite will first attempt to free cache memory before continuing with the allocation request. https://www.sqlite.org/pragma.html#pragma_soft_heap_limit > **PRAGMA soft_heap_limit** > **PRAGMA soft_heap_limit=N** > > This pragma invokes the [sqlite3_soft_heap_limit64()](https://www.sqlite.org/c3ref/hard_heap_limit64.html) interface with the argument N, if N is specified and is a non-negative integer. The soft_heap_limit pragma always returns the same integer that would be returned by the [sqlite3_soft_heap_limit64](https://www.sqlite.org/c3ref/hard_heap_limit64.html)(-1) C-language function.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735852274,DigitalOcean buildpack memory errors for large sqlite db?, https://github.com/simonw/datasette/issues/1083#issuecomment-721927254,https://api.github.com/repos/simonw/datasette/issues/1083,721927254,MDEyOklzc3VlQ29tbWVudDcyMTkyNzI1NA==,9599,simonw,2020-11-04T19:24:34Z,2020-11-04T19:24:34Z,OWNER,"Related: #856 - if it's possible to paginate correctly configured canned query then the CSV option to ""stream all rows"" could work for queries as well as tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736365306,Advanced CSV export for arbitrary queries, https://github.com/simonw/datasette/issues/1083#issuecomment-721926827,https://api.github.com/repos/simonw/datasette/issues/1083,721926827,MDEyOklzc3VlQ29tbWVudDcyMTkyNjgyNw==,9599,simonw,2020-11-04T19:23:42Z,2020-11-04T19:23:42Z,OWNER,"https://latest.datasette.io/fixtures/sortable#export has advanced export options, but https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+order+by+pk1%2C+pk2+limit+101 does not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",736365306,Advanced CSV export for arbitrary queries, https://github.com/simonw/datasette/issues/268#issuecomment-721896822,https://api.github.com/repos/simonw/datasette/issues/268,721896822,MDEyOklzc3VlQ29tbWVudDcyMTg5NjgyMg==,9599,simonw,2020-11-04T18:23:29Z,2020-11-04T18:23:29Z,OWNER,"Worth noting that joining to get the rank works for FTS5 but not for FTS4 - see comment here: https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721420539 Easiest solution would be to only support sort-by-rank for FTS5 tables. Alternative would be to depend on https://github.com/simonw/sqlite-fts4","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/1082#issuecomment-721547177,https://api.github.com/repos/simonw/datasette/issues/1082,721547177,MDEyOklzc3VlQ29tbWVudDcyMTU0NzE3Nw==,39538958,justmars,2020-11-04T06:52:30Z,2020-11-04T06:53:16Z,NONE,"I think I tried the same db size on the following scenarios in Digital Ocean: 1. Basic ($5/month) with 512MB RAM 2. Basic ($10/month) with 1GB RAM 3. Pro ($12/month) with 1GB RAM All such attempts conked out with ""out of memory"" errors","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735852274,DigitalOcean buildpack memory errors for large sqlite db?, https://github.com/simonw/datasette/issues/1082#issuecomment-721545090,https://api.github.com/repos/simonw/datasette/issues/1082,721545090,MDEyOklzc3VlQ29tbWVudDcyMTU0NTA5MA==,9599,simonw,2020-11-04T06:47:15Z,2020-11-04T06:47:15Z,OWNER,"I've run into a similar problem with Google Cloud Run: beyond a certain size of database file I find myself needing to run instances there with more RAM assigned to them. I haven't yet figured out a method to estimate the amount of RAM that will be needed to successfully serve a database file of a specific size- I've been using trial and error. 5GB is quite a big database file, so it doesn't surprise me that it may need a bigger instance. I recommend trying it on a 1GB or 2GB of RAM Digital Ocean instance (their default is 512MB) and see if that works. Let me know what you find out!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735852274,DigitalOcean buildpack memory errors for large sqlite db?, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721453779,https://api.github.com/repos/simonw/sqlite-utils/issues/192,721453779,MDEyOklzc3VlQ29tbWVudDcyMTQ1Mzc3OQ==,9599,simonw,2020-11-04T00:59:24Z,2020-11-04T00:59:36Z,OWNER,"FTS5 was added in SQLite 3.9.0 in 2015-10-14 - so about a year after CTEs, which means CTEs will always be safe to use with FTS5 queries.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721420907,https://api.github.com/repos/simonw/sqlite-utils/issues/192,721420907,MDEyOklzc3VlQ29tbWVudDcyMTQyMDkwNw==,9599,simonw,2020-11-03T23:07:01Z,2020-11-03T23:07:01Z,OWNER,"I could depend on my `sqlite-fts4` library to solve this: https://github.com/simonw/sqlite-fts4 Or I could say that only `FTS5` is supported for ranked searches - but still support `.search()` against FTS4 just without the option to sort by relevance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721420539,https://api.github.com/repos/simonw/sqlite-utils/issues/192,721420539,MDEyOklzc3VlQ29tbWVudDcyMTQyMDUzOQ==,9599,simonw,2020-11-03T23:05:53Z,2020-11-03T23:05:53Z,OWNER,"Just realized there's a problem with the SQL I am using here: joining to get `rank` in this way only works against FTS5 tables, it doesn't work against FTS4. https://github.com/simonw/sqlite-utils/blob/c8a900df59efd34f394c863c0adff9912f1bf1d7/sqlite_utils/db.py#L1303-L1319","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/sqlite-utils/pull/195#issuecomment-721397665,https://api.github.com/repos/simonw/sqlite-utils/issues/195,721397665,MDEyOklzc3VlQ29tbWVudDcyMTM5NzY2NQ==,9599,simonw,2020-11-03T22:02:57Z,2020-11-03T22:02:57Z,OWNER,Documentation so far: https://github.com/simonw/sqlite-utils/blob/6cadc6103ff1ba58c6409ce7fba74259e72965d9/docs/cli.rst#executing-searches,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735663855,table.search() improvements plus sqlite-utils search command, https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721319602,https://api.github.com/repos/simonw/sqlite-utils/issues/192,721319602,MDEyOklzc3VlQ29tbWVudDcyMTMxOTYwMg==,9599,simonw,2020-11-03T19:05:05Z,2020-11-03T19:05:05Z,OWNER,"Relevant example using a SQLite CTE: https://github.com/simonw/datasette/issues/268#issuecomment-675725464 CTEs were added in SQLite 3.8.3 in 2014-02-03 so they should be safe to use. If someone tries to run `sqlite-utils search` no an older version of SQLite they'll get an error, which I think is OK.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735532751,sqlite-utils search command, https://github.com/simonw/datasette/issues/596#issuecomment-720741903,https://api.github.com/repos/simonw/datasette/issues/596,720741903,MDEyOklzc3VlQ29tbWVudDcyMDc0MTkwMw==,132978,terrycojones,2020-11-02T21:44:45Z,2020-11-02T21:44:45Z,NONE,Hi & thanks for the note @simonw! I wish I had more time to play with (and contribute to) datasette. I know you don't need me to tell you that it's super cool :-),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",507454958,Handle really wide tables better, https://github.com/simonw/datasette/issues/830#issuecomment-720700065,https://api.github.com/repos/simonw/datasette/issues/830,720700065,MDEyOklzc3VlQ29tbWVudDcyMDcwMDA2NQ==,9599,simonw,2020-11-02T20:15:36Z,2020-11-02T20:15:36Z,OWNER,"#427 had a bunch of ambitious plans for faceting that I haven't realized yet: > Think of all of the potential kinds of facets: > > * `?_facet_array=tags` where tags is a JSON array of values > * `_facet_date=datetimecol` - faceted by date part of a datetime > * `_facet_bins=numeric_column` - can I do some kind of fancy binning here? Might need to take an argument > * `?_facet_bins=numeric_column:5` - could be a way to take an argument. We’ll ignore columns with a : in their name. > * `?_facet_json=jsoncol:jsonpath` - could use a JSON path to extract out something to facet on? > * `?_facet_percentile=numericcolumn` - could this work? > * `?_facet_function=column:sqlfunctionname` - maybe this could be interesting? Would allow for e.g. facet by soundex > * `?_facet_prefix=column:prefix` - facet by terms but only if they start with a specific prefix > * `?_facet_substring=column:3,6` - facet by a substr(column, 3, 6)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",636511683,Redesign register_facet_classes plugin hook, https://github.com/simonw/datasette/issues/1080#issuecomment-720696827,https://api.github.com/repos/simonw/datasette/issues/1080,720696827,MDEyOklzc3VlQ29tbWVudDcyMDY5NjgyNw==,9599,simonw,2020-11-02T20:08:49Z,2020-11-02T20:13:56Z,OWNER,"Implementing pagination for facets will be interesting. Would be easier if I had a nicer reusable internal pagination mechanism, which is also needed for #856 (pagination of canned queries).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",734777631,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", https://github.com/simonw/datasette/issues/1080#issuecomment-720699160,https://api.github.com/repos/simonw/datasette/issues/1080,720699160,MDEyOklzc3VlQ29tbWVudDcyMDY5OTE2MA==,9599,simonw,2020-11-02T20:13:42Z,2020-11-02T20:13:42Z,OWNER,Also relevant to this issue: #830 - redesigning the facet plugin hook in preparation for Datasette 1.0. And #972 supporting faceting against arbitrary queries.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",734777631,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", https://github.com/simonw/datasette/issues/1080#issuecomment-720698577,https://api.github.com/repos/simonw/datasette/issues/1080,720698577,MDEyOklzc3VlQ29tbWVudDcyMDY5ODU3Nw==,9599,simonw,2020-11-02T20:12:26Z,2020-11-02T20:12:26Z,OWNER,"For regular column faceting, here's the query that is used: https://github.com/simonw/datasette/blob/13d1228d80c91d382a05b1a9549ed02c300ef851/datasette/facets.py#L196-L204 Since it uses `order by count desc, value` maybe those values could be used to implement cursor-based pagination. That wouldn't be robust in the face of changing data, but I'm not sure it's possible to implement paginated faceting in a way that survives ongoing changes to the underlying data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",734777631,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", https://github.com/simonw/datasette/issues/1080#issuecomment-720697226,https://api.github.com/repos/simonw/datasette/issues/1080,720697226,MDEyOklzc3VlQ29tbWVudDcyMDY5NzIyNg==,9599,simonw,2020-11-02T20:09:38Z,2020-11-02T20:09:38Z,OWNER,"Maybe this ends up being code that defers to a simulated canned query, rendered using the existing `query.html` template.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",734777631,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", https://github.com/simonw/datasette/issues/1080#issuecomment-720695174,https://api.github.com/repos/simonw/datasette/issues/1080,720695174,MDEyOklzc3VlQ29tbWVudDcyMDY5NTE3NA==,9599,simonw,2020-11-02T20:05:26Z,2020-11-02T20:05:26Z,OWNER,"URL design: `/database/table/-/facet/colname` And for other types of facet (to be supported later): `/database/table/-/facet/colname?_type=m2m`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",734777631,"""View all"" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them", https://github.com/simonw/datasette/issues/596#issuecomment-720689653,https://api.github.com/repos/simonw/datasette/issues/596,720689653,MDEyOklzc3VlQ29tbWVudDcyMDY4OTY1Mw==,9599,simonw,2020-11-02T19:53:36Z,2020-11-02T19:53:47Z,OWNER,"In #998 I implemented a horizontal scrollbar for these tables, which is a big improvement - demo here: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",507454958,Handle really wide tables better, https://github.com/simonw/datasette/issues/1077#issuecomment-720654925,https://api.github.com/repos/simonw/datasette/issues/1077,720654925,MDEyOklzc3VlQ29tbWVudDcyMDY1NDkyNQ==,9599,simonw,2020-11-02T18:43:25Z,2020-11-02T18:43:25Z,OWNER,Demo: https://latest.datasette.io/fixtures?_bot=1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733829385,database_actions plugin hook, https://github.com/simonw/datasette/issues/1077#issuecomment-720637322,https://api.github.com/repos/simonw/datasette/issues/1077,720637322,MDEyOklzc3VlQ29tbWVudDcyMDYzNzMyMg==,9599,simonw,2020-11-02T18:09:17Z,2020-11-02T18:09:17Z,OWNER,Here's the `table_actions` implementation: 2f7731e9e5ff9b324beb5039fbe2be55d704a184,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733829385,database_actions plugin hook, https://github.com/simonw/datasette/issues/838#issuecomment-720354227,https://api.github.com/repos/simonw/datasette/issues/838,720354227,MDEyOklzc3VlQ29tbWVudDcyMDM1NDIyNw==,82988,psychemedia,2020-11-02T09:33:58Z,2020-11-02T09:33:58Z,CONTRIBUTOR,"Thanks; just a note that the `datasette.urls.static(path)` and `datasette.urls.static_plugins(plugin_name, path)` items both seem to be repeated and appear in the docs twice?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/1079#issuecomment-720110298,https://api.github.com/repos/simonw/datasette/issues/1079,720110298,MDEyOklzc3VlQ29tbWVudDcyMDExMDI5OA==,9599,simonw,2020-11-01T15:58:22Z,2020-11-01T15:58:22Z,OWNER,Might try a drop shadow on that menu too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733999615,Handle long breadcrumbs better with new menu, https://github.com/simonw/datasette/issues/782#issuecomment-720028476,https://api.github.com/repos/simonw/datasette/issues/782,720028476,MDEyOklzc3VlQ29tbWVudDcyMDAyODQ3Ng==,9599,simonw,2020-11-01T05:00:05Z,2020-11-01T05:00:05Z,OWNER,This should be the key focus for Datasette 0.52.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/949#issuecomment-720021029,https://api.github.com/repos/simonw/datasette/issues/949,720021029,MDEyOklzc3VlQ29tbWVudDcyMDAyMTAyOQ==,9599,simonw,2020-11-01T03:29:48Z,2020-11-01T03:29:48Z,OWNER,I'm not going to do any more work on this - SQL isn't an auto-complete friendly enough language.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",684961449,Try out CodeMirror SQL hints, https://github.com/simonw/datasette/issues/1077#issuecomment-720003026,https://api.github.com/repos/simonw/datasette/issues/1077,720003026,MDEyOklzc3VlQ29tbWVudDcyMDAwMzAyNg==,9599,simonw,2020-10-31T23:48:21Z,2020-10-31T23:50:07Z,OWNER,Needed by https://github.com/simonw/datasette-backup/issues/6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733829385,database_actions plugin hook, https://github.com/simonw/datasette/issues/1047#issuecomment-719994676,https://api.github.com/repos/simonw/datasette/issues/1047,719994676,MDEyOklzc3VlQ29tbWVudDcxOTk5NDY3Ng==,9599,simonw,2020-10-31T22:11:25Z,2020-10-31T22:11:25Z,OWNER,https://docs.datasette.io/en/latest/binary_data.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728895233,A new section in the docs about how Datasette handles BLOB columns, https://github.com/simonw/datasette/issues/1027#issuecomment-719988113,https://api.github.com/repos/simonw/datasette/issues/1027,719988113,MDEyOklzc3VlQ29tbWVudDcxOTk4ODExMw==,9599,simonw,2020-10-31T21:03:37Z,2020-10-31T21:03:37Z,OWNER,"On my Mac, I run: datasette . -p 8009 --config base_url:/datasette-prefix/ Then I edited `/usr/local/etc/httpd/httpd.conf` and add this section: ``` LoadModule proxy_module lib/httpd/modules/mod_proxy.so LoadModule proxy_http_module lib/httpd/modules/mod_proxy_http.so ProxyPass /datasette-prefix/ http://localhost:8009/datasette-prefix/ ``` I ran Apache in the foreground like so: apachectl -X Now hitting http://localhost:8081/datasette-prefix/fixtures/compound_three_primary_keys worked!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722758132,Add documentation on serving Datasette behind a proxy using base_url, https://github.com/simonw/datasette/issues/1023#issuecomment-719986922,https://api.github.com/repos/simonw/datasette/issues/1023,719986922,MDEyOklzc3VlQ29tbWVudDcxOTk4NjkyMg==,9599,simonw,2020-10-31T20:51:01Z,2020-10-31T20:51:01Z,OWNER,This should all be working correctly now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722673818,Fix issues relating to base_url, https://github.com/simonw/datasette/issues/838#issuecomment-719986904,https://api.github.com/repos/simonw/datasette/issues/838,719986904,MDEyOklzc3VlQ29tbWVudDcxOTk4NjkwNA==,9599,simonw,2020-10-31T20:50:41Z,2020-10-31T20:50:41Z,OWNER,"OK, this should be working now. You can use the `datasette.urls.static_plugins()` method to generate the correct URLs in the `extra_css_urls` plugin hook: https://docs.datasette.io/en/latest/internals.html#datasette-urls","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/1041#issuecomment-719986800,https://api.github.com/repos/simonw/datasette/issues/1041,719986800,MDEyOklzc3VlQ29tbWVudDcxOTk4NjgwMA==,9599,simonw,2020-10-31T20:49:28Z,2020-10-31T20:49:28Z,OWNER,Implemented in a4ca26a2659d21779adf625183061d8879954c15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1072#issuecomment-719986698,https://api.github.com/repos/simonw/datasette/issues/1072,719986698,MDEyOklzc3VlQ29tbWVudDcxOTk4NjY5OA==,9599,simonw,2020-10-31T20:48:17Z,2020-10-31T20:48:17Z,OWNER,Here's the `datasette-edit-templates` plugin WIP I had before removing the hook: https://github.com/simonw/datasette-edit-templates/tree/82855c2612b84bc09c48fca885f831633a0d1552,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1075#issuecomment-719983750,https://api.github.com/repos/simonw/datasette/issues/1075,719983750,MDEyOklzc3VlQ29tbWVudDcxOTk4Mzc1MA==,9599,simonw,2020-10-31T20:22:29Z,2020-10-31T20:22:29Z,OWNER,I bet this is because I'm mucking around with one of those `__` methods. I'll try just doing the non-underscore methods instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1075#issuecomment-719983565,https://api.github.com/repos/simonw/datasette/issues/1075,719983565,MDEyOklzc3VlQ29tbWVudDcxOTk4MzU2NQ==,9599,simonw,2020-10-31T20:21:03Z,2020-10-31T20:21:03Z,OWNER,"Here's the output of `dir(str)`: `['__add__', '__class__', '__contains__', '__delattr__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__getitem__', '__getnewargs__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__iter__', '__le__', '__len__', '__lt__', '__mod__', '__mul__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__rmod__', '__rmul__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'capitalize', 'casefold', 'center', 'count', 'encode', 'endswith', 'expandtabs', 'find', 'format', 'format_map', 'index', 'isalnum', 'isalpha', 'isascii', 'isdecimal', 'isdigit', 'isidentifier', 'islower', 'isnumeric', 'isprintable', 'isspace', 'istitle', 'isupper', 'join', 'ljust', 'lower', 'lstrip', 'maketrans', 'partition', 'replace', 'rfind', 'rindex', 'rjust', 'rpartition', 'rsplit', 'rstrip', 'split', 'splitlines', 'startswith', 'strip', 'swapcase', 'title', 'translate', 'upper', 'zfill']`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1075#issuecomment-719983484,https://api.github.com/repos/simonw/datasette/issues/1075,719983484,MDEyOklzc3VlQ29tbWVudDcxOTk4MzQ4NA==,9599,simonw,2020-10-31T20:20:28Z,2020-10-31T20:20:28Z,OWNER,"It looks like this is specific to the way `PrefixedUrlString` is built. ``` (Pdb) class Weird(str): pass (Pdb) isinstance(Weird('bob'), collections.abc.Awaitable) False ``` So subclassing strings doesn't trigger this bug, but something about `PrefixedUrlString` causes the problem. Here's the current `PrefixedUrlString` implementation: https://github.com/simonw/datasette/blob/bf18b9ba175a7b25fb8b765847397dd6efb8bb7b/datasette/utils/__init__.py#L1015-L1035","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1075#issuecomment-719983240,https://api.github.com/repos/simonw/datasette/issues/1075,719983240,MDEyOklzc3VlQ29tbWVudDcxOTk4MzI0MA==,9599,simonw,2020-10-31T20:18:49Z,2020-10-31T20:18:49Z,OWNER,"Here's the core problem: ``` (Pdb) isinstance('bob', collections.abc.Awaitable) False (Pdb) isinstance(PrefixedUrlString('bob'), collections.abc.Awaitable) *** TypeError: issubclass() arg 1 must be a class ``` For some reason `isinstance()` does not like being handed an instance of PrefixedUrlString.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1075#issuecomment-719981173,https://api.github.com/repos/simonw/datasette/issues/1075,719981173,MDEyOklzc3VlQ29tbWVudDcxOTk4MTE3Mw==,9599,simonw,2020-10-31T20:02:30Z,2020-10-31T20:03:45Z,OWNER,"I wonder how Jinja's `Markup()` class works? It uses https://pypi.org/project/MarkupSafe/ It's a subclass of `str`, defined here: https://github.com/pallets/markupsafe/blob/master/src/markupsafe/__init__.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1075#issuecomment-719980742,https://api.github.com/repos/simonw/datasette/issues/1075,719980742,MDEyOklzc3VlQ29tbWVudDcxOTk4MDc0Mg==,9599,simonw,2020-10-31T19:58:57Z,2020-10-31T19:58:57Z,OWNER,"Sample traceback: ``` <link rel=""stylesheet"" href=""{{ urls.static('app.css') }}?{{ app_css_hash }}""> /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/jinja2/asyncsupport.py:173: in auto_await if inspect.isawaitable(value): /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/inspect.py:226: in isawaitable isinstance(object, collections.abc.Awaitable)) /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/abc.py:139: in __instancecheck__ return _abc_instancecheck(cls, instance) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class 'collections.abc.Awaitable'> subclass = <bound method PrefixedUrlString.__getattribute__.<locals>.method of '/-/static/app.css'> def __subclasscheck__(cls, subclass): """"""Override for issubclass(subclass, cls)."""""" > return _abc_subclasscheck(cls, subclass) E TypeError: issubclass() arg 1 must be a class ``` This is within Jinja. It looks like Jinja really doesn't like methods that return non-string objects like `PrefixedUrlString`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733796942,PrefixedUrlString mechanism broke everything, https://github.com/simonw/datasette/issues/1074#issuecomment-719977864,https://api.github.com/repos/simonw/datasette/issues/1074,719977864,MDEyOklzc3VlQ29tbWVudDcxOTk3Nzg2NA==,9599,simonw,2020-10-31T19:35:01Z,2020-10-31T19:35:01Z,OWNER,"These plugins were not designed to be actually hosted online, so they do some nasty things like linking to the made-up `plugin-example.com` domain. I should fix that. <img width=""853"" alt=""Datasette_Fixtures__fixtures"" src=""https://user-images.githubusercontent.com/9599/97788348-7b60d800-1b75-11eb-8519-a58e52108841.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733768037,latest.datasette.io should include plugins from fixtures, https://github.com/simonw/datasette/issues/1067#issuecomment-719966176,https://api.github.com/repos/simonw/datasette/issues/1067,719966176,MDEyOklzc3VlQ29tbWVudDcxOTk2NjE3Ng==,9599,simonw,2020-10-31T17:51:31Z,2020-10-31T17:51:31Z,OWNER,"Demo: - https://latest.datasette.io/fixtures/facetable?_bot=1 - https://latest.datasette.io/fixtures/simple_view?_bot=1","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/1074#issuecomment-719965426,https://api.github.com/repos/simonw/datasette/issues/1074,719965426,MDEyOklzc3VlQ29tbWVudDcxOTk2NTQyNg==,9599,simonw,2020-10-31T17:45:00Z,2020-10-31T17:45:00Z,OWNER,"This is working. Go to https://latest.datasette.io/login-as-root and click the button, then visit this page to see extra content added by plugins: https://latest.datasette.io/fixtures/compound_three_primary_keys  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733768037,latest.datasette.io should include plugins from fixtures, https://github.com/simonw/datasette/issues/1074#issuecomment-719963074,https://api.github.com/repos/simonw/datasette/issues/1074,719963074,MDEyOklzc3VlQ29tbWVudDcxOTk2MzA3NA==,9599,simonw,2020-10-31T17:23:48Z,2020-10-31T17:23:48Z,OWNER,"Needs a way to login as root, seeing as several plugins only show extra content if the user is logged in as root.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733768037,latest.datasette.io should include plugins from fixtures, https://github.com/simonw/datasette/issues/1067#issuecomment-719961701,https://api.github.com/repos/simonw/datasette/issues/1067,719961701,MDEyOklzc3VlQ29tbWVudDcxOTk2MTcwMQ==,9599,simonw,2020-10-31T17:11:59Z,2020-10-31T17:11:59Z,OWNER,It bothers me that these aren't visible in any public demos. Maybe `latest.datasette.io` should include the `my_plugins.py` and `my_plugins2.py` plugins?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/1026#issuecomment-719959754,https://api.github.com/repos/simonw/datasette/issues/1026,719959754,MDEyOklzc3VlQ29tbWVudDcxOTk1OTc1NA==,9599,simonw,2020-10-31T16:56:35Z,2020-10-31T16:56:35Z,OWNER,#1041 can also benefit from the string subclass that shows that `base_url` has been added.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/1067#issuecomment-719959419,https://api.github.com/repos/simonw/datasette/issues/1067,719959419,MDEyOklzc3VlQ29tbWVudDcxOTk1OTQxOQ==,9599,simonw,2020-10-31T16:53:42Z,2020-10-31T16:53:42Z,OWNER,For the 0.51 release I'm going to add tests that show this works on view pages. I won't implement it for query pages.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/1067#issuecomment-719956184,https://api.github.com/repos/simonw/datasette/issues/1067,719956184,MDEyOklzc3VlQ29tbWVudDcxOTk1NjE4NA==,9599,simonw,2020-10-31T16:26:09Z,2020-10-31T16:26:09Z,OWNER,"Should the hook provide an indication that it's running on a different type of page? I think yes for queries. Not sure about views - they behave very much like tables, and the plugin can always introspect to see if something is a view if it needs to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/1070#issuecomment-719955724,https://api.github.com/repos/simonw/datasette/issues/1070,719955724,MDEyOklzc3VlQ29tbWVudDcxOTk1NTcyNA==,9599,simonw,2020-10-31T16:22:45Z,2020-10-31T16:22:45Z,OWNER,I've removed this plugin hook in #1073.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733390884,load_template() example in documentation showing loading from a database, https://github.com/simonw/datasette/issues/1072#issuecomment-719955491,https://api.github.com/repos/simonw/datasette/issues/1072,719955491,MDEyOklzc3VlQ29tbWVudDcxOTk1NTQ5MQ==,9599,simonw,2020-10-31T16:20:58Z,2020-10-31T16:20:58Z,OWNER,"Here's the proof of concept `FunctionLoader` that showed me that this wasn't going to work: ```diff diff --git a/datasette/app.py b/datasette/app.py index 4b28e71..b076be7 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -21,7 +21,7 @@ from pathlib import Path from markupsafe import Markup from itsdangerous import URLSafeSerializer import jinja2 -from jinja2 import ChoiceLoader, Environment, FileSystemLoader, PrefixLoader +from jinja2 import ChoiceLoader, Environment, FileSystemLoader, FunctionLoader, PrefixLoader from jinja2.environment import Template from jinja2.exceptions import TemplateNotFound import uvicorn @@ -300,6 +300,7 @@ class Datasette: template_paths.append(default_templates) template_loader = ChoiceLoader( [ + FunctionLoader(self._load_template_from_plugins), FileSystemLoader(template_paths), # Support {% extends ""default:table.html"" %}: PrefixLoader( @@ -322,6 +323,17 @@ class Datasette: self._root_token = secrets.token_hex(32) self.client = DatasetteClient(self) + def _load_template_from_plugins(self, template): + # ""If auto reloading is enabled it’s called to check if the template changed"" + uptodatefunc = lambda: True + source = pm.hook.load_template( + template=template, + datasette=self, + ) + if source is None: + return None + return source, template, uptodatefunc + @property def urls(self): return Urls(self) @@ -719,35 +731,7 @@ class Datasette: else: if isinstance(templates, str): templates = [templates] - - # Give plugins first chance at loading the template - break_outer = False - plugin_template_source = None - plugin_template_name = None - template_name = None - for template_name in templates: - if break_outer: - break - plugin_template_source = pm.hook.load_template( - template=template_name, - request=request, - datasette=self, - ) - plugin_template_source = await await_me_maybe(plugin_template_source) - if plugin_template_source: - break_outer = True - plugin_template_name = template_name - break - if plugin_template_source is not None: - template = self.jinja_env.from_string(plugin_template_source) - else: - template = self.jinja_env.select_template(templates) - for template_name in templates: - from_plugin = template_name == plugin_template_name - used = from_plugin or template_name == template.name - templates_considered.append( - {""name"": template_name, ""used"": used, ""from_plugin"": from_plugin} - ) + template = self.jinja_env.select_template(templates) body_scripts = [] # pylint: disable=no-member for extra_script in pm.hook.extra_body_script( diff --git a/datasette/hookspecs.py b/datasette/hookspecs.py index ca84b35..7804def 100644 --- a/datasette/hookspecs.py +++ b/datasette/hookspecs.py @@ -50,7 +50,7 @@ def extra_template_vars( @hookspec(firstresult=True) -def load_template(template, request, datasette): +def load_template(template, datasette): ""Load the specified template, returning the template code as a string"" diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 3c57b6a..8f2704e 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -273,15 +273,12 @@ Example: `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map .. _plugin_hook_load_template: -load_template(template, request, datasette) -------------------------------------------- +load_template(template, datasette) +---------------------------------- ``template`` - string The template that is being rendered, e.g. ``database.html`` -``request`` - object or None - The current HTTP :ref:`internals_request`. This can be ``None`` if the request object is not available. - ``datasette`` - :ref:`internals_datasette` You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)`` ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1073#issuecomment-719834200,https://api.github.com/repos/simonw/datasette/issues/1073,719834200,MDEyOklzc3VlQ29tbWVudDcxOTgzNDIwMA==,9599,simonw,2020-10-30T22:52:48Z,2020-10-30T22:52:48Z,OWNER,Should mostly be a case of backing out the changes from this commit: 81dea4b07ab2b6f4eaaf248307d2b588472054a1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733560417,Remove load_template plugin hook, https://github.com/simonw/datasette/issues/1072#issuecomment-719833744,https://api.github.com/repos/simonw/datasette/issues/1072,719833744,MDEyOklzc3VlQ29tbWVudDcxOTgzMzc0NA==,9599,simonw,2020-10-30T22:50:57Z,2020-10-30T22:50:57Z,OWNER,Yeah I'm going to remove the `load_template` plugin hook and see if it's possible to build the edit templates extension against `prepare_jinja2_environment` instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719833070,https://api.github.com/repos/simonw/datasette/issues/1072,719833070,MDEyOklzc3VlQ29tbWVudDcxOTgzMzA3MA==,9599,simonw,2020-10-30T22:48:04Z,2020-10-30T22:48:04Z,OWNER,"https://github.com/simonw/datasette/blob/a2a709072059c6b3da365df9a332ca744c2079e9/datasette/app.py#L310-L318 So yeah that plugin hook can probably modify the list of loaders available to the `Environment`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719832853,https://api.github.com/repos/simonw/datasette/issues/1072,719832853,MDEyOklzc3VlQ29tbWVudDcxOTgzMjg1Mw==,9599,simonw,2020-10-30T22:47:12Z,2020-10-30T22:47:12Z,OWNER,Maybe I should ditch this hook entirely in favour of the existing `prepare_jinja2_environment` hook. Could that add new template loaders?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719832651,https://api.github.com/repos/simonw/datasette/issues/1072,719832651,MDEyOklzc3VlQ29tbWVudDcxOTgzMjY1MQ==,9599,simonw,2020-10-30T22:46:25Z,2020-10-30T22:46:25Z,OWNER,"I tried using a `FunctionLoader` and got this error on startup: ```python File ""/Users/simon/Dropbox/Development/datasette/datasette/app.py"", line 989, in __init__ for filepath in self.ds.jinja_env.list_templates() File ""/Users/simon/.local/share/virtualenvs/datasette-edit-templates-agoZyE3x/lib/python3.8/site-packages/jinja2/environment.py"", line 810, in list_templates names = self.loader.list_templates() File ""/Users/simon/.local/share/virtualenvs/datasette-edit-templates-agoZyE3x/lib/python3.8/site-packages/jinja2/loaders.py"", line 434, in list_templates found.update(loader.list_templates()) File ""/Users/simon/.local/share/virtualenvs/datasette-edit-templates-agoZyE3x/lib/python3.8/site-packages/jinja2/loaders.py"", line 99, in list_templates raise TypeError(""this loader cannot iterate over all templates"") TypeError: this loader cannot iterate over all templates ``` So if I'm going to define a custom Jinja loader I'll need to teach plugins to answer the ""list templates"" query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719819331,https://api.github.com/repos/simonw/datasette/issues/1072,719819331,MDEyOklzc3VlQ29tbWVudDcxOTgxOTMzMQ==,9599,simonw,2020-10-30T22:00:43Z,2020-10-30T22:00:43Z,OWNER,I'll try getting that to work. If I can't get it to work I'll drop the plugin hook for the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719819234,https://api.github.com/repos/simonw/datasette/issues/1072,719819234,MDEyOklzc3VlQ29tbWVudDcxOTgxOTIzNA==,9599,simonw,2020-10-30T22:00:21Z,2020-10-30T22:00:21Z,OWNER,"There might be a way to save this. Async template loading can't be supported, but what if you could define a `load_template()` hook which returned a sync function that returned templates... Then the `datasette-edit-templates` plugin could reply to `load_template` by loading all DB templates into memory and returning a `load_template` sync function that looked up the values in those already-loaded templates. It could even maintain an in-memory cache that gets updated when a template is edited. If I do this, I could remove the ability to return an `async` function from `load_template()` but add that in the future should Jinja implement a mechanism for async template loading.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719814279,https://api.github.com/repos/simonw/datasette/issues/1072,719814279,MDEyOklzc3VlQ29tbWVudDcxOTgxNDI3OQ==,9599,simonw,2020-10-30T21:45:33Z,2020-10-30T21:45:33Z,OWNER,Sadly I'm going to bump `load_template` from Datasette 0.51 - I don't think I should block the release on resolving this issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719813970,https://api.github.com/repos/simonw/datasette/issues/1072,719813970,MDEyOklzc3VlQ29tbWVudDcxOTgxMzk3MA==,9599,simonw,2020-10-30T21:44:40Z,2020-10-30T21:44:40Z,OWNER,"I'm pretty sure that `run_in_executor()` workaround won't work. https://github.com/django/asgiref/blob/7becc9daca2628c46af1cb7e46b4c47c1ea27adf/asgiref/sync.py#L83 for example says ""You cannot use AsyncToSync in the same thread as an async event loop"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719813212,https://api.github.com/repos/simonw/datasette/issues/1072,719813212,MDEyOklzc3VlQ29tbWVudDcxOTgxMzIxMg==,9599,simonw,2020-10-30T21:42:35Z,2020-10-30T21:42:35Z,OWNER,Filed a feature request here: https://github.com/pallets/jinja/issues/1304,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719811312,https://api.github.com/repos/simonw/datasette/issues/1072,719811312,MDEyOklzc3VlQ29tbWVudDcxOTgxMTMxMg==,9599,simonw,2020-10-30T21:36:49Z,2020-10-30T21:36:49Z,OWNER,"There's one other option: in `datasette-edit-templates` I could maybe use `asyncio.get_event_loop().run_in_executor(...)` to load the templates asynchronously within the Jinja template loader mechanism. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719810533,https://api.github.com/repos/simonw/datasette/issues/1072,719810533,MDEyOklzc3VlQ29tbWVudDcxOTgxMDUzMw==,9599,simonw,2020-10-30T21:34:38Z,2020-10-30T21:34:38Z,OWNER,"... no wait, my comments above assume that I'm just building the `datasette-edit-templates` plugin. Does this work as a general solution for all of Datasette? I don't think it does. This may mean I need to delay the whole feature.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719810023,https://api.github.com/repos/simonw/datasette/issues/1072,719810023,MDEyOklzc3VlQ29tbWVudDcxOTgxMDAyMw==,9599,simonw,2020-10-30T21:33:06Z,2020-10-30T21:33:06Z,OWNER,"The ideal solution is for Jinja to offer `async` template loading. I'll file a feature request, then I'll implement the second option above (async load all templates from the DB before each render).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719809780,https://api.github.com/repos/simonw/datasette/issues/1072,719809780,MDEyOklzc3VlQ29tbWVudDcxOTgwOTc4MA==,9599,simonw,2020-10-30T21:32:28Z,2020-10-30T21:32:28Z,OWNER,"Here's an alternative that would definitely work and would be a lot simpler, at the cost of a fair amount of RAM: 1. Before rendering the template, load ALL of the most-recent-versions of the templates that are stored in the DB. Use those to populate a `DictLoader`. 2. Render the template. This does mean loading template bodies that we won't use. Provided an instance has less than 100 templates I imagine this will work just fine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719809259,https://api.github.com/repos/simonw/datasette/issues/1072,719809259,MDEyOklzc3VlQ29tbWVudDcxOTgwOTI1OQ==,9599,simonw,2020-10-30T21:31:10Z,2020-10-30T21:31:10Z,OWNER,"How can we tell what template Jinja will need to render? One approach that could work: 1. Set up a dummy template loader which records the name of the template that was requested 2. Load the template 3. Now we know the list of templates that were requested. Async load those 4. The dummy template loader can now return the ones we have loaded. Load the template again. 5. Did it request any more templates? If so, load those, and repeat. 6. Keep on with this loop until a template load (which might even have to be a render) fails to request any templates that we have not yet loaded. 7. Render the template. This is GROSS. It feels like a huge waste of CPU, and it could lead to very weird behaviour if any template variables have side effects.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719807502,https://api.github.com/repos/simonw/datasette/issues/1072,719807502,MDEyOklzc3VlQ29tbWVudDcxOTgwNzUwMg==,9599,simonw,2020-10-30T21:26:49Z,2020-10-30T21:26:49Z,OWNER,"It looks like Jinja does not have a mechanism for asynchronous template loading - the loader API is synchronous. One option may be to figure out which templates are needed (including inherited templates and includes) before rendering the template. Then async load those templates from the database into a `DictLoader`, then pass that `DictLoader` to Jinja.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719803880,https://api.github.com/repos/simonw/datasette/issues/1072,719803880,MDEyOklzc3VlQ29tbWVudDcxOTgwMzg4MA==,9599,simonw,2020-10-30T21:17:11Z,2020-10-30T21:17:11Z,OWNER,"Example from the Jinja docs: https://jinja.palletsprojects.com/en/2.11.x/api/#jinja2.BaseLoader ```python from jinja2 import BaseLoader, TemplateNotFound from os.path import join, exists, getmtime class MyLoader(BaseLoader): def __init__(self, path): self.path = path def get_source(self, environment, template): path = join(self.path, template) if not exists(path): raise TemplateNotFound(template) mtime = getmtime(path) with file(path) as f: source = f.read().decode('utf-8') return source, path, lambda: mtime == getmtime(path) ``` Also available: `jinja2.FunctionLoader(load_func)` which lets me pass it a function like this one: ``` >>> def load_template(name): ... if name == 'index.html': ... return '...' ... >>> loader = FunctionLoader(load_template) ``` Just one catch: I need to be able to load templates asynchronously, because they live in the database. Let's hope Jinja has a mechanism for that!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719785005,https://api.github.com/repos/simonw/datasette/issues/1072,719785005,MDEyOklzc3VlQ29tbWVudDcxOTc4NTAwNQ==,9599,simonw,2020-10-30T20:36:22Z,2020-10-30T20:36:22Z,OWNER,"It should be easy enough to show a comment that says which original template names were considered, but I may not be able to show which one was actually used (or which ones came from plugins).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1072#issuecomment-719784606,https://api.github.com/repos/simonw/datasette/issues/1072,719784606,MDEyOklzc3VlQ29tbWVudDcxOTc4NDYwNg==,9599,simonw,2020-10-30T20:35:33Z,2020-10-30T20:35:33Z,OWNER,"To fix this I think I need to move the `load_template` implementation into a Jinja template loader. I'm not sure I'll be able to keep the `Templates considered` comment working though: https://github.com/simonw/datasette/blob/a2a709072059c6b3da365df9a332ca744c2079e9/datasette/app.py#L745-L750","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733499930,load_template hook doesn't work for include/extends, https://github.com/simonw/datasette/issues/1071#issuecomment-719777499,https://api.github.com/repos/simonw/datasette/issues/1071,719777499,MDEyOklzc3VlQ29tbWVudDcxOTc3NzQ5OQ==,9599,simonw,2020-10-30T20:20:01Z,2020-10-30T20:20:01Z,OWNER,"Fixed: https://latest.datasette.io/-/messages  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733485423,Messages should be displayed full width, https://github.com/simonw/datasette/pull/1069#issuecomment-719657478,https://api.github.com/repos/simonw/datasette/issues/1069,719657478,MDEyOklzc3VlQ29tbWVudDcxOTY1NzQ3OA==,22429695,codecov[bot],2020-10-30T16:31:21Z,2020-10-30T17:46:36Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=h1) Report > Merging [#1069](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/222f79bb4c6e2aa5426cc5ff25f1b2461e18a300?el=desc) will **increase** coverage by `0.01%`. > The diff coverage is `95.83%`. [](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1069 +/- ## ========================================== + Coverage 91.30% 91.32% +0.01% ========================================== Files 29 29 Lines 3736 3756 +20 ========================================== + Hits 3411 3430 +19 - Misses 325 326 +1 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1069/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `93.94% <ø> (-0.04%)` | :arrow_down: | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1069/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.38% <95.45%> (-0.05%)` | :arrow_down: | | [datasette/hookspecs.py](https://codecov.io/gh/simonw/datasette/pull/1069/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2hvb2tzcGVjcy5weQ==) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=footer). Last update [222f79b...92f3840](https://codecov.io/gh/simonw/datasette/pull/1069?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/pull/1069#issuecomment-719672967,https://api.github.com/repos/simonw/datasette/issues/1069,719672967,MDEyOklzc3VlQ29tbWVudDcxOTY3Mjk2Nw==,9599,simonw,2020-10-30T16:58:01Z,2020-10-30T16:58:01Z,OWNER,"OK, new hook specification is: ```python @hookspec(firstresult=True) def load_template(template, request, datasette): ""Load the specified template, returning the template code as a string"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/pull/1069#issuecomment-719670714,https://api.github.com/repos/simonw/datasette/issues/1069,719670714,MDEyOklzc3VlQ29tbWVudDcxOTY3MDcxNA==,9599,simonw,2020-10-30T16:53:56Z,2020-10-30T16:53:56Z,OWNER,"I'm having second thoughts about the design of the plugin hook. Consider the following: ```python plugin_template_source = pm.hook.load_template( template=template_name, database=context.get(""database""), table=context.get(""table""), columns=context.get(""columns""), view_name=self.name, request=request, datasette=self.ds, ) ``` It's a bit gross that `database`, `table` and `columns` are pulled out of the context like that. This doesn't make sense for pages that are rendered by plugins, for example. So maybe for the first release of this plugin hook I should cut it down to just seeing `template`, `request` and `datasette`. I can add the table/view/etc stuff back in later if it turns out to be necessary.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/pull/1069#issuecomment-719666912,https://api.github.com/repos/simonw/datasette/issues/1069,719666912,MDEyOklzc3VlQ29tbWVudDcxOTY2NjkxMg==,9599,simonw,2020-10-30T16:47:44Z,2020-10-30T16:47:44Z,OWNER,"Bringing over a comment from #1042: > I'd like to do this all in the `datasette.render_template()` method to ensure it's available to plugins as well, not just core code that uses the `BaseView` class. > > This code is the problem: > > https://github.com/simonw/datasette/blob/d3e9b0aecb6f8e9b2befd9c654ccb7ce852db3e7/datasette/views/base.py#L114-L133 > > I think I'll fix this by moving the `select_templates` mechanism into `datasette.render_templates()`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/pull/1069#issuecomment-719664530,https://api.github.com/repos/simonw/datasette/issues/1069,719664530,MDEyOklzc3VlQ29tbWVudDcxOTY2NDUzMA==,9599,simonw,2020-10-30T16:43:40Z,2020-10-30T16:43:40Z,OWNER,I should include an example in the documentation that shows loading templates from a database table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/pull/1069#issuecomment-719640430,https://api.github.com/repos/simonw/datasette/issues/1069,719640430,MDEyOklzc3VlQ29tbWVudDcxOTY0MDQzMA==,9599,simonw,2020-10-30T16:01:13Z,2020-10-30T16:01:13Z,OWNER,Next steps: build a demonstration plugin against this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",733303548,load_template() plugin hook, https://github.com/simonw/datasette/issues/1068#issuecomment-719630745,https://api.github.com/repos/simonw/datasette/issues/1068,719630745,MDEyOklzc3VlQ29tbWVudDcxOTYzMDc0NQ==,9599,simonw,2020-10-30T15:44:13Z,2020-10-30T15:44:13Z,OWNER,Documentation: https://docs.datasette.io/en/latest/authentication.html#debug-menu,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732939921,Default menu links should check a real permission , https://github.com/simonw/datasette/issues/1068#issuecomment-719332460,https://api.github.com/repos/simonw/datasette/issues/1068,719332460,MDEyOklzc3VlQ29tbWVudDcxOTMzMjQ2MA==,9599,simonw,2020-10-30T07:13:10Z,2020-10-30T07:13:10Z,OWNER,I mainly want this so I can add that debug menu to my Dogsheep.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732939921,Default menu links should check a real permission , https://github.com/simonw/datasette/issues/1068#issuecomment-719331236,https://api.github.com/repos/simonw/datasette/issues/1068,719331236,MDEyOklzc3VlQ29tbWVudDcxOTMzMTIzNg==,9599,simonw,2020-10-30T07:11:58Z,2020-10-30T07:11:58Z,OWNER,Document the new permission here: https://docs.datasette.io/en/stable/authentication.html#built-in-permissions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732939921,Default menu links should check a real permission , https://github.com/simonw/datasette/issues/1068#issuecomment-719329219,https://api.github.com/repos/simonw/datasette/issues/1068,719329219,MDEyOklzc3VlQ29tbWVudDcxOTMyOTIxOQ==,9599,simonw,2020-10-30T07:09:59Z,2020-10-30T07:09:59Z,OWNER,Permission idea: `debug-menu`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732939921,Default menu links should check a real permission , https://github.com/simonw/datasette/issues/1068#issuecomment-719328661,https://api.github.com/repos/simonw/datasette/issues/1068,719328661,MDEyOklzc3VlQ29tbWVudDcxOTMyODY2MQ==,9599,simonw,2020-10-30T07:09:30Z,2020-10-30T07:09:30Z,OWNER,Then this can make it available to root: https://github.com/simonw/datasette/blob/18a64fbb29271ce607937110bbdb55488c43f4e0/datasette/default_permissions.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732939921,Default menu links should check a real permission , https://github.com/simonw/datasette/issues/1067#issuecomment-719322666,https://api.github.com/repos/simonw/datasette/issues/1067,719322666,MDEyOklzc3VlQ29tbWVudDcxOTMyMjY2Ng==,9599,simonw,2020-10-30T07:04:02Z,2020-10-30T07:04:02Z,OWNER,"Maybe rename it to `actions_menu` and have it work for database, view, table and query pages using different arguments on each.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/1067#issuecomment-719320948,https://api.github.com/repos/simonw/datasette/issues/1067,719320948,MDEyOklzc3VlQ29tbWVudDcxOTMyMDk0OA==,9599,simonw,2020-10-30T07:02:37Z,2020-10-30T07:02:37Z,OWNER,"Yes, this should be possible - no point restricting what plugin authors can do with the feature. Will need to add some extra arguments to the plugin hook for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732905360,"Table actions menu on view pages, not on query pages", https://github.com/simonw/datasette/issues/690#issuecomment-719195346,https://api.github.com/repos/simonw/datasette/issues/690,719195346,MDEyOklzc3VlQ29tbWVudDcxOTE5NTM0Ng==,9599,simonw,2020-10-30T05:20:42Z,2020-10-30T05:20:42Z,OWNER,"I've now added two new plugin hooks: [menu_links()](https://docs.datasette.io/en/latest/plugin_hooks.html#menu-links-datasette-actor) and [table_actions()](https://docs.datasette.io/en/latest/plugin_hooks.html#table-actions-datasette-actor-database-table). I'm going to close this issue. Further work (on column actions and and database actions) can happen in separate tickets, but I won't include them in Datasette 0.51 since they're much less interesting than table and instance actions.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",573755726,Mechanism for plugins to add action menu items for various things, https://github.com/simonw/datasette/issues/1066#issuecomment-719194756,https://api.github.com/repos/simonw/datasette/issues/1066,719194756,MDEyOklzc3VlQ29tbWVudDcxOTE5NDc1Ng==,9599,simonw,2020-10-30T05:18:35Z,2020-10-30T05:18:35Z,OWNER,Documentation: https://docs.datasette.io/en/latest/plugin_hooks.html#table-actions-datasette-actor-database-table,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732859030,Table actions menu plus plugin hook, https://github.com/simonw/datasette/issues/1066#issuecomment-719194619,https://api.github.com/repos/simonw/datasette/issues/1066,719194619,MDEyOklzc3VlQ29tbWVudDcxOTE5NDYxOQ==,9599,simonw,2020-10-30T05:18:04Z,2020-10-30T05:18:04Z,OWNER,"The cog only appears if at least one table action has been registered by a plugin. It looks like this:  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732859030,Table actions menu plus plugin hook, https://github.com/simonw/datasette/issues/1066#issuecomment-719154646,https://api.github.com/repos/simonw/datasette/issues/1066,719154646,MDEyOklzc3VlQ29tbWVudDcxOTE1NDY0Ng==,9599,simonw,2020-10-30T03:48:15Z,2020-10-30T03:48:15Z,OWNER,This will use a very similar implementation to the navigation menu in #1064 - similar plugin hook and I'll use a `<details><summary>` to implement it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732859030,Table actions menu plus plugin hook, https://github.com/simonw/datasette/pull/1065#issuecomment-719153773,https://api.github.com/repos/simonw/datasette/issues/1065,719153773,MDEyOklzc3VlQ29tbWVudDcxOTE1Mzc3Mw==,22429695,codecov[bot],2020-10-30T03:44:57Z,2020-10-30T03:44:57Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=h1) Report > Merging [#1065](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/1a861be19e326e0c88230a711a1b6536366697d7?el=desc) will **increase** coverage by `0.03%`. > The diff coverage is `100.00%`. [](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1065 +/- ## ========================================== + Coverage 91.23% 91.27% +0.03% ========================================== Files 28 29 +1 Lines 3710 3724 +14 ========================================== + Hits 3385 3399 +14 Misses 325 325 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/plugins.py](https://codecov.io/gh/simonw/datasette/pull/1065/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3BsdWdpbnMucHk=) | `82.35% <ø> (ø)` | | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1065/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.42% <100.00%> (+0.03%)` | :arrow_up: | | [datasette/default\_menu\_links.py](https://codecov.io/gh/simonw/datasette/pull/1065/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2RlZmF1bHRfbWVudV9saW5rcy5weQ==) | `100.00% <100.00%> (ø)` | | | [datasette/hookspecs.py](https://codecov.io/gh/simonw/datasette/pull/1065/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2hvb2tzcGVjcy5weQ==) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=footer). Last update [1a861be...5f118b5](https://codecov.io/gh/simonw/datasette/pull/1065?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732856937,Nav menu plus menu_links() hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719117185,https://api.github.com/repos/simonw/datasette/issues/1064,719117185,MDEyOklzc3VlQ29tbWVudDcxOTExNzE4NQ==,9599,simonw,2020-10-30T01:35:17Z,2020-10-30T01:35:17Z,OWNER,"I'm going to go with a list of `{""label"": ..., ""href"": ...}` as the first iteration of this. The logout link will not be returned as part of the plugin output. A default plugin will provide the debug tools if the user is logged in as root.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719111597,https://api.github.com/repos/simonw/datasette/issues/1064,719111597,MDEyOklzc3VlQ29tbWVudDcxOTExMTU5Nw==,9599,simonw,2020-10-30T01:15:05Z,2020-10-30T01:15:05Z,OWNER,I'm torn on this one. I think I have a very slight preference for plugins returning structured objects as opposed to HTML. Less likely to regret that choice in the future?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719111373,https://api.github.com/repos/simonw/datasette/issues/1064,719111373,MDEyOklzc3VlQ29tbWVudDcxOTExMTM3Mw==,9599,simonw,2020-10-30T01:14:13Z,2020-10-30T01:14:13Z,OWNER,"Plugins returning HTML makes more sense for some of the other areas that plugins will be able to inject content - e.g. injecting content on the table or row page above the table. If I'm going to have that as a pattern though it may make sense to use HTML here, since that will be consistent with other places that plugins can inject additional content.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719110808,https://api.github.com/repos/simonw/datasette/issues/1064,719110808,MDEyOklzc3VlQ29tbWVudDcxOTExMDgwOA==,9599,simonw,2020-10-30T01:12:09Z,2020-10-30T01:12:19Z,OWNER,Or... plugins could return HTML - maybe optionally using helper functions to generate common HTML such that plugins which use the helpers can have their HTML modified in the future.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719110582,https://api.github.com/repos/simonw/datasette/issues/1064,719110582,MDEyOklzc3VlQ29tbWVudDcxOTExMDU4Mg==,9599,simonw,2020-10-30T01:11:13Z,2020-10-30T01:11:13Z,OWNER,"Should plugins be able to add forms like the logout form here, or should they be restricted to adding navigation links? I can't think of a reason a plugin would need to add a form. The logout form is a special case to protect against logout-csrf attacks. So I think plugins get to return a list of dictionaries, each with a `label` and an `href`: ```python return [{ ""label"": ""Upload CSVs"", ""href"": datasette.urls.path(""/-/upload-csvs"") }] ``` But... is there an argument for returning headings, to divide up the menu? I think so. I also like the idea that a default plugin checks for the `root` user and outputs links to the different debugging tools - maybe those should be wrapped in a section heading.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719109770,https://api.github.com/repos/simonw/datasette/issues/1064,719109770,MDEyOklzc3VlQ29tbWVudDcxOTEwOTc3MA==,9599,simonw,2020-10-30T01:08:14Z,2020-10-30T01:08:14Z,OWNER,"How should the plugin hook work? Here's the first version of the HTML: ```html <div class=""nav-menu-inner""> <ul> <li><a href=""{{ urls.instance() }}"">Home</a></li> <li><a href=""{{ urls.path('/-/plugins') }}"">Installed plugins</a></li> <li><a href=""{{ urls.path('/-/versions') }}"">Software versions</a></li> <li><a href=""{{ urls.path('/-/metadata') }}"">Metadata</a></li> {% if show_logout %} <form action=""{{ urls.logout() }}"" method=""post""> <input type=""hidden"" name=""csrftoken"" value=""{{ csrftoken() }}""> <button class=""button-as-link"">Log out</button> </form>{% endif %} </ul> </div> ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719106174,https://api.github.com/repos/simonw/datasette/issues/1064,719106174,MDEyOklzc3VlQ29tbWVudDcxOTEwNjE3NA==,9599,simonw,2020-10-30T00:55:12Z,2020-10-30T00:55:12Z,OWNER,"So what should go in this menu? If the user is logged in as root, I'll link to the various debug pages. If they're not logged in at all I don't think the menu should appear. If they are logged in as anyone, it should display to give them access to the ""log out"" button. Plugins can add links to it. If those plugins add links, the menu will display.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719105641,https://api.github.com/repos/simonw/datasette/issues/1064,719105641,MDEyOklzc3VlQ29tbWVudDcxOTEwNTY0MQ==,9599,simonw,2020-10-30T00:53:00Z,2020-10-30T00:53:00Z,OWNER,Tips for making this accessible: https://css-tricks.com/accessible-svgs/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719104883,https://api.github.com/repos/simonw/datasette/issues/1064,719104883,MDEyOklzc3VlQ29tbWVudDcxOTEwNDg4Mw==,9599,simonw,2020-10-30T00:50:01Z,2020-10-30T00:52:29Z,OWNER,"Here's what the prototype looks like so far:  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1064#issuecomment-719105197,https://api.github.com/repos/simonw/datasette/issues/1064,719105197,MDEyOklzc3VlQ29tbWVudDcxOTEwNTE5Nw==,9599,simonw,2020-10-30T00:51:16Z,2020-10-30T00:51:16Z,OWNER,"I used a `<details><summary>` for this: https://github.com/simonw/datasette/blob/0d7ac764861d84be24d661cf4104ce61ea11a82a/datasette/templates/base.html#L16-L36 I added a bit of JavaScript so that clicking outside the menu would close it: https://github.com/simonw/datasette/blob/0d7ac764861d84be24d661cf4104ce61ea11a82a/datasette/templates/base.html#L59-L74","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732798913,Navigation menu plus plugin hook, https://github.com/simonw/datasette/issues/1034#issuecomment-719094027,https://api.github.com/repos/simonw/datasette/issues/1034,719094027,MDEyOklzc3VlQ29tbWVudDcxOTA5NDAyNw==,9599,simonw,2020-10-30T00:11:17Z,2020-10-30T00:11:17Z,OWNER,"Demos: https://latest.datasette.io/fixtures/binary_data.csv?_size=max ```csv rowid,data 1,http://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data 2,http://latest.datasette.io/fixtures/binary_data/2.blob?_blob_column=data 3, ``` https://latest.datasette.io/fixtures.csv?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+1001&_size=max ```csv rowid,data 1,http://latest.datasette.io/fixtures.blob?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+1001&_size=max&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d 2,http://latest.datasette.io/fixtures.blob?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+1001&_size=max&_blob_column=data&_blob_hash=b835b0483cedb86130b9a2c280880bf5fadc5318ddf8c18d0df5204d40df1724 3, ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1063#issuecomment-719066706,https://api.github.com/repos/simonw/datasette/issues/1063,719066706,MDEyOklzc3VlQ29tbWVudDcxOTA2NjcwNg==,9599,simonw,2020-10-29T22:46:28Z,2020-10-29T22:46:28Z,OWNER,I'm not going to do the base64 thing unless someone asks for it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732685643,.csv should link to .blob downloads, https://github.com/simonw/datasette/issues/1051#issuecomment-719053669,https://api.github.com/repos/simonw/datasette/issues/1051,719053669,MDEyOklzc3VlQ29tbWVudDcxOTA1MzY2OQ==,9599,simonw,2020-10-29T22:12:16Z,2020-10-29T22:12:16Z,OWNER,"https://latest.datasette.io/fixtures?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+101 now looks like this: <img width=""816"" alt=""fixtures__select_rowid__data_from_binary_data_order_by_rowid_limit_101_and_understanding_Google_auths_-_for_your_information"" src=""https://user-images.githubusercontent.com/9599/97638086-15e7dc80-19f9-11eb-90a9-686e012c4db9.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1034#issuecomment-719050754,https://api.github.com/repos/simonw/datasette/issues/1034,719050754,MDEyOklzc3VlQ29tbWVudDcxOTA1MDc1NA==,9599,simonw,2020-10-29T22:04:52Z,2020-10-29T22:04:52Z,OWNER,I'm going to link to. the new `.blob` representation using the new `?_blob_hash=xxx` argument to ensure that the content served is the expected binary blob.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1063#issuecomment-719050390,https://api.github.com/repos/simonw/datasette/issues/1063,719050390,MDEyOklzc3VlQ29tbWVudDcxOTA1MDM5MA==,9599,simonw,2020-10-29T22:04:00Z,2020-10-29T22:04:00Z,OWNER,This will close #1034.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732685643,.csv should link to .blob downloads, https://github.com/simonw/datasette/pull/1061#issuecomment-719049115,https://api.github.com/repos/simonw/datasette/issues/1061,719049115,MDEyOklzc3VlQ29tbWVudDcxOTA0OTExNQ==,22429695,codecov[bot],2020-10-29T22:00:57Z,2020-10-29T22:00:57Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=h1) Report > Merging [#1061](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/d6f9ff71378c4eab34dad181c23cfc143a4aef2d?el=desc) will **increase** coverage by `0.07%`. > The diff coverage is `96.87%`. [](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1061 +/- ## ========================================== + Coverage 91.13% 91.20% +0.07% ========================================== Files 27 28 +1 Lines 3677 3697 +20 ========================================== + Hits 3351 3372 +21 + Misses 326 325 -1 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/plugins.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3BsdWdpbnMucHk=) | `82.35% <ø> (ø)` | | | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `93.77% <0.00%> (ø)` | | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.38% <100.00%> (+0.15%)` | :arrow_up: | | [datasette/blob\_renderer.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2Jsb2JfcmVuZGVyZXIucHk=) | `100.00% <100.00%> (ø)` | | | [datasette/utils/asgi.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL2FzZ2kucHk=) | `92.13% <100.00%> (+0.17%)` | :arrow_up: | | [datasette/views/database.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2RhdGFiYXNlLnB5) | `97.04% <100.00%> (+0.07%)` | :arrow_up: | | [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1061/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `95.86% <100.00%> (-0.22%)` | :arrow_down: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=footer). Last update [d6f9ff7...1196d08](https://codecov.io/gh/simonw/datasette/pull/1061?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732634375,.blob output renderer, https://github.com/simonw/datasette/pull/1061#issuecomment-719042601,https://api.github.com/repos/simonw/datasette/issues/1061,719042601,MDEyOklzc3VlQ29tbWVudDcxOTA0MjYwMQ==,9599,simonw,2020-10-29T21:45:35Z,2020-10-29T21:50:42Z,OWNER,Moving the CSV work to a separate issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732634375,.blob output renderer, https://github.com/simonw/datasette/issues/1063#issuecomment-719043108,https://api.github.com/repos/simonw/datasette/issues/1063,719043108,MDEyOklzc3VlQ29tbWVudDcxOTA0MzEwOA==,9599,simonw,2020-10-29T21:46:48Z,2020-10-29T21:46:48Z,OWNER,Remove this `xfail` and `import pytest`: https://github.com/simonw/datasette/blob/503a5b7b4080a26ef9ceb1ecd1a4a6f4ef4ffc59/tests/test_csv.py#L83-L96,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732685643,.csv should link to .blob downloads, https://github.com/simonw/datasette/pull/1061#issuecomment-719035336,https://api.github.com/repos/simonw/datasette/issues/1061,719035336,MDEyOklzc3VlQ29tbWVudDcxOTAzNTMzNg==,9599,simonw,2020-10-29T21:29:29Z,2020-10-29T21:29:29Z,OWNER,Those display_rows have already been processed by the `render_cell` plugin hook: https://github.com/simonw/datasette/blob/d6f9ff71378c4eab34dad181c23cfc143a4aef2d/datasette/views/database.py#L320-L346,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732634375,.blob output renderer, https://github.com/simonw/datasette/pull/1061#issuecomment-719033013,https://api.github.com/repos/simonw/datasette/issues/1061,719033013,MDEyOklzc3VlQ29tbWVudDcxOTAzMzAxMw==,9599,simonw,2020-10-29T21:27:14Z,2020-10-29T21:27:14Z,OWNER,"Next challenge: link to `.blob` downloads from https://latest.datasette.io/fixtures?sql=select+rowid%2C+data+from+binary_data This will be a bit tricky. Here's how that template works at the moment: https://github.com/simonw/datasette/blob/d6f9ff71378c4eab34dad181c23cfc143a4aef2d/datasette/templates/query.html#L69-L77","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732634375,.blob output renderer, https://github.com/simonw/datasette/issues/1062#issuecomment-719031901,https://api.github.com/repos/simonw/datasette/issues/1062,719031901,MDEyOklzc3VlQ29tbWVudDcxOTAzMTkwMQ==,9599,simonw,2020-10-29T21:25:54Z,2020-10-29T21:25:54Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/d6f9ff71378c4eab34dad181c23cfc143a4aef2d/datasette/views/base.py#L258-L345,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",732674148,Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows, https://github.com/simonw/datasette/issues/1050#issuecomment-719021514,https://api.github.com/repos/simonw/datasette/issues/1050,719021514,MDEyOklzc3VlQ29tbWVudDcxOTAyMTUxNA==,9599,simonw,2020-10-29T21:05:08Z,2020-10-29T21:05:08Z,OWNER,"Idea: what if Datasette had a custom SQLite function that could be used to generate URLs to the row-level BLOB download for a value? Then custom SQL query authors could use that function to link to the relevant content. This could be expanded to exposing other `datasette.urls` functionality as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-719001701,https://api.github.com/repos/simonw/datasette/issues/1050,719001701,MDEyOklzc3VlQ29tbWVudDcxOTAwMTcwMQ==,9599,simonw,2020-10-29T20:26:44Z,2020-10-29T20:26:44Z,OWNER,I'll do the rest of the work on this in the pull request #1061.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-718989895,https://api.github.com/repos/simonw/datasette/issues/1050,718989895,MDEyOklzc3VlQ29tbWVudDcxODk4OTg5NQ==,9599,simonw,2020-10-29T20:04:15Z,2020-10-29T20:04:15Z,OWNER,I'll use `hashlib.sha256` for these hashes.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-718987852,https://api.github.com/repos/simonw/datasette/issues/1050,718987852,MDEyOklzc3VlQ29tbWVudDcxODk4Nzg1Mg==,9599,simonw,2020-10-29T20:00:32Z,2020-10-29T20:00:32Z,OWNER,"The reason I like the `?_blob_hash=` solution is that it feels really misleading to provide a link to ""download this binary"" which could conceivably download some other data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-718980944,https://api.github.com/repos/simonw/datasette/issues/1050,718980944,MDEyOklzc3VlQ29tbWVudDcxODk4MDk0NA==,9599,simonw,2020-10-29T19:46:19Z,2020-10-29T19:46:19Z,OWNER,"Had an idea in https://github.com/simonw/datasette/issues/1051#issuecomment-718980659 > OK, alternative idea. The `.blob` output renderer from #1050 gets to see multiple rows at once. > > For an arbitrary SQL query, how about if I link to this? > > `/db.blob?sql=...&_blob_column=data&_blob_hash=bc4c24181ed3ce666` > > Then the output renderer loops through all of the `data` results that are available to it and, if one of them hashes to that value, serves up that data? > > If no matches are found it can show an error message telling you that the link has expired (presumably because the underlying database has changed since the link was generated). > > I think this might be the best solution to the problem. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1051#issuecomment-718980659,https://api.github.com/repos/simonw/datasette/issues/1051,718980659,MDEyOklzc3VlQ29tbWVudDcxODk4MDY1OQ==,9599,simonw,2020-10-29T19:45:42Z,2020-10-29T19:45:42Z,OWNER,"OK, alternative idea. The `.blob` output renderer from #1050 gets to see multiple rows at once. For an arbitrary SQL query, how about if I link to this? `/db.blob?sql=...&_blob_column=data&_blob_hash=bc4c24181ed3ce666` Then the output renderer loops through all of the `data` results that are available to it and, if one of them hashes to that value, serves up that data? If no matches are found it can show an error message telling you that the link has expired (presumably because the underlying database has changed since the link was generated). I think this might be the best solution to the problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1053#issuecomment-718976679,https://api.github.com/repos/simonw/datasette/issues/1053,718976679,MDEyOklzc3VlQ29tbWVudDcxODk3NjY3OQ==,9599,simonw,2020-10-29T19:37:57Z,2020-10-29T19:37:57Z,OWNER,https://docs.datasette.io/en/latest/writing_plugins.html#designing-urls-for-your-plugin,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729604838,Document recommendations for plugin authors to design URLs, https://github.com/simonw/datasette/pull/1049#issuecomment-718528252,https://api.github.com/repos/simonw/datasette/issues/1049,718528252,MDEyOklzc3VlQ29tbWVudDcxODUyODI1Mg==,82988,psychemedia,2020-10-29T09:20:34Z,2020-10-29T09:20:34Z,CONTRIBUTOR,That workaround is probably fine. I was trying to work out whether there might be other situations where a pre-external package load might be useful but couldn't offhand bring any other examples to mind. The static plugins option also looks interesting.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/issues/1050#issuecomment-718346019,https://api.github.com/repos/simonw/datasette/issues/1050,718346019,MDEyOklzc3VlQ29tbWVudDcxODM0NjAxOQ==,9599,simonw,2020-10-29T04:05:07Z,2020-10-29T04:05:07Z,OWNER,"Yes, confirmed - this is a bug where if the `BLOB` column contains a `null` you get a nasty exception if you try to download it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-718342036,https://api.github.com/repos/simonw/datasette/issues/1050,718342036,MDEyOklzc3VlQ29tbWVudDcxODM0MjAzNg==,9599,simonw,2020-10-29T03:49:57Z,2020-10-29T03:49:57Z,OWNER,"@thadk from that error it looks like the problem may have been that you had a BLOB column containing a `null` value? If so that's definitely a bug, I'll fix that.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/pull/1049#issuecomment-718340847,https://api.github.com/repos/simonw/datasette/issues/1049,718340847,MDEyOklzc3VlQ29tbWVudDcxODM0MDg0Nw==,9599,simonw,2020-10-29T03:45:47Z,2020-10-29T03:48:26Z,OWNER,"[thebe](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html) is the first time I've seen a library that requires you to set up some global JavaScript configuration before loading the script itself. I'm hesitant to add an extra template block just to cover that one case since it's such a rare pattern. But it's important that `thebelab` can be used with Datasette. Would this pattern work for you instead? ```html+jinja {% block extra_head %} <script type=""text/x-thebe-config""> { requestKernel: true, binderOptions: { repo: ""binder-examples/requirements"", }, } </script> <script src=""https://unpkg.com/thebelab@latest/lib/index.js""></script> {% endblock %} ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/pull/1049#issuecomment-718341542,https://api.github.com/repos/simonw/datasette/issues/1049,718341542,MDEyOklzc3VlQ29tbWVudDcxODM0MTU0Mg==,9599,simonw,2020-10-29T03:48:12Z,2020-10-29T03:48:12Z,OWNER,"You could use Datasette's new `{{ urls.static_plugins(...) }}` template option - see https://docs.datasette.io/en/latest/internals.html#internals-datasette-urls - to generate a link to code that was bundled with the plugin: ```html+jinja {% block extra_head %} <script type=""text/x-thebe-config""> { requestKernel: true, binderOptions: { repo: ""binder-examples/requirements"", }, } </script> <script src=""{{ urls.static_plugins(""datasette-thebelab-plugin"", ""thebelab-index.js"")""></script> {% endblock %} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/issues/1050#issuecomment-718317997,https://api.github.com/repos/simonw/datasette/issues/1050,718317997,MDEyOklzc3VlQ29tbWVudDcxODMxNzk5Nw==,283343,thadk,2020-10-29T02:24:50Z,2020-10-29T02:29:24Z,NONE,"Unsolicited feedback for an unreleased feature of the [current](https://github.com/simonw/datasette/commit/5e0b72247ecab4ce0fcec599b77a83d73a480872) unreleased GitHub version (I casually wanted to access a blob row) – the existing #1036 route doesn't support special characters in database or table names (e.g. `@()` ). Maybe this is motivation for your new idea here. Also I got this error/crash with my blob and wasn't able to get the file: https://gist.github.com/thadk/28ac32af0e88747ce9056c90b0b19d34","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/pull/1060#issuecomment-718243062,https://api.github.com/repos/simonw/datasette/issues/1060,718243062,MDEyOklzc3VlQ29tbWVudDcxODI0MzA2Mg==,22429695,codecov[bot],2020-10-28T22:23:33Z,2020-10-28T22:23:33Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=h1) Report > Merging [#1060](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/abcf0222496d8148b2e585ffa0ff192270a04b06?el=desc) will **increase** coverage by `6.42%`. > The diff coverage is `100.00%`. [](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1060 +/- ## ========================================== + Coverage 84.71% 91.13% +6.42% ========================================== Files 28 27 -1 Lines 3957 3677 -280 ========================================== - Hits 3352 3351 -1 + Misses 605 326 -279 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1060/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `73.63% <100.00%> (+0.13%)` | :arrow_up: | | [datasette/version.py](https://codecov.io/gh/simonw/datasette/pull/1060/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZlcnNpb24ucHk=) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=footer). Last update [abcf022...4725d46](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731827081,New explicit versioning mechanism, https://github.com/simonw/sqlite-utils/issues/191#issuecomment-718170295,https://api.github.com/repos/simonw/sqlite-utils/issues/191,718170295,MDEyOklzc3VlQ29tbWVudDcxODE3MDI5NQ==,9599,simonw,2020-10-28T19:50:16Z,2020-10-28T19:50:16Z,OWNER,"I think I made a mistake when I designed the initial decorator. I should have had it work like this: ```python @db.register_function() def reverse_string(s): return """".join(reversed(list(s))) ``` As this leaves open the option to add new parameters in the future. To avoid breaking backwards compatibility I'll use the hack that detects the argument this time, but in the future I'll try to remember to always design decorators to be called like `@decorator()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731740458,Idea: @db.register_function(deterministic=True), https://github.com/simonw/sqlite-utils/issues/191#issuecomment-718168730,https://api.github.com/repos/simonw/sqlite-utils/issues/191,718168730,MDEyOklzc3VlQ29tbWVudDcxODE2ODczMA==,9599,simonw,2020-10-28T19:47:20Z,2020-10-28T19:47:20Z,OWNER,"https://stackoverflow.com/a/3931903 looks useful: ```python def trace(*args): def _trace(func): def wrapper(*args, **kwargs): print enter_string func(*args, **kwargs) print exit_string return wrapper if len(args) == 1 and callable(args[0]): # No arguments, this is the decorator # Set default values for the arguments enter_string = 'entering' exit_string = 'exiting' return _trace(args[0]) else: # This is just returning the decorator enter_string, exit_string = args return _trace ``` Can improve that code with `functools.wraps`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731740458,Idea: @db.register_function(deterministic=True), https://github.com/simonw/datasette/pull/1059#issuecomment-718078447,https://api.github.com/repos/simonw/datasette/issues/1059,718078447,MDEyOklzc3VlQ29tbWVudDcxODA3ODQ0Nw==,9599,simonw,2020-10-28T17:07:59Z,2020-10-28T17:08:14Z,OWNER,"> #### 0.6.0 (2020-10-27) > > - aiofiles is now tested on ppc64le. > - Added name and mode properties to async file objects. [#82](https://github.com/Tinche/aiofiles/pull/82) > - Fixed a DeprecationWarning internally. [#75](https://github.com/Tinche/aiofiles/pull/75) > - Python 3.9 support and tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731445447,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7", https://github.com/simonw/datasette/pull/1059#issuecomment-717938992,https://api.github.com/repos/simonw/datasette/issues/1059,717938992,MDEyOklzc3VlQ29tbWVudDcxNzkzODk5Mg==,22429695,codecov[bot],2020-10-28T13:38:46Z,2020-10-28T13:38:46Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=h1) Report > Merging [#1059](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/7d9fedc176717a7e3d22a96575ae0aada5a65440?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1059 +/- ## ======================================= Coverage 84.71% 84.71% ======================================= Files 28 28 Lines 3957 3957 ======================================= Hits 3352 3352 Misses 605 605 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=footer). Last update [7d9fedc...e46327a](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731445447,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7", https://github.com/simonw/datasette/issues/1057#issuecomment-717531272,https://api.github.com/repos/simonw/datasette/issues/1057,717531272,MDEyOklzc3VlQ29tbWVudDcxNzUzMTI3Mg==,9599,simonw,2020-10-27T20:51:09Z,2020-10-27T20:51:09Z,OWNER,"That works! <img width=""1370"" alt=""Banners_and_Alerts_and_SQLite___Mike_Bostock___Observable"" src=""https://user-images.githubusercontent.com/9599/97360439-717c6380-185b-11eb-9996-e0064e4b22d4.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730797787,--cors should enable /fixtures.db CORS access, https://github.com/simonw/datasette/issues/1058#issuecomment-717527606,https://api.github.com/repos/simonw/datasette/issues/1058,717527606,MDEyOklzc3VlQ29tbWVudDcxNzUyNzYwNg==,9599,simonw,2020-10-27T20:44:06Z,2020-10-27T20:44:06Z,OWNER,Example: https://github.com/simonw/datasette/blob/5a1519796037105bc20bcf2f91a76e022926c204/datasette/views/database.py#L26-L32,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730802994,Database download should implement cascading permissions, https://github.com/simonw/datasette/pull/1056#issuecomment-717489501,https://api.github.com/repos/simonw/datasette/issues/1056,717489501,MDEyOklzc3VlQ29tbWVudDcxNzQ4OTUwMQ==,22429695,codecov[bot],2020-10-27T19:39:41Z,2020-10-27T19:39:41Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=h1) Report > Merging [#1056](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/26bb4a268127da2c38f4241abe45444b2a6f7874?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1056 +/- ## ======================================= Coverage 84.70% 84.70% ======================================= Files 28 28 Lines 3955 3955 ======================================= Hits 3350 3350 Misses 605 605 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=footer). Last update [26bb4a2...a7b2aab](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730752399,"Radical new colour scheme and base styles, courtesy of @natbat", https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717361487,https://api.github.com/repos/simonw/sqlite-utils/issues/189,717361487,MDEyOklzc3VlQ29tbWVudDcxNzM2MTQ4Nw==,9599,simonw,2020-10-27T16:24:04Z,2020-10-27T16:24:04Z,OWNER,"This is great, thank you very much.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717359145,https://api.github.com/repos/simonw/sqlite-utils/issues/189,717359145,MDEyOklzc3VlQ29tbWVudDcxNzM1OTE0NQ==,35681,adamwolf,2020-10-27T16:20:32Z,2020-10-27T16:20:32Z,CONTRIBUTOR,"No problem. I added a test. Let me know if it looks sufficient or if you want me to to tweak something! If you don't mind, would you tag this PR as ""hacktoberfest-accepted""? If you do mind, no problem and I'm sorry for asking :) My kiddos like the shirts.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/datasette/issues/1054#issuecomment-717051707,https://api.github.com/repos/simonw/datasette/issues/1054,717051707,MDEyOklzc3VlQ29tbWVudDcxNzA1MTcwNw==,9599,simonw,2020-10-27T07:41:21Z,2020-10-27T07:41:21Z,OWNER,Essentially it's this problem: https://github.com/python-versioneer/python-versioneer/issues/140,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730199464,Switch from versioneer to concrete version in setup.py, https://github.com/simonw/datasette/issues/1054#issuecomment-717050585,https://api.github.com/repos/simonw/datasette/issues/1054,717050585,MDEyOklzc3VlQ29tbWVudDcxNzA1MDU4NQ==,9599,simonw,2020-10-27T07:38:50Z,2020-10-27T07:38:50Z,OWNER,"Maybe imitate how Django does this, e.g. https://github.com/django/django/commit/6b9b2af7352908d40ca4d31bdb1b80c013cab29a","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730199464,Switch from versioneer to concrete version in setup.py, https://github.com/simonw/sqlite-utils/pull/189#issuecomment-716756103,https://api.github.com/repos/simonw/sqlite-utils/issues/189,716756103,MDEyOklzc3VlQ29tbWVudDcxNjc1NjEwMw==,9599,simonw,2020-10-26T18:56:19Z,2020-10-26T18:56:19Z,OWNER,"This is a great fix, thanks! If you add a unit test somewhere in here I'll merge the PR: https://github.com/simonw/sqlite-utils/blob/main/tests/test_m2m.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/datasette/issues/1051#issuecomment-716681602,https://api.github.com/repos/simonw/datasette/issues/1051,716681602,MDEyOklzc3VlQ29tbWVudDcxNjY4MTYwMg==,9599,simonw,2020-10-26T16:51:58Z,2020-10-26T16:51:58Z,OWNER,"I still need to improve the current binary display on the query page though, where it outputs a Python `b'...'` literal.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1051#issuecomment-716681167,https://api.github.com/repos/simonw/datasette/issues/1051,716681167,MDEyOklzc3VlQ29tbWVudDcxNjY4MTE2Nw==,9599,simonw,2020-10-26T16:51:15Z,2020-10-26T16:51:15Z,OWNER,"Crazy idea: generate a signed URL containing a base64 of the gzip of the binary content (to try and reduce size). No: this will blow through URL limits in various hosting providers and possibly even browsers. It could be made to work a little bit more reliably with some extra JavaScript that turns it into a download on the browser-side, but that would be hideously complicated. Also the signed bit doesn't prevent people from generating SQL queries that generate nasty binary blobs for download. I'm beginning to think that restricting this feature to just table view, not query view, is a better idea. Query view can still get at the binary using JSON and base64.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/976#issuecomment-716305890,https://api.github.com/repos/simonw/datasette/issues/976,716305890,MDEyOklzc3VlQ29tbWVudDcxNjMwNTg5MA==,9599,simonw,2020-10-26T05:07:10Z,2020-10-26T05:07:10Z,OWNER,"I used the new `datasette.urls` methods to handle escaping table names. https://github.com/simonw/datasette/blob/f5dbe61a4568c0915ec6be820095c2960cf0857c/datasette/utils/__init__.py#L996-L1008","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",708289783,Idea: -o could open to a more convenient location, https://github.com/simonw/datasette/issues/1052#issuecomment-716265360,https://api.github.com/repos/simonw/datasette/issues/1052,716265360,MDEyOklzc3VlQ29tbWVudDcxNjI2NTM2MA==,9599,simonw,2020-10-26T02:17:58Z,2020-10-26T02:17:58Z,OWNER,"The default z-index values for Leaflet are defined here: https://github.com/Leaflet/Leaflet/blob/b346bb8bf7bb80899baa1f4fc1536bae58e7e3e6/dist/leaflet.css#L81-L91 ```css .leaflet-pane { z-index: 400; } .leaflet-tile-pane { z-index: 200; } .leaflet-overlay-pane { z-index: 400; } .leaflet-shadow-pane { z-index: 500; } .leaflet-marker-pane { z-index: 600; } .leaflet-tooltip-pane { z-index: 650; } .leaflet-popup-pane { z-index: 700; } .leaflet-map-pane canvas { z-index: 100; } .leaflet-map-pane svg { z-index: 200; } ``` So a `z-index` of 1000 on the menu should fix this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729183332,Column action menu overlapped by Leaflet maps, https://github.com/simonw/datasette/pull/1043#issuecomment-716237524,https://api.github.com/repos/simonw/datasette/issues/1043,716237524,MDEyOklzc3VlQ29tbWVudDcxNjIzNzUyNA==,45380,bollwyvl,2020-10-26T00:14:57Z,2020-10-26T00:14:57Z,CONTRIBUTOR,"Sorry, I was out of the loop this weekend. The missing sdists were in some the `datasette-*` plugins... i'll capture my findings more concretely in one spot when i have a chance...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/issues/1051#issuecomment-716204271,https://api.github.com/repos/simonw/datasette/issues/1051,716204271,MDEyOklzc3VlQ29tbWVudDcxNjIwNDI3MQ==,9599,simonw,2020-10-25T20:08:04Z,2020-10-25T20:08:04Z,OWNER,"This is bad though, because if I want to provide binary data in CSV as requested in #1034 I need some way of providing that data. Which suggests to me that the base64 option is the only one that can make sense for arbitrary SQL queries represented as CSV. Download links won't work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1051#issuecomment-716204090,https://api.github.com/repos/simonw/datasette/issues/1051,716204090,MDEyOklzc3VlQ29tbWVudDcxNjIwNDA5MA==,9599,simonw,2020-10-25T20:06:42Z,2020-10-25T20:06:42Z,OWNER,"Providing a binary download link here is actually extremely difficult. The problem is that the SQL query itself represents data that can change from one moment to the next. It's no good showing a ""Binary: 55 bytes"" message that links to that same SQL query but with a `.blob` extension and arguments to select the particular result, because the data may change in a way that causes that query to return a different row - at which point the download link will give you the wrong data, not the 55 bytes you asked for. So providing a download link risks being misleading.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1050#issuecomment-716174203,https://api.github.com/repos/simonw/datasette/issues/1050,716174203,MDEyOklzc3VlQ29tbWVudDcxNjE3NDIwMw==,9599,simonw,2020-10-25T16:27:39Z,2020-10-25T16:53:27Z,OWNER,"Idea: `.blob` output rendererer, where you tell it which column you want using `?_blob_column=x`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-716175236,https://api.github.com/repos/simonw/datasette/issues/1050,716175236,MDEyOklzc3VlQ29tbWVudDcxNjE3NTIzNg==,9599,simonw,2020-10-25T16:35:20Z,2020-10-25T16:35:20Z,OWNER,"This is clearly a better solution than the one I implemented in #1040 - I don't have to add a new route, I don't have to implement permission checks, it reuses mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/pull/1049#issuecomment-716146238,https://api.github.com/repos/simonw/datasette/issues/1049,716146238,MDEyOklzc3VlQ29tbWVudDcxNjE0NjIzOA==,22429695,codecov[bot],2020-10-25T13:13:32Z,2020-10-25T13:13:32Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=h1) Report > Merging [#1049](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/42f4851e3e7885f1092f104d6c883cea40b12f02?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1049 +/- ## ======================================= Coverage 84.72% 84.72% ======================================= Files 28 28 Lines 3942 3942 ======================================= Hits 3340 3340 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=footer). Last update [42f4851...50a743a](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/issues/838#issuecomment-716123598,https://api.github.com/repos/simonw/datasette/issues/838,716123598,MDEyOklzc3VlQ29tbWVudDcxNjEyMzU5OA==,82988,psychemedia,2020-10-25T10:20:12Z,2020-10-25T10:53:24Z,CONTRIBUTOR,"I'm trying to [run something behind a MyBinder proxy](https://github.com/ouseful-testing/nbsearch), but seem to have something set up incorrectly and not sure what the fix is? I'm starting datasette with jupyter-server-proxy setup: ``` # __init__.py def setup_nbsearch(): return { ""command"": [ ""datasette"", ""serve"", f""{_NBSEARCH_DB_PATH}"", ""-p"", ""{port}"", ""--config"", ""base_url:{base_url}nbsearch/"" ], ""absolute_url"": True, # The following needs a the labextension installing. # eg in postBuild: jupyter labextension install jupyterlab-server-proxy ""launcher_entry"": { ""enabled"": True, ""title"": ""nbsearch"", }, } ``` where the `base_url` gets automatically populated by the server-proxy. I define the loaders as: ``` # __init__.py from datasette import hookimpl @hookimpl def extra_css_urls(database, table, columns, view_name, datasette): return [ ""/-/static-plugins/nbsearch/prism.css"", ""/-/static-plugins/nbsearch/nbsearch.css"", ] ``` but these seem to also need a base_url prefix set somehow? Currently, the generated HTML loads properly but internal links are incorrect; eg they take the form `<link rel=""stylesheet"" href=""/-/static-plugins/nbsearch/prism.css"">` which resolves to eg `https://notebooks.gesis.org/hub/-/static-plugins/nbsearch/prism.css` rather than required URL of form `https://notebooks.gesis.org/binder/jupyter/user/ouseful-testing-nbsearch-0fx1mx67/nbsearch/-/static-plugins/nbsearch/prism.css`. The main css is loaded correctly: `<link rel=""stylesheet"" href=""/binder/jupyter/user/ouseful-testing-nbsearch-0fx1mx67/nbsearch/-/static/app.css?404439"">`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/1034#issuecomment-716078777,https://api.github.com/repos/simonw/datasette/issues/1034,716078777,MDEyOklzc3VlQ29tbWVudDcxNjA3ODc3Nw==,9599,simonw,2020-10-25T01:25:11Z,2020-10-25T01:25:11Z,OWNER,"SQLite actually has APIs that could help here: https://www.sqlite.org/c3ref/column_database_name.html - for any given SQL query they identify the origin/table/column that is the source of each resulting column. Those aren't exposed in the Python `sqlite3` module though, so using them could be extremely tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078605,https://api.github.com/repos/simonw/datasette/issues/1034,716078605,MDEyOklzc3VlQ29tbWVudDcxNjA3ODYwNQ==,9599,simonw,2020-10-25T01:22:22Z,2020-10-25T01:22:22Z,OWNER,For arbitrary CSV the only solution I can think of is to embed the base64 value.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078512,https://api.github.com/repos/simonw/datasette/issues/1034,716078512,MDEyOklzc3VlQ29tbWVudDcxNjA3ODUxMg==,9599,simonw,2020-10-25T01:21:11Z,2020-10-25T01:21:11Z,OWNER,"What should happen for CSV export of arbitrary SQL queries, where there's no obvious BLOB to link to?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078420,https://api.github.com/repos/simonw/datasette/issues/1034,716078420,MDEyOklzc3VlQ29tbWVudDcxNjA3ODQyMA==,9599,simonw,2020-10-25T01:20:00Z,2020-10-25T01:20:00Z,OWNER,That documentation: https://docs.datasette.io/en/latest/internals.html#absolute-url-request-path,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077541,https://api.github.com/repos/simonw/datasette/issues/1034,716077541,MDEyOklzc3VlQ29tbWVudDcxNjA3NzU0MQ==,9599,simonw,2020-10-25T01:09:38Z,2020-10-25T01:10:04Z,OWNER,I should turn `datasette.absolute_url(...)` into a documented internal API on https://docs.datasette.io/en/stable/internals.html#datasette-class,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077508,https://api.github.com/repos/simonw/datasette/issues/1034,716077508,MDEyOklzc3VlQ29tbWVudDcxNjA3NzUwOA==,9599,simonw,2020-10-25T01:09:17Z,2020-10-25T01:09:17Z,OWNER,Here's how those absolute `next_url` values are generated: https://github.com/simonw/datasette/blob/5db7ae3ce165ded57c7fb1cfbdb3258b1cf06c10/datasette/views/table.py#L774-L776,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077436,https://api.github.com/repos/simonw/datasette/issues/1034,716077436,MDEyOklzc3VlQ29tbWVudDcxNjA3NzQzNg==,9599,simonw,2020-10-25T01:08:35Z,2020-10-25T01:08:42Z,OWNER,"This is actually a bit tricky to implement, for a few reasons: - Need to generate a full URL, including the `https://host/` bit. I've done this for `next_url` in the JSON output before, thankfully. - This only makes sense for CSV output for tables. If it's the CSV output of an arbitrary query there's no `/db/table/-/blob/pk/column.blob` page for me to link to. - Need to generate those `/.../-/blob/...` URLs for the data that is being output as CSV.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713277810,https://api.github.com/repos/simonw/datasette/issues/1034,713277810,MDEyOklzc3VlQ29tbWVudDcxMzI3NzgxMA==,9599,simonw,2020-10-21T03:40:50Z,2020-10-25T01:01:23Z,OWNER,Blocked awaiting #1036 (update: now unblocked),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1046#issuecomment-716071507,https://api.github.com/repos/simonw/datasette/issues/1046,716071507,MDEyOklzc3VlQ29tbWVudDcxNjA3MTUwNw==,9599,simonw,2020-10-25T00:06:47Z,2020-10-25T00:06:47Z,OWNER,"I used https://primer.style/octicons/download-16 instead. <img width=""734"" alt=""favicons__favicons__71_rows"" src=""https://user-images.githubusercontent.com/9599/97096021-44417280-161b-11eb-9bc4-88def6afaca4.png""> ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728895193,Link to blob downloads in the right places, https://github.com/simonw/datasette/pull/1040#issuecomment-713920562,https://api.github.com/repos/simonw/datasette/issues/1040,713920562,MDEyOklzc3VlQ29tbWVudDcxMzkyMDU2Mg==,22429695,codecov[bot],2020-10-21T22:44:12Z,2020-10-24T23:08:14Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=h1) Report > Merging [#1040](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/bf82b3d6a605c9ddadd5fb739249dfe6defaf635?el=desc) will **increase** coverage by `0.10%`. > The diff coverage is `100.00%`. [](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1040 +/- ## ========================================== + Coverage 84.65% 84.76% +0.10% ========================================== Files 28 28 Lines 3924 3938 +14 ========================================== + Hits 3322 3338 +16 + Misses 602 600 -2 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/views/index.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2luZGV4LnB5) | `98.18% <ø> (+1.69%)` | :arrow_up: | | [datasette/views/special.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3NwZWNpYWwucHk=) | `92.70% <ø> (-0.82%)` | :arrow_down: | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.37% <100.00%> (+0.17%)` | :arrow_up: | | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `93.77% <100.00%> (ø)` | | | [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `96.07% <100.00%> (+0.22%)` | :arrow_up: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=footer). Last update [bf82b3d...4f3165f](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/issues/1046#issuecomment-716066342,https://api.github.com/repos/simonw/datasette/issues/1046,716066342,MDEyOklzc3VlQ29tbWVudDcxNjA2NjM0Mg==,9599,simonw,2020-10-24T23:02:07Z,2020-10-24T23:02:25Z,OWNER,"A download icon would be nice for the links in the table display. I like this one https://primer.style/octicons/download-24 ```svg <svg xmlns=""http://www.w3.org/2000/svg"" viewBox=""0 0 24 24"" width=""24"" height=""24""> <path d=""M4.97 11.03a.75.75 0 111.06-1.06L11 14.94V2.75a.75.75 0 011.5 0v12.19l4.97-4.97a.75.75 0 111.06 1.06l-6.25 6.25a.75.75 0 01-1.06 0l-6.25-6.25zm-.22 9.47a.75.75 0 000 1.5h14.5a.75.75 0 000-1.5H4.75z""></path> </svg> ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728895193,Link to blob downloads in the right places, https://github.com/simonw/datasette/issues/1033#issuecomment-716066000,https://api.github.com/repos/simonw/datasette/issues/1033,716066000,MDEyOklzc3VlQ29tbWVudDcxNjA2NjAwMA==,82988,psychemedia,2020-10-24T22:58:33Z,2020-10-24T22:58:33Z,CONTRIBUTOR,"From [the docs](https://docs.datasette.io/en/latest/internals.html#datasette-urls), I note: ``` datasette.urls.instance() Returns the URL to the Datasette instance root page. This is usually ""/"" ``` What about the proxy case? Eg if I am using jupyter-server-proxy on a MyBinder or local Jupyter notebook server site, `https://example.com:PORT/weirdpath/datasette`, what does `datasette.urls.instance()` refer to? - [ ] `https://example.com:PORT/weirdpath/datasette` - [ ] `https://example.com:PORT/weirdpath/` - [ ] `https://example.com:PORT/` - [ ] `https://example.com` - [ ] something else?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1033#issuecomment-716048564,https://api.github.com/repos/simonw/datasette/issues/1033,716048564,MDEyOklzc3VlQ29tbWVudDcxNjA0ODU2NA==,9599,simonw,2020-10-24T20:08:31Z,2020-10-24T20:08:31Z,OWNER,Documentation here: https://docs.datasette.io/en/latest/internals.html#datasette-urls,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/575#issuecomment-716048199,https://api.github.com/repos/simonw/datasette/issues/575,716048199,MDEyOklzc3VlQ29tbWVudDcxNjA0ODE5OQ==,9599,simonw,2020-10-24T20:05:44Z,2020-10-24T20:05:44Z,OWNER,https://docs.datasette.io/en/latest/writing_plugins.html#static-assets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497162288,Plugin documentation should cover how to bundle static/templates in setup.py, https://github.com/simonw/datasette/issues/1042#issuecomment-715643763,https://api.github.com/repos/simonw/datasette/issues/1042,715643763,MDEyOklzc3VlQ29tbWVudDcxNTY0Mzc2Mw==,9599,simonw,2020-10-24T00:34:31Z,2020-10-24T00:34:52Z,OWNER,I'm going to rename that to template variable from `select_templates` to `templates_considered` too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715643646,https://api.github.com/repos/simonw/datasette/issues/1042,715643646,MDEyOklzc3VlQ29tbWVudDcxNTY0MzY0Ng==,9599,simonw,2020-10-24T00:33:46Z,2020-10-24T00:33:46Z,OWNER,"I'd like to do this all in the `datasette.render_template()` method to ensure it's available to plugins as well, not just core code that uses the `BaseView` class. This code is the problem: https://github.com/simonw/datasette/blob/d3e9b0aecb6f8e9b2befd9c654ccb7ce852db3e7/datasette/views/base.py#L114-L133 I think I'll fix this by moving the `select_templates` mechanism into `datasette.render_templates()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1045#issuecomment-715641183,https://api.github.com/repos/simonw/datasette/issues/1045,715641183,MDEyOklzc3VlQ29tbWVudDcxNTY0MTE4Mw==,9599,simonw,2020-10-24T00:19:29Z,2020-10-24T00:19:29Z,OWNER,"It turns out it already does that: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L710-L720 But the documentation doesn't reflect that: > `template` - string > > > The template file to be rendered, e.g. `my_plugin.html`. Datasette will search for this file first in the `--template-dir=` location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728600048,"Document that datasette.render_template(template, ...) also accepts a list of templates", https://github.com/simonw/datasette/issues/1042#issuecomment-715618333,https://api.github.com/repos/simonw/datasette/issues/1042,715618333,MDEyOklzc3VlQ29tbWVudDcxNTYxODMzMw==,9599,simonw,2020-10-23T22:33:24Z,2020-10-23T22:33:24Z,OWNER,"It wouldn't be a disaster if template-loading plugins were unable to hook into the `/{slug1}/{slug2}.html` custom page mechanism, since plugins can define their own pages already using `register_routes()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715618077,https://api.github.com/repos/simonw/datasette/issues/1042,715618077,MDEyOklzc3VlQ29tbWVudDcxNTYxODA3Nw==,9599,simonw,2020-10-23T22:32:24Z,2020-10-23T22:32:24Z,OWNER,"Another option: the first version of the plugin hook could accept only the template filename. Subsequent releases could add more arguments, since Pluggy allows new arguments to be added without breaking backwards compatibility.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715617830,https://api.github.com/repos/simonw/datasette/issues/1042,715617830,MDEyOklzc3VlQ29tbWVudDcxNTYxNzgzMA==,9599,simonw,2020-10-23T22:31:26Z,2020-10-23T22:31:26Z,OWNER,"So maybe this should be a `register_template_loader` mechanism that returns a Jinja loader after all? That would mean that only the template filename could be used as the input to the plugin, which doesn't seem as useful as emulating the `extra_template_vars()` interface.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715617405,https://api.github.com/repos/simonw/datasette/issues/1042,715617405,MDEyOklzc3VlQ29tbWVudDcxNTYxNzQwNQ==,9599,simonw,2020-10-23T22:29:53Z,2020-10-23T22:29:53Z,OWNER,"Also consider that `DatasetteRouter` uses `.list_templates()` to gather together `{slug}.html` style templates for the custom page templates mechanism: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L949-L967 For that to work with the new plugin hook, custom template providing plugins will need a way to provide a list of templates that they know about.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715616757,https://api.github.com/repos/simonw/datasette/issues/1042,715616757,MDEyOklzc3VlQ29tbWVudDcxNTYxNjc1Nw==,9599,simonw,2020-10-23T22:27:28Z,2020-10-23T22:27:28Z,OWNER,"Almost all of the core template loading happens in the `BaseView.render` method: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/views/base.py#L114-L133 The one exception is the 404 handling code here: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L1034-L1042","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715614971,https://api.github.com/repos/simonw/datasette/issues/1042,715614971,MDEyOklzc3VlQ29tbWVudDcxNTYxNDk3MQ==,9599,simonw,2020-10-23T22:20:14Z,2020-10-23T22:23:51Z,OWNER,"Alternative plugin hook idea: ```python @hookspec def load_template(template, database, table, columns, view_name, request, datasette): ""Load the specified template, returning the template code as a string"" ``` Imitating the existing `extra_template_vars` family of hooks: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-template-vars-template-database-table-columns-view-name-request-datasette","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/pull/1040#issuecomment-715587715,https://api.github.com/repos/simonw/datasette/issues/1040,715587715,MDEyOklzc3VlQ29tbWVudDcxNTU4NzcxNQ==,9599,simonw,2020-10-23T21:01:07Z,2020-10-23T21:03:10Z,OWNER,"A download icon would be nice for the links in the table display. I like this one https://primer.style/octicons/download-24","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/pull/1043#issuecomment-715586711,https://api.github.com/repos/simonw/datasette/issues/1043,715586711,MDEyOklzc3VlQ29tbWVudDcxNTU4NjcxMQ==,9599,simonw,2020-10-23T20:58:26Z,2020-10-23T20:58:26Z,OWNER,I misunderstood - `asgi-csrf` already has an sdist.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/pull/1043#issuecomment-715585140,https://api.github.com/repos/simonw/datasette/issues/1043,715585140,MDEyOklzc3VlQ29tbWVudDcxNTU4NTE0MA==,9599,simonw,2020-10-23T20:54:29Z,2020-10-23T20:54:29Z,OWNER,Thanks. I'll push a source release of `asgi-csrf`.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/pull/1044#issuecomment-715584579,https://api.github.com/repos/simonw/datasette/issues/1044,715584579,MDEyOklzc3VlQ29tbWVudDcxNTU4NDU3OQ==,9599,simonw,2020-10-23T20:53:01Z,2020-10-23T20:53:01Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727916744,Add minimum supported python, https://github.com/simonw/datasette/issues/745#issuecomment-715556545,https://api.github.com/repos/simonw/datasette/issues/745,715556545,MDEyOklzc3VlQ29tbWVudDcxNTU1NjU0NQ==,9599,simonw,2020-10-23T19:47:10Z,2020-10-23T19:47:10Z,OWNER,Dupe of #647 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",608613033,Extract the hash-URL mechanism out into a plugin, https://github.com/simonw/datasette/issues/1042#issuecomment-715497419,https://api.github.com/repos/simonw/datasette/issues/1042,715497419,MDEyOklzc3VlQ29tbWVudDcxNTQ5NzQxOQ==,9599,simonw,2020-10-23T18:12:40Z,2020-10-23T18:12:40Z,OWNER,Maybe the template loader can optionally return some extra context to pass to the template. That could be used to solve the templates considered comment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715496859,https://api.github.com/repos/simonw/datasette/issues/1042,715496859,MDEyOklzc3VlQ29tbWVudDcxNTQ5Njg1OQ==,9599,simonw,2020-10-23T18:11:27Z,2020-10-23T18:11:27Z,OWNER,"When loading a template the filename is required, but you can optionally also send a set of extra arguments which the template loader can take into consideration.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715490532,https://api.github.com/repos/simonw/datasette/issues/1042,715490532,MDEyOklzc3VlQ29tbWVudDcxNTQ5MDUzMg==,9599,simonw,2020-10-23T17:57:34Z,2020-10-23T17:57:34Z,OWNER,"A better version of this hook would be passed the database, table and query name depending on what was being rendered. This would require some re-thinking of how core templates are loaded, especially since I would want the templates considered comment to continue working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/pull/1044#issuecomment-714916127,https://api.github.com/repos/simonw/datasette/issues/1044,714916127,MDEyOklzc3VlQ29tbWVudDcxNDkxNjEyNw==,22429695,codecov[bot],2020-10-23T05:12:52Z,2020-10-23T05:12:52Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=h1) Report > Merging [#1044](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/d0cc6f4c32e1f89238ddec782086b3122f445bd4?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1044 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=footer). Last update [d0cc6f4...6453ab1](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727916744,Add minimum supported python, https://github.com/simonw/datasette/pull/1043#issuecomment-714915025,https://api.github.com/repos/simonw/datasette/issues/1043,714915025,MDEyOklzc3VlQ29tbWVudDcxNDkxNTAyNQ==,22429695,codecov[bot],2020-10-23T05:09:09Z,2020-10-23T05:09:09Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=h1) Report > Merging [#1043](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/d0cc6f4c32e1f89238ddec782086b3122f445bd4?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1043 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=footer). Last update [d0cc6f4...dc4129c](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/issues/1012#issuecomment-714908859,https://api.github.com/repos/simonw/datasette/issues/1012,714908859,MDEyOklzc3VlQ29tbWVudDcxNDkwODg1OQ==,45380,bollwyvl,2020-10-23T04:49:20Z,2020-10-23T04:49:20Z,CONTRIBUTOR,"Good luck on 1.0! It may also be worth lobbying for a `Framework::Datasette::1.0` classifier. This would be a nice way to allow the ecosystem to self-document a bit more [discoverably](https://pypi.org/search/?q=&o=&c=Framework+%3A%3A+Datasette%3A%3A+1.0). I was surprised to see the [PR for `Framework::Jupyter`](https://github.com/pypa/warehouse/pull/1905/files) is a... database migration! Of course, there may be more workflow to it!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718540751,For 1.0 update trove classifier in setup.py, https://github.com/simonw/datasette/issues/1042#issuecomment-714868867,https://api.github.com/repos/simonw/datasette/issues/1042,714868867,MDEyOklzc3VlQ29tbWVudDcxNDg2ODg2Nw==,9599,simonw,2020-10-23T02:31:17Z,2020-10-23T02:31:17Z,OWNER,I'll build this in conjunction with a plugin that supports editing templates stored in SQLite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-714868624,https://api.github.com/repos/simonw/datasette/issues/1042,714868624,MDEyOklzc3VlQ29tbWVudDcxNDg2ODYyNA==,9599,simonw,2020-10-23T02:30:27Z,2020-10-23T02:30:37Z,OWNER,Maybe `register_template_loader(datasette)` which returns an object which is added in at the beginning of the list passed to `ChoiceLoader` here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-714868207,https://api.github.com/repos/simonw/datasette/issues/1042,714868207,MDEyOklzc3VlQ29tbWVudDcxNDg2ODIwNw==,9599,simonw,2020-10-23T02:29:12Z,2020-10-23T02:29:12Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/d0cc6f4c32e1f89238ddec782086b3122f445bd4/datasette/app.py#L288-L311,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/sqlite-utils/issues/173#issuecomment-714758139,https://api.github.com/repos/simonw/sqlite-utils/issues/173,714758139,MDEyOklzc3VlQ29tbWVudDcxNDc1ODEzOQ==,9599,simonw,2020-10-22T20:57:56Z,2020-10-22T20:57:56Z,OWNER,"I could use `ijson` to provide a progress bar for JSON arrays too. I'd prefer to keep that as an optional dependency though, since `sqlite-utils` is a library dependency for many other projects and it would be using `ijson` purely for the CLI component. Here's how to iterate through a list of objects being read from a file: ```python import json parser = ijson.items(open( ""/tmp/list.json"" ), ""item"") for object in parser: # ... ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707478649,Progress bar for sqlite-utils insert, https://github.com/simonw/datasette/issues/1041#issuecomment-714683801,https://api.github.com/repos/simonw/datasette/issues/1041,714683801,MDEyOklzc3VlQ29tbWVudDcxNDY4MzgwMQ==,9599,simonw,2020-10-22T18:37:47Z,2020-10-22T18:37:47Z,OWNER,"I think I'll do this by looking for URLs that start with `/` - since it's also possible to have full `https://...` URLs in that setting. ```json { ""extra_css_urls"": [ ""/static/styles.css"" ], ""extra_js_urls"": [ ""/static/app.js"" ] } ``` I need to think about the `extra_css_urls` and `extra_js_urls` plugin hooks too: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-css-urls-template-database-table-columns-view-name-request-datasette","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1041#issuecomment-714682825,https://api.github.com/repos/simonw/datasette/issues/1041,714682825,MDEyOklzc3VlQ29tbWVudDcxNDY4MjgyNQ==,9599,simonw,2020-10-22T18:36:10Z,2020-10-22T18:36:10Z,OWNER,I'll need to update these docs once there's a solution for this in place: https://docs.datasette.io/en/latest/custom_templates.html#serving-static-files,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1041#issuecomment-714682288,https://api.github.com/repos/simonw/datasette/issues/1041,714682288,MDEyOklzc3VlQ29tbWVudDcxNDY4MjI4OA==,9599,simonw,2020-10-22T18:35:15Z,2020-10-22T18:35:15Z,OWNER,"@psychemedia said: https://github.com/simonw/datasette/issues/1033#issuecomment-714657366 > How does `/-/static` relate to [current guidance docs around `static`](https://docs.datasette.io/en/latest/custom_templates.html?highlight=static#serving-static-files) regarding the `--static option` and metadata formulations such as `""extra_js_urls"": [ ""/static/app.js""]` (I've not managed to get this to work in a Jupyter server proxied set up; the [datasette / jupyter server proxy repo](https://github.com/simonw/jupyterserverproxy-datasette-demo) may provide a useful test example, eg via MyBinder, for folk to crib from?)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1033#issuecomment-714681365,https://api.github.com/repos/simonw/datasette/issues/1033,714681365,MDEyOklzc3VlQ29tbWVudDcxNDY4MTM2NQ==,9599,simonw,2020-10-22T18:33:48Z,2020-10-22T18:33:48Z,OWNER,"That's a good question - I hadn't considered that. I'm going to open a new issue to have `extra_js_urls` respect the `base_url` setting, since the static files will be served from a different location.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1033#issuecomment-714657366,https://api.github.com/repos/simonw/datasette/issues/1033,714657366,MDEyOklzc3VlQ29tbWVudDcxNDY1NzM2Ng==,82988,psychemedia,2020-10-22T17:51:29Z,2020-10-22T17:51:29Z,CONTRIBUTOR,"How does `/-/static` relate to [current guidance docs around `static`](https://docs.datasette.io/en/latest/custom_templates.html?highlight=static#serving-static-files) regarding the `--static option` and metadata formulations such as `""extra_js_urls"": [ ""/static/app.js""]` (I've not managed to get this to work in a Jupyter server proxied set up; the [datasette / jupyter server proxy repo](https://github.com/simonw/jupyterserverproxy-datasette-demo) may provide a useful test example, eg via MyBinder, for folk to crib from?) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/pull/1031#issuecomment-714289680,https://api.github.com/repos/simonw/datasette/issues/1031,714289680,MDEyOklzc3VlQ29tbWVudDcxNDI4OTY4MA==,299380,frankier,2020-10-22T07:23:52Z,2020-10-22T07:23:52Z,NONE,"The bug is that currently when there are databases passed in, but no -i flag, e.g. in configuration directory mode, inclusion in inspect-data.json does not automatically cause databases to be considered immutable, as described in the documentation. The reason is that the -i flag is specified multiple=True, which means when it is not passed in we will get an empty list [], rather than None. So the current code decides that no databases are immutable rather than falling back to inspect-data.json -- as is presumably intended.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/simonw/sqlite-utils/issues/171#issuecomment-714219725,https://api.github.com/repos/simonw/sqlite-utils/issues/171,714219725,MDEyOklzc3VlQ29tbWVudDcxNDIxOTcyNQ==,649467,mhalle,2020-10-22T04:38:35Z,2020-10-22T04:38:35Z,NONE,"Thanks. As I said, I think the result (being able to query tree structures like ancestors and descendants) is more important than the implementation, and I agree that this particular sqlite extension is too obscure. Just providing an sqlite utility to build or rebuild a transitive closure table might be more generically useful. I find that hierarchical data shows up pretty frequently in some data science problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707407567,Idea: transitive closure tables for tree structures, https://github.com/simonw/sqlite-utils/issues/171#issuecomment-714208848,https://api.github.com/repos/simonw/sqlite-utils/issues/171,714208848,MDEyOklzc3VlQ29tbWVudDcxNDIwODg0OA==,9599,simonw,2020-10-22T04:07:14Z,2020-10-22T04:07:14Z,OWNER,"I made the `--load-extension` command much more widely supported in #137 - which should be useful for anyone who wants to use this extension. It's a bit too obscure for me to want to add direct Python library support relating to that extension though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707407567,Idea: transitive closure tables for tree structures, https://github.com/simonw/datasette/pull/1031#issuecomment-714206875,https://api.github.com/repos/simonw/datasette/issues/1031,714206875,MDEyOklzc3VlQ29tbWVudDcxNDIwNjg3NQ==,9599,simonw,2020-10-22T04:01:19Z,2020-10-22T04:01:19Z,OWNER,I don't fully understand the bug you're fixing here. Could you provide a bit more explanation?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/simonw/datasette/pull/1040#issuecomment-714206533,https://api.github.com/repos/simonw/datasette/issues/1040,714206533,MDEyOklzc3VlQ29tbWVudDcxNDIwNjUzMw==,9599,simonw,2020-10-22T04:00:25Z,2020-10-22T04:00:25Z,OWNER,I've decided not to offer a configuration option to turn this off. I'll reconsider if someone asks for it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/issues/998#issuecomment-714205783,https://api.github.com/repos/simonw/datasette/issues/998,714205783,MDEyOklzc3VlQ29tbWVudDcxNDIwNTc4Mw==,9599,simonw,2020-10-22T03:58:13Z,2020-10-22T03:58:13Z,OWNER,This is now live here: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714117534,https://api.github.com/repos/simonw/datasette/issues/998,714117534,MDEyOklzc3VlQ29tbWVudDcxNDExNzUzNA==,9599,simonw,2020-10-22T01:12:06Z,2020-10-22T01:12:06Z,OWNER,"Demo:  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714092002,https://api.github.com/repos/simonw/datasette/issues/998,714092002,MDEyOklzc3VlQ29tbWVudDcxNDA5MjAwMg==,9599,simonw,2020-10-22T00:55:10Z,2020-10-22T00:55:10Z,OWNER,This isn't blocked on #987 - it just means that `datasette-cluster-map` will need to learn to look for `.table-wrapper` first and fall back on the table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714090965,https://api.github.com/repos/simonw/datasette/issues/998,714090965,MDEyOklzc3VlQ29tbWVudDcxNDA5MDk2NQ==,9599,simonw,2020-10-22T00:54:30Z,2020-10-22T00:54:30Z,OWNER,"Easiest fix for the column action menu positioning - hide them when the user scrolls the containing div: ```javascript document.querySelector('.table-wrapper').addEventListener( 'scroll', () => document.querySelector('.dropdown-menu').style.display = 'none' ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/pull/1038#issuecomment-713920461,https://api.github.com/repos/simonw/datasette/issues/1038,713920461,MDEyOklzc3VlQ29tbWVudDcxMzkyMDQ2MQ==,9599,simonw,2020-10-21T22:43:51Z,2020-10-21T22:43:51Z,OWNER,Thanks for spotting this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726154220,DOC: Fix syntax error, https://github.com/simonw/datasette/issues/1036#issuecomment-713899530,https://api.github.com/repos/simonw/datasette/issues/1036,713899530,MDEyOklzc3VlQ29tbWVudDcxMzg5OTUzMA==,9599,simonw,2020-10-21T21:55:00Z,2020-10-21T21:55:00Z,OWNER,"This code needs these permission checks: https://github.com/simonw/datasette/blob/bf82b3d6a605c9ddadd5fb739249dfe6defaf635/datasette/views/table.py#L911-L913","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713821656,https://api.github.com/repos/simonw/datasette/issues/1036,713821656,MDEyOklzc3VlQ29tbWVudDcxMzgyMTY1Ng==,9599,simonw,2020-10-21T19:22:45Z,2020-10-21T19:41:48Z,OWNER,"So for https://latest.datasette.io/fixtures/binary_data the BLOB download URLs would be: `https://latest.datasette.io/fixtures/-/blob/binary_data/1/data.blob` - that last bit after the primary key is to indicate the `data` column With these headers: - `Content-Disposition: attachment; filename=""binary_data-1-data.blob""` - `X-Content-Type-Options: nosniff` - `Content-Type: application/binary`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713830842,https://api.github.com/repos/simonw/datasette/issues/1036,713830842,MDEyOklzc3VlQ29tbWVudDcxMzgzMDg0Mg==,9599,simonw,2020-10-21T19:41:20Z,2020-10-21T19:41:20Z,OWNER,Another useful demo database: https://datasette-render-images-demo.datasette.io/favicons/favicons - see https://datasette-render-images-demo.datasette.io/favicons/favicons.csv,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713829629,https://api.github.com/repos/simonw/datasette/issues/1036,713829629,MDEyOklzc3VlQ29tbWVudDcxMzgyOTYyOQ==,9599,simonw,2020-10-21T19:38:43Z,2020-10-21T19:38:43Z,OWNER,"Should this work just for BLOB columns, or should it work for other columns too? For the moment I'm going to restrict it to BLOBs, since data from other columns is available through the UI whereas BLOB columns are not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713818817,https://api.github.com/repos/simonw/datasette/issues/1036,713818817,MDEyOklzc3VlQ29tbWVudDcxMzgxODgxNw==,9599,simonw,2020-10-21T19:17:01Z,2020-10-21T19:17:01Z,OWNER,Actually I like `.blob`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713818178,https://api.github.com/repos/simonw/datasette/issues/1036,713818178,MDEyOklzc3VlQ29tbWVudDcxMzgxODE3OA==,9599,simonw,2020-10-21T19:15:38Z,2020-10-21T19:16:34Z,OWNER,"What should the suggested filename be? I think something that includes the table name, primary key and the name of the column would work. How about a file extension? I guess `.binary`, then let the user rename it? Or `.raw`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1039#issuecomment-713754844,https://api.github.com/repos/simonw/datasette/issues/1039,713754844,MDEyOklzc3VlQ29tbWVudDcxMzc1NDg0NA==,9599,simonw,2020-10-21T17:58:27Z,2020-10-21T17:58:27Z,OWNER,"Now live: https://latest.datasette.io/fixtures/roadside_attraction_characteristics  ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726687572,Add an animation to the column actions menu, https://github.com/simonw/datasette/pull/1038#issuecomment-713320666,https://api.github.com/repos/simonw/datasette/issues/1038,713320666,MDEyOklzc3VlQ29tbWVudDcxMzMyMDY2Ng==,22429695,codecov[bot],2020-10-21T05:50:38Z,2020-10-21T05:50:38Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=h1) Report > Merging [#1038](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/66120a7a1cb592e8a21164cf537f62a4d7ab1dfc?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1038 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=footer). Last update [66120a7...7fc0cce](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726154220,DOC: Fix syntax error, https://github.com/simonw/datasette/issues/1036#issuecomment-713278349,https://api.github.com/repos/simonw/datasette/issues/1036,713278349,MDEyOklzc3VlQ29tbWVudDcxMzI3ODM0OQ==,9599,simonw,2020-10-21T03:42:29Z,2020-10-21T03:42:29Z,OWNER,Possible URL for this: `/db/table/-/blob/primary-keys` - this would use the `/db/table/-/` namespace proposed in #296.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/998#issuecomment-713269155,https://api.github.com/repos/simonw/datasette/issues/998,713269155,MDEyOklzc3VlQ29tbWVudDcxMzI2OTE1NQ==,9599,simonw,2020-10-21T03:17:07Z,2020-10-21T03:17:07Z,OWNER,"This may require updates to the column action menu JavaScript too, since it was not built with scrolling sideways in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/1037#issuecomment-713268905,https://api.github.com/repos/simonw/datasette/issues/1037,713268905,MDEyOklzc3VlQ29tbWVudDcxMzI2ODkwNQ==,9599,simonw,2020-10-21T03:16:36Z,2020-10-21T03:16:36Z,OWNER,Dupe of #998.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1037#issuecomment-713268498,https://api.github.com/repos/simonw/datasette/issues/1037,713268498,MDEyOklzc3VlQ29tbWVudDcxMzI2ODQ5OA==,9599,simonw,2020-10-21T03:15:44Z,2020-10-21T03:15:44Z,OWNER,"This may require updates to the column action menu JavaScript too, since it was not built with scrolling sideways in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1037#issuecomment-713267989,https://api.github.com/repos/simonw/datasette/issues/1037,713267989,MDEyOklzc3VlQ29tbWVudDcxMzI2Nzk4OQ==,9599,simonw,2020-10-21T03:14:34Z,2020-10-21T03:14:34Z,OWNER,"This is particularly relevant to the `datasette-cluster-map` plugin - the map is much nicer to use if the table itself can be scrolled. That plugin also makes this harder to build, because the plugin inserts the map as the direct predecessor of the `<table>` element and hence breaks if you try to wrap that in a `<div>`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1036#issuecomment-713226726,https://api.github.com/repos/simonw/datasette/issues/1036,713226726,MDEyOklzc3VlQ29tbWVudDcxMzIyNjcyNg==,9599,simonw,2020-10-21T01:04:25Z,2020-10-21T01:04:25Z,OWNER,"Extra security idea: a `blob_download_host` setting which can be used to indicate a host that should be used for downloads - for example `datasettestatic.com`. If this setting is populated then binary downloads are served from paths on that host only, and no other Datasette URLs from that host will be served.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/262#issuecomment-713208667,https://api.github.com/repos/simonw/datasette/issues/262,713208667,MDEyOklzc3VlQ29tbWVudDcxMzIwODY2Nw==,9599,simonw,2020-10-21T00:03:18Z,2020-10-21T00:03:18Z,OWNER,"I think I should prioritize the facets component of this, since that could have significant performance wins while also supporting `datasette-graphql`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713200782,https://api.github.com/repos/simonw/datasette/issues/262,713200782,MDEyOklzc3VlQ29tbWVudDcxMzIwMDc4Mg==,9599,simonw,2020-10-20T23:41:30Z,2020-10-20T23:41:30Z,OWNER,This is now blocking https://github.com/simonw/datasette-graphql/issues/61 because that issue needs a way to turn off suggested facets when retrieving the results of a table query.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/1034#issuecomment-713191819,https://api.github.com/repos/simonw/datasette/issues/1034,713191819,MDEyOklzc3VlQ29tbWVudDcxMzE5MTgxOQ==,9599,simonw,2020-10-20T23:12:58Z,2020-10-20T23:12:58Z,OWNER,"Enzo has a great solution here: https://twitter.com/enzo_mdd/status/1318685442976436226 > Or maybe an option for a url. This keeps the CSV small but allows scripts to download binary data as needed. In #1036 I'm planning on adding a way for users to access BLOB data. I can include that URL in the CSV output.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1036#issuecomment-713186189,https://api.github.com/repos/simonw/datasette/issues/1036,713186189,MDEyOklzc3VlQ29tbWVudDcxMzE4NjE4OQ==,9599,simonw,2020-10-20T22:56:33Z,2020-10-20T22:56:33Z,OWNER,I think this plus the binary-CSV stuff in #1034 will justify a dedicated section of the documentation to talk about how Datasette handles binary BLOB columns.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713185871,https://api.github.com/repos/simonw/datasette/issues/1036,713185871,MDEyOklzc3VlQ29tbWVudDcxMzE4NTg3MQ==,9599,simonw,2020-10-20T22:55:36Z,2020-10-20T22:55:36Z,OWNER,I can also use a `Content-Disposition` header to force a download. I'm reasonably confident that the combination of `Content-Disposition` and `X-Content-Type-Options: nosniff` and `application/binary` will let me allow users to download the contents of arbitrary BLOB columns without any XSS risk.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713185173,https://api.github.com/repos/simonw/datasette/issues/1036,713185173,MDEyOklzc3VlQ29tbWVudDcxMzE4NTE3Mw==,9599,simonw,2020-10-20T22:53:41Z,2020-10-20T22:53:41Z,OWNER,"https://security.stackexchange.com/questions/12896/does-x-content-type-options-really-prevent-content-sniffing-attacks says: > In Tangled Web Michal Zalewski says: > > > Refrain from using Content-Type: application/octet-stream and use application/binary instead, especially for unknown document types. Refrain from returning Content-Type: text/plain. > > > > For example, any code-hosting platform must exercise caution when returning executables or source archives as application/octet-stream, because there is a risk they may be misinterpreted as HTML and displayed inline.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713184374,https://api.github.com/repos/simonw/datasette/issues/1036,713184374,MDEyOklzc3VlQ29tbWVudDcxMzE4NDM3NA==,9599,simonw,2020-10-20T22:51:22Z,2020-10-20T22:51:22Z,OWNER,"From https://hackerone.com/reports/126197: > archive.uber.com mirrors pypi. When downloading `.tar.gz` files from archive.uber.com, the MIME type is `application/octet-stream`. Injecting `<html><script>alert(0)</script>` into the start of the `.tar.gz` causes an XSS in Internet Explorer due to MIME sniffing. So you do have to be careful not to open accidental XSS holes with `application/octet-stream` thanks to (presumably older) versions of IE. From that thread it looks like the solution is to add a `X-Content-Type-Options: nosniff` header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713183306,https://api.github.com/repos/simonw/datasette/issues/1036,713183306,MDEyOklzc3VlQ29tbWVudDcxMzE4MzMwNg==,9599,simonw,2020-10-20T22:48:10Z,2020-10-20T22:48:10Z,OWNER,Twitter thread: https://twitter.com/dancow/status/1318681053347840005,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1034#issuecomment-713176082,https://api.github.com/repos/simonw/datasette/issues/1034,713176082,MDEyOklzc3VlQ29tbWVudDcxMzE3NjA4Mg==,9599,simonw,2020-10-20T22:27:33Z,2020-10-20T22:27:33Z,OWNER,"This feels good to me - it's consistent with how other features in Datasette work, and it means users who need the binary data in CSV (for whatever reason) can get it if they want to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713175741,https://api.github.com/repos/simonw/datasette/issues/1034,713175741,MDEyOklzc3VlQ29tbWVudDcxMzE3NTc0MQ==,9599,simonw,2020-10-20T22:26:45Z,2020-10-20T22:26:45Z,OWNER,"> New idea: since binary in CSV doesn't make sense anyway, emulate Datasette's HTML UI default and output this: > > id,title,data > 1,Some title,<Binary data: 14 bytes> > 2,Other title,<Binary data: 57 bytes> > > Then allow users to add ?_base64=1 to the URL to get base64 instead > https://twitter.com/simonw/status/1318679950635888641","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713174690,https://api.github.com/repos/simonw/datasette/issues/1034,713174690,MDEyOklzc3VlQ29tbWVudDcxMzE3NDY5MA==,9599,simonw,2020-10-20T22:23:50Z,2020-10-20T22:23:50Z,OWNER,Or... default to `<Binary data: 7 bytes>` and support a `?_base64=1` option which outputs in base64 instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713174341,https://api.github.com/repos/simonw/datasette/issues/1034,713174341,MDEyOklzc3VlQ29tbWVudDcxMzE3NDM0MQ==,9599,simonw,2020-10-20T22:22:53Z,2020-10-20T22:23:14Z,OWNER,"An even easier option: do what the Datasette UI does and output `<Binary data: 7 bytes>` for that CSV cell, as seen on https://latest.datasette.io/fixtures/binary_data","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713172901,https://api.github.com/repos/simonw/datasette/issues/1034,713172901,MDEyOklzc3VlQ29tbWVudDcxMzE3MjkwMQ==,9599,simonw,2020-10-20T22:19:10Z,2020-10-20T22:20:28Z,OWNER,"I could go with the same format as `datasette-render-binary` but using `0x00` as the format for the hex bytes. 0x15 0x1C 0x02 0xC7 JFIF 0x00 0x01 Problem with this is that it's ambiguous: if the ASCII characters `0x15` occur in the text they will be indistinguishable from those hex bytes. But since representing binary data in CSV fundamentally doesn't make sense I'm not sure if that really matters.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/741#issuecomment-713171742,https://api.github.com/repos/simonw/datasette/issues/741,713171742,MDEyOklzc3VlQ29tbWVudDcxMzE3MTc0Mg==,9599,simonw,2020-10-20T22:16:25Z,2020-10-20T22:16:25Z,OWNER,See also #992 which will rename `--config` to `--setting`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",607223136,"Replace ""datasette publish --extra-options"" with ""--setting""", https://github.com/simonw/datasette/issues/262#issuecomment-713170979,https://api.github.com/repos/simonw/datasette/issues/262,713170979,MDEyOklzc3VlQ29tbWVudDcxMzE3MDk3OQ==,9599,simonw,2020-10-20T22:14:37Z,2020-10-20T22:14:37Z,OWNER,"I think it's worth having a plugin hook for this - it can be same hook that is used internally. Maybe `register_extra` - it lets you return one or more `extra` implementations, each with a name and an async function that gets called. Things like suggested facets will become `register_extra` hooks. Maybe actual facets too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713170284,https://api.github.com/repos/simonw/datasette/issues/262,713170284,MDEyOklzc3VlQ29tbWVudDcxMzE3MDI4NA==,9599,simonw,2020-10-20T22:13:01Z,2020-10-20T22:13:01Z,OWNER,In the documentation for `?_extra=` I think I'll emphasize the comma-separated version of it. Also: there will be `?_extra=` values which act as aliases for collection combinations - e.g. `?_extra=full` will toggle everything.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/782#issuecomment-712986115,https://api.github.com/repos/simonw/datasette/issues/782,712986115,MDEyOklzc3VlQ29tbWVudDcxMjk4NjExNQ==,9599,simonw,2020-10-20T16:28:46Z,2020-10-20T16:29:51Z,OWNER,"I think this all comes down to how the `?_extras=` mechanism works (see #262), as first hinted at in a30c5b220c15360d575e94b0e67f3255e120b916 (see commit message) when I added this long-forgotten undocumented feature: https://latest.datasette.io/fixtures/attraction_characteristic/2.json?_extras=foreign_key_tables Extras need to be able to execute additional SQL, since that would solve the problem we have now where the expensive ""suggested facets"" code runs on all `.json` output even when its results are not being shown.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1035#issuecomment-712976314,https://api.github.com/repos/simonw/datasette/issues/1035,712976314,MDEyOklzc3VlQ29tbWVudDcxMjk3NjMxNA==,9599,simonw,2020-10-20T16:21:42Z,2020-10-20T16:21:42Z,OWNER,"Makes me question if `datasette.urls` should grow functionality equivalent to the other path and querystring manipulation methods in `datasette.utils`: https://github.com/simonw/datasette/blob/66120a7a1cb592e8a21164cf537f62a4d7ab1dfc/datasette/utils/__init__.py#L216-L279","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1035#issuecomment-712965574,https://api.github.com/repos/simonw/datasette/issues/1035,712965574,MDEyOklzc3VlQ29tbWVudDcxMjk2NTU3NA==,9599,simonw,2020-10-20T16:13:57Z,2020-10-20T16:13:57Z,OWNER,"That `renderers[key] = path_with_format(` is in a base class which can be used for both arbitrary queries, canned queries and the table view. I think that's OK, but it means that the `format=""json""` argument on `datasette.urls.table()` won't be used by Datasette internally, it will just be available for plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1035#issuecomment-712963959,https://api.github.com/repos/simonw/datasette/issues/1035,712963959,MDEyOklzc3VlQ29tbWVudDcxMjk2Mzk1OQ==,9599,simonw,2020-10-20T16:11:25Z,2020-10-20T16:11:25Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/091441a4449beae559a8c0d007376dc85d3aa624/datasette/utils/__init__.py#L681-L696 Only used here: https://github.com/simonw/datasette/blob/091441a4449beae559a8c0d007376dc85d3aa624/datasette/views/base.py#L498-L502","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1026#issuecomment-712962517,https://api.github.com/repos/simonw/datasette/issues/1026,712962517,MDEyOklzc3VlQ29tbWVudDcxMjk2MjUxNw==,9599,simonw,2020-10-20T16:09:12Z,2020-10-20T16:09:12Z,OWNER,"That `datasette.urls.table(""db"", ""table"") + "".json""` example is bad because if the table name contains a `.` it should be `?_format=json` instead. Maybe `.table()` should have a `format=""json""` option that knows how to do this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/1026#issuecomment-712959034,https://api.github.com/repos/simonw/datasette/issues/1026,712959034,MDEyOklzc3VlQ29tbWVudDcxMjk1OTAzNA==,9599,simonw,2020-10-20T16:03:33Z,2020-10-20T16:03:55Z,OWNER,"Reconsidering this: I think the `.get()` etc methods should automatically add the `base_url` prefix for you, since these APIs are only intended to make internal calls. The clincher on this is when I went to add a section to the `datasette.client` documentation recommending you use `datasette.urls.path()` for every call to them that you make. But there's a problem: to handle table name escaping users are likely to want to use `datasette.urls.table()` anyway, like this: response = await datasette.client.get(datasette.urls.table(""db"", ""table"") + "".json"") This risks adding the `base_url` prefix twice. Maybe the `.table()` method could return a string-like object that is marked as already having the `base_url` prefix added, so the `client.get()` method knows not to add it again. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/991#issuecomment-712855389,https://api.github.com/repos/simonw/datasette/issues/991,712855389,MDEyOklzc3VlQ29tbWVudDcxMjg1NTM4OQ==,24740,furilo,2020-10-20T13:36:41Z,2020-10-20T13:36:41Z,NONE,"Here is one quick sketch (done in Figma :P) for an idea: a possible filter to switch between showing all tables from all databases, or grouping tables by database. (the switch is interactive) All tables: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=1%3A2&viewport=536%2C348%2C0.5&scaling=min-zoom Grouped: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=3%3A974&viewport=536%2C348%2C0.5&scaling=min-zoom When only 1 database: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=1%3A162&viewport=536%2C348%2C0.5&scaling=min-zoom Is this is useful, I can send some more suggestions/sketches. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714377268,Redesign application homepage, https://github.com/simonw/datasette/issues/1023#issuecomment-712607608,https://api.github.com/repos/simonw/datasette/issues/1023,712607608,MDEyOklzc3VlQ29tbWVudDcxMjYwNzYwOA==,9599,simonw,2020-10-20T05:47:42Z,2020-10-20T05:47:42Z,OWNER,Requested alpha testers in https://github.com/simonw/datasette/issues/838#issuecomment-712604364,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722673818,Fix issues relating to base_url, https://github.com/simonw/datasette/issues/1026#issuecomment-712607227,https://api.github.com/repos/simonw/datasette/issues/1026,712607227,MDEyOklzc3VlQ29tbWVudDcxMjYwNzIyNw==,9599,simonw,2020-10-20T05:46:44Z,2020-10-20T05:46:44Z,OWNER,We have a solution for this now: `datasette.urls` from #1033 can be used by plugins to assemble the correct URLs to pass to `.get()` and friends.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/1023#issuecomment-712604541,https://api.github.com/repos/simonw/datasette/issues/1023,712604541,MDEyOklzc3VlQ29tbWVudDcxMjYwNDU0MQ==,9599,simonw,2020-10-20T05:39:44Z,2020-10-20T05:39:44Z,OWNER,Here's the alpha with most of this work ready for people to preview: https://github.com/simonw/datasette/releases/tag/0.51a0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722673818,Fix issues relating to base_url, https://github.com/simonw/datasette/issues/838#issuecomment-712604364,https://api.github.com/repos/simonw/datasette/issues/838,712604364,MDEyOklzc3VlQ29tbWVudDcxMjYwNDM2NA==,9599,simonw,2020-10-20T05:39:15Z,2020-10-20T05:39:15Z,OWNER,"OK, I've made a ton of improvements to how the `base_url` setting works - see tickets linked from #1023. I've just pushed out an alpha release with those changes in it: https://github.com/simonw/datasette/releases/tag/0.51a0 @tsibley @tballison @ChristopherWilks I'd really appreciate your help testing this alpha! You can install it with: pip install datasette==0.51a0 It should work with just `ProxyPass`, without needing the `ProxyPassReverse` setting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/865#issuecomment-712597762,https://api.github.com/repos/simonw/datasette/issues/865,712597762,MDEyOklzc3VlQ29tbWVudDcxMjU5Nzc2Mg==,9599,simonw,2020-10-20T05:22:59Z,2020-10-20T05:22:59Z,OWNER,"OK, this is definitely working now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/1025#issuecomment-712593790,https://api.github.com/repos/simonw/datasette/issues/1025,712593790,MDEyOklzc3VlQ29tbWVudDcxMjU5Mzc5MA==,9599,simonw,2020-10-20T05:12:36Z,2020-10-20T05:12:36Z,OWNER,"I'm going to leave the cookies code setting cookies to default to the `""/""` top level.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/782#issuecomment-712590398,https://api.github.com/repos/simonw/datasette/issues/782,712590398,MDEyOklzc3VlQ29tbWVudDcxMjU5MDM5OA==,9599,simonw,2020-10-20T05:03:46Z,2020-10-20T05:04:09Z,OWNER,"OK, https://latest-with-plugins.datasette.io/ is running that now - e.g. https://latest-with-plugins.datasette.io/fixtures/roadside_attractions.json-preview or https://latest-with-plugins.datasette.io/fixtures/compound_three_primary_keys.json-preview ```json { ""rows"": [ { ""pk"": 1, ""name"": ""The Mystery Spot"", ""address"": ""465 Mystery Spot Road, Santa Cruz, CA 95065"", ""latitude"": 37.0167, ""longitude"": -122.0024 }, { ""pk"": 2, ""name"": ""Winchester Mystery House"", ""address"": ""525 South Winchester Boulevard, San Jose, CA 95128"", ""latitude"": 37.3184, ""longitude"": -121.9511 }, { ""pk"": 3, ""name"": ""Burlingame Museum of PEZ Memorabilia"", ""address"": ""214 California Drive, Burlingame, CA 94010"", ""latitude"": 37.5793, ""longitude"": -122.3442 }, { ""pk"": 4, ""name"": ""Bigfoot Discovery Museum"", ""address"": ""5497 Highway 9, Felton, CA 95018"", ""latitude"": 37.0414, ""longitude"": -122.0725 } ], ""total"": 4, ""next_url"": null } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/782#issuecomment-712585921,https://api.github.com/repos/simonw/datasette/issues/782,712585921,MDEyOklzc3VlQ29tbWVudDcxMjU4NTkyMQ==,9599,simonw,2020-10-20T04:48:01Z,2020-10-20T04:48:01Z,OWNER,I'll update `datasette-json-preview` with that now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/782#issuecomment-712585687,https://api.github.com/repos/simonw/datasette/issues/782,712585687,MDEyOklzc3VlQ29tbWVudDcxMjU4NTY4Nw==,9599,simonw,2020-10-20T04:47:02Z,2020-10-20T04:47:12Z,OWNER,"Great point about CORS, I hadn't considered that. I think I'm going to keep the `Link:` header (added in #1014) because I quite enjoy using it with GitHub and WordPress, but I'm not going to have it be the default way of doing pagination. For the default shape I'm now leaning towards this: ```json { ""total"": 36, ""rows"": [{""id"": 1, ""name"": ""Cleo""}], ""next_url"": ""https://latest-with-plugins.datasette.io/fixtures/facetable.json?_next=5"" } ``` So three keys: `total`, `rows` and `next_url`. Then extra keys can be added using `?_extra=` with various named bundles.","{""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1034#issuecomment-712582699,https://api.github.com/repos/simonw/datasette/issues/1034,712582699,MDEyOklzc3VlQ29tbWVudDcxMjU4MjY5OQ==,9599,simonw,2020-10-20T04:36:04Z,2020-10-20T04:36:14Z,OWNER,Asked for ideas on Twitter: https://twitter.com/simonw/status/1318409558805467136,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-712581994,https://api.github.com/repos/simonw/datasette/issues/1034,712581994,MDEyOklzc3VlQ29tbWVudDcxMjU4MTk5NA==,9599,simonw,2020-10-20T04:33:28Z,2020-10-20T04:33:28Z,OWNER,"The [datasette-render-binary](https://github.com/simonw/datasette-render-binary) plugin does this, which I really like - but without the different coloured fonts I'm not sure how readable it would be as just plain text:  Really the goal here is to find the most human-friendly option, so that people looking at the output have a vague idea what's going on. That's why I'm not leaping at the chance to use base64.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-712580976,https://api.github.com/repos/simonw/datasette/issues/1034,712580976,MDEyOklzc3VlQ29tbWVudDcxMjU4MDk3Ng==,9599,simonw,2020-10-20T04:29:23Z,2020-10-20T04:29:23Z,OWNER,Most obvious option is base64. Any other potential solutions I'm missing?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1025#issuecomment-712579674,https://api.github.com/repos/simonw/datasette/issues/1025,712579674,MDEyOklzc3VlQ29tbWVudDcxMjU3OTY3NA==,9599,simonw,2020-10-20T04:24:10Z,2020-10-20T04:24:10Z,OWNER,"Changed my mind, `error.html` needs access to `urls` in order to link to its CSS file. Passing it after all (it already got passed `ds.config(""base_url"")` so `ds` was available previously).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/782#issuecomment-712569695,https://api.github.com/repos/simonw/datasette/issues/782,712569695,MDEyOklzc3VlQ29tbWVudDcxMjU2OTY5NQ==,222245,carlmjohnson,2020-10-20T03:45:48Z,2020-10-20T03:46:14Z,NONE,"I vote against headers. It has a lot of strikes against it: poor discoverability, new developers often don’t know how to use them, makes CORS harder, makes it hard to use eg with JQ, needs ad hoc specification for each bit of metadata, etc. The only advantage of headers is that you don’t need to do .rows, but that’s actually good as a data validation step anyway—if .rows is missing assume there’s an error and do your error handling path instead of parsing the rest.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1025#issuecomment-712481127,https://api.github.com/repos/simonw/datasette/issues/1025,712481127,MDEyOklzc3VlQ29tbWVudDcxMjQ4MTEyNw==,9599,simonw,2020-10-19T22:40:37Z,2020-10-20T01:21:36Z,OWNER,Was blocked on #904 - now unblocked.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/1033#issuecomment-712529413,https://api.github.com/repos/simonw/datasette/issues/1033,712529413,MDEyOklzc3VlQ29tbWVudDcxMjUyOTQxMw==,9599,simonw,2020-10-20T01:21:12Z,2020-10-20T01:21:12Z,OWNER,Also refs #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1025#issuecomment-712525557,https://api.github.com/repos/simonw/datasette/issues/1025,712525557,MDEyOklzc3VlQ29tbWVudDcxMjUyNTU1Nw==,9599,simonw,2020-10-20T01:07:02Z,2020-10-20T01:07:02Z,OWNER,"I fixed the `queries.html` one. I'm not going to fix these two: ``` datasette/templates/error.html: <a href=""/"">home</a> datasette/templates/patterns.html: <a href=""/"">home</a> / ``` Because the `error.html` one does not get passed a context (which makes sense since an error has occurred) and the pattern portfolio doesn't need to link to anywhere in particular.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/904#issuecomment-712524699,https://api.github.com/repos/simonw/datasette/issues/904,712524699,MDEyOklzc3VlQ29tbWVudDcxMjUyNDY5OQ==,9599,simonw,2020-10-20T01:04:12Z,2020-10-20T01:04:12Z,OWNER,Documentation is https://docs.datasette.io/en/latest/writing_plugins.html#building-urls-within-plugins and https://docs.datasette.io/en/latest/internals.html#internals-datasette-urls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-712483066,https://api.github.com/repos/simonw/datasette/issues/904,712483066,MDEyOklzc3VlQ29tbWVudDcxMjQ4MzA2Ng==,9599,simonw,2020-10-19T22:46:48Z,2020-10-19T22:46:48Z,OWNER,"OK, I'm committing to `datasette.urls.table()` and friends, which will be available to (and documented for use in) plugins and will be exposed to templates as `{{ urls.table(...) }}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/1020#issuecomment-712482504,https://api.github.com/repos/simonw/datasette/issues/1020,712482504,MDEyOklzc3VlQ29tbWVudDcxMjQ4MjUwNA==,9599,simonw,2020-10-19T22:45:01Z,2020-10-19T22:45:01Z,OWNER,"I'm having trouble coming up with the syntax for this. Here's one option: ```python response = await datasette.client.get(path, request=request) ``` But this feels confusing to me. We're not using the `request=` argument as a request - we're using it as a source of some default request values (the cookies and incoming headers, but not the path). We're essentially combining that request with the other arguments passed to `.get()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1020#issuecomment-712482015,https://api.github.com/repos/simonw/datasette/issues/1020,712482015,MDEyOklzc3VlQ29tbWVudDcxMjQ4MjAxNQ==,9599,simonw,2020-10-19T22:43:24Z,2020-10-19T22:43:24Z,OWNER,"... unless I want to support authentication mechanisms that work based on incoming IP address instead, in which case there's an argument for copying more over from the incoming request. Tailscale is a good example of a system where authentication based on IP address can actually work well, so this is worth doing. Also, there might be authentication mechanisms which work by setting a custom header on the incoming request (not to mention the `Authorization` header).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1020#issuecomment-712481568,https://api.github.com/repos/simonw/datasette/issues/1020,712481568,MDEyOklzc3VlQ29tbWVudDcxMjQ4MTU2OA==,9599,simonw,2020-10-19T22:41:59Z,2020-10-19T22:41:59Z,OWNER,"It turns out this works just fine: ```python response = await datasette.client.get(path, cookies=request.cookies) ``` So I don't need a mechanism for this. I'm going to add this to the documentation instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1028#issuecomment-712480866,https://api.github.com/repos/simonw/datasette/issues/1028,712480866,MDEyOklzc3VlQ29tbWVudDcxMjQ4MDg2Ng==,9599,simonw,2020-10-19T22:39:51Z,2020-10-19T22:39:51Z,OWNER,Documentation: https://docs.datasette.io/en/latest/spatialite.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723803777,--load-extension=spatialite shortcut, https://github.com/simonw/datasette/issues/1032#issuecomment-712397537,https://api.github.com/repos/simonw/datasette/issues/1032,712397537,MDEyOklzc3VlQ29tbWVudDcxMjM5NzUzNw==,236498,saulpw,2020-10-19T19:37:55Z,2020-10-19T19:37:55Z,NONE,"python-dateutil is awesome, but it can only guess at one date at a time. So if you have a column of dates that are (presumably) in the same format, it can't use the full set of dates to deduce the format. Also, once it has parsed a date, you can't get the format it used, whether to parse or render other dates. These limitations prevent it from being a silver bullet for date parsing, though they're not enough for me to stop using it!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712367285,https://api.github.com/repos/simonw/datasette/issues/1032,712367285,MDEyOklzc3VlQ29tbWVudDcxMjM2NzI4NQ==,9599,simonw,2020-10-19T18:39:32Z,2020-10-19T18:39:32Z,OWNER,https://github.com/digital-land/brownfield-land-collection/blob/a09ddf9960a6af59e72dc02448f7b645e59bf227/bin/harmonise.py#L217-L247 is a beautiful example of this problem.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712365439,https://api.github.com/repos/simonw/datasette/issues/1032,712365439,MDEyOklzc3VlQ29tbWVudDcxMjM2NTQzOQ==,9599,simonw,2020-10-19T18:35:50Z,2020-10-19T18:37:57Z,OWNER,"Maybe I don't need to add `dateutil` as a dependency here? `pendulum` and `arrow` both depend on it so it's pretty deeply embedded in the Python date ecosystem. Adding it as a dependency seems reasonable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712365236,https://api.github.com/repos/simonw/datasette/issues/1032,712365236,MDEyOklzc3VlQ29tbWVudDcxMjM2NTIzNg==,9599,simonw,2020-10-19T18:35:25Z,2020-10-19T18:35:25Z,OWNER,"Since I'm dealing with tables of data I usually have a whole column of examples, so heuristics that check for numbers-greater-than-12 could actually work well for many cases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712364532,https://api.github.com/repos/simonw/datasette/issues/1032,712364532,MDEyOklzc3VlQ29tbWVudDcxMjM2NDUzMg==,9599,simonw,2020-10-19T18:34:10Z,2020-10-19T18:34:10Z,OWNER,Lots of great example date data in https://biglocal.datasettes.com/COVID_WARN_Notices,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712364317,https://api.github.com/repos/simonw/datasette/issues/1032,712364317,MDEyOklzc3VlQ29tbWVudDcxMjM2NDMxNw==,9599,simonw,2020-10-19T18:33:42Z,2020-10-19T18:33:42Z,OWNER,"Related challenge: timezones. I think I'll punt on those for the moment and just concentrate on dates, but I should keep them in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712363825,https://api.github.com/repos/simonw/datasette/issues/1032,712363825,MDEyOklzc3VlQ29tbWVudDcxMjM2MzgyNQ==,9599,simonw,2020-10-19T18:32:43Z,2020-10-19T18:32:43Z,OWNER,"Horrible thought: I bet there is data out there that uses more than one date format in the same table! So this needs to be exposed as a visible per-column setting. Column action menu can help here, although it's not yet the full solution because it isn't yet visible on mobile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core,