id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason
1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1054246919,I_kwDOBm6k_c4-1ogH,1511,Review plugin hooks for Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-11-15T23:26:05Z,2021-11-16T01:20:14Z,,OWNER,,I need to perform a detailed review of the plugin interface - especially the plugin hooks like [register_facet_classes()](https://docs.datasette.io/en/stable/plugin_hooks.html#register-facet-classes) which I don't yet have complete confidence in.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel
Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets.
This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets.
Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1056746091,I_kwDOBm6k_c4-_Kpr,1515,Handle foreign keys that point to a non-existent table,9599,simonw,open,0,,,,,0,2021-11-17T23:40:13Z,2021-11-18T01:31:56Z,,OWNER,,"Spotted in https://github.com/simonw/datasette-graphql/issues/79
Demo: https://datasette-graphql-demo.datasette.io/fixtures/bad_foreign_key
The foreign key links to a 404 page.
![B87009C7-CFCA-4DF9-8FBA-FA3E6CA28EC2](https://user-images.githubusercontent.com/9599/142334788-4d1a4acd-bc87-4426-b333-d46b221afcec.jpeg)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,gustavorps,open,0,,,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax
Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1050163432,I_kwDOBm6k_c4-mDjo,1503,`?_nocol=` removes that column from the filter interface,9599,simonw,closed,0,,,,,1,2021-11-10T18:22:50Z,2021-11-14T05:08:27Z,2021-11-14T04:53:07Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable
This causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter.
![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1051277222,I_kwDOBm6k_c4-qTem,1504,Link to ?_size=max at bottom of table page,9599,simonw,open,0,,,,,0,2021-11-11T19:06:33Z,2021-11-11T19:06:33Z,,OWNER,,"This can have text such as ""Show 1,000 rows per page"", based on the max size limit setting. Would make it easier for people to see more data at once without having to know how to hack the URL, similar to the `...` for facet sizes I added in #1337.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1052247023,I_kwDOBm6k_c4-uAPv,1505,Datasette should have an option to output CSV with semicolons,9599,simonw,open,0,,,,,1,2021-11-12T18:02:21Z,2021-11-16T11:40:52Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1052826038,I_kwDOBm6k_c4-wNm2,1506,Columns beginning with an underscore do not facet correctly,9599,simonw,closed,0,,,,,1,2021-11-14T02:20:32Z,2021-11-14T04:45:21Z,2021-11-14T04:45:21Z,OWNER,,"Datasette treats columns that start with an underscore as querystring parameters it should ignore!
Discovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,simonw,closed,0,,,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation.
https://readthedocs.org/projects/datasette/builds/15268454/
```
/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html
Running Sphinx v1.8.5
loading translations [en]... done
making output directory...
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 27 source files that are out of date
updating environment: 27 added, 0 changed, 0 removed
reading sources... [ 3%] authentication
Traceback (most recent call last):
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main
app.build(args.force_all, filenames)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build
self.builder.build_update()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update
len(to_build))
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build
updated_docnames = set(self.read())
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read
self._read_serial(docnames)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial
self.read_doc(docname)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc
doctree = read_doc(self.app, self.env, self.env.doc2path(docname))
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc
pub.publish()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish
self.apply_transforms()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms
self.document.transformer.apply_transforms()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms
Transformer.apply_transforms(self)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms
transform.apply(**kwargs)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply
apply_source_workaround(n)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround
for classifier in reversed(node.parent.traverse(nodes.classifier)):
TypeError: argument to reversed() must be a sequence
Exception occurred:
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround
for classifier in reversed(node.parent.traverse(nodes.classifier)):
TypeError: argument to reversed() must be a sequence
The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at . Thanks!
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1006016302,I_kwDOBm6k_c479pcu,1477,Consider adding request to the documented default template context,9599,simonw,open,0,,,,,0,2021-09-24T02:34:09Z,2021-09-24T02:34:09Z,,OWNER,,I made a plugin for this today but I think perhaps it should be a default thing instead: https://datasette.io/plugins/datasette-template-request,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
999902754,I_kwDOBm6k_c47mU4i,1473,base logo link visits `undefined` rather than href url,192568,mroswell,open,0,,,,,2,2021-09-18T04:17:04Z,2021-09-19T00:45:32Z,,CONTRIBUTOR,,"I have two connected sites:
http://www.SaferOrToxic.org
(a Hugo website)
and:
http://disinfectants.SaferOrToxic.org/disinfectants/listN
(a datasette table page)
The latter is linked as ""The List"" in the former's menu.
(I'd love a prettier URL, but that's what I've got.)
On:
http://disinfectants.SaferOrToxic.org/disinfectants/listN
... all the other menu links should point back to:
https://www.SaferOrToxic.org
And they do!
But the logo, for some reason--though it has an href pointing to:
https://www.SaferOrToxic.org
Keeps going to this instead:
https://disinfectants.saferortoxic.org/disinfectants/undefined
What is causing that? How can I fix it?
In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.)
But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page (""The List,"") and then have the List point back to the home website.
The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly.
But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined
This is the HTML:
```
```
Is this somehow related to cloudflare?
Or something in the datasette code?
I'm starting to think it's a cloudflare issue.
Can I at least rule out it being a datasette issue?
My repository is here:
https://github.com/mroswell/list-N
(BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see:
- https://github.com/aio-libs/janus/issues/358
This is a tracking issue for anything else that shows up.
This is also needed for the Homebrew package to upgrade to 3.10:
- https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1021849766,I_kwDOBm6k_c486DCm,1483,Running a search on page 2 of results should not preserve ?_next=,9599,simonw,closed,0,,,,,0,2021-10-10T01:18:12Z,2021-10-13T21:08:10Z,2021-10-13T21:08:10Z,OWNER,,Reported by @eigenfoo in https://github.com/simonw/datasette/issues/1470,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1483/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,mroswell,open,0,,,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency.
Would love to have documentation on how to modify the template to:
- Remove the generated ID from the table
- Link the federal ID to the detail page
- and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key.
I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports.
Perhaps this isn't a template issue, maybe more of a db manipulation...
Margie",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1010112818,I_kwDOBm6k_c48NRky,1479,"Win32 ""used by another process"" error with datasette publish",76450761,kirajano,open,0,,,,,7,2021-09-28T19:12:00Z,2023-09-07T02:14:16Z,,NONE,,"I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette.
Failed to deploy. Attaching logs:
1. Tried with an app created via `flyctl apps create frosty-fog-8565` and the ran `datasette publish fly covid.db --app frosty-fog-8565`
```
Deploying frosty-fog-8565
==> Validating app configuration
--> Validating app configuration done
Services
TCP 80/443 ⇢ 8080
Error error connecting to docker: An unknown error occured.
Traceback (most recent call last):
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main
""__main__"", mod_spec)
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly
""--remote-only"",
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpgcm8cz66\\frosty-fog-8565'
```
2. Tried also with an app that gets autogenerate when running `flyctl launch`. This also generates the .toml file. Ran then `datasette publish fly covid.db --app dark-feather-168` **but different error now**
```Deploying dark-feather-168
==> Validating app configuration
Error not possible to validate configuration: server returned Post ""https://api.fly.io/graphql"": unexpected EOF
Traceback (most recent call last):
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main
""__main__"", mod_spec)
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly
""--remote-only"",
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpnoyewcre\\dark-feather-168'
```
These are also the contents of the generated **.toml file** in 2 scenario:
```
# fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00
app = ""dark-feather-168""
kill_signal = ""SIGINT""
kill_timeout = 5
processes = []
[env]
[experimental]
allowed_public_ports = []
auto_rollback = true
[[services]]
http_checks = []
internal_port = 8080
processes = [""app""]
protocol = ""tcp""
script_checks = []
[services.concurrency]
hard_limit = 25
soft_limit = 20
type = ""connections""
[[services.ports]]
handlers = [""http""]
port = 80
[[services.ports]]
handlers = [""tls"", ""http""]
port = 443
[[services.tcp_checks]]
grace_period = ""1s""
interval = ""15s""
restart_limit = 6
timeout = ""2s""
```
3. But also trying `datasette package covid.db` to create a local DOCKERFILE to later try to push it via `flyctl deploy` fails as well.
```[+] Building 147.3s (11/11) FINISHED
=> [internal] load build definition from Dockerfile 0.2s
=> => transferring dockerfile: 396B 0.0s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/library/python:3.8 4.7s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> [internal] load build context 0.1s
=> => transferring context: 82.37kB 0.0s
=> [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s
=> => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s
=> => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s
=> => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s
=> => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s
=> => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s
=> => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s
=> => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s
=> => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s
=> => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s
=> => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s
=> => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s
=> => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s
=> => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s
=> => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s
=> => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s
=> => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s
=> => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s
=> => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s
=> => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s
=> => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s
=> => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s
=> => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s
=> [2/5] COPY . /app 2.3s
=> [3/5] WORKDIR /app 0.2s
=> [4/5] RUN pip install -U datasette 26.9s
=> [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s
=> exporting to image 1.2s
=> => exporting layers 1.2s
=> => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
Traceback (most recent call last):
""__main__"", mod_spec)
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py"", line 283, in package
call(args)
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpkb27qid3\\datasette'```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1479/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1023243105,I_kwDOBm6k_c48_XNh,1486,pipx installation instructions for plugins don't reference pipx inject,41546558,RhetTbull,closed,0,,,,,0,2021-10-12T00:43:42Z,2021-10-13T21:09:11Z,2021-10-13T21:09:11Z,CONTRIBUTOR,,"The datasette [installation instructions](https://github.com/simonw/datasette/blob/main/docs/installation.rst) discuss how to install with pipx, how to upgrade with pipx, and how to upgrade plugins with pipx but do not mention how to install a plugin with pipx. You discussed this on your [blog](https://til.simonwillison.net/python/installing-upgrading-plugins-with-pipx) but looks like this didn't make it in when you updated the docs for pipx (#756).
I'll submit a PR shortly to fix this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1015646369,I_kwDOBm6k_c48iYih,1480,Exceeding Cloud Run memory limits when deploying a 4.8G database,110420,ghing,open,0,,,,,9,2021-10-04T21:20:24Z,2022-10-07T04:39:10Z,,CONTRIBUTOR,,"When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message:
> Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits
Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M.
Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally:
- Real Memory Size: 70.6 MB
- Virtual Memory Size: 4.51 GB
- Shared Memory Size: 2.5 MB
- Private Memory Size: 57.4 MB
I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow.
This is somewhat related to #1082, but on a different platform, so I decided to open a new issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1480/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1025754125,I_kwDOBm6k_c49I8QN,1488,Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects'),9599,simonw,closed,0,,,,,5,2021-10-13T22:37:22Z,2021-10-14T18:03:45Z,2021-10-14T18:03:45Z,OWNER,,This is caused by a change made to `httpx` in https://github.com/encode/httpx/releases/tag/0.20.0,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,chenrui333,closed,0,,,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command.
My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks!
relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1033864602,I_kwDOBm6k_c49n4Wa,1496,Named parameters docs should include an example of a cast,9599,simonw,closed,0,,,,,1,2021-10-22T18:56:04Z,2021-10-22T19:38:23Z,2021-10-22T19:34:27Z,OWNER,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters
It's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1034535001,I_kwDOBm6k_c49qcBZ,1497,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""",9599,simonw,closed,0,,,,,18,2021-10-24T22:57:07Z,2023-01-18T17:13:45Z,2021-10-24T23:36:55Z,OWNER,,"This means the Datasette 0.59.1 release has not been published to Docker Hub.
Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true
```
Preparing to unpack .../libc6_2.32-4_amd64.deb ...
debconf: unable to initialize frontend: Dialog
debconf: (TERM is not set, so the dialog frontend is not usable.)
debconf: falling back to frontend: Readline
debconf: unable to initialize frontend: Readline
debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.)
debconf: falling back to frontend: Teletype
Checking for services that may need to be restarted...
Checking init scripts...
Unpacking libc6:amd64 (2.32-4) over (2.28-10) ...
Setting up libc6:amd64 (2.32-4) ...
/usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory
dpkg: error processing package libc6:amd64 (--configure):
installed libc6:amd64 package post-installation script subprocess returned error exit status 127
Errors were encountered while processing:
libc6:amd64
E: Sub-process /usr/bin/dpkg returned an error code (1)
The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100
```
Same problem when I attempted to publish using the ""Push specific Docker tag"" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1072106103,I_kwDOBm6k_c4_5wp3,1542,feature request: order and dependency of plugins (that use js),33631,fs111,open,0,,,,,1,2021-12-06T12:40:45Z,2021-12-15T17:47:08Z,,NONE,,"I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js.
Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it.
Basically what I want to do is:
my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized.
Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes.
It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1058072543,I_kwDOBm6k_c4_EOff,1518,Complete refactor of TableView and table.html template,9599,simonw,open,0,,,3268330,Datasette 1.0,45,2021-11-19T02:55:16Z,2022-03-15T18:35:49Z,,OWNER,,"Split from #878. The current `TableView` class is by far the most complex part of Datasette, and the most difficult to work on: https://github.com/simonw/datasette/blob/0.59.2/datasette/views/table.py
In #878 I started exploring a new pattern for building views. In doing so it became clear that `TableView` is the first beast that I need to slay - if I can refactor that into something neat the pattern for building other views will emerge as a natural consequence.
I've been trying to build this as a `register_routes()` plugin, as originally suggested in #870 - though unfortunately it looks like those plugins can't replace existing Datasette default views at the moment, see #1517. [UPDATE: I was wrong about this, plugins can over-ride default views just fine]
I also know that I want to have a fully documented template context for `table.html` as a major step on the way to Datasette 1.0, see #1510.
All of this adds up to the `TableView` factor being a major project that will unblock a whole flurry of other things - so I'm going to work on that in this separate issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1058790545,I_kwDOBm6k_c4_G9yR,1519,base_url is omitted in JSON and CSV views,157158,phubbard,closed,0,,,,,22,2021-11-19T18:10:45Z,2021-12-01T17:50:09Z,2021-11-20T19:11:21Z,NONE,,"I have a datasette deployment, using Apache2 to reverse proxy:
ProxyPass /ged http://thor.phfactor.net:8001
ProxyPreserveHost On
In settings.json I have
```json
{
""base_url"": ""/ged/"",
""trace_debug"": 1,
""template_debug"": 1
}
```
and datasette works correctly. However, if you view a query and then click on the 'This data as json, CSV' both links omit the base_url prefix and are therefore 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1058803238,I_kwDOBm6k_c4_HA4m,1520,Pattern for avoiding accidental URL over-rides,9599,simonw,open,0,,,,,1,2021-11-19T18:28:05Z,2021-11-19T18:29:26Z,,OWNER,,"Following #1517 I'm experimenting with a plugin that does this:
```python
@hookimpl
def register_routes():
return [
(r""/(?P[^/]+)/(?P[^/]+?)$"", Table().view),
]
```
This is supposed to replace the default table page with new code... but there's a problem: `/-/versions` on that instance now returns 404 `Database '-' does not exist`!
Need to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1058815557,I_kwDOBm6k_c4_HD5F,1521,Docker configuration for exercising Datasette behind Apache mod_proxy,9599,simonw,closed,0,,,,,10,2021-11-19T18:46:18Z,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,,"> Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette?
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974310208_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1058896236,I_kwDOBm6k_c4_HXls,1522,Deploy a live instance of demos/apache-proxy,9599,simonw,closed,0,,,,,34,2021-11-19T20:32:55Z,2021-11-23T03:00:34Z,2021-11-20T18:51:56Z,OWNER,,"> I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1521#issuecomment-974322178_
I started by following https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1059209412,I_kwDOBm6k_c4_IkDE,1523,Come up with a more elegant solution for base_url than ds.urls.path(),9599,simonw,open,0,,,,,0,2021-11-20T19:05:22Z,2021-11-20T19:05:22Z,,OWNER,,"While fixing #1519 I added a lot of ugly code that looks like this: https://github.com/simonw/datasette/blob/08947fa76433d18988aa1ee1d929bd8320c75fe2/datasette/facets.py#L228-L230
See these two commits in particular: fe687fd0207c4c56c4778d3e92e3505fc4b18172 and 08947fa76433d18988aa1ee1d929bd8320c75fe2
It would be great to come up with a less verbose and error-prone way of handling this problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_
I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here:
https://github.com/simonw/datasette-publish-vercel/issues/51
In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site).
Can this be addressed?
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1059555791,I_kwDOBm6k_c4_J4nP,1527,Columns starting with an underscore behave poorly in filters,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-11-22T01:01:36Z,2022-01-14T00:57:08Z,2022-01-14T00:57:08Z,OWNER,,"Similar bug to #1525 (and #1506 before it). Start on https://latest.datasette.io/fixtures/facetable?_facet=_neighborhood - then select a neighborhood - then try to remove that filter using the little ""x"" and submitting the form again.
![filter-bug](https://user-images.githubusercontent.com/9599/142786754-31d265a2-944d-4ea2-af6f-305d445a2ccb.gif)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so:
```yaml
databases:
fixtures:
queries:
neighborhood_search:
sql: |-
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
title: Search neighborhoods
```
This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it.
So, I'd like to see a new `""sql_file""` key that works like so:
`metadata.yaml`:
```yaml
databases:
fixtures:
queries:
neighborhood_search:
sql_file: neighborhood_search.sql
title: Search neighborhoods
```
`neighborhood_search.sql`:
```sql
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
```
Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup.
A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml:
- Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code
- Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit
- Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier)
- Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible
Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1065429936,I_kwDOBm6k_c4_gSuw,1532,Use datasette-table Web Component to guide the design of the JSON API for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2021-11-28T20:37:18Z,2022-03-16T20:13:34Z,,OWNER,,"I realized that one of the reasons I'm having trouble committing to nailing down the JSON API for 1.0 is that I don't use it much myself - I use the `?_shape=array` one quite often, but I don't have any projects that are using the default, more fully-featured API.
As an experiment I built a Web Component for embedding Datasette tables on pages - https://github.com/simonw/datasette-table - and I think it's actually going to be a really useful tool for helping me dog food the v1.0 API design.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page.
It looks like a neat way to do that is with ` Link:` header like this:
`Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""`
I can put a ` The query_only pragma prevents data changes on database files when enabled. When this pragma is enabled, any attempt to CREATE, DELETE, DROP, INSERT, or UPDATE will result in an [SQLITE_READONLY](https://www.sqlite.org/rescode.html#readonly) error. However, the database is not truly read-only. You can still run a [checkpoint](https://www.sqlite.org/wal.html#ckpt) or a [COMMIT](https://www.sqlite.org/lang_transaction.html) and the return value of the [sqlite3_db_readonly()](https://www.sqlite.org/c3ref/db_readonly.html) routine is not affected.
Would it be worth adding this as an extra protection against accidental writes to a DB file over a read-only connection?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1068791148,I_kwDOBm6k_c4_tHVs,1540,Idea: hover to reveal details of linked row,9599,simonw,open,0,,,,,6,2021-12-01T19:28:07Z,2021-12-09T23:38:39Z,,OWNER,,"
Hovering over that could work a little bit like GitHub issue links:
![hover](https://user-images.githubusercontent.com/9599/144300537-9cd9e9af-ac16-42db-842f-37661bc94063.gif)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1069881276,I_kwDOBm6k_c4_xRe8,1541,Different default layout for row page,9599,simonw,open,0,,,,,1,2021-12-02T18:56:36Z,2021-12-02T18:56:54Z,,OWNER,,"The row page displays as a table even though it only has one table row.
maybe default to the same display as the narrow page version, even for wide pages?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,andrewsanchez,open,0,,,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,LyzardKing,open,0,,,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue.
I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly.
Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works).
If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error:
```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!```
Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139
In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates.
You can't even try calling `post_body()` and implement your own custom parsing because of:
- #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1978022687,I_kwDOBm6k_c515jsf,2204,request.post_body() can only be called once,9599,simonw,open,0,,,,,0,2023-11-05T23:22:03Z,2023-11-05T23:23:23Z,,OWNER,,"This code here:
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135
It consumes the messages, which means if you try to call it a second time you won't be able to get at the body.
This is efficient - we don't end up with a `request` object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not.
Potential solution: set `request._body` the first time it is called, and return that on subsequent calls.
Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call `post_body()` multiple times against one of those larger bodies.
I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,honzajavorek,open,0,,,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error:
```
$ datasette
Traceback (most recent call last):
File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in
from datasette.cli import cli
File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in
from click_default_group import DefaultGroup
ModuleNotFoundError: No module named 'click_default_group'
```
I have datasette in my dependencies like this:
```toml
[tool.poetry.group.dev.dependencies]
datasette = {version = ""1.0a7"", allow-prereleases = true}
```
I had the latest regular version (not pre-release) there originally, but the result was the same:
```toml
[tool.poetry.group.dev.dependencies]
datasette = ""0.64.5""
```
Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1994857251,I_kwDOBm6k_c525xsj,2208,No suggested facets when a column named 'value' is included,198537,rgieseke,open,0,,,,,1,2023-11-15T14:11:17Z,2023-11-15T14:18:59Z,,CONTRIBUTOR,,"When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'.
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174
Currently the following is shown (from https://latest.datasette.io/fixtures/facetable)
![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5)
When I add a column named 'value' only the JSON facets are processed.
![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d)
I think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that.
There is also a TODO with a similar question in the same file. I have not looked into that yet.
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,fgregg,open,0,,,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed.
this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application""
(see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,mattparmett,open,0,,,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler.
Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler.
This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it.
One example:
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537
Other instances:
https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270
https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,precipice,open,0,,,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research.
I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com.
Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read.
If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail.
The foreign key configuration is as follows:
```
# Some tables use integer ids, like sensible tables do. Let's import them first
# since we favor them.
for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY
do
sqlite-utils create-table records.db $TABLE id integer name text --pk=id
sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv
sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id
sqlite-utils create-index records.db collisions $TABLE
done
# *Other* tables use letter keys, like they were raised by WOLVES. Let's put them
# at the end of the import queue.
for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \
PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \
PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \
STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP
do
sqlite-utils create-table records.db $TABLE key text name text --pk=key
sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv
sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key
sqlite-utils create-index records.db collisions $TABLE
done
```
You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db
If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output:
```
INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK
Caught this error:
```
(No other output follows `error:` other than a blank line.)
I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,fzakaria,open,0,,,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean).
I can't seem to get the facet to work or even filtering on this column.
My guess is that Datasette is ""stringifying"" the number and it's not matching?
Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6*
> `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/
>
> Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days?
>
> Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1087919372,I_kwDOBm6k_c5A2FUM,1578,Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key,9599,simonw,open,0,,,,,4,2021-12-23T18:27:59Z,2021-12-24T21:33:19Z,,OWNER,,"Found this while working on https://github.com/simonw/datasette-tiddlywiki
Then clicking on `/tiddlywiki/tiddlers/%24%3A%2FDefaultTiddlers` returns a 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1087931918,I_kwDOBm6k_c5A2IYO,1579,`.execute_write(... block=True)` should be the default behaviour,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-12-23T18:54:28Z,2022-01-13T22:28:08Z,2021-12-23T19:18:26Z,OWNER,,"Every single piece of code I've written against the write APIs has used the `block=True` option to wait for the result.
Without that, it instead fires the write into the queue but then continues even before it has finished executing.
`block=True` should clearly be the default behaviour here!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1089529555,I_kwDOBm6k_c5A8ObT,1581,"when hashed urls are turned on, the _memory db has improperly long-lived cache expiry",536941,fgregg,closed,0,,,,,1,2021-12-28T00:05:48Z,2022-03-24T04:08:18Z,2022-03-24T04:08:18Z,CONTRIBUTOR,,"if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database.
in particular, this header is set:
`cache-control: max-age=31536000`
this is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561).
Either the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages
To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0
mkdir -p templates/pages
echo ""hello world"" >> templates/pages/about.html
Start datasette
datasette --template-dir templates/
Navigate to [http://127.0.0.1:8001/about](url) and receive:
Error 404:
Database not found: about
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name):
``
My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`):
``
So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication.
Thanks!
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor.
I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1077620955,I_kwDOBm6k_c5AOzDb,1549,Redesign CSV export to improve usability,536941,fgregg,open,0,,,3268330,Datasette 1.0,5,2021-12-11T19:02:12Z,2022-04-04T11:17:13Z,,CONTRIBUTOR,,"*Original title: Set content type for CSV so that browsers will attempt to download instead opening in the browser*
Right now, if the user clicks on the CSV related to a table or a query, the response header for the content type is
""content-type: text/plain; charset=utf-8""
Most browsers will try to open a file with this content-type in the browser.
This is not what most people want to do, and lots of folks don't know that if they want to download the CSV and open it in the a spreadsheet program they next need to save the page through their browser.
It would be great if the response header could be something like
```
'Content-type: text/csv');
'Content-disposition: attachment;filename=MyVerySpecial.csv');
```
which would lead browsers to open a download dialog.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399
If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1078702875,I_kwDOBm6k_c5AS7Mb,1552,Allow to set `facets_array` in metadata (like current `facets`),3556,davidbgk,closed,0,,,7571612,Datasette 0.60,9,2021-12-13T16:00:44Z,2022-01-13T22:26:15Z,2021-12-16T18:47:48Z,CONTRIBUTOR,,"For now, you can set a `facets` value (array) in your metadata file but I couldn't find a way to set a `facets_array` in order to provide default facets for arrays (like tags). My use-case is to access to [that kind of view](https://latest.datasette.io/fixtures/facetable?_facet_array=tags) by default without URL's parameters as with other default facets.
_I'm new to datasette, and I'm willing to help with a PR if that is not already implemented and I missed it!_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit.
it would be great if a response is truncated that an header signalling that was set in the header.
i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1
I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc
```sql
SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns'
FROM sqlite_schema AS m,
pragma_index_list(m.name) AS il,
pragma_index_info(il.name) AS ii
WHERE m.type='table'
ORDER BY 1;
```
https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1081318247,I_kwDOBm6k_c5Ac5tn,1556,"Show count of facet values always, not just for `?_facet_size=max`",9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-15T17:49:01Z,2022-01-13T22:26:07Z,2021-12-15T17:58:06Z,OWNER,,"> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1556/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed
1082564912,I_kwDOBm6k_c5AhqEw,1557,`?_nosuggest=1` parameter for disabling facet suggestions on table view,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-16T19:21:42Z,2022-01-13T22:26:48Z,2021-12-16T19:24:59Z,OWNER,,"Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used.
>
> I'll open a separate issue to redesign this better for Datasette 1.0.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518.
```
% curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 ''
fixtures: facetable: 14 rows
where where on_earth = 1 sorted by pk
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id.
One way that it could work is to have the _memory hash be a hash of all the individual databases.
Otherwise, crossdb queries can get quit out of data if using aggressive caching.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083657868,I_kwDOBm6k_c5Al06M,1565,Documented JavaScript variables on different templates made available for plugins,9599,simonw,open,0,,,,,8,2021-12-17T22:30:51Z,2021-12-19T22:37:29Z,,OWNER,,"While working on https://github.com/simonw/datasette-leaflet-freedraw/issues/10 I found myself writing this atrocity to figure out the SQL query used for a specific table page:
```javascript
let innerSql = Array.from(document.getElementsByTagName(""span"")).filter(
el => el.innerText == ""View and edit SQL""
)[0].parentElement.getAttribute(""title"")
```
This is obviously bad - it's very brittle, and will break if I ever change the text on that link (like localizing it for example).
Instead, I think pages like that one should have a block of script at the bottom something like this:
```javascript
window.datasette = window.datasette || {};
datasette.view_name = 'table';
datasette.table_sql = 'select * from ...';
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon
>>> ds = Datasette(memory=True)
Traceback (most recent call last):
File """", line 1, in
TypeError: __init__() missing 1 required positional argument: 'files'
>>> ds = Datasette(memory=True, files=[])
```
I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection:
```pycon
>>> from datasette.app import Datasette
>>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""])
>>> db = ds.add_memory_database('geo')
>>> await db.execute_write(""select InitSpatialMetadata(1)"")
UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f')
no such function: InitSpatialMetadata
```
It looks like the code that loads additional modules only works on read-only connections, not on write connections:
https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153
Compared to:
https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods:
>
> - `db.execute_write(sql, params=None, block=False)`
> - `db.execute_write_script(sql, block=False)`
> - `db.execute_write_many(sql, params_seq, block=False)`
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083927147,I_kwDOBm6k_c5Am2pr,1571,Track number of executions for execute_write_many() in traces,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T19:16:17Z,2022-01-13T22:27:49Z,2021-12-19T20:30:40Z,OWNER,,"Spotted while working on #1555
There's no indication there of how many times `execute_write_many()` executed the SQL.
Solving this is a tiny bit tricky because `params_seq` is an iterator that we don't want to exhaust before passing it to `conn.executemany()` - so we need to instead wrap it in something that counts how many times it was called.
But then we need a way to attach that to the trace here: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/database.py#L115-L122
So probably need to redesign the `trace()` decorator to allow extra pairs to be attached to it within the `with` statement.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083718998,I_kwDOBm6k_c5AmD1W,1567,Remove undocumented sqlite_functions mechanism,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T01:51:10Z,2022-01-13T22:27:04Z,2021-12-18T01:54:46Z,OWNER,,"I added this in 0b8c1b0a6da9cb8ac0d28cc90dd783de87554036 but it's never been documented and the same thing can now be achieved using the `prepare_connection` plugin hook.
https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L262
https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L551-L552
It's used here in the tests:
https://github.com/simonw/datasette/blob/69244a617b1118dcbd04a8f102173f04680cf08c/tests/fixtures.py#L156",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this:
```diff
diff --git a/datasette/database.py b/datasette/database.py
index 468e936..1a424f5 100644
--- a/datasette/database.py
+++ b/datasette/database.py
@@ -94,10 +94,14 @@ class Database:
f""file:{self.path}{qs}"", uri=True, check_same_thread=False
)
- async def execute_write(self, sql, params=None, block=False):
+ async def execute_write(self, sql, params=None, executescript=False, block=False):
+ assert not executescript and params, ""Cannot use params with executescript=True""
def _inner(conn):
with conn:
- return conn.execute(sql, params or [])
+ if executescript:
+ return conn.executescript(sql)
+ else:
+ return conn.execute(sql, params or [])
with trace(""sql"", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_write_fn(_inner, block=block)
```
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1084185188,I_kwDOBm6k_c5An1pk,1573,Make trace() a documented internal API,9599,simonw,open,0,,,,,1,2021-12-19T20:32:56Z,2021-12-19T21:13:13Z,,OWNER,,"This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52
Including the new `kwargs` pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1084007781,I_kwDOBm6k_c5AnKVl,1572,"""Query took"" should be ""Queries took""",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-19T04:03:00Z,2022-01-13T22:27:43Z,2021-12-19T04:03:24Z,OWNER,,"This is misleading, since usually there have been more than one query executed:
![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1084257842,I_kwDOBm6k_c5AoHYy,1575,__call__() got an unexpected keyword argument 'specname',9599,simonw,closed,0,,,,,1,2021-12-20T01:24:04Z,2021-12-20T01:48:03Z,2021-12-20T01:47:57Z,OWNER,,"> I've installed the alpha version but get an error when starting up Datasette:
```
Traceback (most recent call last):
File ""/Users/tim/.pyenv/versions/stock-exchange/bin/datasette"", line 5, in
from datasette.cli import cli
File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py"", line 15, in
from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm
File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py"", line 31, in
from .views.database import DatabaseDownload, DatabaseView
File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py"", line 25, in
from datasette.plugins import pm
File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py"", line 29, in
mod = importlib.import_module(plugin)
File ""/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py"", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py"", line 9, in
@hookimpl(specname=""filters_from_request"")
TypeError: __call__() got an unexpected keyword argument 'specname'
```
_Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1087181951,I_kwDOBm6k_c5AzRR_,1576,Traces should include SQL executed by subtasks created with `asyncio.gather`,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2021-12-22T20:52:02Z,2022-02-05T05:21:35Z,2022-02-05T05:19:53Z,OWNER,,"I tried running some parallel SQL queries using `asyncio.gather()` but the SQL that was executed didn't show up in the trace rendered by https://datasette.io/plugins/datasette-pretty-traces
I realized that was because traces are keyed against the current task ID, which changes when a sub-task is run using `asyncio.gather` or similar.
The faceting and suggest faceting queries are missing from this trace:
![image](https://user-images.githubusercontent.com/9599/147153855-2d611f07-922a-4d18-9e6e-4be89e010dc4.png)
> The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25
>
> This is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively ""child tasks"" of the current request, and hence should be tracked the same.
>
> https://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-999870993_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1104691662,I_kwDOBm6k_c5B2EHO,1600,plugins --all example should use cog,9599,simonw,closed,0,,,,,1,2022-01-15T11:47:49Z,2022-01-20T05:06:21Z,2022-01-20T05:04:16Z,OWNER,,The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1105916061,I_kwDOBm6k_c5B6vCd,1601,Add KNN and data_licenses to hidden tables list,25778,eyeseast,closed,0,,,,,5,2022-01-17T14:19:57Z,2022-01-20T21:29:44Z,2022-01-20T04:38:54Z,CONTRIBUTOR,,"They're generated by Spatialite and not very interesting in most cases.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1090810196,I_kwDOBm6k_c5BBHFU,1583,consider adding deletion step of cloudbuild artifacts to gcloud publish,536941,fgregg,open,0,,,,,1,2021-12-30T00:33:23Z,2021-12-30T00:34:16Z,,CONTRIBUTOR,,"right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun.
after successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1091257796,I_kwDOBm6k_c5BC0XE,1584,give error with recursive sql,58088336,tunguyenatwork,open,0,,,,,0,2021-12-30T18:53:16Z,2021-12-30T18:53:16Z,,NONE,,"I got an error ""near ""WITH"": syntax error"" after I upgraded to version 0.59 from 0.52.4. This error is related to recursive sql. It works great on the previous version but it failed after upgraded. Below is an example of sql:
WITH RECURSIVE manager_of(position, super_position) AS (SELECT position, case ifnull(INDIRECT_SUPER_POSITION,'') when '' then super_position else INDIRECT_SUPER_POSITION end as SUPER_POSITION FROM position where super_position<>'SGV000000001' and super_position!='' and position <> super_position),chain_manager_of_position(position, level) AS (SELECT super_position, 1 as level FROM manager_of WHERE super_position!='' and (position=:pos or position in (Select position from employee where employee=:ein)) UNION ALL SELECT super_position, level+1 as level FROM manager_of JOIN chain_manager_of_position USING(position)) SELECT * FROM chain_manager_of_position left join employee using(position) where employee is not NULL order by level limit 1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1091838742,I_kwDOBm6k_c5BFCMW,1585,Fire base caching for `publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-01T15:38:15Z,2022-01-01T15:40:38Z,,OWNER,,"https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848
Could this enable easier vanity URLs of the format `https://$project_id.web.app/`? How about CDN caching?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1096536240,I_kwDOBm6k_c5BW9Cw,1586,run analyze on all databases as part of start up or publishing,536941,fgregg,open,0,,,,,1,2022-01-07T17:52:34Z,2022-02-02T07:13:37Z,,CONTRIBUTOR,,"Running `analyze;` lets sqlite's query planner make *much* better use of any indices.
It might be nice if the analyze was run as part of the start up of ""serve"" or ""publish"".",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table
>
> This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable
But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1099723916,I_kwDOBm6k_c5BjHSM,1590,Table+query JSON and CSV links broken when using `base_url` setting,1001306,eelkevdbos,closed,0,,,7571612,Datasette 0.60,11,2022-01-11T23:46:39Z,2022-01-14T01:16:34Z,2022-01-14T01:16:08Z,NONE,,"Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set.
In the follow asgi example, I'm hosting a custom Datasette instance:
```python
# asgi.py
import pathlib
from asgi_cors import asgi_cors
from channels.routing import URLRouter
from django.urls import re_path
from datasette.app import Datasette
datasette_ = Datasette(
files=[],
settings={
""base_url"": ""/datasettes/"",
""plugins"": {}
},
config_dir=pathlib.Path('.'),
)
application = URLRouter([
re_path(r""^datasettes/.*"", asgi_cors(datasette_.app(), allow_all=True)),
])
```
Running it with:
```shell
$ daphne -p 8002 asgi:application
```
Using a simple query on the `_memory` table:
```sql
select sqlite_version()
```
http://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29
It renders the following upon inspection:
![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png)
I am using datasette version `0.59.4`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1100015398,I_kwDOBm6k_c5BkOcm,1591,Maybe let plugins define custom serve options?,9599,simonw,open,0,,,,,7,2022-01-12T08:18:47Z,2022-01-15T11:56:59Z,,OWNER,,"https://twitter.com/psychemedia/status/1481171650934714370
> can extensions be passed their own cli args? eg `--ext-tiddlywiki-dbname tiddlywiki2.sqlite` ?
I've thought something like this might be useful for other plugins in the past, too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1100499619,I_kwDOBm6k_c5BmEqj,1592,Row pages should show links to foreign keys,9599,simonw,open,0,,,,,1,2022-01-12T15:50:20Z,2022-01-12T15:52:17Z,,OWNER,,Refs #1518 refactor.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1592/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1102568047,I_kwDOBm6k_c5Bt9pv,1596,Documentation page warning of changes coming in 1.0,9599,simonw,open,0,,,,,0,2022-01-13T23:26:04Z,2022-01-13T23:26:04Z,,OWNER,,I should start this relatively soon.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595
I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool.
It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help
```
Commands:
serve* Serve up specified SQLite database files with a web UI
inspect
install Install Python packages - e.g.
```
```
Usage: datasette inspect [OPTIONS] [FILES]...
Options:
--inspect-file TEXT
--load-extension TEXT Path to a SQLite extension to load
--help Show this message and exit.
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1102637351,I_kwDOBm6k_c5BuOkn,1598,Replace update-docs-help.py script with cog,9599,simonw,closed,0,,,,,1,2022-01-14T00:33:27Z,2022-01-14T00:47:57Z,2022-01-14T00:47:57Z,OWNER,,"I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism:
https://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1102966378,I_kwDOBm6k_c5Bve5q,1599,Add architecture documentation,9599,simonw,open,0,,,,,0,2022-01-14T04:55:38Z,2022-01-14T04:56:03Z,,OWNER,,"Inspired by https://matklad.github.io/2021/02/06/ARCHITECTURE.md.html
Good example: https://github.com/rust-analyzer/rust-analyzer/blob/d7c99931d05e3723d878bea5dc26766791fa4e69/docs/dev/architecture.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1121121305,I_kwDOBm6k_c5C0vQZ,1618,"Reconsider policy on blocking queries containing the string ""pragma""",770231,strada,open,0,,,,,6,2022-02-01T19:39:46Z,2022-02-02T19:42:03Z,,NONE,,"First of all, thanks for creating this cool project, and also supporting publishing to various hosting services out of the box.
While testing out, I noticed legitimate queries such as
```
select * from books where title like 'Pragmatic%'
```
or
```
select * from books where title = 'The Pragmatic Programmer'
```
are blocked, due to the regular expression check here:
https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L185
Example as seen from a Datasette instance:
https://fivethirtyeight.datasettes.com/polls?sql=select+*+from+books+where+title+like+%27Pragmatic%25%27%0D%0A
I'd propose a regular expression like
```
re.compile(f""pragma_(?!({'|'.join(allowed_pragmas)}))""),
```
instead of
```
re.compile(f""pragma(?!_({'|'.join(allowed_pragmas)}))""),
```
I can create a pull request with this change, unless the maintainers think it would allow unwanted queries to be executed.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,simonw,open,0,,,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment:
datasette fixtures.db -p 3344 --setting base_url /foo/bar/
Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3
But... that `json` link goes here, which is a 404:
http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following:
- #1533
I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1122413719,I_kwDOBm6k_c5C5qyX,1621,Test against Python 3.11 dev version,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-02-02T21:38:57Z,2022-03-19T04:04:49Z,2022-02-02T21:58:54Z,OWNER,,"To avoid another surprise like we got with 3.10: https://simonwillison.net/2021/Oct/9/finding-and-reporting-a-bug/
From a quick GitHub code search it looks like `3.11-dev` should work: https://cs.github.com/urllib3/urllib3/blob/7bec77e81aa0a194c98381053225813f5347c9d2/.github/workflows/ci.yml#L60",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from:
- #1620
```
% curl -s -I 'https://latest.datasette.io/-/patterns' | grep link
link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette""
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following:
```
% curl -I 'https://latest.datasette.io/fixtures'
HTTP/1.1 200 OK
link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette""
cache-control: max-age=5
referrer-policy: no-referrer
access-control-allow-origin: *
access-control-allow-headers: Authorization
access-control-expose-headers: Link
content-type: text/html; charset=utf-8
x-databases: _memory, _internal, fixtures, extra_database
Date: Wed, 02 Feb 2022 21:55:49 GMT
Server: Google Frontend
Transfer-Encoding: chunked
% curl -I 'https://latest.datasette.io/'
HTTP/1.1 200 OK
link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette""
content-type: text/html; charset=utf-8
x-databases: _memory, _internal, fixtures, extra_database
Date: Wed, 02 Feb 2022 21:55:52 GMT
Server: Google Frontend
Transfer-Encoding: chunked
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1122450452,I_kwDOBm6k_c5C5zwU,1625,Try running tests against macOS and Windows in addition to Ubuntu,9599,simonw,open,0,,,,,0,2022-02-02T22:25:57Z,2022-02-02T22:25:57Z,,OWNER,,"I already do this for `sqlite-utils`: https://github.com/simonw/sqlite-utils/blob/3.22.1/.github/workflows/test.yml
Related:
- #1617
- #1545",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1122557010,I_kwDOBm6k_c5C6NxS,1627,Get the tests passing against Windows,9599,simonw,open,0,,,,,0,2022-02-03T01:23:06Z,2022-02-03T01:23:32Z,,OWNER,,"> OK, the tests do NOT pass against Windows! https://github.com/simonw/datasette/runs/5044105941
>
>
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1626#issuecomment-1028515161_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1108300685,I_kwDOBm6k_c5CD1ON,1604,Option to assign a domain/subdomain using `datasette publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-19T16:21:17Z,2022-01-19T16:23:54Z,,OWNER,,Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1108235694,I_kwDOBm6k_c5CDlWu,1603,A proper favicon,9599,simonw,closed,0,,,3268330,Datasette 1.0,19,2022-01-19T15:24:55Z,2022-03-19T04:04:49Z,2022-01-20T06:07:31Z,OWNER,,"Tips here: https://adamj.eu/tech/2022/01/18/how-to-add-a-favicon-to-your-django-site/ - I think a PNG served at `/favicon.ico` is the best option, since safari doesn't support SVG yet.
Relevant code: https://github.com/simonw/datasette/blob/cb29119db9115b1f40de2fb45263ed77e3bfbb3e/datasette/app.py#L182-L183
I can reuse the icon for https://datasette.io/desktop",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366
```
E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run
E return asyncio.run(self.serve(sockets=sockets))
E AttributeError: module 'asyncio' has no attribute 'run'
```
I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1108671952,I_kwDOBm6k_c5CFP3Q,1605,Scripted exports,25778,eyeseast,open,0,,,,,10,2022-01-19T23:45:55Z,2022-11-30T15:06:38Z,,CONTRIBUTOR,,"Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries.
I used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries.
This is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1109884720,I_kwDOBm6k_c5CJ38w,1609,"Ensure ""pip install datasette"" still works with Python 3.6",9599,simonw,closed,0,,,,,12,2022-01-21T00:08:10Z,2022-01-24T19:20:09Z,2022-01-21T02:24:13Z,OWNER,,"## Original title: Can I keep ""pip install datasette"" working on Python 3.6?
I dropped support for 3.6 in:
- #1577
I'm getting reports that `pip3 install datasette` throws an error on that Python, even though I haven't made that new release yet - presumably due to lack of pinning of Uvicorn: https://twitter.com/ldodds/status/1484289475195080706
Is it possible to get `pip` on that version of Python to install the highest possible version of the packages that are still known to support Python 3.6?
If so, how?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1609/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike.
These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that.
On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1113384383,I_kwDOBm6k_c5CXOW_,1611,Avoid ever running count(*) against SpatiaLite KNN table,9599,simonw,open,0,,,,,1,2022-01-25T03:32:54Z,2022-02-02T06:45:47Z,,OWNER,,"Got this in a trace:
Looks like running `count(*)` against KNN took 83s! It ignored the time limit. And still only returned a count of 0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1114147905,I_kwDOBm6k_c5CaIxB,1612,Move canned queries closer to the SQL input area,639012,jsfenfen,closed,0,,,3268330,Datasette 1.0,5,2022-01-25T17:06:39Z,2022-03-19T04:04:49Z,2022-01-25T18:34:21Z,CONTRIBUTOR,,"*Original title: Consider placing example queries above the sql input?*
Hi! Have been enjoying deploying ad hoc datasettes for collaborators to pick over!
I keep finding myself manually ""fixing"" the database.html template so that the ""example queries"" (canned queries) appear directly *over* the sql box? So they are sorta more a suggestion for collaborators who aren't inclined to write their own queries?
My sense is any time I go to the trouble of writing canned queries my users should see 'em?
(( I have also considered a client-side reactive-ish option where selecting a query just places the raw SQL in the box and doesn't execute it, but this seems to end up being an inconvenience, rather than a teaching tool. ))
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1114628238,I_kwDOBm6k_c5Cb-CO,1613,Improvements to help make Datasette a better tool for learning SQL,9599,simonw,open,0,,,,,5,2022-01-26T04:56:07Z,2022-01-26T16:41:46Z,,OWNER,,Tracking issue for the general goal of making Datasette a better tool for learning SQL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1613/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1115435536,I_kwDOBm6k_c5CfDIQ,1614,Try again with SQLite codemirror support,9599,simonw,open,0,,,,,1,2022-01-26T20:05:20Z,2022-12-23T21:27:10Z,,OWNER,,"I tried and failed to implement autocomplete a while ago. Relevant code:
https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335
Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1117132741,I_kwDOBm6k_c5ClhfF,1615,Potential simplified publishing mechanism,369053,aidansteele,closed,0,,,,,2,2022-01-28T08:34:50Z,2022-02-02T07:34:21Z,2022-02-02T07:34:17Z,NONE,,"Hi,
Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet.
I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services.
So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting.
E.g. user `johndoe` creates a repo `my-animals` with a couple of files: `dogs.csv`, `cats.csv` and the following GitHub Actions workflow:
```yaml
# .github/workflows/push.yml
on:
push
# this allows the publish action to use OIDC to authenticate johndoe/my-animals
permissions:
id-token: write
contents: read
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/setup-python@v2
- run: pip install sqlite-utils
- uses: actions/checkout@v2
- run: |
set -eux
sqlite-utils create-database animals.db
sqlite-utils insert animals.db dogs dogs.csv --csv
sqlite-utils insert animals.db cats cats.csv --csv
- uses: datasette-hub/publish@v1
with:
db: animals.db
metadata: meta.yml
# this step is helpful for debugging why the
# generated sqlite db was rejected
- uses: actions/upload-artifact@v2
if: failure()
with:
path: animals.db
retention-days: 1
```
This would then cause a Datasette instance to be available at `https://johndoe-my-animals.datasette-hub.test/`. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette.
What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with [git scraping](https://simonwillison.net/2020/Oct/9/git-scraping/).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1138008042,I_kwDOBm6k_c5D1J_q,1636,"""permissions"" propery in metadata for configuring arbitrary permissions",9599,simonw,closed,0,,,8711695, Datasette 1.0a2,14,2022-02-15T00:25:59Z,2022-12-13T02:40:50Z,2022-12-13T02:40:50Z,OWNER,,"The `""allow""` block mechanism can already be used to configure various default permissions. When adding permissions to `datasette-tiddlywiki` I realized it would be good to be able to configure arbitrary permissions such as `edit-tiddlywiki` there too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1125576543,I_kwDOBm6k_c5DFu9f,1630,Review datasette.utils and decide which functions should be documented for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-02-07T06:39:52Z,2022-02-07T06:39:52Z,,OWNER,,"Follows:
- #1176",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ```
> % mkdir /tmp/data
> % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data
> % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852
> ...
> INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit)
> ^CINFO: Shutting down
> % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852
> ...
> INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK
> ```
> The first time I ran Datasette I got two databases - `fixtures` and `created`
>
> BUT... when I ran Datasette the second time it looked like this:
>
>
>
> This is the same result you get if you run:
>
> datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db
>
> This is caused by this Datasette issue:
> - https://github.com/simonw/datasette/issues/509
>
> So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831
_Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,henrikek,open,0,,,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url.
![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png)
And the result is:
![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png)
If I add the marked row to url_builder.py it seams to work:
![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this:
```Dockerfile
FROM python:3.8
COPY . /app
WORKDIR /app
ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2'
RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql
RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json
ENV PORT 8080
EXPOSE 8080
CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db
```
This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile
Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one:
```plain
some-directory/
example.sqlite3
another-example.db
one-more.custom
[...]
```
(In my scenario I can't just change the filename extension without other problems arising)
Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1142107925,I_kwDOBm6k_c5EEy8V,1638,`filters_from_request` plugin hook docs should mention that returning an async function is allowed,9599,simonw,open,0,,,,,0,2022-02-18T00:08:26Z,2022-02-18T00:08:26Z,,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#filters-from-request-request-database-table-datasette doesn't mention that you can return an `async` function - but you can, and in fact Datasette itself uses that here: https://github.com/simonw/datasette/blob/aa7f0037a46eb76ae6fe9bf2a1f616c58738ecdf/datasette/filters.py#L43-L47",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1148638868,I_kwDOBm6k_c5EdtaU,1639,Make datasette-redirect-forbidden unneccessary,9599,simonw,open,0,,,,,0,2022-02-23T22:18:46Z,2022-02-23T22:18:46Z,,OWNER,,"I wrote `datasette-redirect-forbidden` today because I needed 403 errors to redirect to `/-/login` and it was the quickest way to solve that problem.
This should be a feature of Datasette core.
- https://github.com/simonw/datasette-redirect-forbidden/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,broccolihighkicks,open,0,,,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron.
I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc).
I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes).
```python
Traceback (most recent call last):
File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path
response = await view(request, send)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static
await asgi_send_file(send, full_path, chunk_size=chunk_size)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file
await send(
File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send
await send(event)
File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send
output = self.conn.send(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send
data_list = self.send_with_data_passthrough(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough
writer(event, data_list.append)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__
self.send_data(event.data, write)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data
raise LocalProtocolError(""Too much data for declared Content-Length"")
h11._util.LocalProtocolError: Too much data for declared Content-Length
ERROR: Exception in ASGI application
Traceback (most recent call last):
File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path
response = await view(request, send)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static
await asgi_send_file(send, full_path, chunk_size=chunk_size)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file
await send(
File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send
await send(event)
File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send
output = self.conn.send(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send
data_list = self.send_with_data_passthrough(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough
writer(event, data_list.append)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__
self.send_data(event.data, write)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data
raise LocalProtocolError(""Too much data for declared Content-Length"")
h11._util.LocalProtocolError: Too much data for declared Content-Length
```
Thanks, I am finding Datasette very useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1149310456,I_kwDOBm6k_c5EgRX4,1641,Tweak mobile keyboard settings,9599,simonw,open,0,,,,,1,2022-02-24T13:47:10Z,2022-02-24T13:49:26Z,,OWNER,,"https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12
`autocorrect=""off""` is worth experimenting with.
Twitter: https://twitter.com/forestgregg/status/1496842959563726852",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1152072027,I_kwDOBm6k_c5Eqzlb,1642,Dependency issue with asgiref and uvicorn,9599,simonw,closed,0,,,,,1,2022-02-26T18:00:35Z,2022-03-05T01:11:27Z,2022-03-05T01:11:17Z,OWNER,,"```
ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.
datasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible.
```
That's after I forced an upgrade of `uvicorn` due to this warning:
```
ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.
uvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible.
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1154399841,I_kwDOBm6k_c5Ezr5h,1645,"Sensible `cache-control` headers for static assets, including those served by plugins",697092,curiousleo,open,0,,,3268330,Datasette 1.0,4,2022-02-28T18:12:03Z,2022-03-08T02:59:29Z,,NONE,,"## What I'm seeing
With `default_cache_ttl = 86400`, I see the following:
A table view returns `Cache-control: max-age=86400`:
![Screenshot_20220228_190000](https://user-images.githubusercontent.com/697092/156034352-4d64683e-39c8-49af-81df-0217a5957bbd.png)
A static asset returns no `Cache-control` header:
![Screenshot_20220228_185933](https://user-images.githubusercontent.com/697092/156034363-d0b03cc2-5889-4ed2-b601-8c1846b8469a.png)
## What I expected to see
I expected the static asset to return a `Cache-control` header indicating that this response can be cached.
## Why this matters
I'm productionising a Datasette deployment right now and was looking into putting it behind a Varnish instance. I was surprised to see requests for static assets being served from Datasette rather than Varnish, this is what led me to look more closely at the response headers.
While Datasette serves those static assets pretty quickly, I don't see why Datasette should serve them. By their nature, static assets like images and JS files are very cacheable, so it should be easy to serve them from a cache like Varnish.
(Note that Varnish can easily be configured to override this header, enabling caching for static assets. But it would be better if this override was not necessary.)
## Discussion
It seems clear to me that serving static assets without a `Cache-control` header is not ideal.
I see two options here:
A. Static assets use the same logic as table / SQL views to set the `Cache-control` header based on `default_cache_ttl`.
B. An additional setting for static assets is introduced (`default_static_cache_ttl`, say).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1645/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1160407071,I_kwDOBm6k_c5FKmgf,1647,Test failures with SQLite 3.37.0+ due to column affinity case,9599,simonw,closed,0,,,,,5,2022-03-05T17:37:46Z,2022-03-05T19:56:28Z,2022-03-05T19:47:04Z,OWNER,,"These three tests are failing on my local machine:
```
FAILED tests/test_internals_database.py::test_table_column_details[facetable-expected0] - AssertionError: assert [Column(cid=0, name='pk', type='INTEGER', no...
FAILED tests/test_internals_database.py::test_table_column_details[sortable-expected1] - AssertionError: assert [Column(cid=0, name='pk1', type='varchar(30)'...
FAILED tests/test_table_html.py::test_sort_links - AssertionError: assert [{'a_href': None,\n 'attrs': {'class': ['col-Link'],\n 'data-column': '...
```
I ran `pytest --lf -vv` and the output had things like this in it:
```
E - Column(cid=1, name='created', type='text', notnull=0, default_value=None, is_pk=0, hidden=0),
E ? ^^^^
E + Column(cid=1, name='created', type='TEXT', notnull=0, default_value=None, is_pk=0, hidden=0),
...
E {'a_href': '/fixtures/sortable?_sort=sortable_with_nulls_2',
E 'attrs': {'class': ['col-sortable_with_nulls_2'],
E 'data-column': 'sortable_with_nulls_2',
E 'data-column-not-null': '0',
E - 'data-column-type': 'real',
E ? ^^^^
E + 'data-column-type': 'REAL',
E ? ^^^^
```
Something is causing column types to come back in uppercase where previously they were lowercase.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1160750713,I_kwDOBm6k_c5FL6Z5,1650,Implement redirects from old % encoding to new dash encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,5,2022-03-06T23:40:02Z,2022-03-07T19:26:15Z,2022-03-07T19:26:14Z,OWNER,,"> One big advantage to this scheme is that redirecting old links to `%2F` pages (e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators) is easy - if you see a `%` in the `raw_path`, redirect to that page with the `%` replaced by `-`.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1439#issuecomment-1060044007_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1161584460,I_kwDOBm6k_c5FPF9M,1651,Get rid of the no-longer necessary ?_format=json hack for tables called x.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-07T15:40:42Z,2022-03-19T04:04:50Z,2022-03-15T18:25:42Z,OWNER,,"Tidy up from:
- #1439",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1161937073,I_kwDOBm6k_c5FQcCx,1653,Mechanism to default a table to sorting by multiple columns,9599,simonw,open,0,,,,,2,2022-03-07T21:20:11Z,2022-03-07T21:23:39Z,,OWNER,,"### Discussed in https://github.com/simonw/datasette/discussions/1652
Originally posted by **zaneselvans** March 7, 2022
It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order):
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: created_time
```
But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette?
```sql
SELECT
respondent_id,
report_year,
spplmnt_num,
row_number,
row_seq,
row_prvlg,
acct_num,
depr_plnt_base,
est_avg_srvce_lf,
net_salvage,
apply_depr_rate,
mrtlty_crv_typ,
avg_remaining_lf,
report_prd
FROM
f1_edcfu_epda
WHERE
respondent_id = 210
AND report_year = 2020
ORDER BY
report_year, report_prd, respondent_id, spplmnt_num, row_number
LIMIT
1000
```
The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key.
I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names...
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: ""(report_year, report_prd, respondent_id, spplmnt_num, row_number)""
```
and they all give me server errors like:
```
Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number)
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1161969891,I_kwDOBm6k_c5FQkDj,1654,Adopt a code of conduct,9599,simonw,closed,0,,,,,5,2022-03-07T22:00:24Z,2022-03-07T22:19:35Z,2022-03-07T22:19:35Z,OWNER,,"This is long overdue, especially given the size of the project now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1163369515,I_kwDOBm6k_c5FV5wr,1655,query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data,536941,fgregg,open,0,,,,,8,2022-03-09T00:56:40Z,2023-10-17T21:53:17Z,,CONTRIBUTOR,,"[this page](https://labordata.bunkum.us/opdr-8335ea3?sql=with+most_recent_lu+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%28%0D%0A++++++select%0D%0A++++++++*%0D%0A++++++from%0D%0A++++++++lm_data%0D%0A++++++order+by%0D%0A++++++++f_num%2C%0D%0A++++++++receive_date+desc%0D%0A++++%29+t%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++aff_abbr+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+abbr_local_name%2C%0D%0A++coalesce%28%0D%0A++++regexp_match%28%27%28.*%3F%29%28%2C%3F+AFL-CIO%24%29%27%2C+union_name%29%2C%0D%0A++++regexp_match%28%27%28.*%3F%29%28+IND%24%29%27%2C+union_name%29%2C%0D%0A++++union_name%0D%0A++%29+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+full_local_name%2C%0D%0A++*%0D%0Afrom%0D%0A++most_recent_lu%0D%0Awhere+%28desig_num+IS+NOT+NULL+OR+unit_name+IS+NOT+NULL%29+AND+desig_name+%21%3D+%27HQ%27%0D%0Alimit%0D%0A++5000+offset+0)
is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb.
it's using over a 1G on chrome 99.
i found this because, i was trying to figure out why editing the SQL was getting very slow.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful.
- A utility function that resolves a path against a list of reflexes and returns the match
- Make the routes and regular expressions available from a private Datasette method
- Add tests that exercise them
Related:
- #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174306154,I_kwDOBm6k_c5F_n1q,1668,"Introduce concept of a database `route`, separate from its name",9599,simonw,closed,0,,,3268330,Datasette 1.0,20,2022-03-19T16:48:28Z,2022-03-20T16:43:16Z,2022-03-20T16:43:16Z,OWNER,,"Some issues came up in the new `datasette-hashed-urls` plugin relating to the way it renames databases on startup to achieve unique URLs that depend on the database SHA-256 content:
- https://github.com/simonw/datasette-hashed-urls/issues/10
- https://github.com/simonw/datasette-hashed-urls/issues/9
- https://github.com/simonw/datasette-hashed-urls/issues/8
All three of these could be addressed by making the ""path"" concept for a database (the `/foo` bit where it is served) work independently of the database's name, which would be used for default display and also as the alias when configuring cross-database aliases.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`.
https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_
Part of:
- #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1168995756,I_kwDOBm6k_c5FrXWs,1657,Tilde encoding: use ~ instead of - for dash-encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2022-03-14T22:55:17Z,2022-03-15T18:25:11Z,2022-03-15T18:01:58Z,OWNER,,Refs #1439,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1169840669,I_kwDOBm6k_c5Fulod,1658,Revert main to version that passes tests,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-15T15:37:02Z,2022-03-19T04:04:50Z,2022-03-15T15:42:58Z,OWNER,,"> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1658/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1170144879,I_kwDOBm6k_c5Fvv5v,1660,Refactor and simplify Datasette routing and views,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-15T19:56:56Z,2022-03-21T19:19:12Z,2022-03-21T19:19:01Z,OWNER,,"While working on:
- https://github.com/simonw/datasette/issues/1657
- https://github.com/simonw/datasette/issues/1439
It became very clear that the least maintainable part of Datasette at the moment is the way routing to the database, table and row views work - in particular the subclassing mechanism with BaseView and DataView, but also the complex variety of ways in which the URL routes capture different named regular expression groups.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead:
- #647
- https://github.com/simonw/datasette-hashed-urls/issues/3
https://github.com/simonw/datasette-hashed-urls
Sub-tasks:
- [x] Remove hashed URL mode implementation
- [x] Update documentation
- [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly.
* [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html)
* [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/)
* [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features:
- `db.hash`
- `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1189113609,I_kwDOBm6k_c5G4G8J,1697,"`Request.fake(..., url_vars={})`",9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-04-01T01:48:40Z,2022-04-01T02:02:18Z,2022-04-01T02:02:10Z,OWNER,,"I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well:
```python
from datasette.utils.asgi import Request
class Request(Request):
@classmethod
def fake(cls, path_with_query_string, method=""GET"", scheme=""http"", url_vars=None):
""""""Useful for constructing Request objects for tests""""""
path, _, query_string = path_with_query_string.partition(""?"")
scope = {
""http_version"": ""1.1"",
""method"": method,
""path"": path,
""raw_path"": path_with_query_string.encode(""latin-1""),
""query_string"": query_string.encode(""latin-1""),
""scheme"": scheme,
""type"": ""http"",
}
if url_vars:
scope[""url_route""] = {
""kwargs"": url_vars
}
return cls(scope, None)
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174655187,I_kwDOBm6k_c5GA9DT,1671,Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply,9308268,rayvoelker,open,0,,,,,8,2022-03-20T19:17:24Z,2022-03-22T17:43:12Z,,NONE,,"I found a strange behavior, and I'm not sure if it's related to views and boolean values perhaps, or if there's something else weird going on here, but I'll provide an example that may help show what I'm seeing happen.
```bash
#!/bin/bash
echo ""\""id\"",\""expiration_date\""
0,2018-01-04
1,2019-01-05
2,2020-01-06
3,2021-01-07
4,2022-01-08
5,2023-01-09
6,2024-01-10
7,2025-01-11
8,2026-01-12
9,2027-01-13
"" > test.csv
csvs-to-sqlite test.csv test.db
sqlite-utils create-view --replace test.db test_view ""select id, expiration_date, case when julianday('NOW') >= julianday(expiration_date) then 1 else 0 end as has_expired FROM test""
```
```bash
datasette test.db
```
![image](https://user-images.githubusercontent.com/9308268/159178745-9c6152f7-eac6-4bf9-bef5-a2d63d3ee13f.png)
![image](https://user-images.githubusercontent.com/9308268/159178824-c8952137-270c-42a4-ad1c-f6ad2c51e499.png)
![image](https://user-images.githubusercontent.com/9308268/159178877-23e00b36-443a-43ef-83e5-e0bdddd3fdcd.png)
![image](https://user-images.githubusercontent.com/9308268/159178918-65922cc7-2514-4735-a72d-4904b99976d4.png)
Thanks again and let me know if you want me to provide anything else!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change.
After shipping, release a new `datasette-hashed-urls` that depends on it, also this:
- https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174697144,I_kwDOBm6k_c5GBHS4,1672,Refactor CSV handling code out of DataView,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T21:47:00Z,2022-03-20T21:52:39Z,,OWNER,,"> I think the way to get rid of most of the remaining complexity in `DataView` is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1672/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1174708375,I_kwDOBm6k_c5GBKCX,1673,Streaming CSV spends a lot of time in `table_column_details`,9599,simonw,open,0,,,,,1,2022-03-20T22:25:28Z,2022-03-20T22:34:06Z,,OWNER,,"At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do:
datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on'
While investigating:
- #1355
And spotted this:
```
datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2)
Total Samples 5800
GIL: 71.00%, Active: 98.00%, Threads: 4
%Own %Total OwnTime TotalTime Function (filename:line)
8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212)
5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614)
6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81)
7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120)
5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571)
4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140)
```
Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1174717287,I_kwDOBm6k_c5GBMNn,1674,Tweak design of /.json,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T22:58:01Z,2022-03-20T22:58:40Z,,OWNER,,"https://latest.datasette.io/.json
Currently:
```json
{
""_memory"": {
""name"": ""_memory"",
""hash"": null,
""color"": ""a6c7b9"",
""path"": ""/_memory"",
""tables_and_views_truncated"": [],
""tables_and_views_more"": false,
""tables_count"": 0,
""table_rows_sum"": 0,
""show_table_row_counts"": false,
""hidden_table_rows_sum"": 0,
""hidden_tables_count"": 0,
""views_count"": 0,
""private"": false
},
""fixtures"": {
""name"": ""fixtures"",
""hash"": ""645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa"",
""color"": ""645005"",
""path"": ""/fixtures"",
""tables_and_views_truncated"": [
{
""name"": ""compound_three_primary_keys"",
""columns"": [
""pk1"",
""pk2"",
""pk3"",
""content""
],
""primary_keys"": [
""pk1"",
""pk2"",
""pk3""
],
""count"": 1001,
""hidden"": false,
""fts_table"": null,
""num_relationships_for_sorting"": 0,
""private"": false
},
```
As-of this issue the `""path""` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns:
- #1668",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions
>
>> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted.
>>
>> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`:
>
> That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1175694248,I_kwDOBm6k_c5GE6uo,1677,Remove `check_permission()` from `BaseView`,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:18:18Z,2022-03-21T18:45:04Z,2022-03-21T18:45:03Z,OWNER,,"Follow-on from:
- #1675
Refs:
- #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1175648453,I_kwDOBm6k_c5GEvjF,1675,Extract out `check_permissions()` from `BaseView,9599,simonw,closed,0,,,,,7,2022-03-21T16:39:46Z,2022-03-21T17:14:31Z,2022-03-21T17:13:21Z,OWNER,,"> I'm going to refactor this stuff out and document it so it can be easily used by plugins:
https://github.com/simonw/datasette/blob/4a4164b81191dec35e423486a208b05a9edc65e4/datasette/views/base.py#L69-L103
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1074136176_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1175715988,I_kwDOBm6k_c5GFACU,1678,Make `check_visibility()` a documented API,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:30:34Z,2022-03-21T19:04:03Z,2022-03-21T19:01:46Z,OWNER,,"Spotted this while working on:
- #1677
https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1175854982,I_kwDOBm6k_c5GFh-G,1679,Research: how much overhead does the n=1 time limit have?,9599,simonw,closed,0,,,3268330,Datasette 1.0,11,2022-03-21T19:27:46Z,2022-03-21T21:55:57Z,2022-03-21T21:55:56Z,OWNER,,"https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200
```python
@contextmanager
def sqlite_timelimit(conn, ms):
deadline = time.perf_counter() + (ms / 1000)
# n is the number of SQLite virtual machine instructions that will be
# executed between each check. It's hard to know what to pick here.
# After some experimentation, I've decided to go with 1000 by default and
# 1 for time limits that are less than 50ms
n = 1000
if ms < 50:
n = 1
def handler():
if time.perf_counter() >= deadline:
return 1
conn.set_progress_handler(handler, n)
try:
yield
finally:
conn.set_progress_handler(None, n)
```
How often do I set a time limit of 50 or less? How much slower does it go thanks to this code?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1175894898,I_kwDOBm6k_c5GFrty,1680,Consider simplifying permissions for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-03-21T20:17:29Z,2022-03-21T20:17:29Z,,OWNER,,"Permission checks right now can express one of three opinions:
- `False` means ""so not grant this permisson""
- `True` means ""grant this permission""
- `None` means ""I have no opinion""
But... there's also a concept of a ""default"" for a given permission check, which might be `False` or `True`.
I worry this is too complicated. Could this be simplified before 1.0? In particular the default concept.
See also:
- #1676 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1177101697,I_kwDOBm6k_c5GKSWB,1681,Potential bug in numeric handling where_clause for filters,9599,simonw,open,0,,,,,2,2022-03-22T17:43:50Z,2022-03-22T17:49:09Z,,OWNER,,"> Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`:
>
> https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212
>
> Though... it looks like there's a bug in that? It doesn't account for `float` values - `""3.5"".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1178521513,I_kwDOBm6k_c5GPs-p,1682,SQL queries against databases with different routes are broken,9599,simonw,closed,0,,,,,1,2022-03-23T18:42:57Z,2022-03-23T18:48:16Z,2022-03-23T18:48:16Z,OWNER,,"500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable
Here's the trace:
```
File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 54, in data
return await QueryView(self.ds).data(
File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 232, in data
self.ds.get_database(database), sql
File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py"", line 401, in get_database
return self.databases[name]
KeyError: 'fixtures-aa7318b'
```
It looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago!
_Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1682/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1179928510,I_kwDOBm6k_c5GVEe-,1683,allow_facet: False should be respected by column cog menu,9599,simonw,closed,0,,,,,0,2022-03-24T19:05:06Z,2022-03-24T19:16:36Z,2022-03-24T19:16:36Z,OWNER,,"The column cog menu currently shows ""Facet by this"" even if faceting is disabled for the Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1179998071,I_kwDOBm6k_c5GVVd3,1684,Mechanism for disabling faceting on large tables only,9599,simonw,open,0,,,,,1,2022-03-24T20:06:11Z,2022-03-24T20:13:19Z,,OWNER,,"Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions
One option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1181037277,I_kwDOBm6k_c5GZTLd,1686,heroku bails if app name specifed in datasette publish is the same as existing app,2115933,tlongers,open,0,,,,,0,2022-03-25T17:10:34Z,2022-03-25T17:10:34Z,,NONE,,"Seem that `heroku` does not accept an app overwrite triggered by specifying the app name using `datasette publish`, as below:
```
datasette publish heroku some.db --name ""jazzy-name""
```
The resulting error has the below traceback:
```
Creating jazzy-name... !
▸ Name jazzy-name is already taken
Traceback (most recent call last):
File ""/opt/homebrew/bin/datasette"", line 33, in
sys.exit(load_entry_point('datasette==0.60.1', 'console_scripts', 'datasette')())
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__
return self.main(*args, **kwargs)
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1053, in main
rv = self.invoke(ctx)
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 754, in invoke
return __callback(*args, **kwargs)
File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/datasette/publish/heroku.py"", line 127, in heroku
create_output = check_output(cmd).decode(""utf8"")
File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 420, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'jazzy-name', '--json']' returned non-zero exit status 1.
```
It's a solid failsafe, but does `datasette publish` have a way to force an overwrite?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1181364043,I_kwDOBm6k_c5Gai9L,1687,Make show_json.html or a similar mechanism stable for plugins,9599,simonw,open,0,,,,,0,2022-03-25T23:42:45Z,2022-03-25T23:42:45Z,,OWNER,,"I used `show_json.html` in the new `datasette-packages` plugin, which means it will break if that template changes:
- https://github.com/simonw/datasette-packages/issues/3
It would be useful if it (or something like it) was documented and stable for plugins to use.
Also relevant:
- #878",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1687/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`.
I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS.
https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13
Unfortunately, datasette doesn't seem to be able to find my assets.
Input:
```bash
datasette ~/Library/Safari/History.db --plugins-dir=plugins/
```
![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg)
Output:
![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg)
I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code.
1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package?
2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path?
Thanks for your help!
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1182227211,I_kwDOBm6k_c5Gd1sL,1692,[plugins][feature request]: Support additional script tag attributes when loading custom JS,9020979,hydrosquall,open,0,,,,,2,2022-03-27T01:16:03Z,2022-03-30T06:14:51Z,,CONTRIBUTOR,,"## Motivation
- The build system for my new [plugin](https://github.com/hydrosquall/datasette-nteract-data-explorer) has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette.
- I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers.
To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, `nomodule` and `defer`. My injected scripts should look something like this:
```html
```
## Proposal
To achieve this, I propose additional optional properties to the API accepted by the `extra_js_urls` hook and custom JS field the `metadata.json` [described here](https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript).
Under this API, I'd write something like this to get the above HTML rendered in Datasette.
```json
{
""extra_js_urls"": [
{
""url"": ""/index.my-es-module-bundle.js"",
""module"": true,
},
{
""url"": ""/index.my-legacy-fallback-bundle.js"",
""nomodule"": """",
""defer"": true
}
]
}
```
## Resources
- [MDN on the script tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script)
- There may be other properties that could be added that are potentially valuable, like `async` or `referrerpolicy`, but I don't have an immediate need for those.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1182065616,I_kwDOBm6k_c5GdOPQ,1689,datasette.add_message() documentation is incorrect,9599,simonw,closed,0,,,,,1,2022-03-26T20:49:42Z,2022-03-26T21:35:57Z,2022-03-26T20:51:21Z,OWNER,,"https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says:
`.add_message(request, message, message_type=datasette.INFO)`
But in the code it's:
https://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1182141761,I_kwDOBm6k_c5Gdg1B,1690,"Idea: `datasette.set_actor_cookie(response, actor)`",9599,simonw,open,0,,,,,2,2022-03-26T22:41:52Z,2022-03-26T22:43:00Z,,OWNER,,"I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/__init__.py#L79-L92
```python
redirect_response = Response.redirect(""/"")
expires_at = int(time.time()) + (24 * 60 * 60)
redirect_response.set_cookie(
""ds_actor"",
datasette.sign(
{
""a"": profile_response.json(),
""e"": baseconv.base62.encode(expires_at),
},
""actor"",
),
)
return redirect_response
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1182143895,I_kwDOBm6k_c5GdhWX,1691,Bug in pytest-httpx example,9599,simonw,closed,0,,,,,0,2022-03-26T22:45:30Z,2022-03-26T22:46:09Z,2022-03-26T22:46:09Z,OWNER,,"https://docs.datasette.io/en/0.61.1/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx says:
```python
async def test_outbound_http_call(httpx_mock):
httpx_mock.add_response(
url='https://www.example.com/',
data='Hello world',
)
```
That's wrong - `data=` should be `text=`.
https://github.com/Colin-b/pytest_httpx/blob/v0.20.0/README.md#reply-with-custom-body",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1185868354,I_kwDOBm6k_c5GrupC,1695,Option to un-filter facet not shown for `?col__exact=value`,9599,simonw,open,0,,,,,2,2022-03-30T04:44:02Z,2022-03-30T04:46:18Z,,OWNER,,"Spotted this on a page with `COUNTY__exact=Lee` in the URL:
![CleanShot 2022-03-29 at 21 41 46@2x](https://user-images.githubusercontent.com/9599/160752849-a9039343-3770-4655-920b-f19e25687a57.png)
With `COUNTY=Lee` you get this instead:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1186696202,I_kwDOBm6k_c5Gu4wK,1696,Show foreign key label when filtering,9599,simonw,open,0,,,,,2,2022-03-30T16:18:54Z,2023-01-29T20:56:20Z,,OWNER,,"For example here:
3 corresponds to ""Human Related: Other"" - it would be neat to display this in this area of the page somehow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1193090967,I_kwDOBm6k_c5HHR-X,1699,Proposal: datasette query,25778,eyeseast,open,0,,,,,6,2022-04-05T12:36:43Z,2022-04-11T01:32:12Z,,CONTRIBUTOR,,"I started sketching out a plugin to add a `datasette query` subcommand to export data from the command line. This is based on discussions in #1356 and #1605. Before I get too far down this rabbit hole, I figure it's worth getting some feedback here (unless this should happen in `Discussions`). Here's what I'm thinking:
At its most basic, it will write the results of a query to STDOUT.
```sh
datasette query -d data.db 'select * from data' > results.json
```
This isn't much improvement over using [sqlite-utils](https://github.com/simonw/sqlite-utils). To make better use of datasette and its ecosystem, run `datasette query` using a canned query defined in a `metadata.yml` file.
For example, using the metadata file from [alltheplaces-datasette](https://github.com/eyeseast/alltheplaces-datasette/blob/main/metadata.yml):
```sh
cd alltheplaces-datasette
datasette query -d alltheplaces.db -m metadata.yml count_by_spider
```
That query would be good to get as CSV, and we can auto-discover metadata and databases in the current directory:
```sh
cd alltheplaces-datasette
datasette query count_by_spider -f csv
```
In this case, `count_by_spider` is a canned query defined on the `alltheplaces` database. If the same query is defined on multiple databases or its otherwise unclear which database `query` should use, pass the `-d` or `--database` option.
If a query takes parameters, I can pass them in at runtime, using the `--param` or `-p` option:
```sh
datasette query -d data.db -p value something 'select * from neighborhoods where some_column = :value'
```
I'm very interested in feedback on this, including whether it should be a plugin or in Datasette core. (I don't have a strong opinion about this, but I'm prototyping it as a plugin to start.)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1194790504,I_kwDOBm6k_c5HNw5o,1701,Use + for spaces instead of ~20,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-04-06T15:40:48Z,2022-04-06T15:55:10Z,2022-04-06T15:55:05Z,OWNER,,"Tilde encoding introduced in #1657 means that database files with spaces in the name - e.g. the Apple Mail `Envelope Index` database - end up with URLs like this:
http://127.0.0.1:8001/Envelope~20Index
I think this would be prettier:
http://127.0.0.1:9933/Envelope+Index",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1196327155,I_kwDOBm6k_c5HToDz,1702,Be more consistent with column quoting,9599,simonw,open,0,,,,,0,2022-04-07T16:59:20Z,2022-04-07T16:59:20Z,,OWNER,,"This tutorial made me notice that Datasette is pretty inconsistent with how column quoting works: https://datasette.io/tutorials/learn-sql
It has examples of each of `""table_name""` and `[table_name]` and `table_name`, and it uses single quoted values too.
Datasette should generate SQL as consistently as possible to support learners.
That tutorial should also provide a tiny bit of extra information about what's going on here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1197925865,I_kwDOBm6k_c5HZuXp,1704,File PRs against incompatible plugins pinning to datasette<1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-04-08T23:15:30Z,2022-04-08T23:15:30Z,,OWNER,,"As part of the preparation for the 1.0 release, test all existing known plugins against the alpha.
For any that break, submit a PR suggesting they pin to a version <1.0 - and include a link to the documentation on how to upgrade the plugin for 1.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1197926598,I_kwDOBm6k_c5HZujG,1705,How to upgrade your plugin for 1.0 documentation,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-08T23:16:47Z,2022-12-13T05:29:05Z,,OWNER,,"Among other things, needed by:
- #1704",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1198822563,I_kwDOBm6k_c5HdJSj,1706,"[feature] immutable mode for a directory, not just individual sqlite file",9020979,hydrosquall,open,0,,,,,4,2022-04-10T00:50:57Z,2022-12-09T19:11:40Z,,CONTRIBUTOR,,"## Motivation
- I have a directory of sqlite databases
- I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode)
- Currently using this flag throws the following error
IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory'
## Proposal
Immutable flag works for both single files and directories
datasette -i /folder-of-sqlite-files",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1706/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1200224939,I_kwDOBm6k_c5Hifqr,1707,[feature] expanded detail page,536941,fgregg,open,0,,,,,1,2022-04-11T16:29:17Z,2022-04-11T16:33:00Z,,CONTRIBUTOR,,"Right now, if click on the detail page for a row you get the info for the row and links to related tables:
![Screenshot 2022-04-11 at 12-27-26 lm20 filing](https://user-images.githubusercontent.com/536941/162786802-90ac1a71-4624-47c4-ae55-b783f4f6c92d.png)
It would be very cool if there was an option to expand the rows of the related tables from within this detail view.
If you had that then datasette could fulfill a pretty common use case where you want to search for an entity and get a consolidate detail view about what you know about that entity.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1707/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1200649124,I_kwDOBm6k_c5HkHOk,1708,Datasette 1.0 alpha upcoming release notes,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,2,2022-04-11T22:57:12Z,2022-12-13T05:29:06Z,,OWNER,,"I'm going to try writing the release notes first, to see if that helps unblock me.
# ⚠️ Any release notes in this issue are a draft, and should not be treated as the real thing ⚠️ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1200649502,I_kwDOBm6k_c5HkHUe,1709,Redesigned JSON API with ?_extra= parameters,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-11T22:57:49Z,2022-12-13T05:29:06Z,,OWNER,,This will be the single biggest breaking change for the 1.0 release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1709/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1200649889,I_kwDOBm6k_c5HkHah,1710,Guide for plugin authors to upgrade their plugins for 1.0,9599,simonw,closed,0,,,,,1,2022-04-11T22:58:25Z,2022-04-11T23:04:01Z,2022-04-11T23:03:25Z,OWNER,,I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1200650491,I_kwDOBm6k_c5HkHj7,1711,Template context powered entirely by the JSON API format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-11T22:59:27Z,2022-12-13T05:29:06Z,,OWNER,,Datasette 1.0 will have a stable template context. I'm going to achieve this by refactoring the templates to work only with keys returned by the API (or some of its extras) - then the API documentation will double up as template documentation.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1711/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846
![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1203943272,I_kwDOBm6k_c5Hwrdo,1713,Datasette feature for publishing snapshots of query results,9599,simonw,open,0,,,,,5,2022-04-14T01:42:00Z,2022-07-04T05:16:35Z,,OWNER,,"https://twitter.com/simonw/status/1514392335718645760
> Maybe [@datasetteproj](https://twitter.com/datasetteproj) should grow a feature that lets you cache the results of a query and give that snapshot a stable permalink
>
> A plugin that publishes the JSON output of a query to an S3 bucket would be pretty neat... especially if it could also be configured to re-publish the results on a schedule
A lot of people said they would find this useful.
Probably going to build this as a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,tannewt,open,0,,,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db.
Datasette should unescape the url component before passing them into the template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1223234932,I_kwDOBm6k_c5I6RV0,1733,Get Datasette compatible with Pyodide,9599,simonw,closed,0,,,,,9,2022-05-02T19:01:58Z,2022-05-04T15:14:01Z,2022-05-02T20:15:27Z,OWNER,,"I've already got this working as a prototype. Here are the changes I had to make:
- Replace the two dependencies that don't publish pure Python wheels to PyPI: `click-default-group` and `python-baseconv`
- Get Datasette to work without threading - which it turns out is exclusively used for database connections
- Make the `uvicorn` dependency optional (only needed when Datasette runs in the CLI)
TODO:
- [x] Switch to `click-default-group-wheel`
- [x] https://github.com/simonw/datasette/issues/1734
- [x] Work around `uvicorn` import error
- [x] https://github.com/simonw/datasette/issues/1735
- [x] #1737
Goal is to be able to do the following directly in https://pyodide.org/en/stable/console.html
```python
import micropip
await micropip.install(""datasette"")
from datasette.app import Datasette
ds = Datasette()
await ds.client.get(""/.json"")
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1733/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette?
>
> I used https://cs.github.com/ and as far as I can tell there aren't any!
>
> So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223263540,I_kwDOBm6k_c5I6YU0,1735,Datasette setting to disable threading (for Pyodide),9599,simonw,closed,0,,,,,1,2022-05-02T19:31:08Z,2022-05-02T23:25:49Z,2022-05-02T20:13:52Z,OWNER,,"> I'm going to add a Datasette setting to disable threading entirely, designed for usage in this particular case.
>
> I thought about adding a new setting, then I noticed this:
>
> datasette mydatabase.db --setting num_sql_threads 10
>
> I'm going to let users set that to `0` to disable threaded execution of SQL queries.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115278325_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223459734,I_kwDOBm6k_c5I7IOW,1737,Automated test for Pyodide compatibility,9599,simonw,closed,0,,,,,4,2022-05-02T23:24:25Z,2022-05-02T23:40:50Z,2022-05-02T23:40:50Z,OWNER,,"Refs:
- #1733
Need something in the test suite such that if Datasette breaks against Pyodide in the future we hear about it.
I'm thinking this is an opportunity to use [shot-scraper javascript](https://github.com/simonw/shot-scraper#scraping-pages-using-javascript).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223527226,I_kwDOBm6k_c5I7Ys6,1738,"""Cannot use _sort and _sort_desc at the same time""",9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-05-03T01:06:24Z,2022-08-14T16:13:55Z,2022-08-14T16:13:55Z,OWNER,,"Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width:
https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1
Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply:
![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif)
Also notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223699280,I_kwDOBm6k_c5I8CtQ,1739,.db downloads should be served with an ETag,9599,simonw,closed,0,,,,,6,2022-05-03T05:11:21Z,2022-05-04T18:21:18Z,2022-05-03T14:59:51Z,OWNER,,"I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it:
![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1212823665,I_kwDOBm6k_c5ISjhx,1715,Refactor TableView to use asyncinject,9599,simonw,closed,0,,,,,13,2022-04-22T21:43:39Z,2022-12-01T21:15:18Z,2022-04-28T22:26:56Z,OWNER,,"I've been working on a dependency injection mechanism in a separate library:
- https://github.com/simonw/asyncinject
I think it's ready to try out with Datasette to see if it's a pattern that will work here.
I'm going to attempt to refactor `TableView` to use it. There are two overall goals here:
- Use `asyncinject` to add parallel execution of some aspects of the table page - most notably I want to be able to execute the `count(*)` query, the `select ...` query, the various faceting queries and the facet suggestion queries in parallel - and measure if doing so is good for performance.
- Use it to execute different output formats (possibly with some changes to the existing `register_output_renderer()` plugin hook). I want CSV and JSON to use the same mechanism that plugins use.
Stretch goal is to get this working with streaming data too, see:
- #1101",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1212838949,I_kwDOBm6k_c5ISnQl,1716,Configure git blame to ignore Black commit,9599,simonw,closed,0,,,,,1,2022-04-22T21:56:37Z,2022-04-22T22:02:19Z,2022-04-22T22:02:19Z,OWNER,,"GitHub can support this in blame views now too:
https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1213683988,I_kwDOBm6k_c5IV1kU,1718,Code examples in the documentation should be formatted with Black,9599,simonw,closed,0,,,,,12,2022-04-24T15:22:50Z,2022-04-24T16:24:14Z,2022-04-24T16:18:03Z,OWNER,,"For example on this page: https://docs.datasette.io/en/stable/writing_plugins.html#packaging-a-plugin
I wonder if there's an easy way for me to enforce this for Sphinx documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated.
>
> I'm going to split the `RowView` view out into an entirely separate `views/row.py` module.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1215174094,I_kwDOBm6k_c5IbhXO,1720,Design plugin hook for extras,9599,simonw,closed,0,,,,,14,2022-04-26T00:08:10Z,2022-12-01T21:15:19Z,2022-04-26T20:20:27Z,OWNER,,"Refs:
- #262
- #1709
I realized that this is a really natural plugin hook - and if I design it as a hook I can implement Datasette's core extras as default plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1216436131,I_kwDOBm6k_c5IgVej,1721,"Implement plugin hooks: `register_table_extras`, `register_row_extras`, `register_query_extras`",9599,simonw,open,0,,,8755003,Datasette 1.0a-next,0,2022-04-26T20:21:49Z,2022-12-13T05:29:07Z,,OWNER,,"Designed in:
- #1720
Part of:
- #262
- #1709",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1216479167,I_kwDOBm6k_c5Igf-_,1722,`db.primary_keys()` and `db.table_columns()` don't show up in traces,9599,simonw,open,0,,,,,0,2022-04-26T21:08:36Z,2022-04-26T21:08:36Z,,OWNER,,"Noticed this while working on:
- #1715
This code here isn't showing up in traces: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/table.py#L218-L220
Because those functions don't use the regular trace-instrumented `db.execute()` code path - they work directly against a connection instead: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/utils/__init__.py#L610-L626
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1216508080,I_kwDOBm6k_c5IgnCw,1723,Research running SQL in table view in parallel using `asyncio.gather()`,9599,simonw,closed,0,,,,,5,2022-04-26T21:42:48Z,2022-04-27T18:53:44Z,2022-04-26T22:19:09Z,OWNER,,"Spun off from:
- #1715",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1723/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page.
Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings
Possibly related:
- https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1216622905,I_kwDOBm6k_c5IhDE5,1725,Performance question - what is happening in this gap?,9599,simonw,open,0,,,,,0,2022-04-27T00:21:11Z,2022-04-27T00:21:11Z,,OWNER,,"Trace from https://latest-with-plugins.datasette.io/github/commits?_facet=repo&_trace=1&_facet=committer
![CleanShot 2022-04-26 at 17 20 06@2x](https://user-images.githubusercontent.com/9599/165413811-db2cd599-2acc-46ce-b9c2-f9bc45b879e9.png)
What's going on in that gap? Can I improve the tracing output to show some non-SQL queries to figure that out?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1217014076,I_kwDOBm6k_c5Iiik8,1726,Security page in the documentation,9599,simonw,open,0,,,,,0,2022-04-27T08:43:30Z,2022-04-27T08:43:30Z,,OWNER,,"A page talking about how to run Datasette securely, and security concerns to take into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1726/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1217759117,I_kwDOBm6k_c5IlYeN,1727,Research: demonstrate if parallel SQL queries are worthwhile,9599,simonw,open,0,,,,,32,2022-04-27T18:54:21Z,2022-09-26T14:48:31Z,,OWNER,,"I added parallel SQL query execution here:
- https://github.com/simonw/datasette/issues/1723
My hunch is that this will take advantage of multiple cores, since Python's `sqlite3` module releases the GIL once a query is passed to SQLite.
I'd really like to prove this is the case though. Just not sure how to do it!
Larger question: is this performance optimization actually improving performance at all? Under what circumstances is it worthwhile?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1727/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1218133366,I_kwDOBm6k_c5Imz12,1728,Writable canned queries fail with useless non-error against immutable databases,127565,wragge,closed,0,,,8303187,Datasette 0.62,13,2022-04-28T03:10:34Z,2022-08-14T16:34:40Z,2022-08-14T16:34:40Z,CONTRIBUTOR,,"I've been banging my head against a wall for a while and would appreciate any pointers...
- I have a writeable canned query to update rows in the db.
- I'm using the github-oauth plugin for authentication.
- I have `allow` set on the query to accept my GitHub id and a GH organisation.
- Authentication seems to work as expected both locally and on Cloudrun -- viewing `/-/actor` gives the same result in both environments
- I can access the 'padlocked' canned query in both environments.
Everything seems to be the same, but the canned query works perfectly when run locally, and fails when I try it on Cloudrun. I'm redirected back to the canned query page and the db is not changed. There's nothing in the Cloudstor logs to indicate an error.
Any clues as to where I should be looking for the problem?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1219385669,I_kwDOBm6k_c5IrllF,1729,Implement ?_extra and new API design for TableView,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,12,2022-04-28T22:28:14Z,2022-12-13T05:29:07Z,,OWNER,,"Part of:
- #262
- #1518",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1219398983,I_kwDOBm6k_c5Iro1H,1730,SQL tracing should much more closely track the SQL query execution,9599,simonw,open,0,,,,,0,2022-04-28T22:41:04Z,2022-04-28T22:41:10Z,,OWNER,,"In #1727 I realized that the SQL tracing was measuring a whole bunch of stuff outside of the SQL query itself.
I started experimenting with this fix for that but it didn't work - I got back an empty JSON array of traces for some reason:
```diff
diff --git a/datasette/database.py b/datasette/database.py
index ba594a8..d7f9172 100644
--- a/datasette/database.py
+++ b/datasette/database.py
@@ -7,7 +7,7 @@ import sys
import threading
import uuid
-from .tracer import trace
+from .tracer import trace, trace_child_tasks
from .utils import (
detect_fts,
detect_primary_keys,
@@ -207,30 +207,31 @@ class Database:
time_limit_ms = custom_time_limit
with sqlite_timelimit(conn, time_limit_ms):
- try:
- cursor = conn.cursor()
- cursor.execute(sql, params if params is not None else {})
- max_returned_rows = self.ds.max_returned_rows
- if max_returned_rows == page_size:
- max_returned_rows += 1
- if max_returned_rows and truncate:
- rows = cursor.fetchmany(max_returned_rows + 1)
- truncated = len(rows) > max_returned_rows
- rows = rows[:max_returned_rows]
- else:
- rows = cursor.fetchall()
- truncated = False
- except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
- if e.args == (""interrupted"",):
- raise QueryInterrupted(e, sql, params)
- if log_sql_errors:
- sys.stderr.write(
- ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format(
- conn, repr(sql), params, e
+ with trace(""sql"", database=self.name, sql=sql.strip(), params=params):
+ try:
+ cursor = conn.cursor()
+ cursor.execute(sql, params if params is not None else {})
+ max_returned_rows = self.ds.max_returned_rows
+ if max_returned_rows == page_size:
+ max_returned_rows += 1
+ if max_returned_rows and truncate:
+ rows = cursor.fetchmany(max_returned_rows + 1)
+ truncated = len(rows) > max_returned_rows
+ rows = rows[:max_returned_rows]
+ else:
+ rows = cursor.fetchall()
+ truncated = False
+ except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
+ if e.args == (""interrupted"",):
+ raise QueryInterrupted(e, sql, params)
+ if log_sql_errors:
+ sys.stderr.write(
+ ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format(
+ conn, repr(sql), params, e
+ )
)
- )
- sys.stderr.flush()
- raise
+ sys.stderr.flush()
+ raise
if truncate:
return Results(rows, truncated, cursor.description)
@@ -238,9 +239,8 @@ class Database:
else:
return Results(rows, False, cursor.description)
- with trace(""sql"", database=self.name, sql=sql.strip(), params=params):
- results = await self.execute_fn(sql_operation_in_thread)
- return results
+ with trace_child_tasks():
+ return await self.execute_fn(sql_operation_in_thread)
@property
def size(self):
```
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1727#issuecomment-1111602802_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1239008850,I_kwDOBm6k_c5J2cZS,1744,`--nolock` feature for opening locked databases,9599,simonw,closed,0,,,,,7,2022-05-17T18:25:16Z,2022-05-17T19:46:38Z,2022-05-17T19:40:30Z,OWNER,,"The getting started docs currently suggest you try this to browse your Chrome history:
datasette ~/Library/Application\ Support/Google/Chrome/Default/History
But if Chrome is running you will likely get this error:
sqlite3.OperationalError: database is locked
Turns out there's a workaround for this which I just spotted [on the SQLite forum](https://sqlite.org/forum/forumpost/86a67f6995):
> You can do this using a [URI filename](https://sqlite.org/uri.html):
> ```
> sqlite3 'file:places.sqlite?mode=ro&nolock=1'
> ```
> That opens the file `places.sqlite` in read-only mode with locking disabled. This isn't safe, in that changes to the database made by other corrections are likely to cause this connection to return incorrect results or crash. Read-only mode should at least mean that you don't corrupt the database in the process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1239080102,I_kwDOBm6k_c5J2tym,1745,Documentation on running cog,9599,simonw,closed,0,,,,,1,2022-05-17T19:41:06Z,2022-05-17T19:45:51Z,2022-05-17T19:43:45Z,OWNER,,Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1237586379,I_kwDOBm6k_c5JxBHL,1742,?_trace=1 fails with datasette-geojson for some reason,9599,simonw,open,0,,,,,4,2022-05-16T19:06:05Z,2022-05-16T19:42:13Z,,OWNER,,view-source:https://calands.datasettes.com/calands/CPAD_2020a_SuperUnits.geojson?_sort=id&id__exact=4&_labels=on&_trace=1 is showing me a blank page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1237871948,I_kwDOBm6k_c5JyG1M,1743,`datasette.utils.to_css_class()` should be a documented internal,9599,simonw,open,0,,,,,0,2022-05-16T23:57:26Z,2022-05-16T23:57:26Z,,OWNER,,"Because I'm using it in this plugin:
- https://github.com/simonw/datasette-upload-dbs/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1243498298,I_kwDOBm6k_c5KHkc6,1746,Switch documentation theme to Furo,9599,simonw,closed,0,,,,,21,2022-05-20T18:42:17Z,2022-05-20T21:28:29Z,2022-05-20T21:28:29Z,OWNER,,"https://github.com/pradyunsg/furo
I just did this for `shot-scraper` and I really like it: https://shot-scraper.datasette.io/en/latest/
- https://github.com/simonw/shot-scraper/issues/77",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1243512344,I_kwDOBm6k_c5KHn4Y,1747,Add tutorials to the getting started guide,9599,simonw,closed,0,,,,,1,2022-05-20T19:01:52Z,2022-05-20T19:12:30Z,2022-05-20T19:05:34Z,OWNER,,On https://docs.datasette.io/en/stable/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1243517592,I_kwDOBm6k_c5KHpKY,1748,Add copy buttons next to code examples in the documentation,9599,simonw,closed,0,,,,,2,2022-05-20T19:09:00Z,2022-05-20T19:15:00Z,2022-05-20T19:11:32Z,OWNER,,Similar to the ones in `datasette-copyable` which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1247315144,I_kwDOBm6k_c5KWITI,1749,LDAP auth plugin,380241,benswift,open,0,,,,,0,2022-05-25T01:35:12Z,2022-05-25T01:35:12Z,,NONE,,"A [search of the plugins directory](https://datasette.io/plugins?q=ldap) doesn't turn up anything, but is is possible to set up a Datasette app which uses my organisation's LDAP for auth?
If not, how much work would it be to write one (I _may_ have some spare cycles on my team to do this, but we haven't written a datasette plugin before).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1749/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1251700382,I_kwDOBm6k_c5Km26e,1750,Allow `label_column` to specify array of columns,408765,knutwannheden,open,0,,,,,0,2022-05-28T18:45:48Z,2022-05-28T18:45:48Z,,NONE,,"I think it would be great if the Datasette metadata would allow the `label_column` table key to list multiple columns. Something like:
```json
""tables"": {
""person"": {
""label_column"": [""first_name"", ""last_name""]
},
```
It would even be interesting with a ""label expression"" similar to a Python f-string. E.g. `{row.last_name}, {row.first_name}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,knutwannheden,closed,0,,,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.)
When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach.
I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1251739062,I_kwDOBm6k_c5KnAW2,1752,Research if I can drop Janus,9599,simonw,open,0,,,,,0,2022-05-28T22:46:52Z,2022-05-28T22:46:52Z,,OWNER,,"> It seems to me Janus dependency is not necessary, `async with app.database_write_mutex(): out = await app.transaction(func)` may be enough.
Comment here: https://lobste.rs/s/fki4tj/architecture_notes_datasette#c_a2ihon",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,ar-jan,open,0,,,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn):
> Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management.
>
> This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime.
>
> For production deployments we recommend using gunicorn with the uvicorn worker class.
We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1266329095,I_kwDOBm6k_c5LeqYH,1756,Mechanism for creating databases in WAL mode,9599,simonw,open,0,,,,,0,2022-06-09T15:39:28Z,2022-06-09T15:39:28Z,,OWNER,,"The `--create` option currently creates databases if they are missing, but does not enable WAL mode for them.
It turns out WAL mode is useful for databases that are accepting writes!
I think a `--create-wal` option that both creates them AND sets WAL mode on any that are created would be a good idea.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1280799259,I_kwDOBm6k_c5MV3Ib,1761,ensure_ascii=False,1473102,mustafa0x,open,0,,,,,0,2022-06-22T19:58:13Z,2022-06-22T19:58:30Z,,NONE,,"Hi, thanks for the project!
For the JSON output, I would consider defaulting to `ensure_ascii=False` (UTF-8 seems pretty universal) or making it an option. When dealing with non-Latin text, `ensure_ascii=True` (the default) can triple the size of the output.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1306492437,I_kwDOBm6k_c5N334V,1770,`handle_exception` plugin hook for custom error handling,9599,simonw,closed,0,,,8303187,Datasette 0.62,14,2022-07-15T20:52:49Z,2022-08-14T15:25:51Z,2022-08-14T15:25:51Z,OWNER,,"I need this for a couple of plugins, both of which are broken at the moment:
- https://github.com/simonw/datasette-sentry/issues/1
- https://github.com/simonw/datasette-show-errors/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y: