id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054246919,I_kwDOBm6k_c4-1ogH,1511,Review plugin hooks for Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2021-11-15T23:26:05Z,2021-11-16T01:20:14Z,,OWNER,,I need to perform a detailed review of the plugin interface - especially the plugin hooks like [register_facet_classes()](https://docs.datasette.io/en/stable/plugin_hooks.html#register-facet-classes) which I don't yet have complete confidence in.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1055469073,I_kwDOBm6k_c4-6S4R,1513,Research: CTEs and union all to calculate facets AND query at the same time,9599,simonw,closed,0,,,,,12,2021-11-16T22:26:45Z,2021-11-16T23:41:46Z,2021-11-16T23:41:46Z,OWNER,,"Consider this page: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_search=plant&_facet=owner&_facet=country_long&_facet=primary_fuel Datasette needs to run the main query for the rows on that page, a count query for the total query, then a separate query for each of those three specified facets. This is a `_search=` query, so it needs to execute the FTS code once for the rows, again for the count, and then three more times for each of the facets. Could running that query as a CTE and doing the other queries as part of the same large query produce significant speed improvements?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1056746091,I_kwDOBm6k_c4-_Kpr,1515,Handle foreign keys that point to a non-existent table,9599,simonw,open,0,,,,,0,2021-11-17T23:40:13Z,2021-11-18T01:31:56Z,,OWNER,,"Spotted in https://github.com/simonw/datasette-graphql/issues/79 Demo: https://datasette-graphql-demo.datasette.io/fixtures/bad_foreign_key The foreign key links to a 404 page. ![B87009C7-CFCA-4DF9-8FBA-FA3E6CA28EC2](https://user-images.githubusercontent.com/9599/142334788-4d1a4acd-bc87-4426-b333-d46b221afcec.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,gustavorps,open,0,,,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1050163432,I_kwDOBm6k_c4-mDjo,1503,`?_nocol=` removes that column from the filter interface,9599,simonw,closed,0,,,,,1,2021-11-10T18:22:50Z,2021-11-14T05:08:27Z,2021-11-14T04:53:07Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable?_nocol=sortable This causes weird behaviour when you e.g. facet by a hidden column, since selecting facets and then re-submitting the form will clear the selected filter. ![nocol-bug](https://user-images.githubusercontent.com/9599/141171135-aded71d1-a4cb-4b7f-a4ea-26828fa98906.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1051277222,I_kwDOBm6k_c4-qTem,1504,Link to ?_size=max at bottom of table page,9599,simonw,open,0,,,,,0,2021-11-11T19:06:33Z,2021-11-11T19:06:33Z,,OWNER,,"This can have text such as ""Show 1,000 rows per page"", based on the max size limit setting. Would make it easier for people to see more data at once without having to know how to hack the URL, similar to the `...` for facet sizes I added in #1337.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1052247023,I_kwDOBm6k_c4-uAPv,1505,Datasette should have an option to output CSV with semicolons,9599,simonw,open,0,,,,,1,2021-11-12T18:02:21Z,2021-11-16T11:40:52Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1052826038,I_kwDOBm6k_c4-wNm2,1506,Columns beginning with an underscore do not facet correctly,9599,simonw,closed,0,,,,,1,2021-11-14T02:20:32Z,2021-11-14T04:45:21Z,2021-11-14T04:45:21Z,OWNER,,"Datasette treats columns that start with an underscore as querystring parameters it should ignore! Discovered in https://github.com/simonw/git-history/issues/14#issuecomment-968192464",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,simonw,closed,0,,,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation. https://readthedocs.org/projects/datasette/builds/15268454/ ``` /home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html Running Sphinx v1.8.5 loading translations [en]... done making output directory... building [mo]: targets for 0 po files that are out of date building [html]: targets for 27 source files that are out of date updating environment: 27 added, 0 changed, 0 removed reading sources... [ 3%] authentication Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main app.build(args.force_all, filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update len(to_build)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc pub.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish self.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms self.document.transformer.apply_transforms() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms Transformer.apply_transforms(self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms transform.apply(**kwargs) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply apply_source_workaround(n) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround for classifier in reversed(node.parent.traverse(nodes.classifier)): TypeError: argument to reversed() must be a sequence The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1006016302,I_kwDOBm6k_c479pcu,1477,Consider adding request to the documented default template context,9599,simonw,open,0,,,,,0,2021-09-24T02:34:09Z,2021-09-24T02:34:09Z,,OWNER,,I made a plugin for this today but I think perhaps it should be a default thing instead: https://datasette.io/plugins/datasette-template-request,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 999902754,I_kwDOBm6k_c47mU4i,1473,base logo link visits `undefined` rather than href url,192568,mroswell,open,0,,,,,2,2021-09-18T04:17:04Z,2021-09-19T00:45:32Z,,CONTRIBUTOR,,"I have two connected sites: http://www.SaferOrToxic.org (a Hugo website) and: http://disinfectants.SaferOrToxic.org/disinfectants/listN (a datasette table page) The latter is linked as ""The List"" in the former's menu. (I'd love a prettier URL, but that's what I've got.) On: http://disinfectants.SaferOrToxic.org/disinfectants/listN ... all the other menu links should point back to: https://www.SaferOrToxic.org And they do! But the logo, for some reason--though it has an href pointing to: https://www.SaferOrToxic.org Keeps going to this instead: https://disinfectants.saferortoxic.org/disinfectants/undefined What is causing that? How can I fix it? In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.) But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page (""The List,"") and then have the List point back to the home website. The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly. But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined This is the HTML: ``` ``` Is this somehow related to cloudflare? Or something in the datasette code? I'm starting to think it's a cloudflare issue. Can I at least rule out it being a datasette issue? My repository is here: https://github.com/mroswell/list-N (BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see: - https://github.com/aio-libs/janus/issues/358 This is a tracking issue for anything else that shows up. This is also needed for the Homebrew package to upgrade to 3.10: - https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1021849766,I_kwDOBm6k_c486DCm,1483,Running a search on page 2 of results should not preserve ?_next=,9599,simonw,closed,0,,,,,0,2021-10-10T01:18:12Z,2021-10-13T21:08:10Z,2021-10-13T21:08:10Z,OWNER,,Reported by @eigenfoo in https://github.com/simonw/datasette/issues/1470,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1483/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,mroswell,open,0,,,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency. Would love to have documentation on how to modify the template to: - Remove the generated ID from the table - Link the federal ID to the detail page - and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key. I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports. Perhaps this isn't a template issue, maybe more of a db manipulation... Margie",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1010112818,I_kwDOBm6k_c48NRky,1479,"Win32 ""used by another process"" error with datasette publish",76450761,kirajano,open,0,,,,,7,2021-09-28T19:12:00Z,2023-09-07T02:14:16Z,,NONE,,"I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette. Failed to deploy. Attaching logs: 1. Tried with an app created via `flyctl apps create frosty-fog-8565` and the ran `datasette publish fly covid.db --app frosty-fog-8565` ``` Deploying frosty-fog-8565 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080 Error error connecting to docker: An unknown error occured. Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpgcm8cz66\\frosty-fog-8565' ``` 2. Tried also with an app that gets autogenerate when running `flyctl launch`. This also generates the .toml file. Ran then `datasette publish fly covid.db --app dark-feather-168` **but different error now** ```Deploying dark-feather-168 ==> Validating app configuration Error not possible to validate configuration: server returned Post ""https://api.fly.io/graphql"": unexpected EOF Traceback (most recent call last): File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly ""--remote-only"", File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpnoyewcre\\dark-feather-168' ``` These are also the contents of the generated **.toml file** in 2 scenario: ``` # fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00 app = ""dark-feather-168"" kill_signal = ""SIGINT"" kill_timeout = 5 processes = [] [env] [experimental] allowed_public_ports = [] auto_rollback = true [[services]] http_checks = [] internal_port = 8080 processes = [""app""] protocol = ""tcp"" script_checks = [] [services.concurrency] hard_limit = 25 soft_limit = 20 type = ""connections"" [[services.ports]] handlers = [""http""] port = 80 [[services.ports]] handlers = [""tls"", ""http""] port = 443 [[services.tcp_checks]] grace_period = ""1s"" interval = ""15s"" restart_limit = 6 timeout = ""2s"" ``` 3. But also trying `datasette package covid.db` to create a local DOCKERFILE to later try to push it via `flyctl deploy` fails as well. ```[+] Building 147.3s (11/11) FINISHED => [internal] load build definition from Dockerfile 0.2s => => transferring dockerfile: 396B 0.0s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for docker.io/library/python:3.8 4.7s => [auth] library/python:pull token for registry-1.docker.io 0.0s => [internal] load build context 0.1s => => transferring context: 82.37kB 0.0s => [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s => => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s => => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s => => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s => => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s => => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s => => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s => => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s => => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s => => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s => => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s => => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s => => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s => => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s => => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s => => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s => => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s => => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s => => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s => => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s => => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s => => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s => => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s => [2/5] COPY . /app 2.3s => [3/5] WORKDIR /app 0.2s => [4/5] RUN pip install -U datasette 26.9s => [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s => exporting to image 1.2s => => exporting layers 1.2s => => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them Traceback (most recent call last): ""__main__"", mod_spec) File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py"", line 283, in package call(args) File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__ next(self.gen) File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory tmp.cleanup() File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup _shutil.rmtree(self.name) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree return _rmtree_unsafe(path, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpkb27qid3\\datasette'```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1479/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1023243105,I_kwDOBm6k_c48_XNh,1486,pipx installation instructions for plugins don't reference pipx inject,41546558,RhetTbull,closed,0,,,,,0,2021-10-12T00:43:42Z,2021-10-13T21:09:11Z,2021-10-13T21:09:11Z,CONTRIBUTOR,,"The datasette [installation instructions](https://github.com/simonw/datasette/blob/main/docs/installation.rst) discuss how to install with pipx, how to upgrade with pipx, and how to upgrade plugins with pipx but do not mention how to install a plugin with pipx. You discussed this on your [blog](https://til.simonwillison.net/python/installing-upgrading-plugins-with-pipx) but looks like this didn't make it in when you updated the docs for pipx (#756). I'll submit a PR shortly to fix this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1015646369,I_kwDOBm6k_c48iYih,1480,Exceeding Cloud Run memory limits when deploying a 4.8G database,110420,ghing,open,0,,,,,9,2021-10-04T21:20:24Z,2022-10-07T04:39:10Z,,CONTRIBUTOR,,"When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message: > Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M. Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally: - Real Memory Size: 70.6 MB - Virtual Memory Size: 4.51 GB - Shared Memory Size: 2.5 MB - Private Memory Size: 57.4 MB I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow. This is somewhat related to #1082, but on a different platform, so I decided to open a new issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1480/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1025754125,I_kwDOBm6k_c49I8QN,1488,Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects'),9599,simonw,closed,0,,,,,5,2021-10-13T22:37:22Z,2021-10-14T18:03:45Z,2021-10-14T18:03:45Z,OWNER,,This is caused by a change made to `httpx` in https://github.com/encode/httpx/releases/tag/0.20.0,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,chenrui333,closed,0,,,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command. My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks! relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1033864602,I_kwDOBm6k_c49n4Wa,1496,Named parameters docs should include an example of a cast,9599,simonw,closed,0,,,,,1,2021-10-22T18:56:04Z,2021-10-22T19:38:23Z,2021-10-22T19:34:27Z,OWNER,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters It's not obvious that the values from parameters are always SQLite strings, which means that you can't do e.g. integer comparisons on them without casting them first. The documentation here should include an example of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1034535001,I_kwDOBm6k_c49qcBZ,1497,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""",9599,simonw,closed,0,,,,,18,2021-10-24T22:57:07Z,2023-01-18T17:13:45Z,2021-10-24T23:36:55Z,OWNER,,"This means the Datasette 0.59.1 release has not been published to Docker Hub. Here's where that failed: https://github.com/simonw/datasette/runs/3991043374?check_suite_focus=true ``` Preparing to unpack .../libc6_2.32-4_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.28.1 /usr/local/share/perl/5.28.1 /usr/lib/x86_64-linux-gnu/perl5/5.28 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.28 /usr/share/perl/5.28 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype Checking for services that may need to be restarted... Checking init scripts... Unpacking libc6:amd64 (2.32-4) over (2.28-10) ... Setting up libc6:amd64 (2.32-4) ... /usr/bin/perl: error while loading shared libraries: libcrypt.so.1: cannot open shared object file: No such file or directory dpkg: error processing package libc6:amd64 (--configure): installed libc6:amd64 package post-installation script subprocess returned error exit status 127 Errors were encountered while processing: libc6:amd64 E: Sub-process /usr/bin/dpkg returned an error code (1) The command '/bin/sh -c apt-get update && apt-get -y --no-install-recommends install software-properties-common && add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && apt-get update && apt-get -t sid install -y --no-install-recommends libsqlite3-mod-spatialite && apt-get remove -y software-properties-common && apt clean && rm -rf /var/lib/apt && rm -rf /var/lib/dpkg/info/*' returned a non-zero code: 100 ``` Same problem when I attempted to publish using the ""Push specific Docker tag"" workflow: https://github.com/simonw/datasette/runs/3991059912?check_suite_focus=true",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072106103,I_kwDOBm6k_c4_5wp3,1542,feature request: order and dependency of plugins (that use js),33631,fs111,open,0,,,,,1,2021-12-06T12:40:45Z,2021-12-15T17:47:08Z,,NONE,,"I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js. Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it. Basically what I want to do is: my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized. Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes. It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058072543,I_kwDOBm6k_c4_EOff,1518,Complete refactor of TableView and table.html template,9599,simonw,open,0,,,3268330,Datasette 1.0,45,2021-11-19T02:55:16Z,2022-03-15T18:35:49Z,,OWNER,,"Split from #878. The current `TableView` class is by far the most complex part of Datasette, and the most difficult to work on: https://github.com/simonw/datasette/blob/0.59.2/datasette/views/table.py In #878 I started exploring a new pattern for building views. In doing so it became clear that `TableView` is the first beast that I need to slay - if I can refactor that into something neat the pattern for building other views will emerge as a natural consequence. I've been trying to build this as a `register_routes()` plugin, as originally suggested in #870 - though unfortunately it looks like those plugins can't replace existing Datasette default views at the moment, see #1517. [UPDATE: I was wrong about this, plugins can over-ride default views just fine] I also know that I want to have a fully documented template context for `table.html` as a major step on the way to Datasette 1.0, see #1510. All of this adds up to the `TableView` factor being a major project that will unblock a whole flurry of other things - so I'm going to work on that in this separate issue.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058790545,I_kwDOBm6k_c4_G9yR,1519,base_url is omitted in JSON and CSV views,157158,phubbard,closed,0,,,,,22,2021-11-19T18:10:45Z,2021-12-01T17:50:09Z,2021-11-20T19:11:21Z,NONE,,"I have a datasette deployment, using Apache2 to reverse proxy: ProxyPass /ged http://thor.phfactor.net:8001 ProxyPreserveHost On In settings.json I have ```json { ""base_url"": ""/ged/"", ""trace_debug"": 1, ""template_debug"": 1 } ``` and datasette works correctly. However, if you view a query and then click on the 'This data as json, CSV' both links omit the base_url prefix and are therefore 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058803238,I_kwDOBm6k_c4_HA4m,1520,Pattern for avoiding accidental URL over-rides,9599,simonw,open,0,,,,,1,2021-11-19T18:28:05Z,2021-11-19T18:29:26Z,,OWNER,,"Following #1517 I'm experimenting with a plugin that does this: ```python @hookimpl def register_routes(): return [ (r""/(?P[^/]+)/(?P[^/]+?)$"", Table().view), ] ``` This is supposed to replace the default table page with new code... but there's a problem: `/-/versions` on that instance now returns 404 `Database '-' does not exist`! Need to figure out a pattern to avoid that happening. Plugins get to add their routes before Datasette's default routes, which is why this is happening here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058815557,I_kwDOBm6k_c4_HD5F,1521,Docker configuration for exercising Datasette behind Apache mod_proxy,9599,simonw,closed,0,,,,,10,2021-11-19T18:46:18Z,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,,"> Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974310208_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1058896236,I_kwDOBm6k_c4_HXls,1522,Deploy a live instance of demos/apache-proxy,9599,simonw,closed,0,,,,,34,2021-11-19T20:32:55Z,2021-11-23T03:00:34Z,2021-11-20T18:51:56Z,OWNER,,"> I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1521#issuecomment-974322178_ I started by following https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059209412,I_kwDOBm6k_c4_IkDE,1523,Come up with a more elegant solution for base_url than ds.urls.path(),9599,simonw,open,0,,,,,0,2021-11-20T19:05:22Z,2021-11-20T19:05:22Z,,OWNER,,"While fixing #1519 I added a lot of ugly code that looks like this: https://github.com/simonw/datasette/blob/08947fa76433d18988aa1ee1d929bd8320c75fe2/datasette/facets.py#L228-L230 See these two commits in particular: fe687fd0207c4c56c4778d3e92e3505fc4b18172 and 08947fa76433d18988aa1ee1d929bd8320c75fe2 It would be great to come up with a less verbose and error-prone way of handling this problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_ I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here: https://github.com/simonw/datasette-publish-vercel/issues/51 In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site). Can this be addressed? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059555791,I_kwDOBm6k_c4_J4nP,1527,Columns starting with an underscore behave poorly in filters,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-11-22T01:01:36Z,2022-01-14T00:57:08Z,2022-01-14T00:57:08Z,OWNER,,"Similar bug to #1525 (and #1506 before it). Start on https://latest.datasette.io/fixtures/facetable?_facet=_neighborhood - then select a neighborhood - then try to remove that filter using the little ""x"" and submitting the form again. ![filter-bug](https://user-images.githubusercontent.com/9599/142786754-31d265a2-944d-4ea2-af6f-305d445a2ccb.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so: ```yaml databases: fixtures: queries: neighborhood_search: sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood title: Search neighborhoods ``` This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it. So, I'd like to see a new `""sql_file""` key that works like so: `metadata.yaml`: ```yaml databases: fixtures: queries: neighborhood_search: sql_file: neighborhood_search.sql title: Search neighborhoods ``` `neighborhood_search.sql`: ```sql select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood ``` Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup. A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml: - Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code - Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit - Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier) - Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065429936,I_kwDOBm6k_c4_gSuw,1532,Use datasette-table Web Component to guide the design of the JSON API for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2021-11-28T20:37:18Z,2022-03-16T20:13:34Z,,OWNER,,"I realized that one of the reasons I'm having trouble committing to nailing down the JSON API for 1.0 is that I don't use it much myself - I use the `?_shape=array` one quite often, but I don't have any projects that are using the default, more fully-featured API. As an experiment I built a Web Component for embedding Datasette tables on pages - https://github.com/simonw/datasette-table - and I think it's actually going to be a really useful tool for helping me dog food the v1.0 API design.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` The query_only pragma prevents data changes on database files when enabled. When this pragma is enabled, any attempt to CREATE, DELETE, DROP, INSERT, or UPDATE will result in an [SQLITE_READONLY](https://www.sqlite.org/rescode.html#readonly) error. However, the database is not truly read-only. You can still run a [checkpoint](https://www.sqlite.org/wal.html#ckpt) or a [COMMIT](https://www.sqlite.org/lang_transaction.html) and the return value of the [sqlite3_db_readonly()](https://www.sqlite.org/c3ref/db_readonly.html) routine is not affected. Would it be worth adding this as an extra protection against accidental writes to a DB file over a read-only connection?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1068791148,I_kwDOBm6k_c4_tHVs,1540,Idea: hover to reveal details of linked row,9599,simonw,open,0,,,,,6,2021-12-01T19:28:07Z,2021-12-09T23:38:39Z,,OWNER,," Hovering over that could work a little bit like GitHub issue links: ![hover](https://user-images.githubusercontent.com/9599/144300537-9cd9e9af-ac16-42db-842f-37661bc94063.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1069881276,I_kwDOBm6k_c4_xRe8,1541,Different default layout for row page,9599,simonw,open,0,,,,,1,2021-12-02T18:56:36Z,2021-12-02T18:56:54Z,,OWNER,,"The row page displays as a table even though it only has one table row. maybe default to the same display as the narrow page version, even for wide pages?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,andrewsanchez,open,0,,,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,LyzardKing,open,0,,,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue. I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly. Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: ```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!``` Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139 In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates. You can't even try calling `post_body()` and implement your own custom parsing because of: - #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978022687,I_kwDOBm6k_c515jsf,2204,request.post_body() can only be called once,9599,simonw,open,0,,,,,0,2023-11-05T23:22:03Z,2023-11-05T23:23:23Z,,OWNER,,"This code here: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135 It consumes the messages, which means if you try to call it a second time you won't be able to get at the body. This is efficient - we don't end up with a `request` object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not. Potential solution: set `request._body` the first time it is called, and return that on subsequent calls. Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call `post_body()` multiple times against one of those larger bodies. I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,honzajavorek,open,0,,,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error: ``` $ datasette Traceback (most recent call last): File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group' ``` I have datasette in my dependencies like this: ```toml [tool.poetry.group.dev.dependencies] datasette = {version = ""1.0a7"", allow-prereleases = true} ``` I had the latest regular version (not pre-release) there originally, but the result was the same: ```toml [tool.poetry.group.dev.dependencies] datasette = ""0.64.5"" ``` Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994857251,I_kwDOBm6k_c525xsj,2208,No suggested facets when a column named 'value' is included,198537,rgieseke,open,0,,,,,1,2023-11-15T14:11:17Z,2023-11-15T14:18:59Z,,CONTRIBUTOR,,"When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174 Currently the following is shown (from https://latest.datasette.io/fixtures/facetable) ![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5) When I add a column named 'value' only the JSON facets are processed. ![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d) I think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that. There is also a TODO with a similar question in the same file. I have not looked into that yet. https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,fgregg,open,0,,,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed. this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application"" (see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,mattparmett,open,0,,,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler. Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler. This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it. One example: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537 Other instances: https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,precipice,open,0,,,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,fzakaria,open,0,,,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is ""stringifying"" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6* > `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/ > > Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days? > > Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087919372,I_kwDOBm6k_c5A2FUM,1578,Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key,9599,simonw,open,0,,,,,4,2021-12-23T18:27:59Z,2021-12-24T21:33:19Z,,OWNER,,"Found this while working on https://github.com/simonw/datasette-tiddlywiki Then clicking on `/tiddlywiki/tiddlers/%24%3A%2FDefaultTiddlers` returns a 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1087931918,I_kwDOBm6k_c5A2IYO,1579,`.execute_write(... block=True)` should be the default behaviour,9599,simonw,closed,0,,,7571612,Datasette 0.60,7,2021-12-23T18:54:28Z,2022-01-13T22:28:08Z,2021-12-23T19:18:26Z,OWNER,,"Every single piece of code I've written against the write APIs has used the `block=True` option to wait for the result. Without that, it instead fires the write into the queue but then continues even before it has finished executing. `block=True` should clearly be the default behaviour here!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1089529555,I_kwDOBm6k_c5A8ObT,1581,"when hashed urls are turned on, the _memory db has improperly long-lived cache expiry",536941,fgregg,closed,0,,,,,1,2021-12-28T00:05:48Z,2022-03-24T04:08:18Z,2022-03-24T04:08:18Z,CONTRIBUTOR,,"if hashed_urls are on, then a -000 suffix is added to the `_memory` database, and the cache settings are set just as if it was a normal hashed database. in particular, this header is set: `cache-control: max-age=31536000` this is not appropriate because the `_memory-000` database isn't really hashed based on the contents of the databases (see #1561). Either the cache-control header should be changed, or the _memory db should have a hash suffix that does depend on the contents of the databases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,jadsongmatos,closed,0,,,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,wragge,closed,0,,,7571612,Datasette 0.60,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name): `` My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`): `` So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication. Thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor. I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077620955,I_kwDOBm6k_c5AOzDb,1549,Redesign CSV export to improve usability,536941,fgregg,open,0,,,3268330,Datasette 1.0,5,2021-12-11T19:02:12Z,2022-04-04T11:17:13Z,,CONTRIBUTOR,,"*Original title: Set content type for CSV so that browsers will attempt to download instead opening in the browser* Right now, if the user clicks on the CSV related to a table or a query, the response header for the content type is ""content-type: text/plain; charset=utf-8"" Most browsers will try to open a file with this content-type in the browser. This is not what most people want to do, and lots of folks don't know that if they want to download the CSV and open it in the a spreadsheet program they next need to save the page through their browser. It would be great if the response header could be something like ``` 'Content-type: text/csv'); 'Content-disposition: attachment;filename=MyVerySpecial.csv'); ``` which would lead browsers to open a download dialog. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1078702875,I_kwDOBm6k_c5AS7Mb,1552,Allow to set `facets_array` in metadata (like current `facets`),3556,davidbgk,closed,0,,,7571612,Datasette 0.60,9,2021-12-13T16:00:44Z,2022-01-13T22:26:15Z,2021-12-16T18:47:48Z,CONTRIBUTOR,,"For now, you can set a `facets` value (array) in your metadata file but I couldn't find a way to set a `facets_array` in order to provide default facets for arrays (like tags). My use-case is to access to [that kind of view](https://latest.datasette.io/fixtures/facetable?_facet_array=tags) by default without URL's parameters as with other default facets. _I'm new to datasette, and I'm willing to help with a PR if that is not already implemented and I missed it!_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit. it would be great if a response is truncated that an header signalling that was set in the header. i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1079149656,I_kwDOBm6k_c5AUoRY,1555,Optimize all those calls to index_list and foreign_key_list,9599,simonw,closed,0,,,7571612,Datasette 0.60,27,2021-12-13T23:50:56Z,2022-01-13T22:27:32Z,2021-12-19T20:55:59Z,OWNER,,"On the first hit to a restarted index I'm seeing this in the SQL traces: https://latest-with-plugins.datasette.io/github/commits?_trace=1 I imagine this could be sped up a lot using tricks like this one from the SQLite documentation: https://sqlite.org/pragma.html#pragfunc ```sql SELECT DISTINCT m.name || '.' || ii.name AS 'indexed-columns' FROM sqlite_schema AS m, pragma_index_list(m.name) AS il, pragma_index_info(il.name) AS ii WHERE m.type='table' ORDER BY 1; ``` https://latest-with-plugins.datasette.io/fixtures?sql=SELECT+DISTINCT+m.name+%7C%7C+%27.%27+%7C%7C+ii.name+AS+%27indexed-columns%27%0D%0A++FROM+sqlite_schema+AS+m%2C%0D%0A+++++++pragma_index_list%28m.name%29+AS+il%2C%0D%0A+++++++pragma_index_info%28il.name%29+AS+ii%0D%0A+WHERE+m.type%3D%27table%27%0D%0A+ORDER+BY+1%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1081318247,I_kwDOBm6k_c5Ac5tn,1556,"Show count of facet values always, not just for `?_facet_size=max`",9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-15T17:49:01Z,2022-01-13T22:26:07Z,2021-12-15T17:58:06Z,OWNER,,"> You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1423#issuecomment-995023410_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1556/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 1082564912,I_kwDOBm6k_c5AhqEw,1557,`?_nosuggest=1` parameter for disabling facet suggestions on table view,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-16T19:21:42Z,2022-01-13T22:26:48Z,2021-12-16T19:24:59Z,OWNER,,"Found I wanted this while I was debugging #625 just to clean up the debug traces, but it makes sense as a partner to `?_nofacet=1` and `?_nocount=1` from #1350 and #1353.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used. > > I'll open a separate issue to redesign this better for Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082746149,I_kwDOBm6k_c5AiWUl,1560,"Table page title has ""where where"" in it",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-17T00:05:48Z,2022-01-13T22:28:35Z,2022-01-13T22:20:15Z,OWNER,,"Just noticed this while working on #1518. ``` % curl -s 'https://latest.datasette.io/fixtures/facetable?_sort=pk&on_earth__exact=1' | grep -C 1 '' <head> <title>fixtures: facetable: 14 rows where where on_earth = 1 sorted by pk ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083657868,I_kwDOBm6k_c5Al06M,1565,Documented JavaScript variables on different templates made available for plugins,9599,simonw,open,0,,,,,8,2021-12-17T22:30:51Z,2021-12-19T22:37:29Z,,OWNER,,"While working on https://github.com/simonw/datasette-leaflet-freedraw/issues/10 I found myself writing this atrocity to figure out the SQL query used for a specific table page: ```javascript let innerSql = Array.from(document.getElementsByTagName(""span"")).filter( el => el.innerText == ""View and edit SQL"" )[0].parentElement.getAttribute(""title"") ``` This is obviously bad - it's very brittle, and will break if I ever change the text on that link (like localizing it for example). Instead, I think pages like that one should have a block of script at the bottom something like this: ```javascript window.datasette = window.datasette || {}; datasette.view_name = 'table'; datasette.table_sql = 'select * from ...'; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon >>> ds = Datasette(memory=True) Traceback (most recent call last): File """", line 1, in TypeError: __init__() missing 1 required positional argument: 'files' >>> ds = Datasette(memory=True, files=[]) ``` I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083581011,I_kwDOBm6k_c5AliJT,1564,_prepare_connection not called on write connections,9599,simonw,closed,0,,,7571612,Datasette 0.60,1,2021-12-17T20:06:47Z,2022-01-20T21:29:43Z,2021-12-18T01:58:44Z,OWNER,,"I was trying to initalize SpatiaLite in a write connection: ```pycon >>> from datasette.app import Datasette >>> ds = Datasette(memory=True, files=[], sqlite_extensions=[""spatialite""]) >>> db = ds.add_memory_database('geo') >>> await db.execute_write(""select InitSpatialMetadata(1)"") UUID('3f143baa-4e3d-5842-a36f-4fa2f683b72f') no such function: InitSpatialMetadata ``` It looks like the code that loads additional modules only works on read-only connections, not on write connections: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L146-L153 Compared to: https://github.com/simonw/datasette/blob/92a5280d2e75c39424a75ad6226fc74400ae984f/datasette/database.py#L124-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods: > > - `db.execute_write(sql, params=None, block=False)` > - `db.execute_write_script(sql, block=False)` > - `db.execute_write_many(sql, params_seq, block=False)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083927147,I_kwDOBm6k_c5Am2pr,1571,Track number of executions for execute_write_many() in traces,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T19:16:17Z,2022-01-13T22:27:49Z,2021-12-19T20:30:40Z,OWNER,,"Spotted while working on #1555 There's no indication there of how many times `execute_write_many()` executed the SQL. Solving this is a tiny bit tricky because `params_seq` is an iterator that we don't want to exhaust before passing it to `conn.executemany()` - so we need to instead wrap it in something that counts how many times it was called. But then we need a way to attach that to the trace here: https://github.com/simonw/datasette/blob/d637ed46762fdbbd8e32b86f258cd9a53c1cfdc7/datasette/database.py#L115-L122 So probably need to redesign the `trace()` decorator to allow extra pairs to be attached to it within the `with` statement. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083718998,I_kwDOBm6k_c5AmD1W,1567,Remove undocumented sqlite_functions mechanism,9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-18T01:51:10Z,2022-01-13T22:27:04Z,2021-12-18T01:54:46Z,OWNER,,"I added this in 0b8c1b0a6da9cb8ac0d28cc90dd783de87554036 but it's never been documented and the same thing can now be achieved using the `prepare_connection` plugin hook. https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L262 https://github.com/simonw/datasette/blob/0c91e59d2bbfc08884cfcf5d1b902a2f4968b7ff/datasette/app.py#L551-L552 It's used here in the tests: https://github.com/simonw/datasette/blob/69244a617b1118dcbd04a8f102173f04680cf08c/tests/fixtures.py#L156",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this: ```diff diff --git a/datasette/database.py b/datasette/database.py index 468e936..1a424f5 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -94,10 +94,14 @@ class Database: f""file:{self.path}{qs}"", uri=True, check_same_thread=False ) - async def execute_write(self, sql, params=None, block=False): + async def execute_write(self, sql, params=None, executescript=False, block=False): + assert not executescript and params, ""Cannot use params with executescript=True"" def _inner(conn): with conn: - return conn.execute(sql, params or []) + if executescript: + return conn.executescript(sql) + else: + return conn.execute(sql, params or []) with trace(""sql"", database=self.name, sql=sql.strip(), params=params): results = await self.execute_write_fn(_inner, block=block) ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084185188,I_kwDOBm6k_c5An1pk,1573,Make trace() a documented internal API,9599,simonw,open,0,,,,,1,2021-12-19T20:32:56Z,2021-12-19T21:13:13Z,,OWNER,,"This should be documented so plugin authors can use it to add their own custom traces: https://github.com/simonw/datasette/blob/8f311d6c1d9f73f4ec643009767749c17b5ca5dd/datasette/tracer.py#L28-L52 Including the new `kwargs` pattern I added in #1571: https://github.com/simonw/datasette/blob/f65817000fdf87ce8a0c23edc40784ebe33b5842/datasette/database.py#L128-L132",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1084007781,I_kwDOBm6k_c5AnKVl,1572,"""Query took"" should be ""Queries took""",9599,simonw,closed,0,,,7571612,Datasette 0.60,0,2021-12-19T04:03:00Z,2022-01-13T22:27:43Z,2021-12-19T04:03:24Z,OWNER,,"This is misleading, since usually there have been more than one query executed: ![CleanShot 2021-12-18 at 20 02 35@2x](https://user-images.githubusercontent.com/9599/146663457-9c4c2900-5cc0-4650-a565-bb1ff0b8a725.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1084257842,I_kwDOBm6k_c5AoHYy,1575,__call__() got an unexpected keyword argument 'specname',9599,simonw,closed,0,,,,,1,2021-12-20T01:24:04Z,2021-12-20T01:48:03Z,2021-12-20T01:47:57Z,OWNER,,"> I've installed the alpha version but get an error when starting up Datasette: ``` Traceback (most recent call last): File ""/Users/tim/.pyenv/versions/stock-exchange/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/cli.py"", line 15, in from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/app.py"", line 31, in from .views.database import DatabaseDownload, DatabaseView File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/views/database.py"", line 25, in from datasette.plugins import pm File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/plugins.py"", line 29, in mod = importlib.import_module(plugin) File ""/Users/tim/.pyenv/versions/3.8.5/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File ""/Users/tim/.pyenv/versions/3.8.5/envs/stock-exchange/lib/python3.8/site-packages/datasette/filters.py"", line 9, in @hookimpl(specname=""filters_from_request"") TypeError: __call__() got an unexpected keyword argument 'specname' ``` _Originally posted by @wragge in https://github.com/simonw/datasette/issues/1547#issuecomment-997511968_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087181951,I_kwDOBm6k_c5AzRR_,1576,Traces should include SQL executed by subtasks created with `asyncio.gather`,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2021-12-22T20:52:02Z,2022-02-05T05:21:35Z,2022-02-05T05:19:53Z,OWNER,,"I tried running some parallel SQL queries using `asyncio.gather()` but the SQL that was executed didn't show up in the trace rendered by https://datasette.io/plugins/datasette-pretty-traces I realized that was because traces are keyed against the current task ID, which changes when a sub-task is run using `asyncio.gather` or similar. The faceting and suggest faceting queries are missing from this trace: ![image](https://user-images.githubusercontent.com/9599/147153855-2d611f07-922a-4d18-9e6e-4be89e010dc4.png) > The reason they aren't showing up in the traces is that traces are stored just for the currently executing `asyncio` task ID: https://github.com/simonw/datasette/blob/ace86566b28280091b3844cf5fbecd20158e9004/datasette/tracer.py#L13-L25 > > This is so traces for other incoming requests don't end up mixed together. But there's no current mechanism to track async tasks that are effectively ""child tasks"" of the current request, and hence should be tracked the same. > > https://stackoverflow.com/a/69349501/6083 suggests that you pass the task ID as an argument to the child tasks that are executed using `asyncio.gather()` to work around this kind of problem. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-999870993_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1104691662,I_kwDOBm6k_c5B2EHO,1600,plugins --all example should use cog,9599,simonw,closed,0,,,,,1,2022-01-15T11:47:49Z,2022-01-20T05:06:21Z,2022-01-20T05:04:16Z,OWNER,,The example output for `datasette plugins --all`on this page has got out of date: https://docs.datasette.io/en/stable/plugins.html#seeing-what-plugins-are-installed,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1105916061,I_kwDOBm6k_c5B6vCd,1601,Add KNN and data_licenses to hidden tables list,25778,eyeseast,closed,0,,,,,5,2022-01-17T14:19:57Z,2022-01-20T21:29:44Z,2022-01-20T04:38:54Z,CONTRIBUTOR,,"They're generated by Spatialite and not very interesting in most cases. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1090810196,I_kwDOBm6k_c5BBHFU,1583,consider adding deletion step of cloudbuild artifacts to gcloud publish,536941,fgregg,open,0,,,,,1,2021-12-30T00:33:23Z,2021-12-30T00:34:16Z,,CONTRIBUTOR,,"right now, as part of the the publish process images and other artifacts are stored to gcloud's cloud storage before being deployed to cloudrun. after successfully deploying, it would be nice if the the script deleted these artifacts. otherwise, if you have regularly scheduled build process, you can end up paying to store lots of out of date artifacts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091257796,I_kwDOBm6k_c5BC0XE,1584,give error with recursive sql,58088336,tunguyenatwork,open,0,,,,,0,2021-12-30T18:53:16Z,2021-12-30T18:53:16Z,,NONE,,"I got an error ""near ""WITH"": syntax error"" after I upgraded to version 0.59 from 0.52.4. This error is related to recursive sql. It works great on the previous version but it failed after upgraded. Below is an example of sql: WITH RECURSIVE manager_of(position, super_position) AS (SELECT position, case ifnull(INDIRECT_SUPER_POSITION,'') when '' then super_position else INDIRECT_SUPER_POSITION end as SUPER_POSITION FROM position where super_position<>'SGV000000001' and super_position!='' and position <> super_position),chain_manager_of_position(position, level) AS (SELECT super_position, 1 as level FROM manager_of WHERE super_position!='' and (position=:pos or position in (Select position from employee where employee=:ein)) UNION ALL SELECT super_position, level+1 as level FROM manager_of JOIN chain_manager_of_position USING(position)) SELECT * FROM chain_manager_of_position left join employee using(position) where employee is not NULL order by level limit 1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091838742,I_kwDOBm6k_c5BFCMW,1585,Fire base caching for `publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-01T15:38:15Z,2022-01-01T15:40:38Z,,OWNER,,"https://gist.github.com/steren/03d3e58c58c9a53fd49bb78f58541872 has a recipe for this, via https://twitter.com/steren/status/1477038411114446848 Could this enable easier vanity URLs of the format `https://$project_id.web.app/`? How about CDN caching?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1096536240,I_kwDOBm6k_c5BW9Cw,1586,run analyze on all databases as part of start up or publishing,536941,fgregg,open,0,,,,,1,2022-01-07T17:52:34Z,2022-02-02T07:13:37Z,,CONTRIBUTOR,,"Running `analyze;` lets sqlite's query planner make *much* better use of any indices. It might be nice if the analyze was run as part of the start up of ""serve"" or ""publish"".",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table > > This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099723916,I_kwDOBm6k_c5BjHSM,1590,Table+query JSON and CSV links broken when using `base_url` setting,1001306,eelkevdbos,closed,0,,,7571612,Datasette 0.60,11,2022-01-11T23:46:39Z,2022-01-14T01:16:34Z,2022-01-14T01:16:08Z,NONE,,"Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set. In the follow asgi example, I'm hosting a custom Datasette instance: ```python # asgi.py import pathlib from asgi_cors import asgi_cors from channels.routing import URLRouter from django.urls import re_path from datasette.app import Datasette datasette_ = Datasette( files=[], settings={ ""base_url"": ""/datasettes/"", ""plugins"": {} }, config_dir=pathlib.Path('.'), ) application = URLRouter([ re_path(r""^datasettes/.*"", asgi_cors(datasette_.app(), allow_all=True)), ]) ``` Running it with: ```shell $ daphne -p 8002 asgi:application ``` Using a simple query on the `_memory` table: ```sql select sqlite_version() ``` http://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29 It renders the following upon inspection: ![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png) I am using datasette version `0.59.4`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1100015398,I_kwDOBm6k_c5BkOcm,1591,Maybe let plugins define custom serve options?,9599,simonw,open,0,,,,,7,2022-01-12T08:18:47Z,2022-01-15T11:56:59Z,,OWNER,,"https://twitter.com/psychemedia/status/1481171650934714370 > can extensions be passed their own cli args? eg `--ext-tiddlywiki-dbname tiddlywiki2.sqlite` ? I've thought something like this might be useful for other plugins in the past, too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1100499619,I_kwDOBm6k_c5BmEqj,1592,Row pages should show links to foreign keys,9599,simonw,open,0,,,,,1,2022-01-12T15:50:20Z,2022-01-12T15:52:17Z,,OWNER,,Refs #1518 refactor.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1592/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102568047,I_kwDOBm6k_c5Bt9pv,1596,Documentation page warning of changes coming in 1.0,9599,simonw,open,0,,,,,0,2022-01-13T23:26:04Z,2022-01-13T23:26:04Z,,OWNER,,I should start this relatively soon.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102612922,I_kwDOBm6k_c5BuIm6,1597,"""datasette inspect"" has no help summary",9599,simonw,closed,0,,,,,1,2022-01-14T00:02:16Z,2022-01-14T00:07:36Z,2022-01-14T00:07:36Z,OWNER,,"Made obvious by the new CLI reference page added in #1594. https://docs.datasette.io/en/latest/cli-reference.html#datasette-inspect-help ``` Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. ``` ``` Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension TEXT Path to a SQLite extension to load --help Show this message and exit. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102637351,I_kwDOBm6k_c5BuOkn,1598,Replace update-docs-help.py script with cog,9599,simonw,closed,0,,,,,1,2022-01-14T00:33:27Z,2022-01-14T00:47:57Z,2022-01-14T00:47:57Z,OWNER,,"I introduced `cog` in #1594 - I can use this to replace the older `update-docs-help.py` mechanism: https://github.com/simonw/datasette/blob/76d66d5b2bf10249c0beaac0999b93ac8d757f48/tests/test_docs.py#L36-L53",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102966378,I_kwDOBm6k_c5Bve5q,1599,Add architecture documentation,9599,simonw,open,0,,,,,0,2022-01-14T04:55:38Z,2022-01-14T04:56:03Z,,OWNER,,"Inspired by https://matklad.github.io/2021/02/06/ARCHITECTURE.md.html Good example: https://github.com/rust-analyzer/rust-analyzer/blob/d7c99931d05e3723d878bea5dc26766791fa4e69/docs/dev/architecture.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121121305,I_kwDOBm6k_c5C0vQZ,1618,"Reconsider policy on blocking queries containing the string ""pragma""",770231,strada,open,0,,,,,6,2022-02-01T19:39:46Z,2022-02-02T19:42:03Z,,NONE,,"First of all, thanks for creating this cool project, and also supporting publishing to various hosting services out of the box. While testing out, I noticed legitimate queries such as ``` select * from books where title like 'Pragmatic%' ``` or ``` select * from books where title = 'The Pragmatic Programmer' ``` are blocked, due to the regular expression check here: https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L185 Example as seen from a Datasette instance: https://fivethirtyeight.datasettes.com/polls?sql=select+*+from+books+where+title+like+%27Pragmatic%25%27%0D%0A I'd propose a regular expression like ``` re.compile(f""pragma_(?!({'|'.join(allowed_pragmas)}))""), ``` instead of ``` re.compile(f""pragma(?!_({'|'.join(allowed_pragmas)}))""), ``` I can create a pull request with this change, unless the maintainers think it would allow unwanted queries to be executed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,simonw,open,0,,,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment: datasette fixtures.db -p 3344 --setting base_url /foo/bar/ Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3 But... that `json` link goes here, which is a 404: http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122413719,I_kwDOBm6k_c5C5qyX,1621,Test against Python 3.11 dev version,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-02-02T21:38:57Z,2022-03-19T04:04:49Z,2022-02-02T21:58:54Z,OWNER,,"To avoid another surprise like we got with 3.10: https://simonwillison.net/2021/Oct/9/finding-and-reporting-a-bug/ From a quick GitHub code search it looks like `3.11-dev` should work: https://cs.github.com/urllib3/urllib3/blob/7bec77e81aa0a194c98381053225813f5347c9d2/.github/workflows/ci.yml#L60",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from: - #1620 ``` % curl -s -I 'https://latest.datasette.io/-/patterns' | grep link link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette"" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked % curl -I 'https://latest.datasette.io/' HTTP/1.1 200 OK link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette"" content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:52 GMT Server: Google Frontend Transfer-Encoding: chunked ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122450452,I_kwDOBm6k_c5C5zwU,1625,Try running tests against macOS and Windows in addition to Ubuntu,9599,simonw,open,0,,,,,0,2022-02-02T22:25:57Z,2022-02-02T22:25:57Z,,OWNER,,"I already do this for `sqlite-utils`: https://github.com/simonw/sqlite-utils/blob/3.22.1/.github/workflows/test.yml Related: - #1617 - #1545",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1122557010,I_kwDOBm6k_c5C6NxS,1627,Get the tests passing against Windows,9599,simonw,open,0,,,,,0,2022-02-03T01:23:06Z,2022-02-03T01:23:32Z,,OWNER,,"> OK, the tests do NOT pass against Windows! https://github.com/simonw/datasette/runs/5044105941 > > _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1626#issuecomment-1028515161_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108300685,I_kwDOBm6k_c5CD1ON,1604,Option to assign a domain/subdomain using `datasette publish cloudrun`,9599,simonw,open,0,,,,,1,2022-01-19T16:21:17Z,2022-01-19T16:23:54Z,,OWNER,,Looks like this API should be able to do that: https://twitter.com/steren/status/1483835859191304192 - https://cloud.google.com/run/docs/reference/rest/v1/namespaces.domainmappings/create,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1108235694,I_kwDOBm6k_c5CDlWu,1603,A proper favicon,9599,simonw,closed,0,,,3268330,Datasette 1.0,19,2022-01-19T15:24:55Z,2022-03-19T04:04:49Z,2022-01-20T06:07:31Z,OWNER,,"Tips here: https://adamj.eu/tech/2022/01/18/how-to-add-a-favicon-to-your-django-site/ - I think a PNG served at `/favicon.ico` is the best option, since safari doesn't support SVG yet. Relevant code: https://github.com/simonw/datasette/blob/cb29119db9115b1f40de2fb45263ed77e3bfbb3e/datasette/app.py#L182-L183 I can reuse the icon for https://datasette.io/desktop",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108671952,I_kwDOBm6k_c5CFP3Q,1605,Scripted exports,25778,eyeseast,open,0,,,,,10,2022-01-19T23:45:55Z,2022-11-30T15:06:38Z,,CONTRIBUTOR,,"Posting this while I'm thinking about it: I mentioned at the end of [this thread](https://twitter.com/eyeseast/status/1483893011658551299) that I'm usually doing `datasette --get` to export canned queries. I used to use a tool called [datafreeze](https://github.com/pudo/datafreeze) to do scripted exports, but that project looks dead now. The ergonomics of it are pretty nice, though, and the `Freezefile.yml` structure is actually not too far from Datasette's canned queries. This is related to the idea for `datasette query` (#1356) but I think it's a distinct feature. It's most likely a plugin, but I want to raise it here because it's probably something other people have thought about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1109884720,I_kwDOBm6k_c5CJ38w,1609,"Ensure ""pip install datasette"" still works with Python 3.6",9599,simonw,closed,0,,,,,12,2022-01-21T00:08:10Z,2022-01-24T19:20:09Z,2022-01-21T02:24:13Z,OWNER,,"## Original title: Can I keep ""pip install datasette"" working on Python 3.6? I dropped support for 3.6 in: - #1577 I'm getting reports that `pip3 install datasette` throws an error on that Python, even though I haven't made that new release yet - presumably due to lack of pinning of Uvicorn: https://twitter.com/ldodds/status/1484289475195080706 Is it possible to get `pip` on that version of Python to install the highest possible version of the packages that are still known to support Python 3.6? If so, how?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1609/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike. These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1109808154,I_kwDOBm6k_c5CJlQa,1608,Documentation should clarify /stable/ vs /latest/,9599,simonw,closed,0,,,,,15,2022-01-20T22:02:59Z,2023-03-26T23:41:12Z,2022-01-20T22:53:17Z,OWNER,,"It's not currently clear what the difference between https://docs.datasette.io/en/latest/ and https://docs.datasette.io/en/stable/ is - I should fix that. On Twitter: https://twitter.com/simonw/status/1484285006243528705",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1113384383,I_kwDOBm6k_c5CXOW_,1611,Avoid ever running count(*) against SpatiaLite KNN table,9599,simonw,open,0,,,,,1,2022-01-25T03:32:54Z,2022-02-02T06:45:47Z,,OWNER,,"Got this in a trace: Looks like running `count(*)` against KNN took 83s! It ignored the time limit. And still only returned a count of 0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1114147905,I_kwDOBm6k_c5CaIxB,1612,Move canned queries closer to the SQL input area,639012,jsfenfen,closed,0,,,3268330,Datasette 1.0,5,2022-01-25T17:06:39Z,2022-03-19T04:04:49Z,2022-01-25T18:34:21Z,CONTRIBUTOR,,"*Original title: Consider placing example queries above the sql input?* Hi! Have been enjoying deploying ad hoc datasettes for collaborators to pick over! I keep finding myself manually ""fixing"" the database.html template so that the ""example queries"" (canned queries) appear directly *over* the sql box? So they are sorta more a suggestion for collaborators who aren't inclined to write their own queries? My sense is any time I go to the trouble of writing canned queries my users should see 'em? (( I have also considered a client-side reactive-ish option where selecting a query just places the raw SQL in the box and doesn't execute it, but this seems to end up being an inconvenience, rather than a teaching tool. )) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114628238,I_kwDOBm6k_c5Cb-CO,1613,Improvements to help make Datasette a better tool for learning SQL,9599,simonw,open,0,,,,,5,2022-01-26T04:56:07Z,2022-01-26T16:41:46Z,,OWNER,,Tracking issue for the general goal of making Datasette a better tool for learning SQL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1613/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1115435536,I_kwDOBm6k_c5CfDIQ,1614,Try again with SQLite codemirror support,9599,simonw,open,0,,,,,1,2022-01-26T20:05:20Z,2022-12-23T21:27:10Z,,OWNER,,"I tried and failed to implement autocomplete a while ago. Relevant code: https://github.com/codemirror/legacy-modes/blob/8f36abca5f55024258cd23d9cfb0203d8d244f0d/mode/sql.js#L335 Sounds like upgrading to CodeMirror 6 ASAP would be worthwhile since it has better accessibility and touch screen support: https://codemirror.net/6/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1117132741,I_kwDOBm6k_c5ClhfF,1615,Potential simplified publishing mechanism,369053,aidansteele,closed,0,,,,,2,2022-01-28T08:34:50Z,2022-02-02T07:34:21Z,2022-02-02T07:34:17Z,NONE,,"Hi, Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet. I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services. So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting. E.g. user `johndoe` creates a repo `my-animals` with a couple of files: `dogs.csv`, `cats.csv` and the following GitHub Actions workflow: ```yaml # .github/workflows/push.yml on: push # this allows the publish action to use OIDC to authenticate johndoe/my-animals permissions: id-token: write contents: read jobs: publish: runs-on: ubuntu-latest steps: - uses: actions/setup-python@v2 - run: pip install sqlite-utils - uses: actions/checkout@v2 - run: | set -eux sqlite-utils create-database animals.db sqlite-utils insert animals.db dogs dogs.csv --csv sqlite-utils insert animals.db cats cats.csv --csv - uses: datasette-hub/publish@v1 with: db: animals.db metadata: meta.yml # this step is helpful for debugging why the # generated sqlite db was rejected - uses: actions/upload-artifact@v2 if: failure() with: path: animals.db retention-days: 1 ``` This would then cause a Datasette instance to be available at `https://johndoe-my-animals.datasette-hub.test/`. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette. What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with [git scraping](https://simonwillison.net/2020/Oct/9/git-scraping/).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1138008042,I_kwDOBm6k_c5D1J_q,1636,"""permissions"" propery in metadata for configuring arbitrary permissions",9599,simonw,closed,0,,,8711695, Datasette 1.0a2,14,2022-02-15T00:25:59Z,2022-12-13T02:40:50Z,2022-12-13T02:40:50Z,OWNER,,"The `""allow""` block mechanism can already be used to configure various default permissions. When adding permissions to `datasette-tiddlywiki` I realized it would be good to be able to configure arbitrary permissions such as `edit-tiddlywiki` there too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125576543,I_kwDOBm6k_c5DFu9f,1630,Review datasette.utils and decide which functions should be documented for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-02-07T06:39:52Z,2022-02-07T06:39:52Z,,OWNER,,"Follows: - #1176",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ``` > % mkdir /tmp/data > % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit) > ^CINFO: Shutting down > % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852 > ... > INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK > ``` > The first time I ran Datasette I got two databases - `fixtures` and `created` > > BUT... when I ran Datasette the second time it looked like this: > > > > This is the same result you get if you run: > > datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db > > This is caused by this Datasette issue: > - https://github.com/simonw/datasette/issues/509 > > So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831 _Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,henrikek,open,0,,,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url. ![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png) And the result is: ![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png) If I add the marked row to url_builder.py it seams to work: ![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this: ```Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2' RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json ENV PORT 8080 EXPOSE 8080 CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db ``` This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1142107925,I_kwDOBm6k_c5EEy8V,1638,`filters_from_request` plugin hook docs should mention that returning an async function is allowed,9599,simonw,open,0,,,,,0,2022-02-18T00:08:26Z,2022-02-18T00:08:26Z,,OWNER,,"https://docs.datasette.io/en/stable/plugin_hooks.html#filters-from-request-request-database-table-datasette doesn't mention that you can return an `async` function - but you can, and in fact Datasette itself uses that here: https://github.com/simonw/datasette/blob/aa7f0037a46eb76ae6fe9bf2a1f616c58738ecdf/datasette/filters.py#L43-L47",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1148638868,I_kwDOBm6k_c5EdtaU,1639,Make datasette-redirect-forbidden unneccessary,9599,simonw,open,0,,,,,0,2022-02-23T22:18:46Z,2022-02-23T22:18:46Z,,OWNER,,"I wrote `datasette-redirect-forbidden` today because I needed 403 errors to redirect to `/-/login` and it was the quickest way to solve that problem. This should be a feature of Datasette core. - https://github.com/simonw/datasette-redirect-forbidden/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,broccolihighkicks,open,0,,,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron. I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc). I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes). ```python Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ERROR: Exception in ASGI application Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ``` Thanks, I am finding Datasette very useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1149310456,I_kwDOBm6k_c5EgRX4,1641,Tweak mobile keyboard settings,9599,simonw,open,0,,,,,1,2022-02-24T13:47:10Z,2022-02-24T13:49:26Z,,OWNER,,"https://developer.apple.com/library/archive/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/KeyboardManagement/KeyboardManagement.html#//apple_ref/doc/uid/TP40009542-CH5-SW12 `autocorrect=""off""` is worth experimenting with. Twitter: https://twitter.com/forestgregg/status/1496842959563726852",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1152072027,I_kwDOBm6k_c5Eqzlb,1642,Dependency issue with asgiref and uvicorn,9599,simonw,closed,0,,,,,1,2022-02-26T18:00:35Z,2022-03-05T01:11:27Z,2022-03-05T01:11:17Z,OWNER,,"``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. datasette 0.60.2 requires asgiref<3.5.0,>=3.2.10, but you'll have asgiref 3.5.0 which is incompatible. ``` That's after I forced an upgrade of `uvicorn` due to this warning: ``` ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. uvicorn 0.13.1 requires click==7.*, but you'll have click 8.0.4 which is incompatible. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1154399841,I_kwDOBm6k_c5Ezr5h,1645,"Sensible `cache-control` headers for static assets, including those served by plugins",697092,curiousleo,open,0,,,3268330,Datasette 1.0,4,2022-02-28T18:12:03Z,2022-03-08T02:59:29Z,,NONE,,"## What I'm seeing With `default_cache_ttl = 86400`, I see the following: A table view returns `Cache-control: max-age=86400`: ![Screenshot_20220228_190000](https://user-images.githubusercontent.com/697092/156034352-4d64683e-39c8-49af-81df-0217a5957bbd.png) A static asset returns no `Cache-control` header: ![Screenshot_20220228_185933](https://user-images.githubusercontent.com/697092/156034363-d0b03cc2-5889-4ed2-b601-8c1846b8469a.png) ## What I expected to see I expected the static asset to return a `Cache-control` header indicating that this response can be cached. ## Why this matters I'm productionising a Datasette deployment right now and was looking into putting it behind a Varnish instance. I was surprised to see requests for static assets being served from Datasette rather than Varnish, this is what led me to look more closely at the response headers. While Datasette serves those static assets pretty quickly, I don't see why Datasette should serve them. By their nature, static assets like images and JS files are very cacheable, so it should be easy to serve them from a cache like Varnish. (Note that Varnish can easily be configured to override this header, enabling caching for static assets. But it would be better if this override was not necessary.) ## Discussion It seems clear to me that serving static assets without a `Cache-control` header is not ideal. I see two options here: A. Static assets use the same logic as table / SQL views to set the `Cache-control` header based on `default_cache_ttl`. B. An additional setting for static assets is introduced (`default_static_cache_ttl`, say).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1645/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1160407071,I_kwDOBm6k_c5FKmgf,1647,Test failures with SQLite 3.37.0+ due to column affinity case,9599,simonw,closed,0,,,,,5,2022-03-05T17:37:46Z,2022-03-05T19:56:28Z,2022-03-05T19:47:04Z,OWNER,,"These three tests are failing on my local machine: ``` FAILED tests/test_internals_database.py::test_table_column_details[facetable-expected0] - AssertionError: assert [Column(cid=0, name='pk', type='INTEGER', no... FAILED tests/test_internals_database.py::test_table_column_details[sortable-expected1] - AssertionError: assert [Column(cid=0, name='pk1', type='varchar(30)'... FAILED tests/test_table_html.py::test_sort_links - AssertionError: assert [{'a_href': None,\n 'attrs': {'class': ['col-Link'],\n 'data-column': '... ``` I ran `pytest --lf -vv` and the output had things like this in it: ``` E - Column(cid=1, name='created', type='text', notnull=0, default_value=None, is_pk=0, hidden=0), E ? ^^^^ E + Column(cid=1, name='created', type='TEXT', notnull=0, default_value=None, is_pk=0, hidden=0), ... E {'a_href': '/fixtures/sortable?_sort=sortable_with_nulls_2', E 'attrs': {'class': ['col-sortable_with_nulls_2'], E 'data-column': 'sortable_with_nulls_2', E 'data-column-not-null': '0', E - 'data-column-type': 'real', E ? ^^^^ E + 'data-column-type': 'REAL', E ? ^^^^ ``` Something is causing column types to come back in uppercase where previously they were lowercase.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160750713,I_kwDOBm6k_c5FL6Z5,1650,Implement redirects from old % encoding to new dash encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,5,2022-03-06T23:40:02Z,2022-03-07T19:26:15Z,2022-03-07T19:26:14Z,OWNER,,"> One big advantage to this scheme is that redirecting old links to `%2F` pages (e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators) is easy - if you see a `%` in the `raw_path`, redirect to that page with the `%` replaced by `-`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1439#issuecomment-1060044007_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161584460,I_kwDOBm6k_c5FPF9M,1651,Get rid of the no-longer necessary ?_format=json hack for tables called x.json,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-07T15:40:42Z,2022-03-19T04:04:50Z,2022-03-15T18:25:42Z,OWNER,,"Tidy up from: - #1439",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1161937073,I_kwDOBm6k_c5FQcCx,1653,Mechanism to default a table to sorting by multiple columns,9599,simonw,open,0,,,,,2,2022-03-07T21:20:11Z,2022-03-07T21:23:39Z,,OWNER,,"### Discussed in https://github.com/simonw/datasette/discussions/1652
Originally posted by **zaneselvans** March 7, 2022 It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order): ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: created_time ``` But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette? ```sql SELECT respondent_id, report_year, spplmnt_num, row_number, row_seq, row_prvlg, acct_num, depr_plnt_base, est_avg_srvce_lf, net_salvage, apply_depr_rate, mrtlty_crv_typ, avg_remaining_lf, report_prd FROM f1_edcfu_epda WHERE respondent_id = 210 AND report_year = 2020 ORDER BY report_year, report_prd, respondent_id, spplmnt_num, row_number LIMIT 1000 ``` The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key. I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names... ```yaml databases: ferc1: tables: f1_edcfu_epda: sort: ""(report_year, report_prd, respondent_id, spplmnt_num, row_number)"" ``` and they all give me server errors like: ``` Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number) ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1161969891,I_kwDOBm6k_c5FQkDj,1654,Adopt a code of conduct,9599,simonw,closed,0,,,,,5,2022-03-07T22:00:24Z,2022-03-07T22:19:35Z,2022-03-07T22:19:35Z,OWNER,,"This is long overdue, especially given the size of the project now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1163369515,I_kwDOBm6k_c5FV5wr,1655,query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data,536941,fgregg,open,0,,,,,8,2022-03-09T00:56:40Z,2023-10-17T21:53:17Z,,CONTRIBUTOR,,"[this page](https://labordata.bunkum.us/opdr-8335ea3?sql=with+most_recent_lu+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%28%0D%0A++++++select%0D%0A++++++++*%0D%0A++++++from%0D%0A++++++++lm_data%0D%0A++++++order+by%0D%0A++++++++f_num%2C%0D%0A++++++++receive_date+desc%0D%0A++++%29+t%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++aff_abbr+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+abbr_local_name%2C%0D%0A++coalesce%28%0D%0A++++regexp_match%28%27%28.*%3F%29%28%2C%3F+AFL-CIO%24%29%27%2C+union_name%29%2C%0D%0A++++regexp_match%28%27%28.*%3F%29%28+IND%24%29%27%2C+union_name%29%2C%0D%0A++++union_name%0D%0A++%29+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+full_local_name%2C%0D%0A++*%0D%0Afrom%0D%0A++most_recent_lu%0D%0Awhere+%28desig_num+IS+NOT+NULL+OR+unit_name+IS+NOT+NULL%29+AND+desig_name+%21%3D+%27HQ%27%0D%0Alimit%0D%0A++5000+offset+0) is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb. it's using over a 1G on chrome 99. i found this because, i was trying to figure out why editing the SQL was getting very slow. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174306154,I_kwDOBm6k_c5F_n1q,1668,"Introduce concept of a database `route`, separate from its name",9599,simonw,closed,0,,,3268330,Datasette 1.0,20,2022-03-19T16:48:28Z,2022-03-20T16:43:16Z,2022-03-20T16:43:16Z,OWNER,,"Some issues came up in the new `datasette-hashed-urls` plugin relating to the way it renames databases on startup to achieve unique URLs that depend on the database SHA-256 content: - https://github.com/simonw/datasette-hashed-urls/issues/10 - https://github.com/simonw/datasette-hashed-urls/issues/9 - https://github.com/simonw/datasette-hashed-urls/issues/8 All three of these could be addressed by making the ""path"" concept for a database (the `/foo` bit where it is served) work independently of the database's name, which would be used for default display and also as the alias when configuring cross-database aliases.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1168995756,I_kwDOBm6k_c5FrXWs,1657,Tilde encoding: use ~ instead of - for dash-encoding,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2022-03-14T22:55:17Z,2022-03-15T18:25:11Z,2022-03-15T18:01:58Z,OWNER,,Refs #1439,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1169840669,I_kwDOBm6k_c5Fulod,1658,Revert main to version that passes tests,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-15T15:37:02Z,2022-03-19T04:04:50Z,2022-03-15T15:42:58Z,OWNER,,"> I've made a real mess of this. I'm going to revert Datasette`main` back to the last commit that passed the tests and try this again in a branch. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1657#issuecomment-1068125636_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1658/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170144879,I_kwDOBm6k_c5Fvv5v,1660,Refactor and simplify Datasette routing and views,9599,simonw,closed,0,,,3268330,Datasette 1.0,8,2022-03-15T19:56:56Z,2022-03-21T19:19:12Z,2022-03-21T19:19:01Z,OWNER,,"While working on: - https://github.com/simonw/datasette/issues/1657 - https://github.com/simonw/datasette/issues/1439 It became very clear that the least maintainable part of Datasette at the moment is the way routing to the database, table and row views work - in particular the subclassing mechanism with BaseView and DataView, but also the complex variety of ways in which the URL routes capture different named regular expression groups.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170355774,I_kwDOBm6k_c5FwjY-,1661,Remove Hashed URL mode,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2022-03-15T23:13:56Z,2022-03-19T00:37:37Z,2022-03-19T00:37:36Z,OWNER,,"It's now handled by a plugin instead: - #647 - https://github.com/simonw/datasette-hashed-urls/issues/3 https://github.com/simonw/datasette-hashed-urls Sub-tasks: - [x] Remove hashed URL mode implementation - [x] Update documentation - [x] Ensure `--setting hash_urls 1` shows a useful message",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,contrun,closed,0,,,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features: - `db.hash` - `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1190828163,I_kwDOBm6k_c5G-piD,1698,Add a warning about bots and Cloud Run,9599,simonw,closed,0,,,,,1,2022-04-03T05:57:17Z,2022-04-03T06:10:24Z,2022-04-03T06:10:24Z,OWNER,,Recommend the https://github.com/simonw/datasette-block-robots plugin if you are going to run a large database in Cloud Run (one with a lot of rows).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1189113609,I_kwDOBm6k_c5G4G8J,1697,"`Request.fake(..., url_vars={})`",9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-04-01T01:48:40Z,2022-04-01T02:02:18Z,2022-04-01T02:02:10Z,OWNER,,"I just created an alternative `.fake()` method because I wanted to fake the `url_vars` captured in the route as well: ```python from datasette.utils.asgi import Request class Request(Request): @classmethod def fake(cls, path_with_query_string, method=""GET"", scheme=""http"", url_vars=None): """"""Useful for constructing Request objects for tests"""""" path, _, query_string = path_with_query_string.partition(""?"") scope = { ""http_version"": ""1.1"", ""method"": method, ""path"": path, ""raw_path"": path_with_query_string.encode(""latin-1""), ""query_string"": query_string.encode(""latin-1""), ""scheme"": scheme, ""type"": ""http"", } if url_vars: scope[""url_route""] = { ""kwargs"": url_vars } return cls(scope, None) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174655187,I_kwDOBm6k_c5GA9DT,1671,Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply,9308268,rayvoelker,open,0,,,,,8,2022-03-20T19:17:24Z,2022-03-22T17:43:12Z,,NONE,,"I found a strange behavior, and I'm not sure if it's related to views and boolean values perhaps, or if there's something else weird going on here, but I'll provide an example that may help show what I'm seeing happen. ```bash #!/bin/bash echo ""\""id\"",\""expiration_date\"" 0,2018-01-04 1,2019-01-05 2,2020-01-06 3,2021-01-07 4,2022-01-08 5,2023-01-09 6,2024-01-10 7,2025-01-11 8,2026-01-12 9,2027-01-13 "" > test.csv csvs-to-sqlite test.csv test.db sqlite-utils create-view --replace test.db test_view ""select id, expiration_date, case when julianday('NOW') >= julianday(expiration_date) then 1 else 0 end as has_expired FROM test"" ``` ```bash datasette test.db ``` ![image](https://user-images.githubusercontent.com/9308268/159178745-9c6152f7-eac6-4bf9-bef5-a2d63d3ee13f.png) ![image](https://user-images.githubusercontent.com/9308268/159178824-c8952137-270c-42a4-ad1c-f6ad2c51e499.png) ![image](https://user-images.githubusercontent.com/9308268/159178877-23e00b36-443a-43ef-83e5-e0bdddd3fdcd.png) ![image](https://user-images.githubusercontent.com/9308268/159178918-65922cc7-2514-4735-a72d-4904b99976d4.png) Thanks again and let me know if you want me to provide anything else!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,simonw,closed,0,,,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change. After shipping, release a new `datasette-hashed-urls` that depends on it, also this: - https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174697144,I_kwDOBm6k_c5GBHS4,1672,Refactor CSV handling code out of DataView,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T21:47:00Z,2022-03-20T21:52:39Z,,OWNER,,"> I think the way to get rid of most of the remaining complexity in `DataView` is to refactor how CSV stuff works - pulling it in line with other export factors and extracting the streaming mechanism. Opening a fresh issue for that. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1073355032_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1672/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174708375,I_kwDOBm6k_c5GBKCX,1673,Streaming CSV spends a lot of time in `table_column_details`,9599,simonw,open,0,,,,,1,2022-03-20T22:25:28Z,2022-03-20T22:34:06Z,,OWNER,,"At least I think it does. I tried running `py-spy top -p $PID` against a Datasette process that was trying to do: datasette covid.db --get '/covid/ny_times_us_counties.csv?_size=10&_stream=on' While investigating: - #1355 And spotted this: ``` datasette covid.db --get /covid/ny_times_us_counties.csv?_size=10&_stream=on' (python v3.10.2) Total Samples 5800 GIL: 71.00%, Active: 98.00%, Threads: 4 %Own %Total OwnTime TotalTime Function (filename:line) 8.00% 8.00% 4.32s 4.38s sql_operation_in_thread (datasette/database.py:212) 5.00% 5.00% 3.77s 3.93s table_column_details (datasette/utils/__init__.py:614) 6.00% 6.00% 3.72s 3.72s _worker (concurrent/futures/thread.py:81) 7.00% 7.00% 2.98s 2.98s _read_from_self (asyncio/selector_events.py:120) 5.00% 6.00% 2.35s 2.49s detect_fts (datasette/utils/__init__.py:571) 4.00% 4.00% 1.34s 1.34s _write_to_self (asyncio/selector_events.py:140) ``` Relevant code: https://github.com/simonw/datasette/blob/798f075ef9b98819fdb564f9f79c78975a0f71e8/datasette/utils/__init__.py#L609-L625 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174717287,I_kwDOBm6k_c5GBMNn,1674,Tweak design of /.json,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-03-20T22:58:01Z,2022-03-20T22:58:40Z,,OWNER,,"https://latest.datasette.io/.json Currently: ```json { ""_memory"": { ""name"": ""_memory"", ""hash"": null, ""color"": ""a6c7b9"", ""path"": ""/_memory"", ""tables_and_views_truncated"": [], ""tables_and_views_more"": false, ""tables_count"": 0, ""table_rows_sum"": 0, ""show_table_row_counts"": false, ""hidden_table_rows_sum"": 0, ""hidden_tables_count"": 0, ""views_count"": 0, ""private"": false }, ""fixtures"": { ""name"": ""fixtures"", ""hash"": ""645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa"", ""color"": ""645005"", ""path"": ""/fixtures"", ""tables_and_views_truncated"": [ { ""name"": ""compound_three_primary_keys"", ""columns"": [ ""pk1"", ""pk2"", ""pk3"", ""content"" ], ""primary_keys"": [ ""pk1"", ""pk2"", ""pk3"" ], ""count"": 1001, ""hidden"": false, ""fts_table"": null, ""num_relationships_for_sorting"": 0, ""private"": false }, ``` As-of this issue the `""path""` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns: - #1668",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175694248,I_kwDOBm6k_c5GE6uo,1677,Remove `check_permission()` from `BaseView`,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:18:18Z,2022-03-21T18:45:04Z,2022-03-21T18:45:03Z,OWNER,,"Follow-on from: - #1675 Refs: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175648453,I_kwDOBm6k_c5GEvjF,1675,Extract out `check_permissions()` from `BaseView,9599,simonw,closed,0,,,,,7,2022-03-21T16:39:46Z,2022-03-21T17:14:31Z,2022-03-21T17:13:21Z,OWNER,,"> I'm going to refactor this stuff out and document it so it can be easily used by plugins: https://github.com/simonw/datasette/blob/4a4164b81191dec35e423486a208b05a9edc65e4/datasette/views/base.py#L69-L103 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1660#issuecomment-1074136176_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175715988,I_kwDOBm6k_c5GFACU,1678,Make `check_visibility()` a documented API,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2022-03-21T17:30:34Z,2022-03-21T19:04:03Z,2022-03-21T19:01:46Z,OWNER,,"Spotted this while working on: - #1677 https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/datasette/utils/__init__.py#L1005-L1021",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175854982,I_kwDOBm6k_c5GFh-G,1679,Research: how much overhead does the n=1 time limit have?,9599,simonw,closed,0,,,3268330,Datasette 1.0,11,2022-03-21T19:27:46Z,2022-03-21T21:55:57Z,2022-03-21T21:55:56Z,OWNER,,"https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200 ```python @contextmanager def sqlite_timelimit(conn, ms): deadline = time.perf_counter() + (ms / 1000) # n is the number of SQLite virtual machine instructions that will be # executed between each check. It's hard to know what to pick here. # After some experimentation, I've decided to go with 1000 by default and # 1 for time limits that are less than 50ms n = 1000 if ms < 50: n = 1 def handler(): if time.perf_counter() >= deadline: return 1 conn.set_progress_handler(handler, n) try: yield finally: conn.set_progress_handler(None, n) ``` How often do I set a time limit of 50 or less? How much slower does it go thanks to this code?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175894898,I_kwDOBm6k_c5GFrty,1680,Consider simplifying permissions for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-03-21T20:17:29Z,2022-03-21T20:17:29Z,,OWNER,,"Permission checks right now can express one of three opinions: - `False` means ""so not grant this permisson"" - `True` means ""grant this permission"" - `None` means ""I have no opinion"" But... there's also a concept of a ""default"" for a given permission check, which might be `False` or `True`. I worry this is too complicated. Could this be simplified before 1.0? In particular the default concept. See also: - #1676 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1177101697,I_kwDOBm6k_c5GKSWB,1681,Potential bug in numeric handling where_clause for filters,9599,simonw,open,0,,,,,2,2022-03-22T17:43:50Z,2022-03-22T17:49:09Z,,OWNER,,"> Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`: > > https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212 > > Though... it looks like there's a bug in that? It doesn't account for `float` values - `""3.5"".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1178521513,I_kwDOBm6k_c5GPs-p,1682,SQL queries against databases with different routes are broken,9599,simonw,closed,0,,,,,1,2022-03-23T18:42:57Z,2022-03-23T18:48:16Z,2022-03-23T18:48:16Z,OWNER,,"500 error on https://datasette-hashed-urls-preview.vercel.app/fixtures-09f8f95?sql=select+*+from+facetable Here's the trace: ``` File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 54, in data return await QueryView(self.ds).data( File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/views/database.py"", line 232, in data self.ds.get_database(database), sql File ""/Users/simon/.local/share/virtualenvs/datasette-hashed-urls-ssI2fO50/lib/python3.10/site-packages/datasette/app.py"", line 401, in get_database return self.databases[name] KeyError: 'fixtures-aa7318b' ``` It looks like this is a Datasette bug, which is frustrating because I just shipped Datasette 0.61 five minutes ago! _Originally posted by @simonw in https://github.com/simonw/datasette-hashed-urls/issues/13#issuecomment-1076693667_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1682/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1179928510,I_kwDOBm6k_c5GVEe-,1683,allow_facet: False should be respected by column cog menu,9599,simonw,closed,0,,,,,0,2022-03-24T19:05:06Z,2022-03-24T19:16:36Z,2022-03-24T19:16:36Z,OWNER,,"The column cog menu currently shows ""Facet by this"" even if faceting is disabled for the Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1179998071,I_kwDOBm6k_c5GVVd3,1684,Mechanism for disabling faceting on large tables only,9599,simonw,open,0,,,,,1,2022-03-24T20:06:11Z,2022-03-24T20:13:19Z,,OWNER,,"Forest turned off faceting on https://labordata.bunkum.us/ because it was causing performance problems on some of the huge tables - but it would be nice if it could still be an option on smaller tables such as https://labordata.bunkum.us/voluntary_recognitions-4421085/voluntary_recognitions One option: a new setting that automatically disables faceting (and facet suggestion) for tables that have either more than X rows or that are so big that the count could not be completed within the time limit.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181037277,I_kwDOBm6k_c5GZTLd,1686,heroku bails if app name specifed in datasette publish is the same as existing app,2115933,tlongers,open,0,,,,,0,2022-03-25T17:10:34Z,2022-03-25T17:10:34Z,,NONE,,"Seem that `heroku` does not accept an app overwrite triggered by specifying the app name using `datasette publish`, as below: ``` datasette publish heroku some.db --name ""jazzy-name"" ``` The resulting error has the below traceback: ``` Creating jazzy-name... ! ▸ Name jazzy-name is already taken Traceback (most recent call last): File ""/opt/homebrew/bin/datasette"", line 33, in sys.exit(load_entry_point('datasette==0.60.1', 'console_scripts', 'datasette')()) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/datasette/publish/heroku.py"", line 127, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 420, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 524, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'jazzy-name', '--json']' returned non-zero exit status 1. ``` It's a solid failsafe, but does `datasette publish` have a way to force an overwrite?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181364043,I_kwDOBm6k_c5Gai9L,1687,Make show_json.html or a similar mechanism stable for plugins,9599,simonw,open,0,,,,,0,2022-03-25T23:42:45Z,2022-03-25T23:42:45Z,,OWNER,,"I used `show_json.html` in the new `datasette-packages` plugin, which means it will break if that template changes: - https://github.com/simonw/datasette-packages/issues/3 It would be useful if it (or something like it) was documented and stable for plugins to use. Also relevant: - #878",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1687/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182227211,I_kwDOBm6k_c5Gd1sL,1692,[plugins][feature request]: Support additional script tag attributes when loading custom JS,9020979,hydrosquall,open,0,,,,,2,2022-03-27T01:16:03Z,2022-03-30T06:14:51Z,,CONTRIBUTOR,,"## Motivation - The build system for my new [plugin](https://github.com/hydrosquall/datasette-nteract-data-explorer) has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette. - I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers. To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, `nomodule` and `defer`. My injected scripts should look something like this: ```html ``` ## Proposal To achieve this, I propose additional optional properties to the API accepted by the `extra_js_urls` hook and custom JS field the `metadata.json` [described here](https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript). Under this API, I'd write something like this to get the above HTML rendered in Datasette. ```json { ""extra_js_urls"": [ { ""url"": ""/index.my-es-module-bundle.js"", ""module"": true, }, { ""url"": ""/index.my-legacy-fallback-bundle.js"", ""nomodule"": """", ""defer"": true } ] } ``` ## Resources - [MDN on the script tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script) - There may be other properties that could be added that are potentially valuable, like `async` or `referrerpolicy`, but I don't have an immediate need for those.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182065616,I_kwDOBm6k_c5GdOPQ,1689,datasette.add_message() documentation is incorrect,9599,simonw,closed,0,,,,,1,2022-03-26T20:49:42Z,2022-03-26T21:35:57Z,2022-03-26T20:51:21Z,OWNER,,"https://docs.datasette.io/en/0.61.1/internals.html#add-message-request-message-message-type-datasette-info says: `.add_message(request, message, message_type=datasette.INFO)` But in the code it's: https://github.com/simonw/datasette/blob/6b99e4a66ba0ed8fca8ee41ceb7206928b60d5d1/datasette/app.py#L582",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1182141761,I_kwDOBm6k_c5Gdg1B,1690,"Idea: `datasette.set_actor_cookie(response, actor)`",9599,simonw,open,0,,,,,2,2022-03-26T22:41:52Z,2022-03-26T22:43:00Z,,OWNER,,"I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/__init__.py#L79-L92 ```python redirect_response = Response.redirect(""/"") expires_at = int(time.time()) + (24 * 60 * 60) redirect_response.set_cookie( ""ds_actor"", datasette.sign( { ""a"": profile_response.json(), ""e"": baseconv.base62.encode(expires_at), }, ""actor"", ), ) return redirect_response ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1182143895,I_kwDOBm6k_c5GdhWX,1691,Bug in pytest-httpx example,9599,simonw,closed,0,,,,,0,2022-03-26T22:45:30Z,2022-03-26T22:46:09Z,2022-03-26T22:46:09Z,OWNER,,"https://docs.datasette.io/en/0.61.1/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx says: ```python async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url='https://www.example.com/', data='Hello world', ) ``` That's wrong - `data=` should be `text=`. https://github.com/Colin-b/pytest_httpx/blob/v0.20.0/README.md#reply-with-custom-body",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1185868354,I_kwDOBm6k_c5GrupC,1695,Option to un-filter facet not shown for `?col__exact=value`,9599,simonw,open,0,,,,,2,2022-03-30T04:44:02Z,2022-03-30T04:46:18Z,,OWNER,,"Spotted this on a page with `COUNTY__exact=Lee` in the URL: ![CleanShot 2022-03-29 at 21 41 46@2x](https://user-images.githubusercontent.com/9599/160752849-a9039343-3770-4655-920b-f19e25687a57.png) With `COUNTY=Lee` you get this instead: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1186696202,I_kwDOBm6k_c5Gu4wK,1696,Show foreign key label when filtering,9599,simonw,open,0,,,,,2,2022-03-30T16:18:54Z,2023-01-29T20:56:20Z,,OWNER,,"For example here: 3 corresponds to ""Human Related: Other"" - it would be neat to display this in this area of the page somehow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1193090967,I_kwDOBm6k_c5HHR-X,1699,Proposal: datasette query,25778,eyeseast,open,0,,,,,6,2022-04-05T12:36:43Z,2022-04-11T01:32:12Z,,CONTRIBUTOR,,"I started sketching out a plugin to add a `datasette query` subcommand to export data from the command line. This is based on discussions in #1356 and #1605. Before I get too far down this rabbit hole, I figure it's worth getting some feedback here (unless this should happen in `Discussions`). Here's what I'm thinking: At its most basic, it will write the results of a query to STDOUT. ```sh datasette query -d data.db 'select * from data' > results.json ``` This isn't much improvement over using [sqlite-utils](https://github.com/simonw/sqlite-utils). To make better use of datasette and its ecosystem, run `datasette query` using a canned query defined in a `metadata.yml` file. For example, using the metadata file from [alltheplaces-datasette](https://github.com/eyeseast/alltheplaces-datasette/blob/main/metadata.yml): ```sh cd alltheplaces-datasette datasette query -d alltheplaces.db -m metadata.yml count_by_spider ``` That query would be good to get as CSV, and we can auto-discover metadata and databases in the current directory: ```sh cd alltheplaces-datasette datasette query count_by_spider -f csv ``` In this case, `count_by_spider` is a canned query defined on the `alltheplaces` database. If the same query is defined on multiple databases or its otherwise unclear which database `query` should use, pass the `-d` or `--database` option. If a query takes parameters, I can pass them in at runtime, using the `--param` or `-p` option: ```sh datasette query -d data.db -p value something 'select * from neighborhoods where some_column = :value' ``` I'm very interested in feedback on this, including whether it should be a plugin or in Datasette core. (I don't have a strong opinion about this, but I'm prototyping it as a plugin to start.)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1194790504,I_kwDOBm6k_c5HNw5o,1701,Use + for spaces instead of ~20,9599,simonw,closed,0,,,3268330,Datasette 1.0,0,2022-04-06T15:40:48Z,2022-04-06T15:55:10Z,2022-04-06T15:55:05Z,OWNER,,"Tilde encoding introduced in #1657 means that database files with spaces in the name - e.g. the Apple Mail `Envelope Index` database - end up with URLs like this: http://127.0.0.1:8001/Envelope~20Index I think this would be prettier: http://127.0.0.1:9933/Envelope+Index",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1196327155,I_kwDOBm6k_c5HToDz,1702,Be more consistent with column quoting,9599,simonw,open,0,,,,,0,2022-04-07T16:59:20Z,2022-04-07T16:59:20Z,,OWNER,,"This tutorial made me notice that Datasette is pretty inconsistent with how column quoting works: https://datasette.io/tutorials/learn-sql It has examples of each of `""table_name""` and `[table_name]` and `table_name`, and it uses single quoted values too. Datasette should generate SQL as consistently as possible to support learners. That tutorial should also provide a tiny bit of extra information about what's going on here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1197925865,I_kwDOBm6k_c5HZuXp,1704,File PRs against incompatible plugins pinning to datasette<1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2022-04-08T23:15:30Z,2022-04-08T23:15:30Z,,OWNER,,"As part of the preparation for the 1.0 release, test all existing known plugins against the alpha. For any that break, submit a PR suggesting they pin to a version <1.0 - and include a link to the documentation on how to upgrade the plugin for 1.0.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1197926598,I_kwDOBm6k_c5HZujG,1705,How to upgrade your plugin for 1.0 documentation,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-08T23:16:47Z,2022-12-13T05:29:05Z,,OWNER,,"Among other things, needed by: - #1704",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1198822563,I_kwDOBm6k_c5HdJSj,1706,"[feature] immutable mode for a directory, not just individual sqlite file",9020979,hydrosquall,open,0,,,,,4,2022-04-10T00:50:57Z,2022-12-09T19:11:40Z,,CONTRIBUTOR,,"## Motivation - I have a directory of sqlite databases - I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode) - Currently using this flag throws the following error IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory' ## Proposal Immutable flag works for both single files and directories datasette -i /folder-of-sqlite-files",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1706/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1200224939,I_kwDOBm6k_c5Hifqr,1707,[feature] expanded detail page,536941,fgregg,open,0,,,,,1,2022-04-11T16:29:17Z,2022-04-11T16:33:00Z,,CONTRIBUTOR,,"Right now, if click on the detail page for a row you get the info for the row and links to related tables: ![Screenshot 2022-04-11 at 12-27-26 lm20 filing](https://user-images.githubusercontent.com/536941/162786802-90ac1a71-4624-47c4-ae55-b783f4f6c92d.png) It would be very cool if there was an option to expand the rows of the related tables from within this detail view. If you had that then datasette could fulfill a pretty common use case where you want to search for an entity and get a consolidate detail view about what you know about that entity. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1707/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1200649124,I_kwDOBm6k_c5HkHOk,1708,Datasette 1.0 alpha upcoming release notes,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,2,2022-04-11T22:57:12Z,2022-12-13T05:29:06Z,,OWNER,,"I'm going to try writing the release notes first, to see if that helps unblock me. # ⚠️ Any release notes in this issue are a draft, and should not be treated as the real thing ⚠️ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1200649502,I_kwDOBm6k_c5HkHUe,1709,Redesigned JSON API with ?_extra= parameters,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-11T22:57:49Z,2022-12-13T05:29:06Z,,OWNER,,This will be the single biggest breaking change for the 1.0 release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1709/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1200649889,I_kwDOBm6k_c5HkHah,1710,Guide for plugin authors to upgrade their plugins for 1.0,9599,simonw,closed,0,,,,,1,2022-04-11T22:58:25Z,2022-04-11T23:04:01Z,2022-04-11T23:03:25Z,OWNER,,I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200650491,I_kwDOBm6k_c5HkHj7,1711,Template context powered entirely by the JSON API format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-04-11T22:59:27Z,2022-12-13T05:29:06Z,,OWNER,,Datasette 1.0 will have a stable template context. I'm going to achieve this by refactoring the templates to work only with keys returned by the API (or some of its extras) - then the API documentation will double up as template documentation.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1711/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1203943272,I_kwDOBm6k_c5Hwrdo,1713,Datasette feature for publishing snapshots of query results,9599,simonw,open,0,,,,,5,2022-04-14T01:42:00Z,2022-07-04T05:16:35Z,,OWNER,,"https://twitter.com/simonw/status/1514392335718645760 > Maybe [@datasetteproj](https://twitter.com/datasetteproj) should grow a feature that lets you cache the results of a query and give that snapshot a stable permalink > > A plugin that publishes the JSON output of a query to an S3 bucket would be pretty neat... especially if it could also be configured to re-publish the results on a schedule A lot of people said they would find this useful. Probably going to build this as a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,tannewt,open,0,,,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db. Datasette should unescape the url component before passing them into the template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223234932,I_kwDOBm6k_c5I6RV0,1733,Get Datasette compatible with Pyodide,9599,simonw,closed,0,,,,,9,2022-05-02T19:01:58Z,2022-05-04T15:14:01Z,2022-05-02T20:15:27Z,OWNER,,"I've already got this working as a prototype. Here are the changes I had to make: - Replace the two dependencies that don't publish pure Python wheels to PyPI: `click-default-group` and `python-baseconv` - Get Datasette to work without threading - which it turns out is exclusively used for database connections - Make the `uvicorn` dependency optional (only needed when Datasette runs in the CLI) TODO: - [x] Switch to `click-default-group-wheel` - [x] https://github.com/simonw/datasette/issues/1734 - [x] Work around `uvicorn` import error - [x] https://github.com/simonw/datasette/issues/1735 - [x] #1737 Goal is to be able to do the following directly in https://pyodide.org/en/stable/console.html ```python import micropip await micropip.install(""datasette"") from datasette.app import Datasette ds = Datasette() await ds.client.get(""/.json"") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1733/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223263540,I_kwDOBm6k_c5I6YU0,1735,Datasette setting to disable threading (for Pyodide),9599,simonw,closed,0,,,,,1,2022-05-02T19:31:08Z,2022-05-02T23:25:49Z,2022-05-02T20:13:52Z,OWNER,,"> I'm going to add a Datasette setting to disable threading entirely, designed for usage in this particular case. > > I thought about adding a new setting, then I noticed this: > > datasette mydatabase.db --setting num_sql_threads 10 > > I'm going to let users set that to `0` to disable threaded execution of SQL queries. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115278325_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223459734,I_kwDOBm6k_c5I7IOW,1737,Automated test for Pyodide compatibility,9599,simonw,closed,0,,,,,4,2022-05-02T23:24:25Z,2022-05-02T23:40:50Z,2022-05-02T23:40:50Z,OWNER,,"Refs: - #1733 Need something in the test suite such that if Datasette breaks against Pyodide in the future we hear about it. I'm thinking this is an opportunity to use [shot-scraper javascript](https://github.com/simonw/shot-scraper#scraping-pages-using-javascript).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223527226,I_kwDOBm6k_c5I7Ys6,1738,"""Cannot use _sort and _sort_desc at the same time""",9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-05-03T01:06:24Z,2022-08-14T16:13:55Z,2022-08-14T16:13:55Z,OWNER,,"Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width: https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1 Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply: ![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif) Also notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223699280,I_kwDOBm6k_c5I8CtQ,1739,.db downloads should be served with an ETag,9599,simonw,closed,0,,,,,6,2022-05-03T05:11:21Z,2022-05-04T18:21:18Z,2022-05-03T14:59:51Z,OWNER,,"I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it: ![image](https://user-images.githubusercontent.com/9599/166407074-dee19587-0667-4424-9e88-d3b5b90fd819.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212823665,I_kwDOBm6k_c5ISjhx,1715,Refactor TableView to use asyncinject,9599,simonw,closed,0,,,,,13,2022-04-22T21:43:39Z,2022-12-01T21:15:18Z,2022-04-28T22:26:56Z,OWNER,,"I've been working on a dependency injection mechanism in a separate library: - https://github.com/simonw/asyncinject I think it's ready to try out with Datasette to see if it's a pattern that will work here. I'm going to attempt to refactor `TableView` to use it. There are two overall goals here: - Use `asyncinject` to add parallel execution of some aspects of the table page - most notably I want to be able to execute the `count(*)` query, the `select ...` query, the various faceting queries and the facet suggestion queries in parallel - and measure if doing so is good for performance. - Use it to execute different output formats (possibly with some changes to the existing `register_output_renderer()` plugin hook). I want CSV and JSON to use the same mechanism that plugins use. Stretch goal is to get this working with streaming data too, see: - #1101",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1212838949,I_kwDOBm6k_c5ISnQl,1716,Configure git blame to ignore Black commit,9599,simonw,closed,0,,,,,1,2022-04-22T21:56:37Z,2022-04-22T22:02:19Z,2022-04-22T22:02:19Z,OWNER,,"GitHub can support this in blame views now too: https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1213683988,I_kwDOBm6k_c5IV1kU,1718,Code examples in the documentation should be formatted with Black,9599,simonw,closed,0,,,,,12,2022-04-24T15:22:50Z,2022-04-24T16:24:14Z,2022-04-24T16:18:03Z,OWNER,,"For example on this page: https://docs.datasette.io/en/stable/writing_plugins.html#packaging-a-plugin I wonder if there's an easy way for me to enforce this for Sphinx documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1215174094,I_kwDOBm6k_c5IbhXO,1720,Design plugin hook for extras,9599,simonw,closed,0,,,,,14,2022-04-26T00:08:10Z,2022-12-01T21:15:19Z,2022-04-26T20:20:27Z,OWNER,,"Refs: - #262 - #1709 I realized that this is a really natural plugin hook - and if I design it as a hook I can implement Datasette's core extras as default plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216436131,I_kwDOBm6k_c5IgVej,1721,"Implement plugin hooks: `register_table_extras`, `register_row_extras`, `register_query_extras`",9599,simonw,open,0,,,8755003,Datasette 1.0a-next,0,2022-04-26T20:21:49Z,2022-12-13T05:29:07Z,,OWNER,,"Designed in: - #1720 Part of: - #262 - #1709",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1216479167,I_kwDOBm6k_c5Igf-_,1722,`db.primary_keys()` and `db.table_columns()` don't show up in traces,9599,simonw,open,0,,,,,0,2022-04-26T21:08:36Z,2022-04-26T21:08:36Z,,OWNER,,"Noticed this while working on: - #1715 This code here isn't showing up in traces: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/table.py#L218-L220 Because those functions don't use the regular trace-instrumented `db.execute()` code path - they work directly against a connection instead: https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/utils/__init__.py#L610-L626 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1216508080,I_kwDOBm6k_c5IgnCw,1723,Research running SQL in table view in parallel using `asyncio.gather()`,9599,simonw,closed,0,,,,,5,2022-04-26T21:42:48Z,2022-04-27T18:53:44Z,2022-04-26T22:19:09Z,OWNER,,"Spun off from: - #1715",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1723/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216622905,I_kwDOBm6k_c5IhDE5,1725,Performance question - what is happening in this gap?,9599,simonw,open,0,,,,,0,2022-04-27T00:21:11Z,2022-04-27T00:21:11Z,,OWNER,,"Trace from https://latest-with-plugins.datasette.io/github/commits?_facet=repo&_trace=1&_facet=committer ![CleanShot 2022-04-26 at 17 20 06@2x](https://user-images.githubusercontent.com/9599/165413811-db2cd599-2acc-46ce-b9c2-f9bc45b879e9.png) What's going on in that gap? Can I improve the tracing output to show some non-SQL queries to figure that out?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1217014076,I_kwDOBm6k_c5Iiik8,1726,Security page in the documentation,9599,simonw,open,0,,,,,0,2022-04-27T08:43:30Z,2022-04-27T08:43:30Z,,OWNER,,"A page talking about how to run Datasette securely, and security concerns to take into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1726/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1217759117,I_kwDOBm6k_c5IlYeN,1727,Research: demonstrate if parallel SQL queries are worthwhile,9599,simonw,open,0,,,,,32,2022-04-27T18:54:21Z,2022-09-26T14:48:31Z,,OWNER,,"I added parallel SQL query execution here: - https://github.com/simonw/datasette/issues/1723 My hunch is that this will take advantage of multiple cores, since Python's `sqlite3` module releases the GIL once a query is passed to SQLite. I'd really like to prove this is the case though. Just not sure how to do it! Larger question: is this performance optimization actually improving performance at all? Under what circumstances is it worthwhile?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1727/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1218133366,I_kwDOBm6k_c5Imz12,1728,Writable canned queries fail with useless non-error against immutable databases,127565,wragge,closed,0,,,8303187,Datasette 0.62,13,2022-04-28T03:10:34Z,2022-08-14T16:34:40Z,2022-08-14T16:34:40Z,CONTRIBUTOR,,"I've been banging my head against a wall for a while and would appreciate any pointers... - I have a writeable canned query to update rows in the db. - I'm using the github-oauth plugin for authentication. - I have `allow` set on the query to accept my GitHub id and a GH organisation. - Authentication seems to work as expected both locally and on Cloudrun -- viewing `/-/actor` gives the same result in both environments - I can access the 'padlocked' canned query in both environments. Everything seems to be the same, but the canned query works perfectly when run locally, and fails when I try it on Cloudrun. I'm redirected back to the canned query page and the db is not changed. There's nothing in the Cloudstor logs to indicate an error. Any clues as to where I should be looking for the problem?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1219385669,I_kwDOBm6k_c5IrllF,1729,Implement ?_extra and new API design for TableView,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,12,2022-04-28T22:28:14Z,2022-12-13T05:29:07Z,,OWNER,,"Part of: - #262 - #1518",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1219398983,I_kwDOBm6k_c5Iro1H,1730,SQL tracing should much more closely track the SQL query execution,9599,simonw,open,0,,,,,0,2022-04-28T22:41:04Z,2022-04-28T22:41:10Z,,OWNER,,"In #1727 I realized that the SQL tracing was measuring a whole bunch of stuff outside of the SQL query itself. I started experimenting with this fix for that but it didn't work - I got back an empty JSON array of traces for some reason: ```diff diff --git a/datasette/database.py b/datasette/database.py index ba594a8..d7f9172 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -7,7 +7,7 @@ import sys import threading import uuid -from .tracer import trace +from .tracer import trace, trace_child_tasks from .utils import ( detect_fts, detect_primary_keys, @@ -207,30 +207,31 @@ class Database: time_limit_ms = custom_time_limit with sqlite_timelimit(conn, time_limit_ms): - try: - cursor = conn.cursor() - cursor.execute(sql, params if params is not None else {}) - max_returned_rows = self.ds.max_returned_rows - if max_returned_rows == page_size: - max_returned_rows += 1 - if max_returned_rows and truncate: - rows = cursor.fetchmany(max_returned_rows + 1) - truncated = len(rows) > max_returned_rows - rows = rows[:max_returned_rows] - else: - rows = cursor.fetchall() - truncated = False - except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: - if e.args == (""interrupted"",): - raise QueryInterrupted(e, sql, params) - if log_sql_errors: - sys.stderr.write( - ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( - conn, repr(sql), params, e + with trace(""sql"", database=self.name, sql=sql.strip(), params=params): + try: + cursor = conn.cursor() + cursor.execute(sql, params if params is not None else {}) + max_returned_rows = self.ds.max_returned_rows + if max_returned_rows == page_size: + max_returned_rows += 1 + if max_returned_rows and truncate: + rows = cursor.fetchmany(max_returned_rows + 1) + truncated = len(rows) > max_returned_rows + rows = rows[:max_returned_rows] + else: + rows = cursor.fetchall() + truncated = False + except (sqlite3.OperationalError, sqlite3.DatabaseError) as e: + if e.args == (""interrupted"",): + raise QueryInterrupted(e, sql, params) + if log_sql_errors: + sys.stderr.write( + ""ERROR: conn={}, sql = {}, params = {}: {}\n"".format( + conn, repr(sql), params, e + ) ) - ) - sys.stderr.flush() - raise + sys.stderr.flush() + raise if truncate: return Results(rows, truncated, cursor.description) @@ -238,9 +239,8 @@ class Database: else: return Results(rows, False, cursor.description) - with trace(""sql"", database=self.name, sql=sql.strip(), params=params): - results = await self.execute_fn(sql_operation_in_thread) - return results + with trace_child_tasks(): + return await self.execute_fn(sql_operation_in_thread) @property def size(self): ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1727#issuecomment-1111602802_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1239008850,I_kwDOBm6k_c5J2cZS,1744,`--nolock` feature for opening locked databases,9599,simonw,closed,0,,,,,7,2022-05-17T18:25:16Z,2022-05-17T19:46:38Z,2022-05-17T19:40:30Z,OWNER,,"The getting started docs currently suggest you try this to browse your Chrome history: datasette ~/Library/Application\ Support/Google/Chrome/Default/History But if Chrome is running you will likely get this error: sqlite3.OperationalError: database is locked Turns out there's a workaround for this which I just spotted [on the SQLite forum](https://sqlite.org/forum/forumpost/86a67f6995): > You can do this using a [URI filename](https://sqlite.org/uri.html): > ``` > sqlite3 'file:places.sqlite?mode=ro&nolock=1' > ``` > That opens the file `places.sqlite` in read-only mode with locking disabled. This isn't safe, in that changes to the database made by other corrections are likely to cause this connection to return incorrect results or crash. Read-only mode should at least mean that you don't corrupt the database in the process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1239080102,I_kwDOBm6k_c5J2tym,1745,Documentation on running cog,9599,simonw,closed,0,,,,,1,2022-05-17T19:41:06Z,2022-05-17T19:45:51Z,2022-05-17T19:43:45Z,OWNER,,Noticed that `cog -r docs/*.rst` isn't documented in https://docs.datasette.io/en/latest/contributing.html#editing-and-building-the-documentation,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1237586379,I_kwDOBm6k_c5JxBHL,1742,?_trace=1 fails with datasette-geojson for some reason,9599,simonw,open,0,,,,,4,2022-05-16T19:06:05Z,2022-05-16T19:42:13Z,,OWNER,,view-source:https://calands.datasettes.com/calands/CPAD_2020a_SuperUnits.geojson?_sort=id&id__exact=4&_labels=on&_trace=1 is showing me a blank page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1237871948,I_kwDOBm6k_c5JyG1M,1743,`datasette.utils.to_css_class()` should be a documented internal,9599,simonw,open,0,,,,,0,2022-05-16T23:57:26Z,2022-05-16T23:57:26Z,,OWNER,,"Because I'm using it in this plugin: - https://github.com/simonw/datasette-upload-dbs/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1243498298,I_kwDOBm6k_c5KHkc6,1746,Switch documentation theme to Furo,9599,simonw,closed,0,,,,,21,2022-05-20T18:42:17Z,2022-05-20T21:28:29Z,2022-05-20T21:28:29Z,OWNER,,"https://github.com/pradyunsg/furo I just did this for `shot-scraper` and I really like it: https://shot-scraper.datasette.io/en/latest/ - https://github.com/simonw/shot-scraper/issues/77",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243512344,I_kwDOBm6k_c5KHn4Y,1747,Add tutorials to the getting started guide,9599,simonw,closed,0,,,,,1,2022-05-20T19:01:52Z,2022-05-20T19:12:30Z,2022-05-20T19:05:34Z,OWNER,,On https://docs.datasette.io/en/stable/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243517592,I_kwDOBm6k_c5KHpKY,1748,Add copy buttons next to code examples in the documentation,9599,simonw,closed,0,,,,,2,2022-05-20T19:09:00Z,2022-05-20T19:15:00Z,2022-05-20T19:11:32Z,OWNER,,Similar to the ones in `datasette-copyable` which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1247315144,I_kwDOBm6k_c5KWITI,1749,LDAP auth plugin,380241,benswift,open,0,,,,,0,2022-05-25T01:35:12Z,2022-05-25T01:35:12Z,,NONE,,"A [search of the plugins directory](https://datasette.io/plugins?q=ldap) doesn't turn up anything, but is is possible to set up a Datasette app which uses my organisation's LDAP for auth? If not, how much work would it be to write one (I _may_ have some spare cycles on my team to do this, but we haven't written a datasette plugin before).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1749/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1251700382,I_kwDOBm6k_c5Km26e,1750,Allow `label_column` to specify array of columns,408765,knutwannheden,open,0,,,,,0,2022-05-28T18:45:48Z,2022-05-28T18:45:48Z,,NONE,,"I think it would be great if the Datasette metadata would allow the `label_column` table key to list multiple columns. Something like: ```json ""tables"": { ""person"": { ""label_column"": [""first_name"", ""last_name""] }, ``` It would even be interesting with a ""label expression"" similar to a Python f-string. E.g. `{row.last_name}, {row.first_name}`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,knutwannheden,closed,0,,,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.) When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach. I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1251739062,I_kwDOBm6k_c5KnAW2,1752,Research if I can drop Janus,9599,simonw,open,0,,,,,0,2022-05-28T22:46:52Z,2022-05-28T22:46:52Z,,OWNER,,"> It seems to me Janus dependency is not necessary, `async with app.database_write_mutex(): out = await app.transaction(func)` may be enough. Comment here: https://lobste.rs/s/fki4tj/architecture_notes_datasette#c_a2ihon",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,ar-jan,open,0,,,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn): > Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management. > > This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime. > > For production deployments we recommend using gunicorn with the uvicorn worker class. We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266329095,I_kwDOBm6k_c5LeqYH,1756,Mechanism for creating databases in WAL mode,9599,simonw,open,0,,,,,0,2022-06-09T15:39:28Z,2022-06-09T15:39:28Z,,OWNER,,"The `--create` option currently creates databases if they are missing, but does not enable WAL mode for them. It turns out WAL mode is useful for databases that are accepting writes! I think a `--create-wal` option that both creates them AND sets WAL mode on any that are created would be a good idea.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1280799259,I_kwDOBm6k_c5MV3Ib,1761,ensure_ascii=False,1473102,mustafa0x,open,0,,,,,0,2022-06-22T19:58:13Z,2022-06-22T19:58:30Z,,NONE,,"Hi, thanks for the project! For the JSON output, I would consider defaulting to `ensure_ascii=False` (UTF-8 seems pretty universal) or making it an option. When dealing with non-Latin text, `ensure_ascii=True` (the default) can triple the size of the output.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1306492437,I_kwDOBm6k_c5N334V,1770,`handle_exception` plugin hook for custom error handling,9599,simonw,closed,0,,,8303187,Datasette 0.62,14,2022-07-15T20:52:49Z,2022-08-14T15:25:51Z,2022-08-14T15:25:51Z,OWNER,,"I need this for a couple of plugins, both of which are broken at the moment: - https://github.com/simonw/datasette-sentry/issues/1 - https://github.com/simonw/datasette-show-errors/issues/2",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306984363,I_kwDOBm6k_c5N5v-r,1771,minor a11y: `",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1939/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1486036269,I_kwDOBm6k_c5Ykx0t,1941,Mechanism for supporting key rotation for DATASETTE_SECRET,9599,simonw,open,0,,,,,1,2022-12-09T05:24:53Z,2022-12-09T05:25:20Z,,OWNER,,"Currently if you change `DATASETTE_SECRET` all existing signed tokens - both cookies and API tokens and potentially other things too - will instantly expire. Adding support for key rotation would allow keys to be rotated on a semi-regular basis without logging everyone out / invalidating every API token instantly. Can model this on how Django does it: https://github.com/django/django/commit/0dcd549bbe36c060f536ec270d34d9e7d4b8e6c7",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1941/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1487738738,I_kwDOBm6k_c5YrRdy,1942,Option for plugins to request that JSON be served on the page,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2022-12-10T01:08:53Z,2022-12-10T01:11:30Z,,OWNER,,"Idea came from a conversation with @hydrosquall - what if a Datasette plugin could say ""I'd like the JSON for a page to be included in a variable on the HTML page""? `datasette-cluster-map` already needs this - the first thing it does when the page loads is `fetch()` a JSON representation of that same data. This idea fits with my overall goals to unify the JSON and HTML context too. Refs: - #1711",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1942/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493390939,I_kwDOBm6k_c5ZA1Zb,1947,UI to create reduced scope tokens from the `/-/create-token` page,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,22,2022-12-13T05:10:48Z,2022-12-14T05:22:00Z,2022-12-14T05:13:24Z,OWNER,,"Split from: - #1855",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1947/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493404423,I_kwDOBm6k_c5ZA4sH,1948,500 error on permission debug page when testing actors with _r,9599,simonw,open,0,,,,,1,2022-12-13T05:22:03Z,2022-12-13T05:22:19Z,,OWNER,," The 500 error is silent unless you are looking at the DevTools network pane. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1948/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1493306655,I_kwDOBm6k_c5ZAg0f,1945,`view-instance` should not be checked for /-/actor.json,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,0,2022-12-13T04:01:46Z,2022-12-13T04:11:56Z,2022-12-13T04:11:56Z,OWNER,,"Spotted this while testing: - #1855 ``` export TOKEN=$(datasette create-token root --secret s -a foo) curl -H ""Authorization: Bearer $TOKEN"" http://localhost:8002/-/actor.json ``` Returned a Forbidden error (and not in JSON either).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1945/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493339206,I_kwDOBm6k_c5ZAoxG,1946,`datasette --get` mechanism for sending tokens,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-13T04:25:05Z,2022-12-13T04:36:57Z,2022-12-13T04:36:57Z,OWNER,,"> For the tests for `datasette create-token` it would be useful if `datasette --get` had a mechanism for sending an `Authorization: Bearer X` header. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1855#issuecomment-1347731288_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1946/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1493471221,I_kwDOBm6k_c5ZBI_1,1949,`.json` errors should be returned as JSON,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,10,2022-12-13T06:14:12Z,2022-12-15T00:46:27Z,,OWNER,,"Eg the error in this issue: - #1945 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1949/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1495241162,I_kwDOBm6k_c5ZH5HK,1950,"Bad ?_sort returns a 500 error, should be a 400",9599,simonw,closed,0,,,,,2,2022-12-13T22:08:16Z,2022-12-13T22:23:22Z,2022-12-13T22:23:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable?_sort=bad ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1495431932,I_kwDOBm6k_c5ZInr8,1951,`datasette.create_token(...)` method for creating signed API tokens,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,6,2022-12-14T01:25:34Z,2022-12-14T02:43:45Z,2022-12-14T02:42:05Z,OWNER,,"I need this for: - #1947 And I can refactor this to use it too: - #1855 By making this a documented internal API it can be used by other plugins too. It's also going to be really useful for writing tests.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1951/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1495716243,I_kwDOBm6k_c5ZJtGT,1952,Improvements to /-/create-token restrictions interface,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2022-12-14T05:22:39Z,2022-12-14T05:23:13Z,,OWNER,,"> It would be neat not to show write permissions against immutable databases too - and not hard from a performance perspective since it doesn't involve hundreds more permission checks. > > That will need permissions to grow a flag for if they need a mutable database though, which is a bigger job. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1947#issuecomment-1350414402_ Also, DO show the `_memory` database there if Datasette was started in `--crossdb` mode.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1952/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1495821607,I_kwDOBm6k_c5ZKG0n,1953,Release notes for Datasette 1.0a2,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T06:26:40Z,2022-12-15T02:02:15Z,2022-12-15T02:01:08Z,OWNER,,"https://github.com/simonw/datasette/milestone/27?closed=1 https://github.com/simonw/datasette/compare/1.0a1...9ad76d279e2c3874ca5070626a25458ce129f126",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1953/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1496652622,I_kwDOBm6k_c5ZNRtO,1955,"invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things",32839123,Rik-de-Kort,closed,0,,,,,36,2022-12-14T13:39:56Z,2022-12-19T04:34:16Z,2022-12-18T02:45:18Z,NONE,,"In the past (pre-september 14, #1809) I had a running deployment of Datasette on Azure WebApps by emulating the call in cli.py to Gunicorn: `gunicorn -w 2 -k uvicorn.workers.UvicornWorker app:app`. My most recent deployment, however, fails loudly by shouting that `Datasette.invoke_startup()` was not called. It does not seem to be possible to call `invoke_startup` when running using a uvicorn command directly like this (I've reproduced this locally using `uvicorn`). Two candidates that I have tried: * Uvicorn has a `--factory` option, but the app factory has to be synchronous, so no `await invoke_startup` there * `asyncio.get_event_loop().run_until_complete` is also not an option because `uvicorn` already has the event loop running. One additional option is: * Use Gunicorn's [server hooks](https://docs.gunicorn.org/en/stable/settings.html#server-hooks) to call `invoke_startup`. These are also synchronous, but I might be able to get ahead of the event loop starting here. In my current deployment setup, it does not appear to be possible to use `datasette serve` directly, so I'm stuck either * Trying to rework my complete deployment setup, for instance, using Azure functions as described [here](https://github.com/simonw/azure-functions-datasette)) * Or dig into the ASGI spec and write a wrapper for the sole purpose of launching Datasette using a direct Uvicorn invocation. Questions for the maintainers: * Is this intended behaviour/will not support/etc.? If so, I'd be happy to add a PR with a couple lines in the documentation. * if this is not intended behaviour, what is a good way to fix it? I could have a go at the ASGI spec thing (I think the Azure Functions thing is related) and provide a PR with the wrapper here, but I'm all ears! Almost forgot, minimal reproducer: ```python from datasette import Datasette ds = Datasette(files=['./global-power-plants.db'])] app = ds.app() ``` Save as app.py in the same folder as global-power-plants.db, and then try running `uvicorn app:app`. Opening the resulting Datasette instance in the browser will show the error message.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1497288666,I_kwDOBm6k_c5ZPs_a,1956,Handle abbreviations properly in permission_allowed_actor_restrictions,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T19:54:21Z,2022-12-14T20:04:29Z,2022-12-14T20:04:28Z,OWNER,,"This code currently assumes abbreviations are: ```pyton action_initials = """".join([word[0] for word in action.split(""-"")]) ``` https://github.com/simonw/datasette/blob/1a3dcf494376e32f7cff110c86a88e5b0a3f3924/datasette/default_permissions.py#L182-L208 That's no longer correct, they are now registered by the new plugin hook: - #1939 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1956/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1497577017,I_kwDOBm6k_c5ZQzY5,1957,Reconsider row value truncation on query page,9599,simonw,open,0,,,,,1,2022-12-14T23:49:47Z,2022-12-14T23:50:50Z,,OWNER,,"Consider this example: https://ripgrep.datasette.io/repos?sql=select+json_group_array%28full_name%29+from+repos ```sql select json_group_array(full_name) from repos ``` ![CleanShot 2022-12-14 at 15 48 32@2x](https://user-images.githubusercontent.com/9599/207739709-8177f683-f938-49a1-8225-42791fad88fe.png) My intention here was to get a string of JSON I can copy and paste elsewhere - see: https://til.simonwillison.net/sqlite/compare-before-after-json The truncation isn't helping here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1957/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1497909798,I_kwDOBm6k_c5ZSEom,1958,datasette --root running in Docker doesn't reliably show the magic URL,11729897,davidhaley,closed,0,,,,,11,2022-12-13T16:29:13Z,2022-12-16T00:59:12Z,2022-12-16T00:55:19Z,NONE,,"I followed these steps: `docker run datasetteproject/datasette pip install datasette-upload-csvs` `docker commit $(docker ps -lq) datasette-with-plugins` `docker run -p 8001:8001 -v $(pwd):/mnt datasette-with-plugins datasette --root -p 8001 -h 0.0.0.0` Visited: http://127.0.0.1:8001/-/plugins ![image](https://user-images.githubusercontent.com/11729897/207392071-d939cd5e-1d96-4e11-b0be-dc06dd207866.png) Visited: http://localhost:8001/-/upload-csvs ![image](https://user-images.githubusercontent.com/11729897/207389241-3e96ca66-ca74-4a16-8b7d-4427ee862c5e.png) I may have missed a step? Thank you. --- Ubuntu 22.04.1 LTS",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1958/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1499081664,I_kwDOBm6k_c5ZWivA,1959,Refactor test suite to use mostly `async def` tests,9599,simonw,closed,0,,,,,9,2022-12-15T21:02:54Z,2022-12-17T21:49:37Z,2022-12-17T21:49:36Z,OWNER,,"I got blocked working on this issue due to weird and hard-to-debug test suite problems: - #1955 The test suite has needed a major upgrade for several years now. It has a LOT of `def test_...` synchronous functions that could be upgraded to `async def` for better performance and less test complexity - I've used the new `async def` pattern in plugins and new tests for a couple of years now. Hopefully I can get more of the tests to use in-memory named databases too, ideally so I can fix this consistent problem: - #1843",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1959/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1509783085,I_kwDOBm6k_c5Z_XYt,1969,sql-formatter javascript is not now working with CloudFlare rocketloader,536941,fgregg,open,0,,,,,0,2022-12-23T21:14:06Z,2023-01-10T01:56:33Z,,CONTRIBUTOR,,"This is probably not a bug with datasette, but I thought you might want to know, @simonw. I noticed today that my CloudFlare proxied datasette instance lost the ""Format SQL"" option. I'm pretty sure it was there last week. In the CloudFlare settings, if I turn off [Rocket Loader](https://developers.cloudflare.com/fundamentals/speed/rocket-loader/), I get the ""Format SQL"" option back. Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading? I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1500636982,I_kwDOBm6k_c5Zcec2,1962,"Alternative, async-friendly pattern for `make_app_client()` and similar - fully retire `TestClient`",9599,simonw,open,0,,,,,1,2022-12-16T17:56:51Z,2022-12-16T21:55:29Z,,OWNER,,"In this issue I replaced a whole bunch of places that used the non-async `app_client` fixture with an async `ds_client` fixture instead: - #1959 But I didn't get everything, and a lot of tests are still using the old `TestClient` mechanism as a result. The main work here is replacing all of the `app_client_...` fixtures which use variants on the default client - and changing the tests that call `make_app_client()` to do something else instead. This requires some careful thought. I need to come up with a really nice pattern for creating variants on the `ds_client` default fixture - and do so in a way that minimizes the number of open files, refs: - #1843",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1962/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501778647,I_kwDOBm6k_c5Zg1LX,1964,Cog menu is not keyboard accessible (also no ARIA),9599,simonw,open,0,,,,,1,2022-12-18T06:36:28Z,2022-12-18T06:37:28Z,,OWNER,,"This menu here: https://latest.datasette.io/fixtures/attraction_characteristic You can tab to it (see the outline) and hit space or enter to open it, but you can't then navigate the items in the open menu using the keyboard. ![cog-menu](https://user-images.githubusercontent.com/9599/208284973-2a04cdab-ed95-4316-979c-67fe5f7787db.gif) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1964/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501713288,I_kwDOBm6k_c5ZglOI,1963,0.63.3 bugfix release,9599,simonw,closed,0,,,,,2,2022-12-18T02:48:15Z,2022-12-18T03:26:55Z,2022-12-18T03:26:55Z,OWNER,,"I'm going to ship a release which back-ports these two fixes: - https://github.com/simonw/datasette/issues/1958 - https://github.com/simonw/datasette/issues/1955",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,lbellomo,closed,0,,,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,CharlesNepote,open,0,,,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524076587,I_kwDOBm6k_c5a15Ar,1979,More useful error message if enable_load_extension is not available,9599,simonw,closed,0,,,,,5,2023-01-07T19:13:19Z,2023-01-08T00:21:23Z,2023-01-08T00:21:23Z,OWNER,,"I get this from: datasette --load-extension spatialite --get /-/versions.json ``` File ""/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/datasette/app.py"", line 614, in _prepare_connection conn.enable_load_extension(True) AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' ``` It would be useful if Datasette caught this error and output something more friendly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1524867951,I_kwDOBm6k_c5a46Nv,1980,"""Cannot sort table by id"" when sortable_columns is used",9599,simonw,open,0,,,,,2,2023-01-09T03:21:33Z,2023-01-09T03:23:53Z,,OWNER,,"I had an instance with this in `metadata.yml`: ```yaml databases: timezones: tables: timezones: sortable_columns: - tzid ``` When I clicked on the ""Apply"" button here: It sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message: > 500: Cannot sort table by id",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524983536,I_kwDOBm6k_c5a5Wbw,1981,Canned query field labels truncated,9599,simonw,open,0,,,,,1,2023-01-09T06:04:24Z,2023-01-09T06:05:44Z,,OWNER,,"Eg here on mobile: https://timezones.datasette.io/timezones/by_point?longitude=-0.1406632&latitude=50.8246776 ![107A1894-D1DA-4158-9EA3-40C840DD10E3](https://user-images.githubusercontent.com/9599/211248895-c922ce61-95d3-47ca-9314-dcff7c86afab.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515186569,I_kwDOBm6k_c5aT-mJ,1972,Fix Sphinx warning about extlink extension,9599,simonw,closed,0,,,,,0,2022-12-31T19:12:04Z,2022-12-31T19:13:26Z,2022-12-31T19:13:26Z,OWNER,,"``` [sphinx-autobuild] > sphinx-build -b html /Users/simon/Dropbox/Development/datasette/docs /Users/simon/Dropbox/Development/datasette/docs/_build Running Sphinx v5.3.0 loading pickled environment... done WARNING: extlinks: Sphinx-6.0 will require a caption string to contain exactly one '%s' and all other '%' need to be escaped as '%%'. ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1972/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515182998,I_kwDOBm6k_c5aT9uW,1970,"Path ""None"" in _internal database table",9599,simonw,closed,0,,,,,2,2022-12-31T18:51:05Z,2022-12-31T19:22:58Z,2022-12-31T18:52:49Z,OWNER,,"See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515815014,I_kwDOBm6k_c5aWYBm,1973,render_cell plugin hook's row object is not a sqlite.Row,193185,cldellow,open,0,,,,,4,2023-01-01T20:27:46Z,2023-01-29T00:40:31Z,,CONTRIBUTOR,,"From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette: > row - sqlite.Row > The SQLite row object that the value being rendered is part of This appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue. I have a table: ```sql CREATE TABLE IF NOT EXISTS ""dss_job_stats""( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) ); ``` On datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like: ``` CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')]) ``` I expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`. I can work around this, but was wondering if this was intended behaviour?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1516815571,I_kwDOBm6k_c5aaMTT,1975,_col=id can cause id column to export twice in CSV export,9599,simonw,open,0,,,,,0,2023-01-03T00:25:15Z,2023-01-03T00:25:21Z,,OWNER,,"https://datasette.simonwillison.net/simonwillisonblog/blog_entry.csv?_col=id&_col=title&_col=body&_labels=on&_size=1 ```csv id,id,title,body 1,1,WaSP Phase II,""

The Web Standards project has launched Phase II.

"" ``` That should not have two `id` columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1975/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1522778923,I_kwDOBm6k_c5aw8Mr,1978,Document datasette.urls.row and row_blob,25778,eyeseast,closed,0,,,,,2,2023-01-06T15:45:51Z,2023-01-09T14:30:00Z,2023-01-09T14:30:00Z,CONTRIBUTOR,,"These are in the codebase but not in documentation. I think everything else in this class is documented. ```python class Urls: ... def row(self, database, table, row_path, format=None): path = f""{self.table(database, table)}/{row_path}"" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) def row_blob(self, database, table, row_path, column): return self.table(database, table) + ""/{}.blob?_blob_column={}"".format( row_path, urllib.parse.quote_plus(column) ) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned 1528448642,I_kwDOBm6k_c5bGkaC,1985,Don't let Datasette(path) without a list cause weird errors,9599,simonw,closed,0,,,,,1,2023-01-11T05:17:44Z,2023-01-11T18:25:04Z,2023-01-11T18:25:04Z,OWNER,,"I got a confusing `sqlite3.OperationalError: disk I/O error` error in my tests, it turned out it was because this: ```python ds = Datasette(path) ``` Should have been this: ```python ds = Datasette([path]) ``` _Originally posted by @simonw in https://github.com/simonw/datasette-faiss/issues/1#issuecomment-1378252673_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1529707837,I_kwDOBm6k_c5bLX09,1988,Reconsider pattern where plugins could break existing template context,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2023-01-11T21:13:43Z,2023-01-11T21:25:05Z,,OWNER,,"> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing. _Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1532000914,I_kwDOBm6k_c5bUHqS,1990,Suggestion: Highlight error messages ('These facets timed out'),116795,pax,open,0,,,,,0,2023-01-13T09:40:58Z,2023-01-13T09:40:58Z,,NONE,,"I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the _These facets timed out_ notice. It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1990/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1533673397,I_kwDOBm6k_c5baf-1,1991,fts5 tables are not auto-detected and hidden,83819,keturn,open,0,,,,,0,2023-01-15T06:00:42Z,2023-01-20T04:54:24Z,,NONE,,"I set up a [Datasette instance](https://huggingface.co/spaces/Sygil/INE-dataset-explorer/tree/main) and was following the docs on full-text search. When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either. If I [manually set](https://huggingface.co/spaces/keturn/INED-datasette/blob/main/metadata.json#L9) `fts_table` for a view, then search does work as expected. My table and view creation code looks like this: ```py connection.execute(""""""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """""")   connection.execute(""""""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """""") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1991/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1536851861,I_kwDOBm6k_c5bmn-V,1994,Stuck on loading screen,10913053,jackhagley,open,0,,,,,1,2023-01-17T18:33:49Z,2023-01-23T08:21:08Z,,NONE,,"Can’t actually open it! Downloaded today from the releases tab Running macOS13.1 ``` bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms ``` STUCK",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1994/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1538197093,I_kwDOBm6k_c5brwZl,1995,foreign_keys error 500,137183,jonschoning,open,0,,,,,0,2023-01-18T15:27:36Z,2023-01-18T16:44:01Z,,NONE,,"**Error 500 expected string or bytes-like object** [espial-new.sqlite3.zip](https://github.com/simonw/datasette/files/10447965/espial-new.sqlite3.zip) run `datasette espial-new.sqlite3` & click on any table other than `User` ``` /home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """""".format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk[""other_column""]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk[""other_table""]), │ │ 817 │ │ │ placeholders="", "".join([""?""] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f""[{s}]"" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 134, in view return await self.dispatch_request(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 91, in dispatch_request return await handler(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 603, in _data_traced await self.ds.expand_foreign_keys( File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 814, in expand_foreign_keys other_column=escape_sqlite(fk[""other_column""]), File ""/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py"", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - ""GET /espial-new/bookmark HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:38574 - ""GET /-/static/app.css?d59929 HTTP/1.1"" 200 OK ``` Schema: ``` CREATE TABLE IF NOT EXISTS ""user"" ( ""id"" INTEGER PRIMARY KEY, ""name"" VARCHAR NOT NULL, ""password_hash"" VARCHAR NOT NULL, ""api_token"" VARCHAR NULL, ""private_default"" BOOLEAN NOT NULL, ""archive_default"" BOOLEAN NOT NULL, ""privacy_lock"" BOOLEAN NOT NULL, CONSTRAINT ""unique_user_name"" UNIQUE (""name"") ); CREATE TABLE IF NOT EXISTS ""bookmark"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), ""href"" VARCHAR NOT NULL, ""description"" VARCHAR NOT NULL, ""extended"" VARCHAR NOT NULL, ""time"" TIMESTAMP NOT NULL, ""shared"" BOOLEAN NOT NULL, ""to_read"" BOOLEAN NOT NULL, ""selected"" BOOLEAN NOT NULL, ""archive_href"" VARCHAR NULL, CONSTRAINT ""unique_user_href"" UNIQUE (""user_id"", ""href""), CONSTRAINT ""unique_user_slug"" UNIQUE (""user_id"", ""slug"") ); CREATE TABLE IF NOT EXISTS ""bookmark_tag"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""tag"" VARCHAR NOT NULL, ""bookmark_id"" INTEGER NOT NULL REFERENCES ""bookmark"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""seq"" INTEGER NOT NULL, CONSTRAINT ""unique_user_tag_bookmark_id"" UNIQUE (""user_id"", ""tag"", ""bookmark_id""), CONSTRAINT ""unique_user_bookmark_id_tag_seq"" UNIQUE (""user_id"", ""bookmark_id"", ""tag"", ""seq"") ); CREATE TABLE IF NOT EXISTS ""note"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), ""length"" INTEGER NOT NULL, ""title"" VARCHAR NOT NULL, ""text"" VARCHAR NOT NULL, ""is_markdown"" BOOLEAN NOT NULL, ""shared"" BOOLEAN NOT NULL DEFAULT false, ""created"" TIMESTAMP NOT NULL, ""updated"" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1995/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557507274,I_kwDOBm6k_c5c1azK,2005,`extra_template_vars` should be OK to return `None`,9599,simonw,open,0,,,,,1,2023-01-26T01:40:45Z,2023-01-26T01:41:50Z,,OWNER,,"Got this exception and had to make sure it always returned `{}`: ``` File "".../python3.11/site-packages/datasette/app.py"", line 1049, in render_template assert isinstance(extra_vars, dict), ""extra_vars is of type {}"".format( AssertionError: extra_vars is of type ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,simonw,open,0,,,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1552368054,I_kwDOBm6k_c5ch0G2,2000,rewrite_sql hook,193185,cldellow,open,0,,,,,1,2023-01-23T01:02:52Z,2023-01-23T06:08:01Z,,CONTRIBUTOR,,"I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like ```python def rewrite_sql(datasette, database, request, fn, sql, params) ``` It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement. The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed. Plugins that could be written with this hook: - https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching - a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004)) - a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID - a plugin to maintain audit tables when users write to a table - a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue Flaws with this idea: `execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2000/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553615704,I_kwDOBm6k_c5cmktY,2001,Datasette is not compatible with SQLite's strict quoting compilation option,406380,gwk,open,0,,,,,4,2023-01-23T19:10:07Z,2023-01-25T04:59:58Z,,NONE,,"I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a ""MySQL 3.x misfeature"". See https://www.sqlite.org/quirks.html#dblquote for background. Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment. My experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response. The error: `sqlite3.OperationalError: no such column: geometry_columns` The responsible SQL: `'select 1 from sqlite_master where tbl_name = ""geometry_columns""'` I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `""select 1 from sqlite_master where tbl_name = 'geometry_columns'""`. With this change, I get a little further, but have the same problem with the first table name in my database (in my case, ""Meta""): ``` OperationalError: no such column: Meta Traceback (most recent call last): File ""/Users/gwk/external/datasette/datasette/app.py"", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/index.py"", line 70, in get ""fts_table"": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/utils/__init__.py"", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - ""GET / HTTP/1.1"" 500 Internal Server Error ``` I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon. Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: ``` sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0); ``` As far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1554032168,I_kwDOBm6k_c5coKYo,2002,Document how actors are displayed,9599,simonw,open,0,,,,,0,2023-01-24T00:08:49Z,2023-01-24T00:08:49Z,,OWNER,,"https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/__init__.py#L1052-L1056 This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575880841,I_kwDOBm6k_c5d7giJ,2020,"Documentation refers to ""off"" setting; doesn't seem to work, ""false"" does",1350673,dmick,open,0,,,,,0,2023-02-08T10:38:10Z,2023-02-08T10:38:10Z,,NONE,,"https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using ""off"" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a ""JSON boolean"" and have the values ""true"" or ""false"". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2020/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1560662739,I_kwDOBm6k_c5dBdLT,2007,`render_cell()` hook should take an optional `request` argument,9599,simonw,closed,0,,,,,1,2023-01-28T03:13:00Z,2023-08-09T17:15:03Z,2023-01-28T03:34:26Z,OWNER,,From Discord: https://discordapp.com/channels/823971286308356157/996877076982415491/1068227071156965486,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2007/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1563264257,I_kwDOBm6k_c5dLYUB,2010,Row page should default to card view,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-01-30T21:49:37Z,2023-01-30T21:52:06Z,,OWNER,,"Datasette currently uses the same table layout on the row pages as it does on the table pages: https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect https://datasette.io/content/pypi_packages/datasette-column-inspect If you shrink down to mobile width you get this instead, on both of those pages: I think that view, which I think of as the ""card view"", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564769997,I_kwDOBm6k_c5dRH7N,2011,"Applied facet did not result in an ""x"" icon to dismiss it",9599,simonw,open,0,,,,,1,2023-01-31T17:57:44Z,2023-01-31T17:58:54Z,,OWNER,,"![CleanShot 2023-01-31 at 09 55 56@2x](https://user-images.githubusercontent.com/9599/215843684-1761a230-d490-4f87-be6d-186319366794.png) That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata It's for `Contract Type` of `Non-Purchasing Contract (Rents, etc.)` - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1564774831,I_kwDOBm6k_c5dRJGv,2012,Missing space in database summary,9599,simonw,open,0,,,,,0,2023-01-31T18:01:13Z,2023-01-31T18:01:13Z,,OWNER,,"Spotted this on an instance index page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2012/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1565179870,I_kwDOBm6k_c5dSr_e,2013,Datasette uses non-standard quoting for identifiers,193185,cldellow,open,0,,,,,0,2023-02-01T00:05:39Z,2023-02-01T00:06:30Z,,CONTRIBUTOR,,"Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html: > ""keyword"" A keyword in double-quotes is an identifier. > [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2013/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571207083,I_kwDOBm6k_c5dprer,2016,Database metadata fields like description are not available in the index page template's context,9993,palewire,open,0,,,3268330,Datasette 1.0,1,2023-02-05T02:25:53Z,2023-02-05T22:56:43Z,,NONE,,"When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571711808,I_kwDOBm6k_c5drmtA,2018,`check_visibility` gives confusing (wrong?) results if permission is `None`,193185,cldellow,open,0,,,,,0,2023-02-06T01:03:08Z,2023-02-06T01:03:46Z,,CONTRIBUTOR,,"I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ (""update-row"", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1573424830,I_kwDOBm6k_c5dyI6-,2019,Refactor out the keyset pagination code,9599,simonw,open,0,,,,,14,2023-02-06T23:04:00Z,2023-02-08T01:40:46Z,,OWNER,,"While working on: - #1999 I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493 Extracting that into a utility function would simplify that code a lot.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2019/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1592327343,I_kwDOBm6k_c5e6Pyv,2029,"Sorry Simon, didn't know how else to contact you",5804626,llchristopherson,open,0,,,,,0,2023-02-20T19:02:53Z,2023-02-20T19:02:53Z,,NONE,,"Hi Simon, Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org. Thanks, Laura",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2029/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1577548579,I_kwDOBm6k_c5eB3sj,2021,Docker images for 1.0 alphas?,1563881,meowcat,open,0,,,,,0,2023-02-09T09:35:52Z,2023-02-09T09:35:52Z,,NONE,,"Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2021/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,DavidPratten,closed,0,,,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579973223,I_kwDOBm6k_c5eLHpn,2024,Mention WAL mode in documentation,9599,simonw,open,0,,,,,1,2023-02-10T16:11:10Z,2023-02-10T16:11:53Z,,OWNER,,It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,dmick,open,0,,,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,gk7279,closed,0,,,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,cldellow,open,0,,,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1612296210,I_kwDOBm6k_c5gGbAS,2033,`datasette install -r requirements.txt`,9599,simonw,closed,0,,,,,2,2023-03-06T22:17:17Z,2023-03-06T22:54:52Z,2023-03-06T22:27:34Z,OWNER,,"Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615692818,I_kwDOBm6k_c5gTYQS,2035,Potential feature: special support for `?a=1&a=2` on the query page,9599,simonw,open,0,,,3268330,Datasette 1.0,14,2023-03-08T18:05:03Z,2023-03-31T16:09:08Z,,OWNER,,"From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138 The key idea is to make it easier for people to implement `where id in (...)` that's populated from query string arguments. What if you could add `?id=11&id=32&id=62` to the URL and have that made available as a list that can be used in the query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615862295,I_kwDOBm6k_c5gUBoX,2036,"`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems",9599,simonw,closed,0,,,,,6,2023-03-08T20:11:44Z,2023-03-08T20:57:34Z,2023-03-08T20:57:34Z,OWNER,,"See this issue: - https://github.com/simonw/datasette.io/issues/141",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,simonw,open,0,,,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here: ```diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( ""Allow display of SQL trace debug information with ?_trace=1"", ), Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""), + Setting(""strict_templates"", False, ""Raise errors for undefined template variables""), ) _HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead"" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting(""strict_templates""): + env_extras[""undefined""] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters[""escape_css_string""] = escape_css_string self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus ``` Explored this idea a bit in: - #1999",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620515757,I_kwDOBm6k_c5glxut,2039,Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\`,15178711,asg017,open,0,,,,,0,2023-03-12T21:18:52Z,2023-03-12T21:18:52Z,,CONTRIBUTOR,,"From the Datasette discord: A user tried running the following command on windows: ``` datasette --load-extension=""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"" ``` This failed with `""The specified module could not be found""`, because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at `""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""`, it instead tried to load the extension at `""C""` with entrypoint `""\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"". This is hard because most absolute windows paths have a colon in them, like `C:\foo.txt` or `D:\bar.txt`. I'd image the `--static` flag is also vulnerable to this type of bug. The ""solution"" is to use a relative path instead, but that doesn't feel that great. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1633077183,I_kwDOBm6k_c5hVse_,2041,Remove obsolete table POST code,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-03-21T01:01:40Z,2023-03-21T01:17:44Z,2023-03-21T01:17:43Z,OWNER,,"Spotted this in: - #1999 `POST /db/table` currently executes obsolete code for inserting a row - I replaced that with `/db/table/-/insert` in https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,simonw,open,0,,,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0. The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1641013220,I_kwDOBm6k_c5hz9_k,2045,First column on a view page has no facet option in cog menu,9599,simonw,open,0,,,3268330,Datasette 1.0,0,2023-03-26T18:02:47Z,2023-03-26T18:02:48Z,,OWNER,,"e.g. first column on this page - cog menu has no option to facet. https://datasette.io/content/tools ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1657861026,I_kwDOBm6k_c5i0POi,2054,"Make detailed notes on how table, query and row views work right now",9599,simonw,open,0,,,,,13,2023-04-06T18:21:09Z,2023-04-07T20:14:38Z,,OWNER,,"Research to help influence the following: - #2049 - #2053 - #2050 - #262 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646068413,I_kwDOBm6k_c5iHQK9,2048,Test failures encountered while packaging for GNU Guix,8332263,Apteryks,open,0,,,,,0,2023-03-29T15:36:54Z,2023-03-29T15:36:54Z,,NONE,,"Hello, While reviewing a packaged submitted to Guix to add `datasette`, the test suite produces the following errors: ``` =================================== FAILURES =================================== _________________________ test_row_strange_table_name __________________________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_row_strange_table_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"" ) > assert response.status == 200 E assert 400 == 200 E + where 400 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where ""rowid""=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv _______________ test_database_page_for_database_with_dot_in_name _______________ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_with_dot = def test_database_page_for_database_with_dot_in_name(app_client_with_dot): response = app_client_with_dot.get(""/fixtures~2Edot.json"") > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___________________ test_tilde_encoded_database_names[fo%o] ____________________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'fo%o' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __________________ test_tilde_encoded_database_names[f~/c.d] ___________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'f~/c.d' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ______________ test_database_with_space_in_name[/searchable.json] ______________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________________ test_database_with_space_in_name[.json] ____________________ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ______________ test_database_with_space_in_name[/searchable_view] ______________ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects _____________________ test_database_with_space_in_name[/] ______________________ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_database_with_space_in_name[/searchable] _________________ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________ test_database_with_space_in_name[/searchable_view.json] ____________ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_weird_database_names[database (1).sqlite] ________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _____________ test_weird_database_names[test-database (1).sqlite] ______________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['a/b', '.c-d', 'c']] @pytest.mark.parametrize( ""path,expected"", ( ( ""/fixtures/compound_primary_key/a,b"", [ [ 'a', 'b', 'c', ] ], ), ( ""/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd"", [ [ 'a/b', '.c-d', 'c', ] ], ), ), ) def test_row_html_compound_primary_key(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563'] @pytest.mark.parametrize( ""path,expected_classes"", [ (""/"", [""index""]), (""/fixtures"", [""db"", ""db-fixtures""]), (""/fixtures?sql=select+1"", [""query"", ""db-fixtures""]), ( ""/fixtures/simple_primary_key"", [""table"", ""db-fixtures"", ""table-simple_primary_key""], ), ( ""/fixtures/neighborhood_search"", [""query"", ""db-fixtures"", ""query-neighborhood_search""], ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", [""table"", ""db-fixtures"", ""table-tablewithslashescsv-fa7563""], ), ( ""/fixtures/simple_primary_key/1"", [""row"", ""db-fixtures"", ""table-simple_primary_key""], ), ], ) def test_css_classes_on_body(app_client, path, expected_classes): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html' @pytest.mark.parametrize( ""path,expected_considered"", [ (""/"", ""*index.html""), (""/fixtures"", ""database-fixtures.html, *database.html""), ( ""/fixtures/simple_primary_key"", ""table-fixtures-simple_primary_key.html, *table.html"", ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""table-fixtures-tablewithslashescsv-fa7563.html, *table.html"", ), ( ""/fixtures/simple_primary_key/1"", ""row-fixtures-simple_primary_key.html, *row.html"", ), ], ) def test_templates_considered(app_client, path, expected_considered): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json' @pytest.mark.parametrize( ""path,expected"", ( # Instance index page (""/"", ""http://localhost/.json""), # Table page (""/fixtures/facetable"", ""http://localhost/fixtures/facetable.json""), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json"", ), # Row page ( ""/fixtures/no_primary_key/1"", ""http://localhost/fixtures/no_primary_key/1.json"", ), # Database index page ( ""/fixtures"", ""http://localhost/fixtures.json"", ), # Custom query page ( ""/fixtures?sql=select+*+from+facetable"", ""http://localhost/fixtures.json?sql=select+*+from+facetable"", ), # Canned query page ( ""/fixtures/neighborhood_search?text=town"", ""http://localhost/fixtures/neighborhood_search.json?text=town"", ), # /-/ pages ( ""/-/plugins"", ""http://localhost/-/plugins.json"", ), ), ) def test_alternate_url_json(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B' @pytest.mark.parametrize( ""path,expected"", [ ( ""/fixtures/neighborhood_search"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text="", ), ( ""/fixtures/neighborhood_search?text=ber"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text=ber"", ), (""/fixtures/pragma_cache_size"", None), ( # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬 ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC"", ""/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B"", ), (""/fixtures/magic_parameters"", None), ], ) def test_edit_sql_link_on_canned_queries(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _______________________ test_table_with_slashes_in_name ________________________ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"" ) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError __________________ test_custom_query_with_unicode_characters ___________________ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_custom_query_with_unicode_characters(app_client): # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json response = app_client.get( ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"" ) > assert [{""id"": 1, ""name"": ""San Francisco""}] == response.json /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/__init__.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , s = '', idx = 0 def raw_decode(self, s, idx=0): """"""Decode a JSON document from ``s`` (a ``str`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """""" try: obj, end = self.scan_once(s, idx) except StopIteration as err: > raise JSONDecodeError(""Expecting value"", s, err.value) from None E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""path,expected_rows"", [ ( ""/fixtures/searchable.json?_search=dog"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # Special keyword shouldn't break FTS query ""/fixtures/searchable.json?_search=AND"", [], ), ( # Without _searchmode=raw this should return no results ""/fixtures/searchable.json?_search=te*+AND+do*"", [], ), ( # _searchmode=raw ""/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # _searchmode=raw combined with _search_COLUMN ""/fixtures/searchable.json?_search_text2=te*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], ], ), ( ""/fixtures/searchable.json?_search=weasel"", [[2, ""terry dog"", ""sara weasel"", ""puma""]], ), ( ""/fixtures/searchable.json?_search_text2=dog"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ( ""/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ], ) def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _____ test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] ______ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {'searchmode': 'raw'}, querystring = '_search=te*+AND+do*' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {}, querystring = '_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: ""/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest"" arguments: (""-vv"" ""-n"" ""24"" ""-m"" ""not serial"") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds ``` The tests run in a private namespace without internet connectivity, and the Python dependencies are at: ``` python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9. Thank you!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related: - #262 I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too. Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649791661,I_kwDOBm6k_c5iVdKt,2050,Row page JSON should use new ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-03-31T17:56:53Z,2023-03-31T17:59:49Z,,OWNER,,"https://latest.datasette.io/fixtures/facetable/2.json Related: - #2049 - #1709 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,simonw,open,0,,,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1662951875,I_kwDOBm6k_c5jHqHD,2057,DeprecationWarning: pkg_resources is deprecated as an API,9599,simonw,closed,0,,,,,25,2023-04-11T17:41:20Z,2023-09-21T22:09:10Z,2023-09-21T22:09:10Z,OWNER,,"Got this running tests against Python 3.11. ``` ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/datasette/app.py:14: in import pkg_resources ../../../.local/share/virtualenvs/datasette-big-local-6Yn-280V/lib/python3.11/site-packages/pkg_resources/__init__.py:121: in warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) E DeprecationWarning: pkg_resources is deprecated as an API ``` I ran with `pytest -Werror --pdb -x` to get the debugger for that warning, but it turned out searching the code worked better. It's used in these two places: https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/plugins.py#L43-L50 https://github.com/simonw/datasette/blob/5890a20c374fb0812d88c9b0ef26a838bfa06c76/datasette/app.py#L1037",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2057/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,simonw,open,0,,,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it. ``` File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request await self.ds.refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas await self._refresh_schemas() File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database ``` That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,mtdukes,open,0,,,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users. Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665510265,I_kwDOBm6k_c5jRat5,2060,Clean up a bunch of warnings from ruff,9599,simonw,open,0,,,,,0,2023-04-13T01:23:02Z,2023-04-13T01:23:02Z,,OWNER,,"See: - #2056 `ruff` spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1686033652,I_kwDOBm6k_c5kftT0,2065,Datasette cannot be installed with Rye,9599,simonw,closed,0,,,,,4,2023-04-27T03:35:42Z,2023-04-27T05:09:36Z,2023-04-27T05:09:36Z,OWNER,,"https://github.com/mitsuhiko/rye I tried this: rye install datasette But now: ``` % ~/.rye/shims/datasette Traceback (most recent call last): File ""/Users/simon/.rye/shims/datasette"", line 5, in from datasette.cli import cli File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/cli.py"", line 17, in from .app import ( File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/app.py"", line 14, in import pkg_resources ModuleNotFoundError: No module named 'pkg_resources' ``` I think that's because `setuptools` is not included in Rye.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1686042269,I_kwDOBm6k_c5kfvad,2066,Failing test: httpx.InvalidURL: URL too long,9599,simonw,closed,0,,,,,10,2023-04-27T03:48:47Z,2023-04-27T04:27:50Z,2023-04-27T04:27:50Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/4815723640/jobs/8574667731 ``` def urlparse(url: str = """", **kwargs: typing.Optional[str]) -> ParseResult: # Initial basic checks on allowable URLs. # --------------------------------------- # Hard limit the maximum allowable URL length. if len(url) > MAX_URL_LENGTH: > raise InvalidURL(""URL too long"") E httpx.InvalidURL: URL too long /opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/httpx/_urlparse.py:155: InvalidURL =========================== short test summary info ============================ FAILED tests/test_csv.py::test_max_csv_mb - httpx.InvalidURL: URL too long ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1690765434,I_kwDOBm6k_c5kxwh6,2067,Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6,39538958,justmars,open,0,,,,,1,2023-05-01T12:42:28Z,2023-05-03T00:16:03Z,,NONE,,"Hi! Wondering if this issue is limited to my local system or if it affects others as well. It seems like 3.11 errors out on a ""litestream-restored"" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6. To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli. ```sh # create new shell with 3.11.3 litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite # SQLite version 3.41.2 2023-03-22 11:56:21 # Enter "".help"" for usage hints. # sqlite> .tables # _litestream_lock _litestream_seq movie # sqlite> ``` However this get me an `OperationalError` when reading via datasette:
Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) Traceback (most recent call last): File ""/tester/.venv/bin/datasette"", line 8, in sys.exit(cli()) ^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py"", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 660, in check_databases await database.execute_fn(check_connection) File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py"", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```
In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1698865182,I_kwDOBm6k_c5lQqAe,2069,[BUG] Cannot insert new data to deployed instance,31861128,yqlbu,open,0,,,,,1,2023-05-07T02:59:42Z,2023-05-07T03:17:35Z,,NONE,,"## Summary Recently, I deployed an instance of datasette to Vercel with the following plugins: - datasette-auth-tokens - datasette-insert With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel: ```console File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ``` I think it is a potential bug. ## Reproduce
metadata.json
```json { ""plugins"": { ""datasette-insert"": { ""allow"": { ""id"": ""*"" } }, ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""INSERT_TOKEN"" }, ""actor"": { ""id"": ""repeater"" } } ], ""param"": ""_auth_token"" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H ""Authorization: Bearer "" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File ""/var/task/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/var/task/datasette/app.py"", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File ""/var/task/datasette/utils/__init__.py"", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File ""/var/task/datasette_insert/__init__.py"", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File ""/var/task/datasette_insert/__init__.py"", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File ""/var/task/datasette/database.py"", line 167, in execute_write_fn raise result File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1702354223,I_kwDOBm6k_c5ld90v,2070,Mechanism for deploying a preview of a branch using Vercel,9599,simonw,closed,0,,,,,2,2023-05-09T16:21:45Z,2023-05-09T16:25:00Z,2023-05-09T16:24:31Z,OWNER,,"I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml It deployed the `json-extras-query` branch here: https://datasette-preview-json-extras-query.vercel.app/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,simonw,open,0,,,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline I had to do this workaround: ```sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] ``` I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1726236847,I_kwDOBm6k_c5m5Eiv,2078,Resolve the difference between `wrap_view()` and `BaseView`,9599,simonw,closed,0,,,,,16,2023-05-25T17:44:32Z,2023-05-26T00:18:46Z,2023-05-26T00:18:46Z,OWNER,,"There are two patterns for implementing views in Datasette at the moment. I want to combine those. Part of: - #2053",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2078/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1726531350,I_kwDOBm6k_c5m6McW,2079,Datasette should serve Access-Control-Max-Age,9599,simonw,closed,0,,,,,8,2023-05-25T21:50:50Z,2023-05-25T22:56:28Z,2023-05-25T22:08:35Z,OWNER,,"Currently the CORS headers served are: https://github.com/simonw/datasette/blob/9584879534ff0556e04e4c420262972884cac87b/datasette/utils/__init__.py#L1139-L1143 Serving `Access-Control-Max-Age: 600` would allow browsers to cache that for 10 minutes, avoiding additional CORS pre-flight OPTIONS requests during that time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2079/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1727478903,I_kwDOBm6k_c5m9zx3,2081,Update Endpoints defined in metadata throws 403 Forbidden after a while,15085007,cutmasta-kun,open,0,,,,,0,2023-05-26T11:52:30Z,2023-05-26T11:52:30Z,,NONE,,"Hello. I expose an endpoint to update `tasks`: ``` { ""title"": ""My Datasette Instance"", ""databases"": { ""tasks"": { ""queries"": { ""update_task"": { ""sql"": ""UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID"", ""write"": true, ""on_success_message"": ""Task updated"", ""on_success_redirect"": ""/tasks/tasks.json"", ""on_error_message"": ""Task update failed"", ""on_error_redirect"": ""/tasks.json"", ""params"": [""queueID"", ""taskData"", ""status"", ""result"", ""systemMessage""] } } } } } ``` This works really well! But after a while, the Datasette Instanz answers with **403 Forbidden**. I have to delete the database and recreate it in order to work again. Any help here? (´。_。`)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1761613778,I_kwDOBm6k_c5pABfS,2084,Support facets for columns that contain timestamps,19492893,devxpy,open,0,,,,,0,2023-06-17T03:33:54Z,2023-06-17T03:33:54Z,,NONE,," Django has this very nice filter for datetime fields - It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now... ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1762180409,I_kwDOBm6k_c5pCL05,2085,Interactive row selection in Datasette ,24938923,learning4life,open,0,,,,,0,2023-06-18T08:29:45Z,2023-06-18T08:31:23Z,,NONE,,"Simon did a excellent [prototype](https://til.simonwillison.net/datasette/row-selection-prototype) of an interactive row selection in Datasette. I hope this [functionality](https://camo.githubusercontent.com/3d4a0f31fb6a27fd279f809af5b53dc3b76faa63c7721e228951c5252b645a77/68747470733a2f2f7374617469632e73696d6f6e77696c6c69736f6e2e6e65742f7374617469632f323032332f6461746173657474652d7069636b65722e676966) can be turned into a Datasette plugin. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1764792125,I_kwDOBm6k_c5pMJc9,2086,Show information on startup in directory configuration mode,9599,simonw,open,0,,,,,0,2023-06-20T07:13:33Z,2023-06-20T07:13:33Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098 > One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,simonw,open,0,,,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080 > May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1794097871,I_kwDOBm6k_c5q78LP,2095,"Introduce ""dark mode"" CSS",3315059,jamietanna,open,0,,,,,0,2023-07-07T19:15:58Z,2023-07-07T19:15:58Z,,NONE,,Using [the CSS media query `prefers-color-scheme`](https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme) we can provide a dark-mode version of Datasette,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1780973290,I_kwDOBm6k_c5qJ37q,2089,codespell test failure,9599,simonw,closed,0,,,,,5,2023-06-29T14:40:10Z,2023-06-29T14:48:11Z,2023-06-29T14:48:10Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/5413443676/jobs/9838999356 ``` codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt shell: /usr/bin/bash -e {0} env: pythonLocation: /opt/hostedtoolcache/Python/3.9.17/x64 LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.17/x64/lib docs/metadata.rst:192: displaing ==> displaying ``` This failure is legit, it found a spelling mistake: https://github.com/simonw/datasette/blob/ede62036180993dbd9d4e5d280fc21c183cda1c3/docs/metadata.rst#L192",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2089/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,simonw,open,0,,,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781047747,I_kwDOBm6k_c5qKKHD,2092,test_homepage intermittent failure,9599,simonw,closed,0,,,,,2,2023-06-29T15:20:37Z,2023-06-29T15:26:28Z,2023-06-29T15:24:13Z,OWNER,,"e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852 ``` =================================== FAILURES =================================== ________________________________ test_homepage _________________________________ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python ds_client = @pytest.mark.asyncio async def test_homepage(ds_client): response = await ds_client.get(""/.json"") assert response.status_code == 200 assert ""application/json; charset=utf-8"" == response.headers[""content-type""] data = response.json() assert data.keys() == {""fixtures"": 0}.keys() d = data[""fixtures""] assert d[""name""] == ""fixtures"" assert d[""tables_count""] == 24 assert len(d[""tables_and_views_truncated""]) == 5 assert d[""tables_and_views_more""] is True # 4 hidden FTS tables + no_primary_key (hidden in metadata) assert d[""hidden_tables_count""] == 6 # 201 in no_primary_key, plus 6 in other hidden tables: > assert d[""hidden_table_rows_sum""] == 207, data E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}} E assert 0 == 207 ``` My guess is that this is a timing error, where very occasionally the ""count rows but stop counting if it exceeds a time limit"" thing fails.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781530343,I_kwDOBm6k_c5qL_7n,2093,"Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File",15178711,asg017,open,0,,,,,8,2023-06-29T21:18:23Z,2023-09-11T20:19:32Z,,CONTRIBUTOR,,"Very often I get tripped up when trying to configure my Datasette instances. For example: if I want to change the port my app listen too, do I do that with a CLI flag, a `--setting` flag, inside `metadata.json`, or an env var? If I want to up the time limit of SQL statements, is that under `metadata.json` or a setting? Where does my plugin configuration go? Normally I need to look it up in Datasette docs, and I quickly find my answer, but the number of places where ""config"" goes it overwhelming. - Flat CLI flags like `--port`, `--host`, `--cors`, etc. - `--setting`, like `default_page_size`, `sql_time_limit_ms` etc - Inside `metadata.json`, including plugin configuration Typically my Datasette deploys are extremely long shell commands, with multiple `--setting` and other CLI flags. ## Proposal: Consolidate all ""config"" into `datasette.toml` I propose that we add a new `datasette.toml` that combines ""settings"", ""metadata"", and other common CLI flags like `--port` and `--cors` into a single file. It would be similar to ""Cargo.toml"" in Rust projects, ""package.json"" in Node projects, and ""pyproject.toml"" in Python, etc. A sample of what it could look like: ```toml # ""top level"" configuration that are currently CLI flags on `datasette serve` [config] port = 8020 host = ""0.0.0.0"" cors = true # replaces multiple `--setting` flags [settings] base_url = ""/app/datasette/"" default_allow_sql = true sql_time_limit_ms = 3500 # replaces `metadata.json`. # The contents of datasette-metadata.json could be defined in this file instead, but supporting separate files is nice (since those are easy to machine-generate) [metadata] include=""./datasette-metadata.json"" # plugin-specific [plugins] [plugins.datasette-auth-github] client_id = {env = ""DATASETTE_AUTH_GITHUB_CLIENT_ID""} client_secret = {env = ""GITHUB_CLIENT_SECRET""} [plugins.datasette-cluster-map] latitude_column = ""lat"" longitude_column = ""lon"" ``` ## Pros - Instead of multiple files and CLI flags, everything could be in one tidy file - Editing config in a separate file is easier than editing CLI flags, since you don't have to kill a process + edit a command every time - New users will know ""just edit my `datasette.toml` instead of needing to learn metadata + settings + CLI flags - Better dev experience for multiple environment. For example, could have `datasette -c datasette-dev.toml` for local dev environments (enables SQL, debug plugins, long timeouts, etc.), and a `datasette -c datasette-prod.toml` for ""production"" (lower timeouts, less plugins, monitoring plugins, etc.) ## Cons - Yet another config-management system. Now Datasette users will need to know about metadata, settings, CLI flags, _and_ `datasette.toml`. However with enough documentation + announcements + examples, I think we can get ahead of it. - If toml is chosen, would need to add a toml parser for Python version <3.11 - Multiple sources of config require priority. For example: Would `--setting default_allow_sql off` override the value inside `[settings]`? What about `--port`? ## Other Notes ### Toml I chose toml over json because toml supports comments. I chose toml over yaml because Python 3.11 has builtin support for it. I also find toml easier to work with since it doesn't have the odd ""gotchas"" that YAML has (""ex `3.10` resolving to `3.1`, Norway `NO` resolving to `false`, etc.). It also mimics `pyproject.toml` which is nice. Happy to change my mind about this however ### Plugin config will be difficult Plugin config is currently in `metadata.json` in two places: 1. Top level, under `""plugins.[plugin-name]""`. This fits well into `datasette.toml` as `[plugins.plugin-name]` 2. Table level, under `""databases.[db-name].tables.[table-name].plugins.[plugin-name]`. This doesn't fit that well into `datasette.toml`, unless it's nested under `[metadata]`? ### Extensions, static, one-off plugins? We could also include equivalents of `--plugins-dir`, `--static`, and `--load-extension` into `datasette.toml`, but I'd imagine there's a few security concerns there to think through. ### Explicitly list with plugins to use? I believe Datasette by default will load all install plugins on startup, but maybe `datasette.toml` can specify a list of plugins to use? For example, a dev version of `datasette.toml` can specify `datasette-pretty-traces`, but the prod version can leave it out",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2093/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1783304750,I_kwDOBm6k_c5qSxIu,2094,JS Plugin Hooks for the Code Editor,15178711,asg017,open,0,,,,,0,2023-07-01T00:51:57Z,2023-07-01T00:51:57Z,,CONTRIBUTOR,,"When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor. I'd eventually like to build a JS plugin for [`sqlite-docs`](https://github.com/asg017/sqlite-docs), to add things like: - Inline documentation for tables/columns on hover - Inline docs for custom functions that are loaded in - More detailed autocomplete for tables/columns/functions I did some hacking to see what this would look like, see here: There can be a new hook that allows JS plugins to add new ""extension"" in the CodeMirror editorview here: https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25 Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of `""@codemirror/autocomplete""`, for example. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2094/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795051447,I_kwDOBm6k_c5q_k-3,2097,Drop Python 3.7,9599,simonw,closed,0,,,,,0,2023-07-08T18:39:44Z,2023-08-23T18:18:00Z,2023-08-23T18:18:00Z,OWNER,,"> I'm going to drop Python 3.7. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892_ It's not supported any more: https://devguide.python.org/versions/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2097/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,aki-k,open,0,,,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1803264272,I_kwDOBm6k_c5re6EQ,2101,alter: true support for JSON write API,9599,simonw,open,0,,,,,1,2023-07-13T15:24:11Z,2023-07-13T15:24:18Z,,OWNER,,"Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642 > The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1805076818,I_kwDOBm6k_c5rl0lS,2102,API tokens with view-table but not view-database/view-instance cannot access the table,9599,simonw,closed,0,9599,simonw,,,20,2023-07-14T15:34:27Z,2023-08-29T16:32:36Z,2023-08-29T16:32:35Z,OWNER,,"> Spotted a problem while working on this: if you grant a token access to view table for a specific table but don't also grant view database and view instance permissions, that token is useless. > > This was a deliberate design decision in Datasette - it's documented on https://docs.datasette.io/en/1.0a2/authentication.html#access-permissions-in-metadata > >> If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries. > > I'm now second-guessing if this was a good decision. _Originally posted by @simonw in https://github.com/simonw/datasette-auth-tokens/issues/7#issuecomment-1636031702_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,simonw,open,0,,,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,simonw,open,0,,,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822936521,I_kwDOBm6k_c5sp83J,2110,Merge database index page and query view,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:21:57Z,2023-07-26T19:53:25Z,2023-07-26T19:53:25Z,OWNER,,"Refs: - #2109 The idea here is that hitting `/content` without a `?sql=` will show an empty result set AND default to including a bunch of extras about the list of tables in the database. Then I won't have to think about `/content` and `/content?sql=` as separate pages any more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822934563,I_kwDOBm6k_c5sp8Yj,2109,Plan for getting the new JSON format query views working,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:20:18Z,2023-07-27T00:24:47Z,2023-07-26T18:25:34Z,OWNER,,"I've been stuck on this for too long. I'm breaking it down into a full milestone: https://github.com/simonw/datasette/milestone/29",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822940964,I_kwDOBm6k_c5sp98k,2115,Ensure all tests pass against new query view JSON,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,0,2023-07-26T18:25:20Z,2023-08-08T02:01:39Z,2023-08-08T02:01:38Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822937426,I_kwDOBm6k_c5sp9FS,2111,Implement new /content.json?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-07-26T18:22:39Z,2023-08-08T02:00:37Z,2023-08-08T02:00:22Z,OWNER,,"This will be the base that the remaining work builds on top of. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822938661,I_kwDOBm6k_c5sp9Yl,2112,Build HTML version of /content?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,5,2023-07-26T18:23:34Z,2023-08-08T02:01:09Z,2023-08-08T02:01:01Z,OWNER,,"This will help make the hook as robust as possible. - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822813627,I_kwDOBm6k_c5spe27,2108,some (many?) SQL syntax errors are not throwing errors with a .csv endpoint,536941,fgregg,open,0,,,,,0,2023-07-26T16:57:45Z,2023-07-26T16:58:07Z,,CONTRIBUTOR,,"here's a CTE query that should always fail with a syntax error: ```sql with foo as (nonsense) select * from foo; ``` when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B same with this bad sql ```sql select a, from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B vs https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B but, datasette catches this bad sql at both endpoints: ```sql slect a from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822982933,I_kwDOBm6k_c5sqIMV,2117,Figure out what to do about `DatabaseView.name`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-07-26T18:58:06Z,2023-08-08T02:02:07Z,2023-08-08T02:02:07Z,OWNER,,"In the old code: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/views/database.py#L34-L35 This `name` class attribute was later used by some of the plugin hooks, passed as `view_name`: https://github.com/simonw/datasette/blob/18dd88ee4d78fe9d760e9da96028ae06d938a85c/datasette/hookspecs.py#L50-L54 Figure out how that should work once I've refactored those classes to view functions instead. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,simonw,open,0,,,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6 Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1824457306,I_kwDOBm6k_c5svwJa,2122,Parameters on canned queries: fixed or query-generated list?,1563881,meowcat,open,0,,,,,0,2023-07-27T14:07:07Z,2023-07-27T14:07:07Z,,NONE,,"Hi, currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.) * adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is) * making it possible to provide a query which returns selectable values for a parameter, e.g. ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: ""SELECT VALUE FROM generate_series(1, 30, 1)"" # this obviously requires the corresponding sqlite extension instrument: sql: ""SELECT DISTINCT MACHINE FROM calendar_entries"" ``` * making it possible to provide a fixed list of parameters ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,cadeef,open,0,,,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument: ``` $ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload. ``` If a 'serve' file is created it launches properly (albeit with an empty database called serve): ``` $ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ``` Version (running from HEAD on main): ``` $ datasette --version datasette, version 1.0a2 ``` This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context. I'm happy to debug and land a patch if it's welcome.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1843391585,I_kwDOBm6k_c5t3-xh,2134,Add writable canned query demo to latest.datasette.io,9599,simonw,closed,0,,,,,5,2023-08-09T14:31:30Z,2023-08-10T01:22:46Z,2023-08-10T01:05:56Z,OWNER,,"This would be useful while working on: - #2114",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843710170,I_kwDOBm6k_c5t5Mja,2136,Query view shouldn't return `columns`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-08-09T17:23:57Z,2023-08-09T19:03:04Z,2023-08-09T19:03:04Z,OWNER,,"I just noticed that https://latest.datasette.io/fixtures/roadside_attraction_characteristics.json?_labels=on&_size=1 returns: ```json { ""ok"": true, ""next"": ""1"", ""rows"": [ { ""rowid"": 1, ""attraction_id"": { ""value"": 1, ""label"": ""The Mystery Spot"" }, ""characteristic_id"": { ""value"": 2, ""label"": ""Paranormal"" } } ], ""truncated"": false } ``` But https://latest.datasette.io/fixtures.json?sql=select+rowid%2C+attraction_id%2C+characteristic_id+from+roadside_attraction_characteristics+order+by+rowid+limit+1 returns: ```json { ""rows"": [ { ""rowid"": 1, ""attraction_id"": 1, ""characteristic_id"": 2 } ], ""columns"": [ ""rowid"", ""attraction_id"", ""characteristic_id"" ], ""ok"": true, ""truncated"": false } ``` The `columns` key in the query response is inconsistent with the table response.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843821954,I_kwDOBm6k_c5t5n2C,2137,Redesign row default JSON,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,1,2023-08-09T18:49:11Z,2023-08-09T19:02:47Z,,OWNER,,"This URL here: https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables ```json { ""database"": ""fixtures"", ""table"": ""simple_primary_key"", ""rows"": [ { ""id"": ""1"", ""content"": ""hello"" } ], ""columns"": [ ""id"", ""content"" ], ""primary_keys"": [ ""id"" ], ""primary_key_values"": [ ""1"" ], ""units"": {}, ""foreign_key_tables"": [ { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_blank_label"", ""count"": 0, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_blank_label=1"" }, { ""other_table"": ""foreign_key_references"", ""column"": ""id"", ""other_column"": ""foreign_key_with_label"", ""count"": 1, ""link"": ""/fixtures/foreign_key_references?foreign_key_with_label=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f3"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f3=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f2"", ""count"": 0, ""link"": ""/fixtures/complex_foreign_keys?f2=1"" }, { ""other_table"": ""complex_foreign_keys"", ""column"": ""id"", ""other_column"": ""f1"", ""count"": 1, ""link"": ""/fixtures/complex_foreign_keys?f1=1"" } ], ""query_ms"": 4.226590999678592, ""source"": ""tests/fixtures.py"", ""source_url"": ""https://github.com/simonw/datasette/blob/main/tests/fixtures.py"", ""license"": ""Apache License 2.0"", ""license_url"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""ok"": true, ""truncated"": false } ``` That `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1844213115,I_kwDOBm6k_c5t7HV7,2138,on_success_message_sql option for writable canned queries,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-10T00:20:14Z,2023-08-10T00:39:40Z,2023-08-10T00:34:26Z,OWNER,,"> Or... how about if the `on_success_message` option could define a SQL query to be executed to generate that message? Maybe `on_success_message_sql`. - https://github.com/simonw/datasette/issues/2134",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1840324765,I_kwDOBm6k_c5tsSCd,2129,CSV ?sql= should indicate errors,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-07T23:13:04Z,2023-08-08T02:02:21Z,,OWNER,,"> https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now: ```bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' ``` ``` HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1840329615,I_kwDOBm6k_c5tsTOP,2130,Render plugin mechanism needs `error` and `truncated` fields,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,2,2023-08-07T23:19:19Z,2023-08-08T01:51:54Z,2023-08-08T01:47:42Z,OWNER,,"While working on: - https://github.com/simonw/datasette/pull/2118 It became clear that the `render` callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette Needs to grow the ability to be told if an error occurred (an `error` string) and if the results were truncated (a `truncated` boolean).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1840417903,I_kwDOBm6k_c5tsoxv,2131,Refactor code that supports templates_considered comment,9599,simonw,open,0,,,3268330,Datasette 1.0,1,2023-08-08T01:28:36Z,2023-08-09T15:27:41Z,,OWNER,,"I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167 I think it should move to `datasette.render_template()` - and maybe have a renamed template variable too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1841343173,I_kwDOBm6k_c5twKrF,2132,Get form fields on query page working again ,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,1,2023-08-08T13:39:05Z,2023-08-08T13:45:10Z,2023-08-08T13:45:09Z,OWNER,,"Caused by: - #2112 https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+where+%22pk1%22+%3D+%3Ap0+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=b The `:p0` form field is missing. Submitting the form results in this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1841501975,I_kwDOBm6k_c5twxcX,2133,[feature request]`datasette install plugins.json` options,54462,HaveF,closed,0,,,,,9,2023-08-08T15:06:50Z,2023-08-10T00:31:24Z,2023-08-09T22:04:46Z,NONE,,"Hi, simon ❤️ `datasette plugins --all > plugins.json` could generate all plugins info. On another machine, it would be great to install all plugins just by `datasette install plugins.json`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1846076261,I_kwDOBm6k_c5uCONl,2139,border-color: ##ff0000 bug - two hashes,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-11T01:22:58Z,2023-08-11T05:16:24Z,2023-08-11T05:16:24Z,OWNER,,"Spotted this on https://latest.datasette.io/extra_database ```html
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1847201263,I_kwDOBm6k_c5uGg3v,2140,Remove all remaining documentation instances of '$ ',9599,simonw,closed,0,,,,,1,2023-08-11T17:42:13Z,2023-08-11T17:52:25Z,2023-08-11T17:45:00Z,OWNER,,"For example this: https://github.com/simonw/datasette/blob/4535568f2ce907af646304d0ebce2500ebd55677/docs/authentication.rst?plain=1#L33-L35 The problem with that `$ ` prefix is that it prevents users from copying and pasting the raw command. https://docs.datasette.io/en/stable/authentication.html#using-the-root-actor",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855885427,I_kwDOBm6k_c5unpBz,2143,De-tangling Metadata before Datasette 1.0,15178711,asg017,open,0,,,,,24,2023-08-18T00:51:50Z,2023-08-24T18:28:27Z,,CONTRIBUTOR,,"Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add ""metadata"" about your ""data"" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries. Specifically, I've found the following problems when working with Datasette metadata: 1. Metadata cannot be updated without re-starting the entire Datasette instance. 2. The `metadata.json`/`metadata.yaml` has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries 3. The Python APIs for defining extra metadata are a bit awkward (the `datasette.metadata()` class, `get_metadata()` hook, etc.) ## Possible solutions Here's a few ideas of Datasette core changes we can make to address these problems. ### Re-vamp the Datasette Python metadata APIs The Datasette object has a single `datasette.metadata()` method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the `get_metadata()` hook. The `get_metadata()` hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do. (I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods) ### Add an optional `datasette_metadata` table Datasette should detect and use metadata stored in a new special table called `datasette_metadata`. This would be a regular table that a user can edit on their own, and would serve as a ""live updating"" source of metadata, than can be changed while the Datasette instance is running. Not too sure what the schema would look like, but I'd imagine: ```sql CREATE TABLE datasette_metadata( level text, target any, key text, value any, primary key (level, target) ) ``` Every row in this table would map to a single metadata ""entry"". - `level` would be one of ""datasette"", ""database"", ""table"", ""column"", which is the ""level"" the entry describes. For example, `level=""table""` means it is metadata about a specific table, `level=""database""` for a specific database, or `level=""datasette""` for the entire Datasette instance. - `target` would ""point"" to the specific object the entry metadata is about, and would depend on what `level` is specific. - `level=""database""`: `target` would be the string name of the database that the metadata entry is about. ex `""fixtures""` - `level=""table""`: `target` would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex `[""fixtures"", ""students""]` - `level=""column""`: `target` would be a JSON array of 3 strings: The database name, table name, and column name. Ex `[""fixtures"", ""students"", ""student_id""`] - `key` would be the type of metadata entry the row has, similar to the current ""keys"" that exist in `metadata.json`. Ex `""about_url""`, `""source""`, `""description""`, etc - `value` would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc A quick sample: level | target | key | value -- | -- | -- | -- datasette | NULL | title | my datasette title... db | fixtures | source | table | [""fixtures"", ""students""] | label_column | student_name column | [""fixtures"", ""students"", ""birthdate""] | description | This `datasette_metadata` would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in `datasette_metadata`, to update descriptions/columns on the fly. ### Re-vamp `metadata.json` and move non-metadata config to another place The motivation behind this is that it's awkward that `metadata.json` contains config about things that are not strictly metadata, including: - Plugin configuration - [Authentication/permissions](https://docs.datasette.io/en/latest/authentication.html#access-permissions-in-metadata) (ex the `allow` key on datasettes/databases/tables - Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases. I think we should move these outside of `metadata.json` and into a different file. The `datasette.json` idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in `datasette.json`, while `metadata.json`/`datasette_metadata` will strictly be about documenting databases/tables/columns. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857234285,I_kwDOBm6k_c5usyVt,2145,If a row has a primary key of `null` various things break,9599,simonw,open,0,,,,,23,2023-08-18T20:06:28Z,2023-08-21T17:30:01Z,,OWNER,,"Stumbled across this while experimenting with `datasette-write-ui`. The error I got was a 500 on the `/db` page: > `'NoneType' object has no attribute 'encode'` Tracked it down to this code, which assembles the URL for a row page: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L120-L134 That's because `tilde_encode` can't handle `None`: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L1175-L1178 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1858228057,I_kwDOBm6k_c5uwk9Z,2147,Plugin hook for database queries that are run,18899,jackowayed,open,0,,,,,6,2023-08-20T18:43:50Z,2023-08-24T03:54:35Z,,NONE,,"I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.) As far as I can tell reading the docs, there isn't really a hook setup to allow this. Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good. I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just ""haven't needed it yet."" I'm potentially interested in implementing the hook if the latter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1876407598,I_kwDOBm6k_c5v17Uu,2169,execute-sql on a database should imply view-database/view-permission,9599,simonw,closed,0,,,,,0,2023-08-31T22:45:56Z,2023-08-31T22:46:28Z,2023-08-31T22:46:28Z,OWNER,,"I noticed that a token with `execute-sql` permission alone did not work, because it was not allowed to view the instance of the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1876353656,I_kwDOBm6k_c5v1uJ4,2168,Consider a request/response wrapping hook slightly higher level than asgi_wrapper(),9599,simonw,open,0,,,,,6,2023-08-31T21:42:04Z,2023-09-10T17:54:08Z,,OWNER,,"There's a long justification for why this might be needed here: - https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001 Short version: it would be neat if it was possible to stash some data on the `request` object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware. The `asgi_wrapper()` mechanism doesn't have access to the request or response objects - it gets `scope` and can mess around with `receive` and `send`, but those are pretty low-level primitives. Since Datasette has well-defined `request` and `response` objects now it might be nice to have a middleware layer that can manipulate those directly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1863810783,I_kwDOBm6k_c5vF37f,2150,form label { width: 15% } is a bad default,9599,simonw,closed,0,,,,,4,2023-08-23T18:22:27Z,2023-08-23T18:37:18Z,2023-08-23T18:35:48Z,OWNER,,"See: - https://github.com/simonw/datasette-configure-fts/issues/14 - https://github.com/simonw/datasette-auth-tokens/issues/12",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865232341,I_kwDOBm6k_c5vLS_V,2153,Datasette --get --actor option,9599,simonw,closed,0,,,,,5,2023-08-24T14:00:03Z,2023-08-28T20:19:15Z,2023-08-28T20:15:53Z,OWNER,,"I experimented with a prototype of this here: - https://github.com/simonw/datasette/issues/2102#issuecomment-1691037971_ Which lets me run requests as if they belonged to a specific actor like this: ```bash datasette fixtures.db --get '/fixtures/facetable.json' --actor '{ ""_r"": { ""r"": { ""fixtures"": { ""facetable"": [ ""vt"" ] } } }, ""a"": ""user"" }' ``` Really useful for testing actors an `_r` options. Is this worth adding as a feature?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,simonw,open,0,,,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax. > > Here's what I've come up with: > ``` > datasette \ > -s settings.sql_time_limit_ms 1000 \ > -s plugins.datasette-auth-tokens.manage_tokens true \ > -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ > mydatabase.db tokens.db > ``` > Which would be equivalent to `datasette.yml` containing this: > ```yaml > plugins: > datasette-auth-tokens: > manage_tokens: true > manage_tokens_database: tokens > settings: > sql_time_limit_ms: 1000 > ``` More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,asg017,open,0,,,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1872043170,I_kwDOBm6k_c5vlRyi,2163,Rename core_X to catalog_X in the internals,9599,simonw,closed,0,,,,,1,2023-08-29T16:45:00Z,2023-08-29T17:01:31Z,2023-08-29T17:01:31Z,OWNER,,"Discussed with Alex this morning. We think the American spelling is fine here (it's shorter than `catalogue`) and that it's a slightly less lazy name than `core_`. Follows: - https://github.com/simonw/datasette/issues/2157",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1874255116,I_kwDOBm6k_c5vtt0M,2164,Ability to only load a specific list of plugins,9599,simonw,closed,0,,,,,1,2023-08-30T19:33:41Z,2023-09-08T04:35:46Z,2023-08-30T22:12:27Z,OWNER,,"I'm going to try and get this working through an environment variable, so that you can start Datasette and it will only load a subset of plugins including those that use the `register_commands()` hook. Initial research on this: - https://github.com/pytest-dev/pluggy/issues/422",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1875739055,I_kwDOBm6k_c5vzYGv,2167,Document return type of await ds.permission_allowed(),9599,simonw,open,0,,,,,0,2023-08-31T15:14:23Z,2023-08-31T15:14:23Z,,OWNER,,"The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350 On inspecting the code I'm not 100% sure if it's possible for this. method to return `None`, or if it can only return `True` or `False`. Need to confirm that. https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1895266807,I_kwDOBm6k_c5w93n3,2184,Design decision - should configuration be exposed at /-/config ?,9599,simonw,open,0,,,,,0,2023-09-13T21:07:08Z,2023-09-13T21:07:38Z,,OWNER,,"> This made me think. That `{""$env"": ""ENV_VAR""}` hack was introduced back here: > > - https://github.com/simonw/datasette/issues/538 > > The problem it was solving was that metadata was visible to everyone with access to the instance at `/-/metadata` but plugins clearly needed a way to set secret settings. > > Now that this stuff is moving to config, we have some decisions to make: > > 1. Add `/-/config` to let people see the configuration of their instance, and keep the `$env` trick for secret settings. > 2. Say all configuration aside from metadata is secret and make `$env` optional or ditch it entirely. > 3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from `/-/config` > > I've found `/-/metadata` extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like. > > So I'm leaning towards option 1 or 3. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924_ Also refs: - #2093",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1884408624,I_kwDOBm6k_c5wUcsw,2177,Move schema tables from _internal to _catalog,9599,simonw,open,0,,,,,1,2023-09-06T16:58:33Z,2023-09-06T17:04:30Z,,OWNER,,"This came up in discussion over: - https://github.com/simonw/datasette/pull/2174 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1886350562,I_kwDOBm6k_c5wb2zi,2178,Don't show foreign key links to tables the user cannot access,9599,simonw,closed,0,,,,,5,2023-09-07T17:56:41Z,2023-09-07T23:28:27Z,2023-09-07T23:28:27Z,OWNER,,"Spotted this problem while working on this plugin: - https://github.com/simonw/datasette-public It's possible to make a table public to any users - but then you may end up with situations like this: That table is public, but the foreign key links go to tables that are NOT public. We're also leaking the names of the values in those private tables here, which we shouldn't do. So this is a tiny bit of an information leak. Since this only affects people who have configured a table to be public that has foreign keys to a table that is private I don't think this is worth issuing a vulnerability report about - I very much doubt anyone is running Datasette configured in a way that could result in problems because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886649402,I_kwDOBm6k_c5wc_w6,2179,Flaky test: test_hidden_sqlite_stat1_table,9599,simonw,closed,0,,,,,0,2023-09-07T22:48:43Z,2023-09-07T22:51:19Z,2023-09-07T22:51:19Z,OWNER,,"This test here: https://github.com/simonw/datasette/blob/fbcb103c0cb6668018ace539a01a6a1f156e8d6a/tests/test_api.py#L1011-L1020 It failed for me like this: `E AssertionError: assert [('normal', False), ('sqlite_stat1', True), ('sqlite_stat4', True)] in ([('normal', False), ('sqlite_stat1', True)],)` Looks like some builds of SQLite include a `sqlite_stat4` table.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1886791100,I_kwDOBm6k_c5wdiW8,2180,Plugin hook: `actors_from_ids()`,9599,simonw,closed,0,,,,,6,2023-09-08T01:16:41Z,2023-09-10T17:44:14Z,2023-09-08T04:28:03Z,OWNER,,"In building Datasette Cloud we realized that a bunch of the features we are building need a way of resolving an actor ID to the actual actor, in order to display something more interesting than just an integer ID. Social plugins in particular need this - comments by X, CSV uploaded by X, that kind of thing. I think the solution is a new plugin hook: `actors_from_ids(datasette, ids)` which can return a list of actor dictionaries. The default implementation can return `[{""id"": ""...""}]` for the IDs passed to it. Pluggy has a `firstresult=True` option which is relevant here, since this is the first plugin hook we will have implemented where only one plugin should provide an answer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,Olshansk,closed,0,,,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17 Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported. That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed. It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,geofinder,open,0,,,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1900026059,I_kwDOBm6k_c5xQBjL,2188,"Plugin Hooks for ""compile to SQL"" languages",15178711,asg017,open,0,,,,,2,2023-09-18T01:37:15Z,2023-09-18T06:58:53Z,,CONTRIBUTOR,,"There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples: - Logica https://logica.dev - PRQL https://prql-lang.org - Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage. A few things I'd imagine a `datasette-prql` or `datasette-logica` plugin would do: - `prql=` instead of `sql=` - Code editor support (syntax highlighting, autocomplete) - Hide/show SQL",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1901416155,I_kwDOBm6k_c5xVU7b,2189,Server hang on parallel execution of queries to named in-memory databases,9599,simonw,closed,0,,,,,31,2023-09-18T17:23:18Z,2023-09-21T22:26:21Z,2023-09-21T22:26:21Z,OWNER,,"I've started to encounter a bug where queries to tables inside named in-memory databases sometimes trigger server hangs. I'm still trying to figure out what's going on here - on one occasion I managed to Ctrl+C the server and saw an exception that mentioned a thread lock, but usually hitting Ctrl+C does nothing and I have to `kill -9` the PID instead. This is all running on my M2 Mac. I've seen the bug in the Datasette 1.0 alphas and in Datasette 0.64.3 - but reverting to 0.61 appeared to fix it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907655261,I_kwDOBm6k_c5xtIJd,2193,"""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run",9599,simonw,closed,0,,,,,6,2023-09-21T19:49:34Z,2023-09-21T21:56:43Z,2023-09-21T21:56:43Z,OWNER,,"> That passed on 3.8 but should have failed: https://github.com/simonw/datasette/actions/runs/6266341481/job/17017099801 - the ""Test DATASETTE_LOAD_PLUGINS"" test shows errors but did not fail the CI run. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2057#issuecomment-1730201226_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907695234,I_kwDOBm6k_c5xtR6C,2194,"Deploy failing with ""plugins/alternative_route.py: Not a directory""",9599,simonw,closed,0,,,,,8,2023-09-21T20:17:49Z,2023-09-21T22:08:19Z,2023-09-21T22:08:19Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/6266449018/job/17017460074 This is a bit of a mystery, I don't think I've changed anything recently that could have broken this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1907765514,I_kwDOBm6k_c5xtjEK,2195,`datasette publish` needs support for the new config/metadata split,9599,simonw,open,0,,,,,9,2023-09-21T21:08:12Z,2023-09-21T22:57:48Z,,OWNER,,"> ... which raises the challenge that `datasette publish` doesn't yet know what to do with a config file! _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,hcarter333,open,0,,,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of `python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite` and `python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""` and `python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll` I got the error: ``` File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # ""extension"" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) #else: conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: `python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1940346034,I_kwDOBm6k_c5zp1Sy,2199,Detailed upgrade instructions for metadata.yaml -> datasette.yaml,9599,simonw,open,0,,,3268330,Datasette 1.0,7,2023-10-12T16:21:25Z,2023-10-12T22:08:42Z,,OWNER,,"> `Exception: Datasette no longer accepts plugin configuration in --metadata. Move your ""plugins"" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.` > > I think we should link directly to documentation that tells people how to perform this upgrade. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1751214236,I_kwDOC8SPRc5oYWic,36,Getting sqlite_master may not be modified when creating dogsheep index,8711912,khushmeeet,open,0,,,,,0,2023-06-11T03:21:53Z,2023-06-11T03:21:53Z,,NONE,,"When creating a `dogsheep` index from `config.yml` file on pocket.db (created using pocket-to-sqlite), I am getting this error ``` Traceback (most recent call last): File ""/Users/khushmeeet/.pyenv/versions/3.11.2/bin/dogsheep-beta"", line 8, in sys.exit(cli()) ^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/cli.py"", line 36, in index run_indexer( File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 32, in run_indexer ensure_table_and_indexes(db, tokenize) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 91, in ensure_table_and_indexes table.add_foreign_key(*fk) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2155, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 1116, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified ``` Command I ran to get this error ``` dogsheep-beta index pocket.db config.yml ``` Dogsheep version ``` dogsheep-beta, version 0.10.2 ``` Python version ``` Python 3.11.2 ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1888477283,I_kwDOC8SPRc5wj-Bj,38,Run `rebuild_fts` after building the index,9599,simonw,open,0,,,,,0,2023-09-08T23:17:45Z,2023-09-08T23:17:45Z,,MEMBER,,"In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix: ```bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,mmngreco,open,0,,,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the `README.md` I've faced this error: ```bash (venv) mgreco@pop-os apple-health master* (23:44|0s) $ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml Importing from HealthKit [------------------------------------] 0% Traceback (most recent call last): File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')()) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite for tag, el in find_all_tags( File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags for event, el in parser.read_events(): File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events raise event File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: syntax error: line 156, column 0 ``` So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads: 1. Uncompress the zip and move the new folder where `export.xml` is. 1. Create a `patch.txt` with the following content ```diff --- export.xml 2022-09-18 15:17:09.000000000 -0400 +++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400 @@ -15,6 +15,7 @@ HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED + HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED > - + - + - device CDATA #IMPLIED - - -> ]> ``` 1. Apply the path with the command: `patch < patch.txt` 1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml` 1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1041778507,I_kwDOCGYnMM4-GEdL,334,Filter by datetime objects using rows_where(),11642379,viseshrp,closed,0,,,,,0,2021-11-02T00:44:08Z,2021-11-13T19:23:21Z,2021-11-13T19:23:21Z,NONE,,"Firstly, thanks for this nice utility. It would be nice to have an example in the docs on how to filter by date range using `rows_where()`. This doesn't seem to work: ``` table.rows_where('datetime(created) between datetime(""2021-10-31T17:29:59.277428-04:00"") AND datetime(""2021-11-01T03:44:04.544651+00:00"")') ``` I could probably just use `db.query()`, which works for the above, but it would be nice if I could pass in `datetime` objects in `rows_where()`. Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1042569687,I_kwDOCGYnMM4-JFnX,335,sqlite-utils index-foreign-keys fails due to pre-existing index,596279,zaneselvans,closed,0,,,,,11,2021-11-02T16:22:11Z,2021-11-14T22:55:56Z,2021-11-14T22:55:56Z,NONE,,"While running the command: ```sh sqlite-utils index-foreign-keys $SQLITE_DIR/pudl.sqlite ``` I got the following error: ``` Traceback (most recent call last): File ""/home/zane/miniconda3/envs/pudl-dev/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 454, in index_foreign_keys db.index_foreign_keys() File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 902, in index_foreign_keys table.create_index([fk.column]) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1563, in create_index self.db.execute(sql) File ""/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: index idx_generators_eia860_report_date already exists ``` This DB was created with the foreign key constraint `PRAGMA` enabled and a bunch of column-level `CHECK` constraints. Is this an expected behavior? Should one not try to index foreign keys if FK constraints are already being enforced within the DB? I'm also noticing that the size of the DB after FK indexes have been added went from 483MB to 835MB, which seems like a much bigger jump than when I've done this previously. Software versions... * sqlite-utils 3.17.1 * sqlite 3.36.0 * SQLAlchemy 1.4.26 (used to create the DB)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1044267332,I_kwDOCGYnMM4-PkFE,336,"sqlite-util tranform --column-order mangles columns of type ""timestamp""",536941,fgregg,closed,0,,,,,1,2021-11-04T01:15:38Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,CONTRIBUTOR,,"Reproducible code below: ```bash > echo 'create table bar (baz text, created_at timestamp default CURRENT_TIMESTAMP)' | sqlite3 foo.db > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE bar (baz text, created_at timestamp default CURRENT_TIMESTAMP); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT 'CURRENT_TIMESTAMP' ); sqlite> .exit > sqlite-utils transform foo.db bar --column-order baz > sqlite3 foo.db SQLite version 3.36.0 2021-06-18 18:36:39 Enter "".help"" for usage hints. sqlite> .schema bar CREATE TABLE IF NOT EXISTS ""bar"" ( [baz] TEXT, [created_at] FLOAT DEFAULT '''CURRENT_TIMESTAMP''' ); ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053087862,I_kwDOCGYnMM4-xNh2,338,"dict, list, tuple should all map to TEXT",9599,simonw,closed,0,,,,,0,2021-11-15T00:28:01Z,2021-11-15T00:36:03Z,2021-11-15T00:36:03Z,OWNER,,"> This relates to the fact that dictionaries, lists and tuples get special treatment and are converted to JSON strings, using this code: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L2937-L2947 > > So the `COLUMN_TYPE_MAPPING` should include those too - right now it looks like this: https://github.com/simonw/sqlite-utils/blob/e8d958109ee290cfa1b44ef7a39629bb50ab673e/sqlite_utils/db.py#L165-L188 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/322#issuecomment-968401459_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053122092,I_kwDOCGYnMM4-xV4s,339,`table.lookup()` option to populate additional columns when creating a record,9599,simonw,closed,0,,,,,4,2021-11-15T01:41:17Z,2021-11-15T02:02:34Z,2021-11-15T02:02:00Z,OWNER,,"> For the commits table I feel like I want a version of `table.lookup()` that can be passed additional columns to populate only if the record does not exist yet. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/12#issuecomment-967455017_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053136495,I_kwDOCGYnMM4-xZZv,341,`hash_id: Optional[Any]` should be `hash_id: Optional[str]`,9599,simonw,closed,0,,,,,0,2021-11-15T02:12:39Z,2021-11-15T02:19:31Z,2021-11-15T02:19:31Z,OWNER,,"In a few places: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L642 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L751 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1049 https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L1230 But it's correct here: https://github.com/simonw/sqlite-utils/blob/54a2269e91ce72b059618662ed133a85f3d42e4a/sqlite_utils/db.py#L2470",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1005891028,I_kwDOCGYnMM479K3U,329,Rethink approach to [ and ] in column names (currently throws error),9599,simonw,closed,0,,,,,12,2021-09-23T22:14:24Z,2021-11-15T02:57:51Z,2021-11-15T02:57:51Z,OWNER,,"> I think it's best to still keep `[` and `]` out of column names though. Transforming them into `(` and `)` seems reasonable - but should that happen here or in `sqlite-utils`? I think in `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/datasette-app/issues/121#issuecomment-926200398_ This is a rethinking of the solution to: - https://github.com/simonw/sqlite-utils/issues/86",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,andreaslongo,closed,0,,,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"``` Python 3.9.5 mypy 0.910 sqlite-utils 3.17.1 ``` While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error: ``` mypy . src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ Found 2 errors in 2 files (checked 7 source files) ``` When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away. ``` al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la total 200 drwx------ 3 al al 4096 Oct 14 22:00 . drwx------ 117 al al 4096 Oct 12 21:12 .. -rw------- 1 al al 64409 Oct 12 21:11 cli.py -rw------- 1 al al 109092 Oct 12 21:11 db.py -rw------- 1 al al 0 Oct 14 22:00 py.typed -rw------- 1 al al 684 Oct 12 21:11 recipes.py -rw------- 1 al al 7988 Oct 12 21:11 utils.py -rw------- 1 al al 113 Oct 12 21:11 __init__.py ``` I would like to suggest adding a `py.typed` file to the repository. See also the mypy docs on creating PEP 561 compatible packages: https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028056713,I_kwDOCGYnMM49RuaJ,332,`sqlite-utils memory --flatten` option to flatten nested JSON,22523840,rdtq,closed,0,,,,,1,2021-10-16T14:04:42Z,2021-11-14T23:05:05Z,2021-11-14T23:05:05Z,NONE,,"currently --flatten option works only for `insert` command, it would be cool if it worked for `memory` as well to query nested json",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,simonw,open,0,,,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,simonw,open,0,,,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072780607,I_kwDOCGYnMM4_8VU_,351,Support `--import xml.etree.ElementTree` in `sqlite-utils convert`,9599,simonw,closed,0,,,,,1,2021-12-07T00:40:29Z,2021-12-11T00:11:25Z,2021-12-11T00:11:25Z,OWNER,,"It's not possible to use a module that requires a nested import, such as `xml.etree.ElementTree`, at the moment. I found and fixed this bug in `git-history`, I should replicate that fix (and accompanying documentation) here: https://github.com/simonw/git-history/issues/39",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072792507,I_kwDOCGYnMM4_8YO7,352,`sqlite-utils insert --extract colname`,9599,simonw,open,0,,,,,4,2021-12-07T00:55:44Z,2022-02-03T22:59:36Z,,OWNER,,"Is there a reason I've not added `--extract` as an option for `sqlite-utils insert` next? There's a `extracts=` option for the various `table.insert()` etc methods - last line in this code block: https://github.com/simonw/sqlite-utils/blob/213a0ff177f23a35f3b235386366ff132eb879f1/sqlite_utils/db.py#L2483-L2495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1058196641,I_kwDOCGYnMM4_Esyh,342,Extra options to `lookup()` which get passed to `insert()`,9599,simonw,closed,0,,,,,7,2021-11-19T06:53:03Z,2021-11-19T07:26:54Z,2021-11-19T07:26:54Z,OWNER,,"For https://github.com/simonw/git-history/issues/12 I found myself wanting to pass extra options to `lookup()` to set the column order, primary key etc.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1063388037,I_kwDOCGYnMM4_YgOF,343,Provide function to generate hash_id from specified columns,82988,psychemedia,closed,0,,,,,4,2021-11-25T10:12:12Z,2022-03-02T04:25:25Z,2022-03-02T04:25:25Z,NONE,,"Hi I note that you define `_hash()` to create a `hash_id` from non-id column values in a table [here](https://github.com/simonw/sqlite-utils/blob/8f386a0d300d1b1c76132bb75972b755049fb742/sqlite_utils/db.py#L2996). It would be useful to be able to call a complementary function to generate a corresponding `_id` from a subset of specified columns when adding items to another table, eg to support the creation of foreign keys. Or is there a better pattern for doing that?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066474200,I_kwDOCGYnMM4_kRrY,344,Support STRICT tables,9599,simonw,closed,0,,,,,14,2021-11-29T20:32:23Z,2023-12-08T05:22:39Z,2023-12-08T05:22:39Z,OWNER,,"New in SQLite 3.37.0, released a few days ago: https://www.sqlite.org/stricttables.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066501534,I_kwDOCGYnMM4_kYWe,345,`table.strict` introspection boolean for identifying STRICT mode tables,9599,simonw,closed,0,,,,,2,2021-11-29T21:05:10Z,2021-11-29T22:45:26Z,2021-11-29T22:44:36Z,OWNER,,"> From the STRICT docs: >> The SQLite parser accepts a comma-separated list of table options after the final close parenthesis in a CREATE TABLE statement. As of this writing (2021-08-23) only two options are recognized: >> >> - STRICT >> - [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html) > > So I think I need to read the `CREATE TABLE` statement from the `sqlite_master` table, split on the last `)`, split those tokens on `,` and see if `create` is in there (case insensitive). _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982020757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1066563554,I_kwDOCGYnMM4_knfi,346,Way to test SQLite 3.37 (and potentially other versions) in CI,9599,simonw,open,0,,,,,5,2021-11-29T22:21:06Z,2021-11-29T23:12:49Z,,OWNER,,"> Need to figure out a good pattern for testing this in CI too - it will currently skip the new tests if it doesn't have SQLite 3.37 or higher. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982076924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1067771698,I_kwDOCGYnMM4_pOcy,348,Command for creating an empty database,9599,simonw,closed,0,,,7558727,3.21,6,2021-11-30T23:24:27Z,2022-01-13T07:06:59Z,2022-01-09T20:33:20Z,OWNER,,"I sometimes find the need to create an empty SQLite database file - for example if I want to enable WAL on it before using it with another script. I currently do that like this: sqlite3 my.db vacuum sqlite-utils enable-wal my.db It would be nice if `sqlite-utils` had a convenience command for doing this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1976986318,I_kwDOCGYnMM511mrO,599,Cannot find spatialite on arm64 linux,37802088,MikeCoats,closed,0,,,,,1,2023-11-03T22:05:51Z,2023-11-04T01:06:31Z,2023-11-04T00:33:28Z,CONTRIBUTOR,,"Initially, I found an issue in `datasette` where it wouldn’t find `spatialite` when running on my Radxa Rock 5B - an RK3588 powered SBC, running the arm64 build of Debian Bullseye. I confirmed the same behaviour on my Raspberry Pi 4 - a BCM2711 powered SBC, running the arm64 build of Debian Bookworm. ``` $ datasette --load-extension=spatialite example.db Error: Could not find SpatiaLite extension ``` I did some digging and realised the issue originates in this project. Even with the `libsqlite3-mod-spatialite` package installed, `pytest` skips all of the GIS tests in the project. ``` $ apt list --installed | grep spatial […] libsqlite3-mod-spatialite/stable,now 5.0.1-3 arm64 [installed] $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` I tracked the issue down to the [`find_sqlite()` function in the `utils.py`](https://github.com/simonw/sqlite-utils/blob/622c3a5a7dd53a09c029e2af40c2643fe7579340/sqlite_utils/utils.py#L60) file. The [`SPATIALITE_PATHS`](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/utils.py#L34-L39) array doesn’t have an entry for the location of this module on arm64 linux. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1977155641,I_kwDOCGYnMM512QA5,601,Move plugin directory into documentation,9599,simonw,open,0,,,,,0,2023-11-04T04:07:52Z,2023-11-04T04:07:52Z,,OWNER,,"https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html https://til.simonwillison.net/readthedocs/stable-docs",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1978603203,I_kwDOCGYnMM517xbD,602,`sqlite-utils transform` removes the `AUTOINCREMENT` keyword,4472046,ArsTapatun,open,0,,,,,0,2023-11-06T08:48:43Z,2023-11-06T08:48:43Z,,NONE,,"### Context We ran into this bug randomly, noticing that deleted `ROWID` would get reused after migrating the DB. Using `transform` to change any column in the table will also unexpectedly strip away the `AUTOINCREMENT` keyword from the primary key definition, even if it was not the transformation target. ### Reproducible example **Original database** ```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF $ sqlite3 test.db "".schema mytable"" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ``` **Modified database after sqlite-utils** ```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE IF NOT EXISTS ""mytable"" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1988525411,I_kwDOCGYnMM52hn1j,603,Pyhton 3.12 Bug report,1324252,constantinedev,open,0,,,,,1,2023-11-10T22:57:48Z,2023-12-08T05:10:31Z,,NONE,,"I start with new python3 verison 3.12.0 Also have the error where connect DataBase ``` Traceback (most recent call last): File ""/home/t/Development/python/FKPJ/ClinicSYS/run.py"", line 1, in import re, os, io, json, sqlite_utils, requests, pytz, logging File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/__init__.py"", line 1, in from .db import Database File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 277, in class Database: File ""/home/t/.local/lib/python3.12/site-packages/sqlite_utils/db.py"", line 306, in Database filename_or_conn: Optional[Union[str, pathlib.Path, sqlite3.Connection]] = None, ^^^^^^^^^^^^^^^^^^ ``` This bug come from `sqlite-utils` since's v3.33. Anyone get the same ? As well now of the resolved plan just keep the sqlite-utils version in python3.12 with v3.32.1 [tested] but where are the sqlite3.Connection problem.... This won't happen on python version down to 3.11[tested] Just the python3.12.0, I have test this error are come from the sqlite3 connection The error say from `sqlite_utils` and with the sqlite3 Connection, what can I do. Let fix together.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2007893839,I_kwDOCGYnMM53rgdP,605,Insert fails with `Error: Python int too large to convert to SQLite INTEGER`; can we use `NUMERIC` here?,12229877,Zac-HD,closed,0,,,,,1,2023-11-23T10:19:46Z,2023-12-08T05:07:54Z,2023-12-08T05:07:54Z,NONE,,"I'm currently working on a new feature for Hypothesis, where we can dump a tidy jsonlines table of all the test cases we tried - including arguments, outcomes, timings, coverage, etc. Exploring this seems like a perfect cases for `sqlite-utils` and `datasette`, but I pretty quickly ran into an integer overflow problem and don't want to recommend that experience to my users. I originally went to report this as a bug... and then found https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895581038 almost exactly matched my repro 😅 https://github.com/simonw/sqlite-utils/issues/110#issuecomment-626391063 suggests that using `NUMERIC` would avoid this overflow error, although ""If the TEXT value is a well-formed integer literal that is too large to fit in a 64-bit signed integer, it is converted to REAL."" suggests that this would come at the cost of rounding to the nearest float value. Maybe I should just convert large integers to float before writing out my json? After a bit more hacking, ""manually cast large integers to float"" seems like a decent solution for my particular case, but having written it up I thought I might as well post this issue anyway - I hope it's useful feedback, and won't mind at all if you close as wontfix if it's not.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2029161033,I_kwDOCGYnMM548opJ,606,str and int as aliases for text and integer,9599,simonw,closed,0,,,,,2,2023-12-06T18:35:49Z,2023-12-06T19:44:04Z,2023-12-06T18:49:32Z,OWNER,,"I keep making this mistake: ```bash sqlite-utils add-column content.db assets _since int ``` ``` Usage: sqlite-utils add-column [OPTIONS] PATH TABLE COL_NAME [[integer|float|b lob|text|INTEGER|FLOAT|BLOB|TEXT]] Try 'sqlite-utils add-column -h' for help. Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]': 'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077102934,I_kwDOCGYnMM5AM0lW,353,"Allow passing a file of code to ""sqlite-utils convert""",536941,fgregg,closed,0,,,,,8,2021-12-10T18:06:14Z,2021-12-11T01:38:29Z,2021-12-11T01:09:39Z,CONTRIBUTOR,,"sqlite-utils is so nice, but the ergonomics of the multiline code in kind of tough. It's really hard (maybe impossible) to make the newlines play well with Makefiles. it would be great to write your code fragment in a separate file and direct it into the sqlite-utils either like ```sqlite-utils convert my.db my_table my_column < custom_code.py``` or ```sqlite-utils convert my.db my_table my_column --custom-code=custom_code.py``` Thanks, as ever, for these great tools!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077243232,I_kwDOCGYnMM5ANW1g,354,Test failure in test_rebuild_fts,9599,simonw,closed,0,,,,,7,2021-12-10T21:27:55Z,2021-12-11T01:08:46Z,2021-12-11T01:08:46Z,OWNER,,"Not sure why this has only just started failing, but I'm getting this: https://github.com/simonw/sqlite-utils/runs/4488687639 ``` E sqlite3.DatabaseError: database disk image is malformed sqlite_utils/db.py:425: DatabaseError _______________________ test_rebuild_fts[searchable_fts] _______________________ fresh_db = > table_to_fix = 'searchable_fts' @pytest.mark.parametrize(""table_to_fix"", [""searchable"", ""searchable_fts""]) def test_rebuild_fts(fresh_db, table_to_fix): table = fresh_db[""searchable""] table.insert(search_records[0]) table.enable_fts([""text"", ""country""]) # Run a search rows = list(table.search(""tanuki"")) assert len(rows) == 1 assert { ""rowid"": 1, ""text"": ""tanuki are running tricksters"", ""country"": ""Japan"", ""not_searchable"": ""foo"", }.items() <= rows[0].items() # Delete from searchable_fts_data fresh_db[""searchable_fts_data""].delete_where() # This should have broken the index with pytest.raises(sqlite3.DatabaseError): list(table.search(""tanuki"")) # Running rebuild_fts() should fix it > fresh_db[table_to_fix].rebuild_fts() ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077322009,I_kwDOCGYnMM5ANqEZ,355,Allow users to pass a full convert() function definition,9599,simonw,closed,0,,,,,4,2021-12-10T23:59:58Z,2021-12-11T00:51:15Z,2021-12-11T00:49:31Z,OWNER,,"> I think the fix for this is to change the rules about what code is accepted in both the `-` mode and the literal code string mode: you can pass in a Python expression, OR a fragment that gets turned into a function, OR code that implements its own `def convert(value)` function. So this would work too: > ```sh > sqlite-utils convert my.db mytable col1 ' > def convert(value): > return value.upper() > ' > ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991381679_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077431957,I_kwDOCGYnMM5AOE6V,356,`sqlite-utils insert --convert` option,9599,simonw,closed,0,,,,,11,2021-12-11T07:24:48Z,2022-01-06T06:30:13Z,2022-01-06T06:28:53Z,OWNER,,"Idea come to me while re-reading this: https://simonwillison.net/2021/Aug/6/sqlite-utils-convert/ This is a bit of a hack: ``` cat /tmp/log.txt | \ jq --raw-input '{line: .}' --compact-output | \ sqlite-utils insert /tmp/logs.db log - --nl ``` Would be great if you could pipe lines to `insert` and transform them on the way in. A `--convert python-code` option, modeled after `sqlite-utils convert`, could do this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079422215,I_kwDOCGYnMM5AVq0H,357,pytest-runner is not required,4067843,pgajdos,closed,0,,,,,1,2021-12-14T07:51:24Z,2021-12-16T20:43:19Z,2021-12-16T20:43:13Z,NONE,,Deprecated pytest-runner is not necessary for running the testsuite.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1082651698,I_kwDOCGYnMM5Ah_Qy,358,Support for CHECK constraints,11597658,luxint,open,0,,,,,7,2021-12-16T21:19:45Z,2022-09-25T07:15:59Z,,NONE,,"Hi, I noticed the `transform.table()` method doesn't have an option to add/change or drop a check constraint (see https://sqlite.org/lang_createtable.html -> 3.7 Check Constraints. would be great to have this as an option! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1090798237,I_kwDOCGYnMM5BBEKd,359,Use RETURNING if available to populate last_pk,9599,simonw,open,0,,,,,0,2021-12-29T23:43:23Z,2021-12-29T23:43:23Z,,OWNER,,"Inspired by this: https://news.ycombinator.com/item?id=29729283 > Because SQLite is effectively serializing all the writes for us, we have zero locking in our code. We used to have to lock when inserting new items (to get the LastInsertRowId), but the newer version of SQLite supports the RETURNING keyword, so we don't even have to lock on inserts now.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091819089,I_kwDOCGYnMM5BE9ZR,360,MemoryError,559453,nzaar9,closed,0,,,,,1,2022-01-01T13:39:17Z,2022-03-21T04:22:46Z,2022-03-21T04:22:46Z,NONE,,"HI, when dealing with large json file (~170GB) i got the following error ``` Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1126, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1051, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1393, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 752, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/cli.py"", line 1300, in memory rows, format_used = rows_from_file(csv_fp, format=format, encoding=encoding) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 185, in rows_from_file return rows_from_file(buffered, format=Format.JSON) File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/utils.py"", line 156, in rows_from_file decoded = json.load(fp) File ""/usr/lib/python3.9/json/__init__.py"", line 293, in load return loads(fp.read(), MemoryError ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1094974713,I_kwDOCGYnMM5BQ_z5,362,upsert --detect-types is broken,9599,simonw,closed,0,,,,,0,2022-01-06T05:12:10Z,2022-01-06T06:54:45Z,2022-01-06T06:28:34Z,OWNER,,"Noticed this thanks to syntax highlighting in VS Code showing an unused variable - need to fix it and add a test. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1094981339,I_kwDOCGYnMM5BRBbb,363,Better error message if `--convert` code fails to return a dict,9599,simonw,closed,0,,,,,4,2022-01-06T05:26:28Z,2022-02-03T22:52:30Z,2022-02-03T22:51:30Z,OWNER,,"Here's the traceback if your `--convert` function doesn't return a dict right now: ``` % sqlite-utils insert /tmp/all.db blah /tmp/log.log --convert 'all.upper()' --all Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 949, in insert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 834, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2602, in insert_all first_record = next(records) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 3044, in fix_square_braces for record in records: File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 831, in docs = (decode_base64_values(doc) for doc in docs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 86, in decode_base64_values to_fix = [ File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 89, in if isinstance(doc[k], dict) TypeError: string indices must be integers ``` It would be nicer if that returned a more useful error message. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/361#issuecomment-1006295276_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1095570074,I_kwDOCGYnMM5BTRKa,364,`--batch-size 1` doesn't seem to commit for every item,9599,simonw,closed,0,,,7558727,3.21,16,2022-01-06T18:18:50Z,2022-01-10T19:27:17Z,2022-01-10T05:36:19Z,OWNER,,"I'm trying this, but it doesn't seem to write anything to the database file until I hit `CTRL+C`: ``` heroku logs --app=simonwillisonblog --tail | grep 'measure#nginx.service' | \ sqlite-utils insert /tmp/herokutail.db log - --import re --convert ""$(cat < Relevant documentation: https://www.sqlite.org/lang_analyze.html _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1007633376_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097091527,I_kwDOCGYnMM5BZEnH,369,Research how much of a difference analyze / sqlite_stat1 makes,9599,simonw,closed,0,,,,,11,2022-01-09T03:03:36Z,2022-02-03T21:07:41Z,2022-02-03T21:07:35Z,OWNER,,"> Is there a downside to having a `sqlite_stat1` table if it has wildly incorrect statistics in it? _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008163050_ More generally: how much of a difference does the `sqlite_stat1` table created by `ANALYZE` make to queries? I'm particularly interested in `group by` / `count *` queries since Datasette uses those for faceting.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097129710,I_kwDOCGYnMM5BZN7u,372,Idea: `suffix` and `stem` file columns,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-09T07:48:53Z,2022-01-10T19:27:34Z,2022-01-09T20:17:00Z,OWNER,,"For https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-data-from-files Given a file called `dogs.jpg` stem would be `dogs` and ext would be `jpg`. Need to decide what happens for `dogs.and.cats.jpg.gz`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/372/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097128334,I_kwDOCGYnMM5BZNmO,371,Support mutating row in `--convert` without returning it,9599,simonw,closed,0,,,7558727,3.21,6,2022-01-09T07:38:44Z,2022-01-10T19:27:30Z,2022-01-09T20:06:15Z,OWNER,,"Currently you have to do this: ``` $ sqlite-utils insert dogs.db dogs dogs.json --convert ' row[""is_good""] = 1 return row' ``` Would be neat if this worked too: ``` $ sqlite-utils insert dogs.db dogs dogs.json \ --convert 'row[""is_good""] = 1' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135860,I_kwDOCGYnMM5BZPb0,374,`--fmt` should imply `-t`,9599,simonw,closed,0,,,7558727,3.21,4,2022-01-09T08:23:07Z,2022-01-10T19:27:26Z,2022-01-09T18:07:59Z,OWNER,,Not sure why I didn't implement this.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097436959,I_kwDOCGYnMM5BaY8f,376,`--nl` mode should ignore blank lines,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T04:10:54Z,2022-01-10T19:27:41Z,2022-01-10T04:12:46Z,OWNER,,Spotted this while manually testing #364 - there's no reason `--nl` should crash if you feed it an empty line in between JSON objects.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/376/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098309897,I_kwDOCGYnMM5BduEJ,378,analyze=True parameter for some methods,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-10T19:54:52Z,2022-01-11T01:08:11Z,2022-01-11T01:08:09Z,OWNER,,"This would cause `ANALYZE` to be run against the relevant table at the end of executing the method. > Having browsed the API reference I think the methods that would benefit from an `analyze=True` parameter are: - [x] `table.create_index` - [x] `table.insert_all` - [x] `table.upsert_all` - [x] `table.delete_where` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009288898_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/378/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098544628,I_kwDOCGYnMM5BenX0,379,CLI options for running ANALYZE,9599,simonw,closed,0,,,7558727,3.21,0,2022-01-11T01:09:16Z,2022-01-11T01:38:01Z,2022-01-11T01:36:48Z,OWNER,,"> The Python methods are all done now, next step is the CLI options. I'll do those in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/366#issuecomment-1009508865_ - [x] `sqlite-utils analyze` command - [x] `sqlite-utils create-index --analyze` option (see #365) - [x] `sqlite-utils insert --analyze` option - [x] `sqlite-utils upsert --analyze` option In #378 I also added `.delete_where(..., analyze=True)` but there isn't currently a `sqlite-utils delete-where` CLI command - deletions via CLI are expected to be handled using SQL queries.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/379/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1098574572,I_kwDOCGYnMM5Beurs,380,Release notes for 3.21,9599,simonw,closed,0,,,7558727,3.21,1,2022-01-11T02:12:30Z,2022-01-11T02:34:26Z,2022-01-11T02:34:26Z,OWNER,,For these commits: https://github.com/simonw/sqlite-utils/compare/3.20...129141572f249ea290e2a075437e2ebaad215859,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/380/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099584685,I_kwDOCGYnMM5BilSt,381,`sqlite-utils rows` options `--limit` and `--offset`,9599,simonw,closed,0,,,,,2,2022-01-11T20:23:12Z,2022-01-11T23:33:37Z,2022-01-11T23:19:36Z,OWNER,,Because I often want to use it just to preview a few rows from the database. Piping through `| head -n 20` works for JSON and CSV (they stream) but not for `--table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099585611,I_kwDOCGYnMM5BilhL,382,`--where` option for `sqlite-rows`,9599,simonw,closed,0,,,,,1,2022-01-11T20:24:23Z,2022-01-11T23:33:14Z,2022-01-11T23:32:47Z,OWNER,,CLI equivalent of `table.rows_where()` - should accept parameters too. Work on this at the same time as #381.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099586786,I_kwDOCGYnMM5Bilzi,383,Add documentation page with the output of `--help`,9599,simonw,closed,0,,,,,4,2022-01-11T20:25:58Z,2022-01-11T22:55:05Z,2022-01-11T21:44:05Z,OWNER,,"Can be maintained using `cog` from #373. Similar in purpose to the API reference page, but this is for the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099897648,I_kwDOCGYnMM5Bjxsw,384,Add examples to every `--help`,9599,simonw,closed,0,,,,,0,2022-01-12T05:31:25Z,2022-01-26T03:15:02Z,2022-01-26T03:15:02Z,OWNER,,Everything on https://sqlite-utils.datasette.io/en/stable/cli-reference.html would benefit from an example.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/384/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1122446693,I_kwDOCGYnMM5C5y1l,394,Test against Python 3.11-dev,9599,simonw,open,0,,,,,1,2022-02-02T22:21:03Z,2022-02-03T21:06:35Z,,OWNER,,"Same as: - https://github.com/simonw/datasette/issues/1621",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1107557831,I_kwDOCGYnMM5CA_3H,386,"Better ""contributing"" documentation",9599,simonw,closed,0,,,,,0,2022-01-19T02:11:48Z,2022-01-19T02:15:21Z,2022-01-19T02:15:21Z,OWNER,,"This page jumps straight into running the tests: https://sqlite-utils.datasette.io/en/latest/contributing.html It should add a little more about expected collaboration styles - opening an issue before filing a pull request - and probably link to https://simonwillison.net/2022/Jan/12/how-i-build-a-feature/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/386/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1111293050,I_kwDOCGYnMM5CPPx6,387,Python library docs should start with a self contained example,9599,simonw,closed,0,,,,,1,2022-01-22T06:23:56Z,2022-01-26T01:37:17Z,2022-01-26T01:35:30Z,OWNER,,You have to read a lot of stuff in a lot of different places to get started with the Python library. Add a getting started introduction to https://sqlite-utils.datasette.io/en/stable/python-api.html,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123851690,I_kwDOCGYnMM5C_J2q,396,"mypy failure, sqlite_utils/utils.py:56",9599,simonw,closed,0,,,,,0,2022-02-04T06:08:09Z,2022-02-04T06:10:33Z,2022-02-04T06:10:33Z,OWNER,,"https://github.com/simonw/sqlite-utils/runs/5062725880?check_suite_focus=true > `sqlite_utils/utils.py:56: error: Incompatible return value type (got ""None"", expected ""str"")`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123849278,I_kwDOCGYnMM5C_JQ-,395,"""apt-get: command not found"" error on macOS",9599,simonw,closed,0,,,,,1,2022-02-04T06:03:42Z,2022-02-04T06:10:58Z,2022-02-04T06:10:58Z,OWNER,,"Yeah, `apt-get` isn't a thing on macOS so 4a2a3e2fd0d5534f446b3f1fee34cb165e4d86d2 (to test #79 against real SpatiaLite) broke.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114543475,I_kwDOCGYnMM5CbpVz,388,Link to stable docs from older versions,9599,simonw,closed,0,,,,,7,2022-01-26T01:55:46Z,2023-03-26T23:43:12Z,2022-01-26T02:00:22Z,OWNER,,"https://sqlite-utils.datasette.io/en/2.14.1/ isn't showing a link to the stable release right now. I should also apply the same fix I used for Datasette in: - https://github.com/simonw/datasette/issues/1608 TIL: https://til.simonwillison.net/readthedocs/link-from-latest-to-stable",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/388/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114544727,I_kwDOCGYnMM5CbppX,389,Plausible analytics for documentation,9599,simonw,closed,0,,,,,2,2022-01-26T01:58:35Z,2022-01-26T02:07:41Z,2022-01-26T02:07:41Z,OWNER,,"```html ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/388#issuecomment-1021785268_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114557284,I_kwDOCGYnMM5Cbstk,390,`sqlite-utils upsert` should require `--pk` more elegantly,9599,simonw,closed,0,,,,,1,2022-01-26T02:20:31Z,2022-01-26T03:20:25Z,2022-01-26T03:19:43Z,OWNER,,"Currently throws an ugly traceback: ``` % echo '[ {""id"": 1, ""name"": ""Lila""}, {""id"": 1, ""name"": ""Lila""} ]' | sqlite-utils upsert data.db chickens - Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 1104, in upsert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 906, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2615, in insert_all raise PrimaryKeyRequired(""upsert() requires a pk"") sqlite_utils.db.PrimaryKeyRequired: upsert() requires a pk ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114640101,I_kwDOCGYnMM5CcA7l,392,`sqlite-utils bulk --batch-size` option,9599,simonw,closed,0,,,,,4,2022-01-26T05:17:11Z,2022-01-26T18:17:59Z,2022-01-26T18:17:59Z,OWNER,,"> Could add support for `--batch-size` as seen in `insert`/`upsert` too - causing it to break the list up into batches and commit for each one. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/391#issuecomment-1021876055_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1114638930,I_kwDOCGYnMM5CcApS,391,`sqlite-utils bulk` progress bar,9599,simonw,closed,0,,,,,2,2022-01-26T05:14:49Z,2022-01-26T05:17:20Z,2022-01-26T05:16:51Z,OWNER,,"It can easily have a progress bar because it works by looping through an iterator: https://github.com/simonw/sqlite-utils/blob/a9fca7efa4184fbb2a65ca1275c326950ed9d3c1/sqlite_utils/cli.py#L1014-L1018 Should also support the `--silent` option if I add this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1118585417,I_kwDOCGYnMM5CrEJJ,393,Better documentation for insert-replace,9599,simonw,closed,0,,,,,1,2022-01-30T15:40:23Z,2022-02-03T22:13:24Z,2022-02-03T22:13:24Z,OWNER,,"Currently: https://sqlite-utils.datasette.io/en/stable/python-api.html#insert-replacing-data > If you want to insert a record or replace an existing record with the same primary key, using the replace=True argument to .insert() or .insert_all(): Should describe the exception you get first, then how to use replace to avoid it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124237013,I_kwDOCGYnMM5DAn7V,398,Add SpatiaLite helpers to CLI,25778,eyeseast,closed,0,,,,,9,2022-02-04T14:01:28Z,2022-02-16T01:02:29Z,2022-02-16T00:58:07Z,CONTRIBUTOR,,"Now that #385 is merged, add CLI versions of those methods. ```sh # init spatialite sqlite-utils init-spatialite database.db # or maybe/also sqlite-utils create database.db --enable-wal --spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this needs to create a table if it doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column sqlite-utils create-spatial-index database.db table-name geometry ``` Should be mostly straightforward. The one thing worth highlighting in docs is that geometry columns can only be added to existing tables. Trying to add a geometry column to a table that doesn't exist yet might mean you have a schema like `{""rowid"": int, ""geometry"": bytes}`. Might be worth nudging people to explicitly create a table first, then add geometry columns. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1124731464,I_kwDOCGYnMM5DCgpI,399,"Make it easier to insert geometries, with documentation and maybe code",9599,simonw,open,0,,,,,25,2022-02-05T00:11:26Z,2023-05-16T03:11:52Z,,OWNER,,"In playing with the new SpatiaLite helpers from #385 I noticed that actually populating geometry columns is still a little bit tricky. Here's what I ended up doing: ```python import httpx, sqlite_utils db = sqlite_utils.Database(""/tmp/spatial.db"") attractions = httpx.get(""https://latest.datasette.io/fixtures/roadside_attractions.json?_shape=array"").json() db[""attractions""].insert_all(attractions, pk=""pk"") # Schema of that table is now: # CREATE TABLE [attractions] ( # [pk] INTEGER PRIMARY KEY, # [name] TEXT, # [address] TEXT, # [latitude] FLOAT, # [longitude] FLOAT # ) db.init_spatialite() db[""attractions""].add_geometry_column(""point"", ""POINT"") db.execute("""""" update attractions set point = GeomFromText( 'POINT(' || longitude || ' ' || latitude || ')', 4326 ) """""") ``` That last line took some figuring out - especially the need for the SRID of `4326`, without which I got this error: > `IntegrityError: attractions.point violates Geometry constraint [geom-type or SRID not allowed]` It would be good to both document this in more detail, but ideally also to come up with a more obvious pattern for inserting common types of spatial data. Also related: - #398 - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1125077063,I_kwDOCGYnMM5DD1BH,400,`sqlite-utils create-table` ... `--if-not-exists`,9599,simonw,closed,0,,,,,1,2022-02-06T01:32:53Z,2022-02-06T01:34:53Z,2022-02-06T01:34:46Z,OWNER,,"Inspired by: - #397 To match the option on `create-index`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-index ``` --if-not-exists Ignore if index already exists ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125081640,I_kwDOCGYnMM5DD2Io,401,Update SpatiaLite example in the documentation,9599,simonw,closed,0,,,,,2,2022-02-06T02:02:07Z,2022-02-06T02:05:03Z,2022-02-06T02:03:24Z,OWNER,,"This one here: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions It should take advantage of the new methods from: - #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1125297737,I_kwDOCGYnMM5DEq5J,402,Advanced class-based `conversions=` mechanism,9599,simonw,open,0,,,,,14,2022-02-06T19:47:41Z,2022-02-16T10:18:55Z,,OWNER,,"The `conversions=` parameter works like this at the moment: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions ```python db[""places""].insert( {""name"": ""Wales"", ""geometry"": wkt}, conversions={""geometry"": ""GeomFromText(?, 4326)""}, ) ``` This proposal is to support values in that dictionary that are objects, not strings, which can represent more complex conversions - spun out from #399. New proposed mechanism: ```python from sqlite_utils.utils import LongitudeLatitude db[""places""].insert( { ""name"": ""London"", ""point"": (-0.118092, 51.509865) }, conversions={""point"": LongitudeLatitude}, ) ``` Here `LongitudeLatitude` is a magical value which does TWO things: it sets up the `GeomFromText(?, 4326)` SQL function, and it handles converting the `(51.509865, -0.118092)` tuple into a `POINT({} {})` string. This would involve a change to the `conversions=` contract - where it usually expects a SQL string fragment, but it can also take an object which combines that SQL string fragment with a Python conversion function. Best of all... this resolves the `lat, lon` v.s. `lon, lat` dilemma because you can use `from sqlite_utils.utils import LongitudeLatitude` OR `from sqlite_utils.utils import LatitudeLongitude` depending on which you prefer! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030739566_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1126692066,I_kwDOCGYnMM5DJ_Ti,403,Document how to add a primary key to a rowid table using `sqlite-utils transform --pk`,536941,fgregg,closed,0,,,,,4,2022-02-08T01:39:40Z,2022-02-09T04:22:43Z,2022-02-08T19:33:59Z,CONTRIBUTOR,,"*Original title: Add option for adding a new, serial, primary key* sometimes we have tables that don't have primary keys, but ought to have them. we *can* use rowid for that, but it would often be nicer to have an explicit primary key. using the current value of rowid would be fine.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128120451,I_kwDOCGYnMM5DPcCD,404,Add example of `--convert` to the help for `sqlite-utils insert`,9599,simonw,closed,0,,,,,2,2022-02-09T06:49:09Z,2022-02-09T06:56:35Z,2022-02-09T06:55:16Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of `--convert` in action. I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128139375,I_kwDOCGYnMM5DPgpv,405,"`Database(memory_name=""name"")` constructor argument",9599,simonw,closed,0,,,,,2,2022-02-09T07:15:03Z,2022-02-16T01:23:16Z,2022-02-16T01:23:16Z,OWNER,,"SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process. Datasette supports this - SQLite could support it too. https://docs.datasette.io/en/0.60.2/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1128466114,I_kwDOCGYnMM5DQwbC,406,Creating tables with custom datatypes,82988,psychemedia,open,0,,,,,5,2022-02-09T12:16:31Z,2022-09-15T18:13:50Z,,NONE,,"Via https://stackoverflow.com/a/18622264/454773 I note the ability to register custom handlers for novel datatypes that can map into and out of things like sqlite `BLOB`s. From a quick look and a quick play, I didn't spot a way to do this in `sqlite_utils`? For example: ```python # Via https://stackoverflow.com/a/18622264/454773 import sqlite3 import numpy as np import io def adapt_array(arr): """""" http://stackoverflow.com/a/31312102/190597 (SoulNibbler) """""" out = io.BytesIO() np.save(out, arr) out.seek(0) return sqlite3.Binary(out.read()) def convert_array(text): out = io.BytesIO(text) out.seek(0) return np.load(out) # Converts np.array to TEXT when inserting sqlite3.register_adapter(np.ndarray, adapt_array) # Converts TEXT to np.array when selecting sqlite3.register_converter(""array"", convert_array) ``` ```python from sqlite_utils import Database db = Database('test.db') # Reset the database connection to used the parsed datatype # sqlite_utils doesn't seem to support eg: # Database('test.db', detect_types=sqlite3.PARSE_DECLTYPES) db.conn = sqlite3.connect(db_name, detect_types=sqlite3.PARSE_DECLTYPES) # Create a table the old fashioned way # but using the new custom data type vector_table_create = """""" CREATE TABLE dummy (title TEXT, vector array ); """""" cur = db.conn.cursor() cur.execute(vector_table_create) # sqlite_utils doesn't appear to support custom types (yet?!) # The following errors on the ""array"" datatype """""" db[""dummy""].create({ ""title"": str, ""vector"": ""array"", }) """""" ``` We can then add / retrieve records from the database where the datatype of the `vector` field is a custom registered `array` type (which is to say, a `numpy` array): ```python import numpy as np db[""dummy""].insert({'title':""test1"", 'vector':np.array([1,2,3])}) for row in db.query(""SELECT * FROM dummy""): print(row['title'], row['vector'], type(row['vector'])) """""" test1 [1 2 3] """""" ``` It would be handy to be able to do this idiomatically in `sqlite_utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1145882578,I_kwDOCGYnMM5ETMfS,408,`deterministic=True` fails on versions of SQLite prior to 3.8.3,24938923,learning4life,closed,0,,,,,6,2022-02-21T14:36:43Z,2022-03-13T16:54:09Z,2022-03-02T00:38:11Z,NONE,,"Hi, love your work. I am unable to lookup indexes in a database using sqlite-utils: ` sqlite-utils indexes city_spec.db --table` or `sqlite-utils indexes city_spec.db MyTable ` **Software** sqlite-utils, version 3.24 sqlite3 --version: 3.36.0 **Output:** Traceback (most recent call last): File ""/opt/app-root/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/click/decorators.py"", line 26, in new_func return f(get_current_context(), *args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 2123, in indexes ctx.invoke( File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 1624, in query db.register_fts4_bm25() File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 403, in register_fts4_bm25 self.register_function(rank_bm25, deterministic=True) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 399, in register_function register(fn) File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 392, in register self.conn.create_function(name, arity, fn, **kwargs) sqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,simonw,open,0,,,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,dotcs,closed,0,,,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command. Let's first create a simple database from JSON string ```console $ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo - $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""abc""}] ``` and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns)) ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run Traceback (most recent call last): File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert for row in db.conn.execute(sql, where_args).fetchall(): sqlite3.OperationalError: user-defined function raised exception ``` But without the `--dry-run` flag it does work as expected: ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""bar""}] ``` ```console $ sqlite-utils --version sqlite-utils, version 3.25.1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173023272,I_kwDOCGYnMM5F6uoo,416,Options for how `r.parsedate()` should handle invalid dates,638427,mattkiefer,closed,0,,,,,11,2022-03-17T23:29:55Z,2022-05-03T21:36:49Z,2022-03-21T04:01:39Z,NONE,,"Exceptions are normal expected behavior when typecasting an invalid format. However, r.parsedate() is really just re-formatting strings and keeping the type as text. So it may be better to print-and-pass on exception so the user can see a complete list of invalid values -- while also allowing for the parser to reformat the remaining valid values. ``` sqlite-utils convert idfpr.db license ""Expiration Date"" ""r.parsedate(value)"" [#######-----------------------------] 21% 00:01:57Traceback (most recent call last): File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/db.py"", line 2336, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/lib/python3.9/dist-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/lib/python3/dist-packages/dateutil/parser/_parser.py"", line 652, in parse raise ParserError(""String does not contain a date: %s"", timestr) dateutil.parser._parser.ParserError: String does not contain a date: / / ``` In this case, I had just one variation of an invalid date: ' / / '. But theoretically there could be many values that would have to be fixed one at a time with the current exception handling. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1160034488,I_kwDOCGYnMM5FJLi4,411,Support for generated columns,25778,eyeseast,open,0,,,,,8,2022-03-04T20:41:33Z,2022-03-11T22:32:43Z,,CONTRIBUTOR,,"This is a fairly new feature -- SQLite version 3.31.0 (2020-01-22) -- that I, admittedly, haven't gotten to work yet. But it looks _incredibly_ useful: https://dgl.cx/2020/06/sqlite-json-support I'm not sure if this is an option on `add-column` or a separate command like `add-generated-column`. Either way, it needs an argument to populate it. It could be something like this: ```sh sqlite-utils add-column data.db table-name generated --as 'json_extract(data, ""$.field"")' --virtual ``` More here: https://www.sqlite.org/gencol.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/411/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1160182768,I_kwDOCGYnMM5FJvvw,412,Optional Pandas integration,9599,simonw,open,0,,,,,13,2022-03-05T01:49:27Z,2022-06-14T15:36:29Z,,OWNER,,"It would be neat if there was a way to use this more seamlessly with Pandas, in particular Pandas dataframes - but without making Pandas a required dependency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1166587040,I_kwDOCGYnMM5FiLSg,413,Display autodoc type information more legibly,9599,simonw,closed,0,,,,,5,2022-03-11T15:58:20Z,2022-03-11T18:07:10Z,2022-03-11T18:07:10Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.25/reference.html#sqlite_utils.db.Table.insert looks like this at the moment: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1166731361,I_kwDOCGYnMM5Fiuhh,414,I forgot to include the changelog in the 3.25.1 release,9599,simonw,closed,0,,,,,7,2022-03-11T18:32:36Z,2022-03-11T18:40:39Z,2022-03-11T18:40:39Z,OWNER,,"I pushed a release for https://github.com/simonw/sqlite-utils/releases/tag/3.25.1 but forgot to include the release notes in `docs/changelog.rst` This means https://sqlite-utils.datasette.io/en/stable/changelog.html isn't showing them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178456794,I_kwDOCGYnMM5GPdLa,418,Add generated files to .gitignore,25778,eyeseast,closed,0,,,,,0,2022-03-23T17:48:12Z,2022-03-24T21:01:44Z,2022-03-24T21:01:44Z,CONTRIBUTOR,,"I end up with these in my local directory: .hypothesis/ Pipfile Pipfile.lock pyproject.toml Might as well gitignore them.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1178546862,I_kwDOCGYnMM5GPzKu,420,Document how to use a `--convert` function that runs initialization code first,770231,strada,closed,0,,,,,12,2022-03-23T19:07:36Z,2022-08-28T11:34:37Z,2022-03-25T20:07:33Z,NONE,,"When I have an insert command with transform like this: ``` cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert ' d = enchant.Dict(""en_US"") row[""is_dictionary_word""] = d.check(row[""name""]) ' --import=enchant --ignore ``` I noticed as the number of rows increases the operation becomes quite slow, likely due to the creation of the `d = enchant.Dict(""en_US"")` object for each row. Is there a way to share that instance `d` between transform function calls, like a shared context?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1180427792,I_kwDOCGYnMM5GW-YQ,421,"""Error: near ""("": syntax error"" when using sqlite-utils indexes CLI",24938923,learning4life,closed,0,,,,,8,2022-03-25T07:12:51Z,2022-04-13T22:41:59Z,2022-04-13T22:41:59Z,NONE,,"This bug relates to https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 **New error when using CLI: ""sqlite-utils indexes global.db --table""** ``` (app-root) sqlite-utils indexes global.db --table Error: near ""("": syntax error (app-root) sqlite-utils --version sqlite-utils, version 3.25.1 (app-root) sqlite3 --version 3.36.0 2021-06-18 18:36:39 (app-root) python --version Python 3.8.11 ``` Dockerfile ``` FROM centos/python-38-centos7 USER root RUN yum update -y RUN yum upgrade -y # epel RUN yum -y install epel-release && yum clean all # SQLite RUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel WORKDIR /build/ COPY sqlite-autoconf-3360000.tar.gz ./ RUN tar -zxf sqlite-autoconf-3360000.tar.gz WORKDIR /build/sqlite-autoconf-3360000 RUN ./configure RUN make RUN make install # RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip RUN pip install sqlite-utils ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1181236173,I_kwDOCGYnMM5GaDvN,422,Reconsider not running convert functions against null values,9599,simonw,open,0,,,,,1,2022-03-25T20:22:40Z,2022-03-25T20:23:21Z,,OWNER,,"I just got caught out by the fact that `None` values are not processed by the `.convert()` mechanism https://github.com/simonw/sqlite-utils/blob/0b7b80bd40fe86e4d66a04c9f607d94991c45c0b/sqlite_utils/db.py#L2504-L2510 I had run this code while working on #420 and I wasn't sure why it didn't work: ``` $ sqlite-utils add-column content.db articles score float $ sqlite-utils convert content.db articles score ' import random random.seed(10) def convert(value): global random return random.random() ' ``` The reason it didn't work is that the newly added `score` column was full of `null` values. I fixed it by doing this instead: $ sqlite-utils add-column content.db articles score float --not-null-default 1.0 But this indicates to me that the design of `convert()` here may be incorrect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1205687423,I_kwDOCGYnMM5H3VR_,426,CLI docs should link to Python docs and vice versa,9599,simonw,closed,0,9599,simonw,,,1,2022-04-15T16:05:15Z,2023-07-22T22:13:22Z,2023-07-22T22:13:22Z,OWNER,,"For every command/API method there should be a link to the equivalent in the other form factor. Maybe also link to the API and CLI reference pages too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/426/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1199158210,I_kwDOCGYnMM5HebPC,423,.extract() doesn't set foreign key when extracted columns contain NULL value,37447552,jlieth,closed,0,,,,,1,2022-04-10T20:05:30Z,2022-08-27T14:45:04Z,2022-08-27T14:45:04Z,NONE,,"I've run into an issue with `extract` and I don't believe this is the intended behaviour. I'm working with a database with music listening information. Currently it has one large table `listens` that contains all information. I'm trying to normalize the database by extracting relevant columns to separate tables (`artists`, `tracks`, `albums`). Not every track has an album. A simplified demonstration with just `track_title` and `album_title` columns: ```ipython In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""listens""].insert_all([ ...: {""id"": 1, ""track_title"": ""foo"", ""album_title"": ""bar""}, ...: {""id"": 2, ""track_title"": ""baz"", ""album_title"": None} ...: ], pk=""id"") Out[3]: ``` The track in the first row has an album, the second track doesn't. Now I extract album information into a separate column: ```ipython In [4]: db[""listens""].extract(columns=[""album_title""], table=""albums"", fk_column=""album_id"") Out[4]:
In [5]: list(db[""albums""].rows) Out[5]: [{'id': 1, 'album_title': 'bar'}, {'id': 2, 'album_title': None}] In [6]: list(db[""listens""].rows) Out[6]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] ``` This behaves as expected -- the `album` table contains entries for both the existing album and the NULL album. The `listens` table has a foreign key only for the first row (since the album in the second row was empty). Now I want to extract the track information as well. Album information belongs to the track so I want to extract both columns to a new table. ```ipython In [7]: db[""listens""].extract(columns=[""track_title"", ""album_id""], table=""tracks"", fk_column=""track_id"") Out[7]:
In [8]: list(db[""tracks""].rows) Out[8]: [{'id': 1, 'track_title': 'foo', 'album_id': 1}, {'id': 2, 'track_title': 'baz', 'album_id': None}] In [9]: list(db[""listens""].rows) Out[9]: [{'id': 1, 'track_id': 1}, {'id': 2, 'track_id': None}] ``` Extracting to the `tracks` table worked fine (both tracks are present with correct columns). However, the `listens` table only has a foreign key to the newly created tracks for the first row, the foreign key in the second row is NULL. Changing the order of extracts doesn't help. I poked around in the source a bit and I believe [this line](https://github.com/simonw/sqlite-utils/blob/433813612ff9b4b501739fd7543bef0040dd51fe/sqlite_utils/db.py#L1737) (essentially comparing `NULL = NULL`) is the problem, but I don't know enough about SQL to create a reliable fix myself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/423/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1200866134,I_kwDOCGYnMM5Hk8NW,424,Better error message if you try to create a table with no columns,9599,simonw,closed,0,,,,,1,2022-04-12T02:43:20Z,2022-04-13T22:40:15Z,2022-04-13T22:40:10Z,OWNER,,"Seen here: - https://github.com/simonw/geojson-to-sqlite/issues/30 Attempting to create a table with no columns produced this confusing error: ``` File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/geojson_to_sqlite/utils.py"", line 69, in import_features db[table].create(column_types, pk=pk) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 863, in create self.db.create_table( File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 517, in create_table self.execute(sql) File ""/Users/simon/.local/pipx/venvs/geojson-to-sqlite/lib/python3.9/site-packages/sqlite_utils/db.py"", line 236, in execute return self.conn.execute(sql) sqlite3.OperationalError: near "")"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1203842656,I_kwDOCGYnMM5HwS5g,425,`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher,9599,simonw,closed,0,,,,,5,2022-04-13T22:16:53Z,2023-04-15T20:14:58Z,2022-04-13T22:48:57Z,OWNER,,"Got this error while investigating: - #421 Even though I was using the `LD_PRELOAD` trick from https://til.simonwillison.net/sqlite/ld-preload to use a newer version of SQLite. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098531354_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1224112817,I_kwDOCGYnMM5I9nqx,430,Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases,9308268,rayvoelker,open,0,,,,,2,2022-05-03T13:33:58Z,2022-06-14T23:26:37Z,,NONE,,"I'm trying to figure out a way to get the `table.extract()` method to complete successfully -- I'm not sure if maybe the cause (and a possible solution) of this on Ubuntu Server 22.04 is to adjust some of the PRAGMA values within SQLite itself ... on another Linux system (PopOS), using this method on this same database appears to work just fine. Here's the bit that's causing the error, and the resulting error output: ```python # combine these columns into 1 table ""bib_properties"" : # best_title # bib_level_code # mat_type # material_code # best_author db[""circ_trans""].extract( [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], table=""bib_properties"", fk_column=""bib_properties_id"" ) db[""circ_trans""].extract( [""call_number""], table=""call_number"", fk_column=""call_number_id"", rename={""call_number"": ""value""} ) ``` ```python --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) Input In [17], in () 1 # combine these columns into 1 table ""bib_properties"" : 2 # best_title 3 # bib_level_code 4 # mat_type 5 # material_code 6 # best_author ----> 7 db[""circ_trans""].extract( 8 [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], 9 table=""bib_properties"", 10 fk_column=""bib_properties_id"" 11 ) 13 db[""circ_trans""].extract( 14 [""call_number""], 15 table=""call_number"", 16 fk_column=""call_number_id"", 17 rename={""call_number"": ""value""} 18 ) File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1764, in Table.extract(self, columns, table, fk_column, rename) 1761 column_order.append(c.name) 1763 # Drop the unnecessary columns and rename lookup column -> 1764 self.transform( 1765 drop=set(columns), 1766 rename={magic_lookup_column: fk_column}, 1767 column_order=column_order, 1768 ) 1770 # And add the foreign key constraint 1771 self.add_foreign_key(fk_column, table, ""id"") File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1526, in Table.transform(self, types, rename, drop, pk, not_null, defaults, drop_foreign_keys, column_order) 1524 with self.db.conn: 1525 for sql in sqls: -> 1526 self.db.execute(sql) 1527 # Run the foreign_key_check before we commit 1528 if pragma_foreign_keys_was_on: File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:465, in Database.execute(self, sql, parameters) 463 return self.conn.execute(sql, parameters) 464 else: --> 465 return self.conn.execute(sql) OperationalError: database or disk is full ``` This database is about 17G in total size, so I'm assuming the error is coming from the vacuum ... where i'm assuming it's maybe trying to do the temp storage in a location that doesn't have sufficient room. The disk space is more than ample on the host in question (1.8T is free in the directory where the sqlite db resides) The `/tmp` directory however is limited on a smaller disk associated with the OS I'm trying to think if there's a way to set the `PRAGMA temp_store` or maybe if it's `temp_store_directory` that I'm after ... to use the same local directory of where the file is located (maybe this is a property of the version of sqlite on the system?) ```python # SET the temp file store to be a file ... print(db.execute('PRAGMA temp_store').fetchall()) print(db.execute('PRAGMA temp_store=FILE').fetchall()) print(db.execute('PRAGMA temp_store').fetchall()) # the users home directory ... print(db.execute(""PRAGMA temp_store_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA temp_store_directory"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory"").fetchall()) ``` ```text [(1,)] [] [(1,)] [] [] [('/home/plchuser/',)] [] ``` Here's the docs on the Temporary File Storage Locations https://www.sqlite.org/tempfiles.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1212701569,I_kwDOCGYnMM5ISFuB,427,"sqlite-utils convert date parsing recipe complains about trying to parse ""*""",1385831,wdccdw,closed,0,,,,,1,2022-04-22T19:27:10Z,2022-07-02T13:59:59Z,2022-07-02T13:59:32Z,NONE,,"Missing values in my dataset are denoted by a single asterisk. I am trying to parse string dates into dates. This works fine for columns without missing values, but, when the column contains ""*"", I get the following: ``` $ sqlite-utils convert ${dbfile} details dob 'r.parsedate(value)' [------------------------------------] 0%Traceback (most recent call last): File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2508, in convert_value return fn(v) File """", line 2, in fn File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/recipes.py"", line 8, in parsedate parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat() File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/dateutil/parser/_parser.py"", line 643, in parse raise ParserError(""Unknown string format: %s"", timestr) dateutil.parser._parser.ParserError: Unknown string format: * Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils==3.25.1', 'console_scripts', 'sqlite-utils')()) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2698, in convert db[table].convert( File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2524, in convert self.db.execute(sql, where_args or []) File ""/usr/local/Cellar/sqlite-utils/3.25.1/libexec/lib/python3.9/site-packages/sqlite_utils/db.py"", line 458, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: user-defined function raised exception ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/427/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1215216249,I_kwDOCGYnMM5Ibrp5,428,Research adding support for savepoints,9599,simonw,open,0,,,,,1,2022-04-26T01:04:01Z,2022-04-26T01:05:29Z,,OWNER,,"https://www.sqlite.org/lang_savepoint.html Savepoints are like regular transactions except they have names and can be nested. Would there be any value in adding support to them to `sqlite-utils`, potentially as some kind of context manager? Something like this: ```python with db.savepoint(""name""): # do stuff with db.savepoint(""name2""): # do more stuff raise Release # Rolls back to before ""name2"" savepoint ``` I've never used this feature so I'm not comfortable adding anything like this without a bunch of extra research.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1239034903,I_kwDOCGYnMM5J2iwX,433,CLI eats my cursor,7908073,chapmanjacobd,closed,0,,,,,10,2022-05-17T18:52:52Z,2023-11-04T00:46:30Z,2023-11-04T00:46:30Z,CONTRIBUTOR,,"I'm not sure why this happens but `sqlite-utils` makes my terminal cursor disappear after running commands like `sqlite-utils insert`. I've only noticed this behavior in `sqlite-utils`, not in any other CLI tools I can still type commands after it runs but the text cursor is invisible",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/433/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1227571375,I_kwDOCGYnMM5JK0Cv,431,Allow making m2m relation of a table to itself,738408,rafguns,open,0,,,,,3,2022-05-06T08:30:43Z,2022-06-23T14:12:51Z,,NONE,,"I am building a database, in which one of the tables has a many-to-many relationship to itself. As far as I can see, this is not (yet) possible using `.m2m()` in sqlite-utils. This may be a bit of a niche use case, so feel free to close this issue if you feel it would introduce too much complexity compared to the benefits. Example: suppose I have a table of people, and I want to store the information that John and Mary have two children, Michael and Suzy. It would be neat if I could do something like this: ```python from sqlite_utils import Database db = Database(memory=True) db[""people""].insert({""name"": ""John""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) db[""people""].insert({""name"": ""Mary""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) ``` But if I do that, the many-to-many table `parent_child` has only one column: ``` CREATE TABLE [parent_child] ( [people_id] TEXT REFERENCES [people]([name]), PRIMARY KEY ([people_id], [people_id]) ) ``` This could be solved by adding one or two keyword_arguments to `.m2m()`, e.g. `.m2m(..., left_name=None, right_name=None)` or `.m2m(..., names=(None, None))`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1236693079,I_kwDOCGYnMM5JtnBX,432,"Support `rows_where()`, `delete_where()` etc for attached alias databases",11597658,luxint,open,0,,,,,5,2022-05-16T06:38:58Z,2022-06-14T22:16:48Z,,NONE,,"Hi, I noticed `rows_where()` doesn't return any rows from tables which are from attached databases. The `exists()` function returns false. As far as I can see this is because the `table_names()` function only looks for table names in the current database and not in attached (or temp) databases. Besides, `rows_where()`, also `insert_all()` and `delete_where()` didn't do what I was expecting because of this. For the moment I've patched `table_names()` for myself, see below but I'm not sure what the total impact is on the other functions like lookup truncate etc which all use `exists()`. Also `view_names()` doesn't look for views in attached or temp databases. ```python def table_names(self, fts4: bool = False, fts5: bool = False) -> List[str]: ""A list of string table names in this database."" where = [""type = 'table'""] if fts4: where.append(""sql like '%USING FTS4%'"") if fts5: where.append(""sql like '%USING FTS5%'"") dbs = [x[1] for x in self.execute('pragma database_list').fetchall()] lst=[] for db in dbs: sql = ""select name from {} where {}"".format(db+"".sqlite_master"","" AND "".join(where)) lst.extend(r[0] for r in self.execute(sql).fetchall()) return lst ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1257724585,I_kwDOCGYnMM5K91qp,441,Combining `rows_where()` and `search()` to limit which rows are searched,1448859,betatim,closed,0,,,,,4,2022-06-02T06:01:55Z,2022-06-14T21:57:57Z,2022-06-14T21:54:38Z,NONE,,"What is the right way to limit a full text search query to some rows of a table? For example, I have a table that contains the following columns: `title`, `content`, `owner` (each row represents a document). The `owner` column is a username. It feels right to store all documents in one table, instead of having one table per owner. In particular because I'd like to full text search all documents, only documents owned by one user and documents owned by a set of users. I tried to combine `.rows_where(""owner = ?"", ""1234"")` and `.search()` from the `Table` class but I don't think that is meant to work. I discovered `.search_sql()` as a way to generate the FTS SQL statement. By hand I can edit it to add a `AND [original].[owner] = :owner` to the `where` clause. This seems to do what I want. My two questions: 1. is adding a `AND ...` to the `where` clause actually the right thing to do or should I be doing something else (my SQL skills are low)? 2. is there a built-in to sqlite-utils way to achieve this? Right now I am thinking I will make my own version of `search_sql()` that generates a query that contains an additional `owner = :owner` for my particular use-case. Bonus question: is this generally useful/something to add to sqlite-utils or too niche?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,ryascott,closed,0,,,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243704847,I_kwDOCGYnMM5KIW4P,435,Switch to Furo documentation theme,9599,simonw,closed,0,,,,,2,2022-05-20T21:46:39Z,2022-05-20T21:56:10Z,2022-05-20T21:54:43Z,OWNER,,"As seen in: - https://github.com/simonw/datasette/issues/1746 - https://github.com/simonw/shot-scraper/issues/77",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1243715381,I_kwDOCGYnMM5KIZc1,436,"Add ""copy to clipboard"" button to code examples in documentation",9599,simonw,closed,0,,,,,0,2022-05-20T21:53:23Z,2022-05-20T21:57:53Z,2022-05-20T21:57:53Z,OWNER,,"Follows: - #435 Imitates: - https://github.com/simonw/datasette/issues/1748 I'll use https://github.com/executablebooks/sphinx-copybutton - here's the Datasette commit: https://github.com/simonw/datasette/commit/1465fea4798599eccfe7e8f012bd8d9adfac3039",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250161887,I_kwDOCGYnMM5Kg_Tf,438,illegal UTF-16 surrogate,4068,frafra,closed,0,,,,,2,2022-05-26T22:49:52Z,2022-05-27T08:21:53Z,2022-05-27T08:21:53Z,NONE,,"I am trying to insert `https://artsdatabanken.no/Fab2018/api/export/csv` into a SQLite database, but I have an error when using `sqlite-utils`: ``` sqlite-utils insert --csv --delimiter "";"" --encoding=""utf-16-le"" --pk ""Id"" csv fremmedart test.db [------------------------------------] 0% Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I tried to convert the file using `iconv -f ""utf-16le"" -t ""utf-8""`, but I still get a similar error (slightly different position): ``` sqlite-utils insert --csv --delimiter "";"" --encoding=utf-8 --pk ""Id"" csv_utf8 fremmedart test.db [------------------------------------] 0% Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I have no issues reading such file using this Python code: ```python content = open('csv', encoding='utf-16-le').read()) ``` `in2csv` works too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250495688,I_kwDOCGYnMM5KiQzI,439,Misleading progress bar against utf-16-le CSV input,4068,frafra,open,0,,,,,12,2022-05-27T08:34:49Z,2022-06-15T03:53:43Z,,NONE,,"The program crashes without any error. ``` wget ""https://artsdatabanken.no/Fab2018/api/export/csv"" sqlite-utils create-database test.db sqlite-utils insert --csv --delimiter "";"" --encoding ""utf-16-le"" test test.db csv [------------------------------------] 0% [#################-------------------] 49% 00:00:01 ``` I would like to highlight various issues: 1. sqlite-utils catches exceptions without printing the stacktrace and/or reraising the exception, so there is no easy way to use `pdb` or similar to debug the program, solution: add a debug option 2. Silent crash: this is related to (1.), and it happens when there is a catch-all mechanism; solution: let the program fail.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1250629388,I_kwDOCGYnMM5KixcM,440,CSV files with too many values in a row cause errors,4068,frafra,closed,0,,,,,20,2022-05-27T10:54:44Z,2022-06-14T22:23:01Z,2022-06-14T20:12:46Z,NONE,,"*Original title: csv.DictReader can have None as key* In some cases, `csv.DictReader` can have `None` as key for unnamed columns, and a list of values as value. `sqlite_utils.utils.rows_from_file` cannot handle that: ```python url=""https://artsdatabanken.no/Fab2018/api/export/csv"" db = sqlite_utils.Database("":memory"") with urlopen(url) as fab: reader, _ = sqlite_utils.utils.rows_from_file(fab, encoding=""utf-16le"") db[""fab2018""].insert_all(reader, pk=""Id"") ``` Result: ``` Traceback (most recent call last): File """", line 3, in File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2924, in insert_all chunk = list(chunk) File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in fix_square_braces if any(""["" in key or ""]"" in key for key in record.keys()): File ""/home/user/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/db.py"", line 3454, in if any(""["" in key or ""]"" in key for key in record.keys()): TypeError: argument of type 'NoneType' is not iterable ``` Code: https://github.com/simonw/sqlite-utils/blob/59be60c471fd7a2c4be7f75e8911163e618ff5ca/sqlite_utils/db.py#L3454 `sqlite-utils insert` from command line is not affected by this issue.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269886084,I_kwDOCGYnMM5LsOyE,442,`maximize_csv_field_size_limit()` utility function,9599,simonw,closed,0,,,,,2,2022-06-13T19:54:54Z,2022-06-14T21:55:15Z,2022-06-14T21:31:49Z,OWNER,,"This code here runs only if `cli.py` is imported: https://github.com/simonw/sqlite-utils/blob/7ddf5300886a32d6daf60cf1d71efe492b65c87e/sqlite_utils/cli.py#L50-L59 I found myself needing the same fix in another library: - https://github.com/simonw/datasette-socrata/issues/13 It should be a documented utility function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1269998342,I_kwDOCGYnMM5LsqMG,443,Make `utils.rows_from_file()` a documented API,9599,simonw,closed,0,,,,,2,2022-06-13T21:53:24Z,2022-06-20T19:49:37Z,2022-06-14T20:12:46Z,OWNER,,"> `rows_from_file()` isn't part of the documented API but maybe it should be! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1154385916_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1271426387,I_kwDOCGYnMM5LyG1T,444,CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool,9599,simonw,open,0,,,,,5,2022-06-14T22:22:47Z,2022-07-07T16:39:18Z,,OWNER,,"> I forgot to add equivalents of `extras_key=` and `ignore_extras=` to the CLI tool - will do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1155767915_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,simonw,closed,0,,,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277328147,I_kwDOCGYnMM5MInsT,446,Use Just to automate running tests and linters locally,9599,simonw,closed,0,,,,,2,2022-06-20T19:51:09Z,2022-06-21T19:28:35Z,2022-06-20T19:54:50Z,OWNER,,I keep committing code that fails additional tests like `mypy` and `flake8` and `black`. Automate those using Just.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,simonw,closed,0,,,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279144769,I_kwDOCGYnMM5MPjNB,448,Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto',236907,mungewell,closed,0,,,,,5,2022-06-21T21:48:27Z,2023-05-08T22:01:00Z,2023-05-08T22:01:00Z,NONE,,"Attempting to run the example given here (without extra bracket ;-): https://sqlite-utils.datasette.io/en/stable/python-api.html#reading-rows-from-a-file ``` from sqlite_utils.utils import rows_from_file import io rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) print(list(rows), format) # Outputs [{'id': '1', 'name': 'Cleo'}] Format.CSV ``` Gives error ``` >""c:\Program Files\Python37\python.exe"" test2.py Traceback (most recent call last): File ""test2.py"", line 4, in rows, format = rows_from_file(io.StringIO(""id,name\n1,Cleo"")) File ""C:\Users\swood\Downloads\sqlite-utils-main-20220621\sqlite-utils-main\sqlite_utils\utils.py"", line 300, in rows_from_file first_bytes = buffered.peek(2048).strip() AttributeError: '_io.StringIO' object has no attribute 'readinto' ``` I am running Python on Windows. ``` >""c:\Program Files\Python37\python.exe"" Python 3.7.4 (tags/v3.7.4:e09359112e, Jul 8 2019, 20:34:20) [MSC v.1916 64 bit (AMD64)] on win32 Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/448/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1279863844,I_kwDOCGYnMM5MSSwk,449,Utilities for duplicating tables and creating a table with the results of a query,1690072,davidleejy,closed,0,,,,,4,2022-06-22T09:41:43Z,2022-07-15T21:46:13Z,2022-07-15T21:21:36Z,CONTRIBUTOR,,"is there a duplicate table functionality? Otherwise, I'd be happy to submit a PR. In sqlite3 it would look like: ```python import sqlite3 as sl con = sl.connect('prompt-tune.db') def db_duplicate_table(table_name, table_name_new, con=con): # Duplicates table `table_name` to a new table `table_name_new`. try: cur = con.cursor() cur.execute(f""""""CREATE TABLE {table_name_new} AS SELECT * FROM {table_name}"""""") except Exception as e: print(e) finally: cur.close() db_duplicate_table('orig_table', 'new_table') ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1306548397,I_kwDOCGYnMM5N4Fit,454,CLI command for duplicating tables,9599,simonw,closed,0,,,,,1,2022-07-15T21:31:27Z,2022-07-15T21:48:23Z,2022-07-15T21:45:51Z,OWNER,,"CLI equivalent of: - #449",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1292060682,I_kwDOCGYnMM5NA0gK,450,Add --ignore option to more commands,9599,simonw,closed,0,,,,,9,2022-07-02T13:52:02Z,2022-07-15T22:39:09Z,2022-07-15T22:37:45Z,OWNER,,"As seen in https://sqlite-utils.datasette.io/en/stable/cli-reference.html#add-foreign-key Could make this TIL trick unnecessary: https://til.simonwillison.net/bash/ignore-errors",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1298531653,I_kwDOCGYnMM5NZgVF,451,Make sqlite_utils.utils.chunks a documented function,9599,simonw,closed,0,,,,,2,2022-07-08T06:01:04Z,2022-07-15T22:09:34Z,2022-07-15T21:59:33Z,OWNER,,I want to use it in another project: https://github.com/simonw/sqlite-utils/blob/8a9fe6498faf783a1fdeb1793e661ad194a05267/sqlite_utils/utils.py#L471-L474,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1303169663,I_kwDOCGYnMM5NrMp_,453,'unclosed file' warning when using insert_upsert_implementation from Python,311257,makkus,closed,0,,,,,1,2022-07-13T09:34:35Z,2022-07-15T21:52:25Z,2022-07-15T21:52:21Z,NONE,,"I'm using the `[insert_upsert_implementation](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py)` function directly in my Python code to import a csv file with all the bells and whistles `sqlite-utils` provides, but I'm getting a resource warning that a io.TextWrapper object is not closed. The warning goes away when wrapping the code from [this line](https://github.com/simonw/sqlite-utils/blob/42440d6345c242ee39778045e29143fb550bd2c2/sqlite_utils/cli.py#L924) in a try/finally block like: ``` try: ... ... finally: decoded.close() ``` (might be that `sniff_buffer` must also be closed if non null, but I might be wrong) I suspect Python closes the reference automatically when the sqlite-utils cli run is done, but since my code doesn't exit, I'm getting the warning. Alternatively, it'd be cool if the 'import csv/tsv' functionality could be added directly to the Database class.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1324659241,I_kwDOCGYnMM5O9LIp,459,Single quoted transform recipes on Windows do not work as expected ,19921,shakeel,open,0,,,,,0,2022-08-01T16:14:54Z,2022-08-01T16:14:54Z,,CONTRIBUTOR,,"Trying to follow the tutorial for sqlite-utils and datasette https://datasette.io/tutorials/clean-data on Windows 11 OS `Microsoft Windows [Version 10.0.22622.440]`, with sqlite-utils and datasette installed using pipx. ``` pipx list package datasette 0.61.1, installed using Python 3.10.4 - datasette.exe package sqlite-utils 3.28, installed using Python 3.10.4 - sqlite-utils.exe ``` In the step to transform dates into ISO dates the quoted value `'r.parsedatetime(value)'` is copied verbatim into the columns instead of applying the output of the Python recipe. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ 'r.parsedatetime(value)' --dry-run 1975/01/31 00:00:00+00 --- becomes: r.parsedatetime(value) Would affect 13568 rows ``` However, if I change the code from single quotes to double quotes, it works as expected. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ ""r.parsedatetime(value)"" --dry-run 1975/01/31 00:00:00+00 --- becomes: 1975-01-31T00:00:00+00:00 Would affect 13568 rows ``` Specifying the transform code recipe should work with single quotes on Windows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1310243385,I_kwDOCGYnMM5OGLo5,456,feature request: pivot command,536941,fgregg,open,0,,,,,5,2022-07-20T00:58:08Z,2022-07-20T17:50:50Z,,CONTRIBUTOR,,pivoting long-format table to wide-format tables is pretty common and kind of pain. would love to see this feature in sqlite-utils!,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1320243134,I_kwDOCGYnMM5OsU--,458,Support custom names for registered functions,9599,simonw,closed,0,,,8355157,3.29,1,2022-07-28T00:13:00Z,2022-08-27T03:56:01Z,2022-07-28T00:13:57Z,OWNER,,"In this example: ```python @db.register_function def reverse_string(s): return """".join(reversed(list(s))) print(db.execute('select reverse_string(""hello"")').fetchone()[0]) ``` There's currently no way to over-ride the automatically selected name for the SQL function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1326349129,I_kwDOCGYnMM5PDntJ,461,Consider including animated SVG console demos,9599,simonw,open,0,,,,,1,2022-08-02T20:10:04Z,2022-08-02T20:12:14Z,,OWNER,,"I recorded this one using https://github.com/nbedos/termtosvg - with `pipx install termtosvg` and then `termtosvg` - execute demo - `exit` to save. ![sqlite-utils-insert-json](https://user-images.githubusercontent.com/9599/182464206-f4976af4-eda8-4020-8257-4ada1867fb44.svg) ```json [ { ""id"": 1, ""name"": ""Catimus"" }, { ""id"": 2, ""name"": ""Feliopia"" } ] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1338001039,I_kwDOCGYnMM5PwEaP,464,Link from documentation to source code,9599,simonw,closed,0,,,,,5,2022-08-13T16:19:57Z,2022-08-17T23:38:03Z,2022-08-17T23:38:03Z,OWNER,,Twitter conversation asking for ways to automate this here: https://twitter.com/simonw/status/1558260492015046656,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1348169997,I_kwDOCGYnMM5QW3EN,467,Mechanism for ensuring a table has all the columns,9599,simonw,closed,0,,,8355157,3.29,13,2022-08-23T15:50:23Z,2022-08-27T23:19:41Z,2022-08-27T23:17:56Z,OWNER,,Suggested by @jefftriplett on Discord: https://discord.com/channels/823971286308356157/997738192360964156/1011655389063958600,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932716,I_kwDOCGYnMM5QpB1s,471,sqlite-utils query --functions mechanism for registering extra functions,9599,simonw,closed,0,,,8355157,3.29,12,2022-08-27T03:57:53Z,2022-09-07T03:46:26Z,2022-08-27T05:10:57Z,OWNER,,"It would be really cool if you could register additional custom SQL functions for use with the `sqlite-utils query` command - something like this: ``` sqlite-utils data.db 'update images set domain = extract_domain(url)' --functions ' from urllib.parse import urlparse def extract_domain(url): return urlparse(url).netloc ' ``` Every function defined in that code block would be registered with the connection, unless the name began with an underscore.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352931464,I_kwDOCGYnMM5QpBiI,469,sqlite-utils rows --order option,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T03:49:51Z,2022-08-27T04:30:49Z,2022-08-27T04:10:32Z,OWNER,,"For consistency with `search`: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#search ``` -o, --order TEXT Order by ('column' or 'column desc') ``` I wanted to run `sqlite-utils rows db.db mytable --order 'rowid desc'` to see the most recently imported rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352932038,I_kwDOCGYnMM5QpBrG,470,Upgrade `--load-extension` to accept entrypoints like Datasette,9599,simonw,closed,0,,,8355157,3.29,6,2022-08-27T03:53:20Z,2022-08-27T05:55:49Z,2022-08-27T05:55:48Z,OWNER,,"Imitate: - https://github.com/simonw/datasette/pull/1789 ``` # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the ""sqlite3_foo_init"" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the ""sqlite3_bar_init"" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1352946135,I_kwDOCGYnMM5QpFHX,472,Reuse the locals/globals fix from --functions for other code accepting options,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T05:12:05Z,2022-08-27T05:20:12Z,2022-08-27T05:20:12Z,OWNER,,"I figured out a workaround for the ugly `global x` hack here: - https://github.com/simonw/sqlite-utils/issues/471#issuecomment-1229120653",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,hubgit,open,0,,,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353441389,I_kwDOCGYnMM5Qq-Bt,477,Conda Forge,49702524,thewchan,closed,0,,,,,2,2022-08-28T19:03:08Z,2022-09-07T03:46:55Z,2022-09-07T03:46:55Z,NONE,,"Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks. https://github.com/conda-forge/sqlite-utils-feedstock",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353189941,I_kwDOCGYnMM5QqAo1,475,table.default_values introspection property,9599,simonw,closed,0,,,8355157,3.29,1,2022-08-27T22:33:31Z,2022-08-27T22:44:46Z,2022-08-27T22:43:02Z,OWNER,,"> Interesting challenge with `default_value`: I need to be able to tell if the default values passed to `.create()` differ from those in the database already. > > Introspecting that is a bit tricky: > > ```pycon > >>> import sqlite_utils > >>> db = sqlite_utils.Database(memory=True) > >>> db[""blah""].create({""id"": int, ""name"": str}, not_null=(""name"",), defaults={""name"": ""bob""}) >
> >>> db[""blah""].columns > [Column(cid=0, name='id', type='INTEGER', notnull=0, default_value=None, is_pk=0), Column(cid=1, name='name', type='TEXT', notnull=1, default_value=""'bob'"", is_pk=0)] > ``` > Note how a default value of the Python string `bob` is represented in the results of `PRAGMA table_info()` as `default_value=""'bob'""` - it's got single quotes added to it! > > So comparing default values from introspecting the database needs me to first parse that syntax. This may require a new table introspection method. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/468#issuecomment-1229279539_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353196970,I_kwDOCGYnMM5QqCWq,476,Release notes for 3.29,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T23:21:21Z,2022-08-28T04:07:15Z,2022-08-28T04:07:03Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.28...104f37fa4d2e7e5999c1d829267b62c737f74d3e,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353481513,I_kwDOCGYnMM5QrH0p,478,`sqlite-utils tables data.db table1 table2`,9599,simonw,open,0,,,,,1,2022-08-28T22:05:53Z,2022-08-28T22:22:35Z,,OWNER,,"The `sqlite-utils tables` command currently lists all tables. If you have a huge table in there then running it with `--counts` can get expensive, because of the huge table. Would be useful if it could accept an optional list of tables that it should execute against, as an alternative to the default of all of them. This should be a backwards compatible change. Current design is: https://sqlite-utils.datasette.io/en/stable/cli-reference.html#tables ``` Usage: sqlite-utils tables [OPTIONS] PATH List the tables in the database Example: sqlite-utils tables trees.db ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1355193529,I_kwDOCGYnMM5Qxpy5,479,OperationalError: cannot VACUUM from within a transaction,7908073,chapmanjacobd,open,0,,,,,0,2022-08-30T05:34:24Z,2022-08-30T05:34:24Z,,CONTRIBUTOR,,"Maybe when calling `.vacuum()` and other DB-level write-lock operations `sqlite_utils` could guard against this error message by automatically committing first? ``` 46 db[""media""].optimize() # type: ignore ---> 47 db.vacuum() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:1047, in Database.vacuum(self) 1045 def vacuum(self): 1046 ""Run a SQLite ``VACUUM`` against the database."" -> 1047 self.execute(""VACUUM;"") File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:470, in Database.execute(self, sql, parameters) 468 return self.conn.execute(sql, parameters) 469 else: --> 470 return self.conn.execute(sql) OperationalError: cannot VACUUM from within a transaction ``` It might also be nice to add a sentence or two about how transactions are committed on the [docs page](https://sqlite-utils.datasette.io/en/latest/python-api.html#detect-fts). When I was swapping out my sqlite3 code for this library it was nice that everything was pretty much drop-in but I was/am unsure what to do about the places I explicitly call `.commit()` in my code Related to https://github.com/simonw/sqlite-utils/issues/121",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/479/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1373224657,I_kwDOCGYnMM5R2b7R,488,`sqlite-utils transform` should set empty strings to null when converting text columns to integer/float,9599,simonw,open,0,,,,,5,2022-09-14T15:51:30Z,2022-12-23T17:38:55Z,,OWNER,,"``` /tmp % echo ""id,age,weight\n1,3,2.5\n2,,"" | sqlite-utils insert test.db test - --csv /tmp % sqlite-utils schema test.db CREATE TABLE [test] ( [id] TEXT, [age] TEXT, [weight] TEXT ); /tmp % sqlite-utils transform test.db test --type age integer --type weight float /tmp % sqlite-utils schema test.db CREATE TABLE ""test"" ( [id] TEXT, [age] INTEGER, [weight] FLOAT ); /tmp % sqlite-utils rows test.db test [{""id"": ""1"", ""age"": 3, ""weight"": 2.5}, {""id"": ""2"", ""age"": """", ""weight"": """"}] ``` It would be neat if this resulted in the following instead: ``` {""id"": ""2"", ""age"": null, ""weight"": null} ``` Related Discord discussion: https://discord.com/channels/823971286308356157/823971286941302908/1019635490833567794",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/488/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1374939463,I_kwDOCGYnMM5R8-lH,489,Ability to load JSON records held in a file with a single top level key that is a list of objects,9599,simonw,open,0,,,,,9,2022-09-15T18:46:03Z,2022-09-15T20:56:10Z,,OWNER,,"It's very common for JSON to look like this: ```json { ""Version"": ""5.5.52.6"", ""List"": [ { ""Description"": ""Nonpartisan"", ""Id"": 1, ""ExternalId"": """" }, { ""Description"": ""Undeclared"", ""Id"": 2, ""ExternalId"": """" } ] } ``` This example taken from the records downloaded from https://www.elections.alaska.gov/election-results/e/ Right now you can't import this into `sqlite-utils` - you need to run it through `jq .List` first. But since this is so common, it would be neat if `sqlite-utils` could have a rule of thumb that says ""if it's an object, but it has a single key that is is a list of objects, use that instead"".",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/489/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1359604075,I_kwDOCGYnMM5RCelr,481,"Idea: `sqlite-utils create-table tablename --sql ""select ...""`",9599,simonw,open,0,,,,,0,2022-09-02T01:41:24Z,2022-09-02T01:42:08Z,,OWNER,,"Could offer syntactic sugar for: ```sql create table foo as select * from bar ``` ``` sqlite-utils create-table data.db foo --sql ""select * from bar"" ``` https://sqlite-utils.datasette.io/en/stable/cli-reference.html#create-table",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1361355564,I_kwDOCGYnMM5RJKMs,482,balanced table default column_order,7908073,chapmanjacobd,closed,0,,,,,1,2022-09-05T03:00:18Z,2022-10-10T17:43:02Z,2022-09-06T20:17:27Z,CONTRIBUTOR,,"Is there any performance or size difference with column order in SQLITE ? similar to this https://www.cybertec-postgresql.com/en/column-order-in-postgresql-does-matter/ It might be interesting to have an option to create with an optimized column order. I'm assuming this would look something like INTEGER columns, REAL columns, BLOB columns, TEXT columns, NULL columns. NULL columns at the end because they are more likely to be TEXT and it is impossible to know if they will become INTEGER (Of course, any schema evolution would reduce optimization but maybe column order could also be re-evaluated when schema changes) edit: this is easy to accomplish with the existing `transform` method: ``` int_columns = [k for k, v in table_columns.items() if v == int] db[table].transform(column_order=[*int_columns]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363766973,I_kwDOCGYnMM5RSW69,484,Expose convert recipes to `sqlite-utils --functions`,9599,simonw,open,0,,,,,11,2022-09-06T20:15:08Z,2022-09-07T19:09:52Z,,OWNER,,"`--functions` was added in: - #471 It would be useful if the `r.jsonsplit()` and similar recipes for `sqlite-utils convert` could be used in these blocks of code too: https://sqlite-utils.datasette.io/en/stable/cli.html#sqlite-utils-convert-recipes",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363765916,I_kwDOCGYnMM5RSWqc,483,`sqlite-utils install` command,9599,simonw,closed,0,,,,,2,2022-09-06T20:13:55Z,2022-09-26T19:04:43Z,2022-09-26T18:57:15Z,OWNER,,"With the addition of `--functions` in: - #471 In addition to the existing `convert` command, there are now very good reasons to want to install additional packages into the same virtual environment as `sqlite-utils` itself, to allow them to be used with those features. This isn't easy if you installed the tool with `pipx` or `brew install sqlite-utils`. Datasette solved this problem with the `datasette install` command: - https://github.com/simonw/datasette/issues/925 `sqlite-utils` could benefit from the same idea.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/483/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1366423176,I_kwDOCGYnMM5RcfaI,485,Progressbar not shown when inserting/upserting jsonlines file,99098079,MischaU8,closed,0,,,,,1,2022-09-08T14:13:18Z,2022-09-15T20:39:52Z,2022-09-15T20:37:52Z,CONTRIBUTOR,,"When inserting or upserting a jsonlines file, no progressbar is shown. Expected behavior is that, just like with .csv/.tsv files, also for a jsonlines file (--nl), unless --silent is provided, a progressbar is shown. ```bash sql-utils upsert mydb.db posts posts.jl --nl --pk post_id (silence) ``` Currently `file_progress` is only called within the tsv/csv logic, however I think it can be safely wrapped around all the all the input formats that use `decoded`: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/cli.py#L963",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1367835380,I_kwDOCGYnMM5Rh4L0,487,Specify foreign key against compound key in other table,540968,ryanfox,closed,0,,,,,2,2022-09-09T13:32:09Z,2022-09-11T04:00:44Z,2022-09-11T04:00:44Z,NONE,,"When inserting rows via the library, is it possible to specify a foreign key to a compound primary key? For example, suppose I create a table: ``` db = Database('events.db') db['events'].insert_all([ {'venue': 'Times Square', 'date': '2022-12-31', 'title': 'Rockin New Year Eve'}, {'venue': 'Wembley Stadium', 'date': '2022-06-05', 'title': 'FA Cup'}, {'venue': 'Times Square', 'date': '2021-12-31', 'title': 'Rockin New Year Eve'}, ], pk=('date', 'venue')) ``` And I want to add related data in another table: ``` act = {'name': 'Rick Astley', 'venue': 'Times Square', 'date': '2021-12-31' } db['performers'].insert(act, pk=) ``` Is it possible to specify a value for `pk` that will point to the compound primary key in `events`? SQLite does support it: https://www.sqlite.org/foreignkeys.html#fk_composite",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1382457780,I_kwDOCGYnMM5SZqG0,490,Ability to insert multi-line files,6180701,jeqo,closed,0,,,,,4,2022-09-22T13:29:22Z,2022-09-26T18:24:44Z,2022-09-23T16:37:58Z,NONE,,"I was looking into how to parse application log files that contain multiline text (e.g. Java stack traces) into sqlite. I can see that at the moment `--lines` helps, but falls short when processing multi-line texts. I wonder if this functionality would be useful for sqlite-utils. A similar approach to Elastic logstash/filebeat can be adopted: https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html Potential changes: - add a `--multiline` option - additional properties for - multiline-pattern (regex expression) - multiline-negate: true/false - multiline-what: previous or next Or if this is achievable in a different way, please share. Thanks!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1383646615,I_kwDOCGYnMM5SeMWX,491,Ability to merge databases and tables,8904453,sgraaf,open,0,,,,,7,2022-09-23T11:10:55Z,2023-06-14T22:14:24Z,,NONE,,"Hi! Let me firstly say that I am a big fan of your work -- I follow your tweets and blog posts with great interest 😄. Now onto the matter at hand: I think it would be great if `sqlite-utils` included a `merge` or `combine` command, with the purpose of combining different SQLite databases into a single SQLite database. This way, the newly ""merged"" database would contain all differently named tables contained in the databases to be merged as-is, as well a concatenation of all tables of the same name. This could look something like this: ```bash sqlite-utils merge cats.db dogs.db > animals.db ``` I imagine this is rather straightforward if all databases involved in the merge contain differently named tables (i.e. no chance of conflicts), but things get slightly more complicated if two or more of the databases to be merged contain tables with the same name. Not only do you have to ""do something"" with the primary key(s), but these tables could also simply have different schemas (and therefore be incompatible for concatenation to begin with). Anyhow, I would love your thoughts on this, and, if you are open to it, work together on the design and implementation!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/491/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386530156,I_kwDOCGYnMM5SpMVs,492,Idea: ability to pass extra variables to `--convert` scripts,9599,simonw,open,0,,,,,1,2022-09-26T18:30:45Z,2022-09-26T18:33:19Z,,OWNER,,"Got this idea from this example in https://jeqo.github.io/notes/2022-09-24-ingest-logs-sqlite/ ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': 'localhost'}) row.update({'component': 'broker'}) return rows "" ``` And the accompanying note: > The `row.update` allows to label rows as I’m planning to ingest logs from different hosts and potentially different components. This made me think: it might be neat if you could inject additional variable values into that script with extra command-line options, to make this kind of reuse easier. Something like this: ```bash sqlite-utils insert /tmp/kafka-logs.db logs server.log.2022-09-24-21 --text --convert "" import re r = re.compile(r'^\[(?P\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})\] (?P\w+) (?P(.+(\n(?\!\[).+|)+))', re.MULTILINE) def convert(text): rows = [m.groupdict() for m in r.finditer(text)] for row in rows: row.update({'server': server}) row.update({'component': component}) return rows "" --var server ""localhost"" --var component ""broker"" ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/492/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,simonw,open,0,,,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386593843,I_kwDOCGYnMM5Spb4z,494,Document how to use Just,9599,simonw,closed,0,,,,,2,2022-09-26T19:25:12Z,2022-09-26T19:32:36Z,2022-09-26T19:26:39Z,OWNER,,"I'm using `just` a lot know, based on this file - I should add that to https://sqlite-utils.datasette.io/en/latest/contributing.html https://github.com/simonw/sqlite-utils/blob/afbd2b2cba45cccb305c3d4638d18db4dd3d4bbd/Justfile#L1-L24",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,mhalle,closed,0,,,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393202060,I_kwDOCGYnMM5TCpOM,496,devrel/python api: Pylance type hinting,7908073,chapmanjacobd,open,0,,,,,4,2022-10-01T03:03:34Z,2023-05-03T05:53:27Z,,CONTRIBUTOR,,"Pylance is generally pretty good at figuring out stuff but `sqlite-utils` has some quirks which make type hinting kinda useless. Maybe you don't care but I thought I would bring it to your attention. For example: ``` db[""subs""].insert_all(subs, pk=""index"") ``` ``` Cannot access member ""insert_all"" for type ""View"" Member ""insert_all"" is unknown ``` `insert_all` and all the other methods show up as a type issues because the program can't know whether something is a View or a Table. Fair enough. But that basically throws all type checking out the window. `pk=""index""` also shows up as a type issue: ``` Argument of type ""Literal['index']"" cannot be assigned to parameter ""pk"" of type ""Default"" in function ""insert_all"" ""Literal['index']"" is incompatible with ""Default"" ``` I think this is because DEFAULT is an empty class? maybe a few small changes could be made to make the library more type-friendly The interim solution is of course to turn off type hints completely for the line ``` db[""subs""].insert_all(subs, pk=""index"") # type: ignore ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1393212964,I_kwDOCGYnMM5TCr4k,497,column_names,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-01T03:34:21Z,2022-10-25T21:09:28Z,2022-10-25T21:09:28Z,CONTRIBUTOR,,"It would be nice to have a `column_names`. Similar to `table_names`. Or if you could get one or all of the following syntax to work for both Database and Table that might be even better: Style 1 - `if 'table1' in db` - `if 'col1' in db['table1']` Style 2 - `if 'table1' in db.tables` - `if 'col1' in db['table1'].columns` maybe the table ones actually work but I'm too lazy to check. I just know that I have to do: `[c.name for c in db['table1'].columns]` Edit: This is possible with `columns_dict`. I have actually used that before but I forgot about it. Feel free to close, but I do think accessing this data could be more consistent and intuitive.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422954582,I_kwDOCGYnMM5U0JBW,502,Fix tests for Python 3.11,9599,simonw,closed,0,,,,,1,2022-10-25T19:20:31Z,2022-10-25T19:23:47Z,2022-10-25T19:23:47Z,OWNER,,"The way errors are represented has changed: https://github.com/simonw/sqlite-utils/actions/runs/3323588047/jobs/5494127154 ``` _________________________ test_query_invalid_function __________________________ db_path = '/tmp/pytest-of-runner/pytest-0/test_query_invalid_function0/test.db' def test_query_invalid_function(db_path): result = CliRunner().invoke( cli.cli, [db_path, ""select bad()"", ""--functions"", ""def invalid_python""] ) assert result.exit_code == 1 > assert ( result.output.strip() == ""Error: Error in functions definition: invalid syntax (, line 1)"" ) E AssertionError: assert 'Error: Error...ing>, line 1)' == 'Error: Error...ing>, line 1)' E - Error: Error in functions definition: invalid syntax (, line 1) E ? ^^^^^^ ^^^^^^ E + Error: Error in functions definition: expected '(' (, line 1) E ? ^^^^^^^ ^^^ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423000702,I_kwDOCGYnMM5U0UR-,503,test_recreate failing on Windows Python 3.11,9599,simonw,closed,0,,,,,10,2022-10-25T20:01:41Z,2022-10-25T20:47:34Z,2022-10-25T20:45:43Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3323672128/jobs/5494726927 Related: - #502 ``` FAILED tests/test_recreate.py::test_recreate[True-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_True_True_0\\data.db' FAILED tests/test_recreate.py::test_recreate[False-True] - PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-runneradmin\\pytest-0\\test_recreate_False_True_0\\data.db' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423069384,I_kwDOCGYnMM5U0lDI,504,"db.close() method, calling db.conn.close()",9599,simonw,closed,0,,,,,1,2022-10-25T20:50:50Z,2022-10-25T21:00:29Z,2022-10-25T20:57:47Z,OWNER,,"I ended up needing to use `db.conn.close()` to fix this issue: - #503 I think `.close()` should be a method on `Database` itself.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423182778,I_kwDOCGYnMM5U1Au6,505,Release sqlite-utils 3.30,9599,simonw,closed,0,,,,,2,2022-10-25T22:20:05Z,2022-10-25T22:41:26Z,2022-10-25T22:41:16Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.29...defa2974c6d3abc19be28d6b319649b8028dc966,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413610718,I_kwDOCGYnMM5UQfze,500,Turn --flatten into a documented utility function,9599,simonw,closed,0,,,,,4,2022-10-18T17:43:36Z,2022-10-18T18:02:10Z,2022-10-18T18:00:40Z,OWNER,,The `--flatten` implementation isn't currently available to Python code - people have to roll their own implementation. Feedback from a conversation at DjangoCon.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413641049,I_kwDOCGYnMM5UQnNZ,501,Tests failing due to updated tabulate library,9599,simonw,closed,0,,,,,4,2022-10-18T18:07:52Z,2022-10-18T18:23:40Z,2022-10-18T18:23:40Z,OWNER,,"Failure here: https://github.com/simonw/sqlite-utils/actions/runs/3275786702/jobs/5391063221 I figured out the problem: ```diff diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index b88e38a..82b4b6c 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -112,11 +112,15 @@ See :ref:`cli_query`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -176,11 +180,15 @@ See :ref:`cli_memory`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -401,11 +409,14 @@ See :ref:`cli_search`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -651,11 +662,14 @@ See :ref:`cli_tables`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each table @@ -689,11 +703,14 @@ See :ref:`cli_views`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each view @@ -732,11 +749,15 @@ See :ref:`cli_rows`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional @@ -768,11 +789,14 @@ See :ref:`cli_triggers`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -804,11 +828,14 @@ See :ref:`cli_indexes`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint diff --git a/docs/cli.rst b/docs/cli.rst index 8bc4176..1d67e88 100644 --- a/docs/cli.rst +++ b/docs/cli.rst @@ -187,10 +187,15 @@ Available ``--fmt`` options are: cog.out(""\n"" + ""\n"".join('- ``{}``'.format(t) for t in tabulate.tabulate_formats) + ""\n\n"") .. ]]] +- ``asciidoc`` +- ``double_grid`` +- ``double_outline`` - ``fancy_grid`` - ``fancy_outline`` - ``github`` - ``grid`` +- ``heavy_grid`` +- ``heavy_outline`` - ``html`` - ``jira`` - ``latex`` @@ -198,15 +203,22 @@ Available ``--fmt`` options are: - ``latex_longtable`` - ``latex_raw`` - ``mediawiki`` +- ``mixed_grid`` +- ``mixed_outline`` - ``moinmoin`` - ``orgtbl`` +- ``outline`` - ``pipe`` - ``plain`` - ``presto`` - ``pretty`` - ``psql`` +- ``rounded_grid`` +- ``rounded_outline`` - ``rst`` - ``simple`` +- ``simple_grid`` +- ``simple_outline`` - ``textile`` - ``tsv`` - ``unsafehtml`` ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,simonw,closed,0,,,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1430325103,I_kwDOCGYnMM5VQQdv,507,conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character,7908073,chapmanjacobd,closed,0,,,,,1,2022-10-31T18:49:51Z,2022-11-01T00:40:17Z,2022-11-01T00:40:16Z,CONTRIBUTOR,,"I'm not really sure what caused this and it happened in the middle of my program (after running for 35775 seconds). ``` Extracting metadata 49.9% (chunk 9893 of 19831) ... File ""/home/xk/.local/lib/python3.10/site-packages/xklb/fs_extract.py"", line 90, in extract_chunk args.db[""media""].insert_all(utils.list_dict_filter_bool(media), pk=""path"", alter=True, replace=True) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3107, in insert_all self.insert_chunk( File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2872, in insert_chunk result = self.db.execute(query, params) File ""/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py"", line 483, in execute return self.conn.execute(sql, parameters) UnicodeEncodeError: 'utf-8' codec can't encode character '\udcc3' in position 62: surrogates not allowed ``` This might be relevant: https://stackoverflow.com/questions/31898353/python-cant-encode-with-surrogateescape I'm going to try re-running with ```py def execute( self, sql: str, parameters: Optional[Union[Iterable, dict]] = None ) -> sqlite3.Cursor: """""" Execute SQL query and return a ``sqlite3.Cursor``. :param sql: SQL query to execute :param parameters: Parameters to use in that query - an iterable for ``where id = ?`` parameters, or a dictionary for ``where id = :id`` """""" try: if self._tracer: self._tracer(sql, parameters) if parameters is not None: return self.conn.execute(sql, parameters) else: return self.conn.execute(sql) except UnicodeEncodeError: sql = sql.encode('utf-8', 'surrogatepass').decode('utf-8') if parameters is not None: parameters = parameters.encode('utf-8', 'surrogatepass').decode('utf-8') return self.execute(sql, parameters) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1432377191,I_kwDOCGYnMM5VYFdn,509,`sqlite-utils transform` breaks DEFAULT string values and STRFTIME(),2199875,kennysong,closed,0,,,,,0,2022-11-02T02:32:23Z,2023-05-08T21:13:38Z,2023-05-08T21:13:38Z,NONE,,"Very nice library! Our team found sqlite-utils through @simonw's [comment on the ""Simple declarative schema migration for SQLite"" article](https://news.ycombinator.com/item?id=31249823), and we were excited to use it, but unfortunately `sqlite-utils transform` seems to break our DB. Running `sqlite-utils transform` to modify a column mangles their DEFAULT values: - Default string values are wrapped in extra single quotes - Function expressions such as [`STRFTIME()`](https://www.sqlite.org/lang_datefunc.html) are turned into strings! ------ Here are steps to reproduce: **Original database** ``` $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) EOF $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) ``` **Modified database after sqlite-utils** ``` $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 $ sqlite-utils transform test.db mytable --rename col1 renamedcol1 $ sqlite3 test.db ""SELECT sql FROM sqlite_master WHERE name = 'mytable';"" CREATE TABLE ""mytable"" ( [renamedcol1] TEXT DEFAULT '''foo''', [col2] TEXT DEFAULT 'STRFTIME(''%Y-%m-%d %H:%M:%f'', ''NOW'')' ) $ sqlite3 test.db ""INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;"" foo|2022-11-02 02:26:58.038 'foo'|STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW') ``` (Related: #336)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,ar-jan,closed,0,,,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1436539554,I_kwDOCGYnMM5Vn9qi,511,"[insert_all, upsert_all] IntegrityError: constraint failed",7908073,chapmanjacobd,closed,0,,,,,2,2022-11-04T19:21:48Z,2022-11-04T22:59:54Z,2022-11-04T22:54:09Z,CONTRIBUTOR,,"My understand is that `INSERT OR IGNORE` will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from `sqlite3`. ``` import argparse from pathlib import Path from xklb import db, utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument(""database"") parser.add_argument(""dbs"", nargs=""*"") parser.add_argument(""--upsert"") parser.add_argument(""--db"", ""-db"", help=argparse.SUPPRESS) parser.add_argument(""--verbose"", ""-v"", action=""count"", default=0) args = parser.parse_args() if args.db: args.database = args.db Path(args.database).touch() args.db = db.connect(args) log.info(utils.dict_filter_bool(args.__dict__)) return args def merge_db(args, source_db): source_db = str(Path(source_db).resolve()) s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose)) for table in [s for s in s_db.table_names() if not ""_fts"" in s and not s.startswith(""sqlite_"")]: log.info(""[%s]: %s"", source_db, table) with s_db.conn: data = s_db[table].rows with args.db.conn: if args.upsert: args.db[table].upsert_all(data, pk=args.upsert.split("",""), alter=True) else: args.db[table].insert_all(data, alter=True, replace=True) def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db) if __name__ == ""__main__"": merge_dbs() ``` ``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO [media]([path]) VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze) 3116 all_columns += [ 3117 column for column in record if column not in all_columns 3118 ] 3120 first = False -> 3122 self.insert_chunk( 3123 alter, 3124 extracts, 3125 chunk, 3126 all_columns, 3127 hash_id, 3128 hash_id_columns, 3129 upsert, 3130 pk, 3131 conversions, 3132 num_records_processed, 3133 replace, 3134 ignore, 3135 ) 3137 if analyze: 3138 self.analyze() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore) 2885 for query, params in queries_and_params: 2886 try: -> 2887 result = self.db.execute(query, params) 2888 except OperationalError as e: 2889 if alter and ("" column"" in e.args[0]): 2890 # Attempt to add any missing columns, then try again File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters) 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) IntegrityError: constraint failed > /home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py(484)execute() 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) ``` ``` sqlite3 --version 3.36.0 2021-06-18 18:36:39 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,simonw,closed,0,,,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1453134846,I_kwDOCGYnMM5WnRP-,513,Add or document streamlined workflow for importing Datasette csv / json exports,19328961,henry501,open,0,,,,,0,2022-11-17T10:54:47Z,2022-11-17T10:54:47Z,,NONE,,"I'm working on some small front-end enhancements to the laion-aesthetic-datasette project, and I wanted to partially populate a database directly using exports from the existing Datasette instance instead of downloading the parquet files and creating my own multi-GB database. There have been a number of small issues that are certainly related to my relative lack of familiarity with the toolkit, but that are still surprising. For example: a CSV export of the images table (http://laion-aesthetic.datasette.io/laion-aesthetic-6pls.csv?sql=select+rowid%2C+url%2C+text%2C+domain_id%2C+width%2C+height%2C+similarity%2C+punsafe%2C+pwatermark%2C+aesthetic%2C+hash%2C+__index_level_0__+from+images+order+by+random%28%29+limit+100) has nested single quotes, double quotes, and commas that aren't handled by rows_from_file. Similarly, the json output has to be manually transformed to add the column names and remove extraneous information before sqlite_utils can import it. I was able to work through these issues, but as an enhancement it would be really helpful to create or document a clear workflow that avoids the friction of this data transformation.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1465194249,I_kwDOCGYnMM5XVRcJ,514,upsert of new row with check constraints fails,193185,cldellow,closed,0,,,,,5,2022-11-26T16:12:23Z,2023-05-08T21:50:52Z,2023-05-08T21:50:51Z,NONE,,"(I originally opened this in https://github.com/simonw/datasette-insert/issues/20, but I see that that library depends on sqlite-utils) In the case of a new row, upsert first adds the row, specifying only its pkeys: https://github.com/simonw/sqlite-utils/blob/965ca0d5f5bffe06cc02cd7741344d1ddddf9d56/sqlite_utils/db.py#L2783-L2787 This means that a table with NON NULL (or other constraint) columns that aren't part of the pkey can't have new rows upserted.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1479914599,I_kwDOCGYnMM5YNbRn,516,Feature request: output number of ignored/replaced rows for insert command,9599,simonw,open,0,,,,,4,2022-12-06T18:59:21Z,2022-12-06T19:08:14Z,,OWNER,,"https://hachyderm.io/@briandorsey/109468185742876820 > I'm fiddling with piping json to `insert -ignore` I'd love to see the count of records inserted & ignored, but didn't see a way to do that in the help/docs. > > Example: `xh ""https://hachyderm.io/api/v1/timelines/tag/rust?max_id=109443380308326328"" | sqlite-utils insert aoc.db aoc - --pk=id --ignore`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1487757143,I_kwDOCGYnMM5YrV9X,517,Drop support for Python 3.6,9599,simonw,closed,0,,,,,1,2022-12-10T01:23:31Z,2022-12-10T01:36:36Z,2022-12-10T01:36:36Z,OWNER,,"CI has started failing for Python 3.6: https://github.com/simonw/sqlite-utils/actions/runs/3576322798 It's fixable by swiching away from `ubuntu-latest` according to: - https://github.com/actions/setup-python/issues/355#issuecomment-1335042510 But https://endoflife.date/python says that 3.6 end of life was almost 6 years ago, and end of security support nearly 1 year ago. So I'm OK dropping support entirely - Python 3.6 users will still be able to install version 3.30, just not any releases that come next.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1487764628,I_kwDOCGYnMM5YrXyU,518,flake8 ValueError: Error code '#' supplied to 'extend-ignore' option...,9599,simonw,closed,0,,,,,0,2022-12-10T01:30:24Z,2022-12-10T01:36:46Z,2022-12-10T01:36:46Z,OWNER,,"> `Error code '#' supplied to 'extend-ignore' option does not match '^[A-Z]{1,3}[0-9]{0,3}$'` https://github.com/simonw/sqlite-utils/actions/runs/3662011265/jobs/6190770361 I think from this: https://github.com/simonw/sqlite-utils/blob/e660635cea6c32f4022818380b1e1ee88e7c93a6/setup.cfg#L1-L3 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1550536442,I_kwDOCGYnMM5ca076,521,Custom JSON encoder,31504,janrito,open,0,,,,,0,2023-01-20T09:19:40Z,2023-01-20T09:19:40Z,,NONE,,"It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553425465,I_kwDOCGYnMM5cl2Q5,522,Add COLUMN_TYPE_MAPPING for timedelta,81377,maport,closed,0,,,,,0,2023-01-23T16:49:54Z,2023-11-04T00:49:51Z,2023-11-04T00:49:51Z,NONE,,"Currently trying to create a column with Python type `datetime.timedelta` results in an error: ``` >>> from sqlite_utils import Database >>> db = Database(""test.db"") >>> test_tbl = db['test'] >>> test_tbl.insert({'col1': datetime.timedelta()}) Traceback (most recent call last): File """", line 1, in File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 2979, in insert return self.insert_all( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 3082, in insert_all self.create( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 1574, in create self.db.create_table( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 961, in create_table sql = self.create_table_sql( File ""/usr/local/lib/python3.10/dist-packages/sqlite_utils/db.py"", line 852, in create_table_sql column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` The reason this would be useful is that `MySQLdb` uses `timedelta` for MySQL `TIME` columns: ``` >>> import MySQLdb >>> conn = MySQLdb.connect(host='database', user='user', passwd='pw') >>> csr = conn.cursor() >>> csr.execute(""SELECT CAST('11:20' AS TIME)"") >>> tuple(csr) ((datetime.timedelta(seconds=40800),),) ``` So currently any attempt to convert a MySQL DB with a `TIME` column using `db-to-sqlite` will result in the above error. I was rather surprised that `MySQLdb` uses `timedelta` for `TIME` columns but I see that [this column type](https://dev.mysql.com/doc/refman/8.0/en/time.html) is intended for time intervals as well as the time of day so it makes sense. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/522/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1575131737,I_kwDOCGYnMM5d4ppZ,525,Repeated calls to `Table.convert()` fail,167893,mcarpenter,closed,0,,,,,4,2023-02-07T22:40:47Z,2023-05-08T21:59:41Z,2023-05-08T21:54:02Z,CONTRIBUTOR,,"## Summary When using the API, repeated calls to `Table.convert()` do not work correctly since all conversions quietly use the callable (function, lambda) from the first call to `convert()` only. Subsequent invocations with different callables use the callable from the first invocation only. ## Example ```python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 1}]) print(table.get(1)) table.convert(col, lambda x: x*2) print(table.get(1)) def zeroize(x): return 0 #zeroize = lambda x: 0 #zeroize.__name__ = 'zeroize' table.convert(col, zeroize) print(table.get(1)) ``` Output: ``` {'x': 1} {'x': 2} {'x': 4} ``` Expected: ``` {'x': 1} {'x': 2} {'x': 0} ``` ## Explanation This is some relevant [documentation](https://github.com/simonw/sqlite-utils/blob/1491b66dd7439dd87cd5cd4c4684f46eb3c5751b/docs/python-api.rst#registering-custom-sql-functions:~:text=By%20default%20registering%20a%20function%20with%20the%20same%20name%20and%20number%20of%20arguments%20will%20have%20no%20effect). * `Table.convert()` takes a `Callable` to perform data conversion on a column * The `Callable` is passed to `Database.register_function()` * `Database.register_function()` uses the callable's `__name__` attribute for registration * (Aside: all lambdas have a `__name__` of ``: I thought this was the problem, and it was close, but not quite) * However `convert()` first wraps the callable by local function [`convert_value()`](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2661) * Consequently `register_function()` sees name `convert_value` for all invocations from `convert()` * `register_function()` silently ignores registrations using the same name, retaining only the first such registration There's a mismatch between the comments and the code: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L404 but actually the existing function is returned/used instead (as the ""registering custom sql functions"" doc I linked above says too). Seems like this can be rectified to match the comment? ## Suggested fix I think there are four things: 1. The call to `register_function()` from `convert()`should have an explicit `name=` parameter (to continue using `convert_value()` and the progress bar). 2. For functions, this name can be the real function name. (I understand the sqlite api needs a name, and it's nice if those are recognizable names where possible). For lambdas would `'lambda-{uuid}'` or similar be acceptable? 3. `register_function()` really should throw an error on repeated attempts to register a duplicate (function, arity)-pair. 4. A test? I haven't looked at the test framework here but seems this should be testable. ## See also - #458 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1560651350,I_kwDOCGYnMM5dBaZW,523,Feature request: trim all leading and trailing white space for all columns for all tables in a database,536941,fgregg,open,0,,,,,1,2023-01-28T02:40:10Z,2023-01-28T02:41:14Z,,CONTRIBUTOR,,"It's pretty common that i need to trim leading or trailing white space from lots of columns in a database a part of an initial ETL. I use the following recipe a lot, and it would be great to include this functionality into sqlite-utils `trimify.sql` ```sql select 'select group_concat(''update [' || name || '] set ['' || name || ''] = trim(['' || name || ''])'', ''; '') || ''; '' as sql_to_run from pragma_table_info('''||name||''');' from sqlite_schema; ``` then something like: ```bash sqlite3 example.db < scripts/trimify.sql > table_trim.sql && \ sqlite3 $example.db < table_trim.sql > trim.sql && \ sqlite3 $example.db < trim.sql ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1572766460,I_kwDOCGYnMM5dvoL8,524,Transformation type `--type DATETIME`,21095447,4l1fe,closed,0,,,,,15,2023-02-06T15:18:42Z,2023-02-15T12:10:54Z,2023-02-15T12:10:54Z,NONE,,"Hey. Currently i do transformation with the type `--type TEXT`, but i noticed using the sqlalchemy based library [dataset](https://github.com/pudo/dataset) that is reading and writing differ depending on the column types `TEXT`, `DATETIME`. Is it possible to alter a column type to `DATETIME` somehow using Sqlite-Utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1578790070,I_kwDOCGYnMM5eGmy2,527,`Table.convert()` skips falsey values,167893,mcarpenter,closed,0,,,,,5,2023-02-10T00:00:52Z,2023-05-09T21:15:05Z,2023-05-08T21:03:24Z,CONTRIBUTOR,,"# Summary By design, `Table.convert()` does [not attempt](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2663) conversion of falsey values (`None`, `""""`, `0`, ...). This is surprising (directly contradicts the docstring) and `convert()` may quietly skip cells where the user assumed a conversion would take place. # Example Increment a column of integers by one ``` python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 0}, {col:1}]) print(table.get(1)) # 0 print(table.get(2)) # 1 print() table.convert(col, lambda x: x+1) print(table.get(1)) # got 0, expected 1 ⚠⚠⚠ print(table.get(2)) # got 2, expected 2 ``` Another example might be, say, transforming cells containing empty string to `NULL`. # Discussion This was, I think, a pragmatic choice so that consumers can skip writing guard clauses for these falsey values (particularly from the CLI). But this surprising undocumented behavior can lead to incorrect data. I don't think this is a good trade-off between convenience and correctness. In the absence of this convenience users will either have to write guard clauses into their conversion expressions (or adapt the called function to do the same), so: ``` python fn(value) if value else value ``` instead of: ``` python fn(value) ``` This is more typing and sometimes I will forget, and there will be errors. (But they will be noisy errors, which is a good thing). Such a change will certainly inconvenience some existing consumers; there will be some breakage. But I think this is worth it to avoid quietly not converting some values by default, which can lead to quietly bad data. I have a PR that I will attach, please take a look and see what you think.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1581090327,I_kwDOCGYnMM5ePYYX,529,Microsoft line endings,7908073,chapmanjacobd,closed,0,,,,,1,2023-02-12T02:20:48Z,2023-06-14T23:12:12Z,2023-06-14T23:11:47Z,CONTRIBUTOR,,"sqlite-utils prints `\r\n` but [it should probably](https://devblogs.microsoft.com/commandline/extended-eol-in-notepad/) print `\n` (unless the platform is detected as Windows?) It has tripped me up a few times when piping the output of sqlite-utils to other programs: ``` $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$ ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1595340692,I_kwDOCGYnMM5fFveU,530,"add ability to configure ""on delete"" and ""on update"" attributes of foreign keys:",536941,fgregg,open,0,,,,,2,2023-02-22T15:44:14Z,2023-05-08T20:39:01Z,,CONTRIBUTOR,,"sqlite supports these, and it would be quite nice to be able to add them with sqlite-utils. https://www.sqlite.org/foreignkeys.html#fk_actions",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,voltagex,closed,0,,,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json` ``` sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json [###################################-] 99% 00:00:00 Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ``` Please show more information as to *why* this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1620516340,I_kwDOCGYnMM5glx30,533,ReadTheDocs error: not all arguments converted during string formatting,9599,simonw,closed,0,,,,,2,2023-03-12T21:21:05Z,2023-03-12T21:25:33Z,2023-03-12T21:25:33Z,OWNER,,"This came up as a failure running tests for: - #531 Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/ ``` File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1622640374,I_kwDOCGYnMM5gt4b2,534, ResourceWarning: unclosed file,1244826,djhenderson,closed,0,,,,,1,2023-03-14T03:02:18Z,2023-05-08T19:56:29Z,2023-05-08T19:56:29Z,NONE,,"Issuing either ``` py -Wdefault -m sqlite_utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` or ``` set pythonwarnings=default sqlite-utils insert dogs.db dogs dogs0.csv --csv [#############-----------------------] 36% [####################################] 100%C:\Users\Doug\AppData\Local\Programs\Python\Python311\Lib\site-packages\sqlite_utils\cli.py:1187: ResourceWarning: unclosed file <_io.TextIOWrapper name='dogs0.csv' encoding='utf-8-sig'> insert_upsert_implementation( ResourceWarning: Enable tracemalloc to get the object allocation traceback ``` exhibits a ResourceWarning indicating that the CSV file being loaded is not closed. sqlite-utils --version sqlite-utils, version 3.30 py --version Python 3.11.2 Windows Version 10.0.19045 Build 19045 SQLite version 3.41.0 2023-02-21 18:09:37 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1655860104,I_kwDOCGYnMM5ismuI,535,rows: --transpose or psql extended view-like functionality,7908073,chapmanjacobd,closed,0,,,,,2,2023-04-05T15:37:33Z,2023-06-15T08:39:49Z,2023-06-14T22:05:28Z,CONTRIBUTOR,,"It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`) In other words instead of this: ``` sqlite-utils rows --limit 5 --fmt github track_metadata.db songs ``` | track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 | The output would look something like this: ``` $ for col in (sqlite-columns track_metadata.db songs) sqlite-utils --fmt github track_metadata.db ""select $col from songs order by rowid desc limit 5"" end ``` | track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 | ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1695428235,I_kwDOCGYnMM5lDi6L,538,`table.upsert_all` fails to write rows when `not_null` is present,1231935,xavdid,closed,0,,,,,9,2023-05-04T07:30:38Z,2023-05-08T20:06:35Z,2023-05-08T19:27:02Z,NONE,,"I found an odd bug today, where calls to `table.upsert_all` don't write rows if you include the `not_null` kwarg. ## Repro Example ```py from sqlite_utils import Database db = Database(""upsert-test.db"") db[""comments""].upsert_all( [{""id"": 1, ""name"": ""david""}], pk=""id"", not_null=[""name""], ) assert list(db[""comments""].rows) # err! ``` The schema is correctly created: ```sql CREATE TABLE [comments] ( [id] INTEGER PRIMARY KEY, [name] TEXT NOT NULL ) ``` But no rows are created. Removing either the `not_null` kwargs works as expected, as does an `insert_all` call. ## Version Info - Python: `3.11.0` - sqlite-utils: `3.30` - sqlite: `3.39.5 2022-10-14`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699174055,I_kwDOCGYnMM5lR1an,539,"`--raw-lines` option, like `--raw` for multiple lines",9599,simonw,closed,0,,,,,4,2023-05-07T18:07:46Z,2023-05-07T18:43:24Z,2023-05-07T18:26:18Z,OWNER,,I wanted to output newline-separated output of the first column of every row in the results - like `--row` but for more than one line.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699184583,I_kwDOCGYnMM5lR3_H,540,sphinx.builders.linkcheck build error,9599,simonw,closed,0,,,,,4,2023-05-07T18:37:09Z,2023-05-08T04:56:13Z,2023-05-07T18:42:36Z,OWNER,,"https://readthedocs.org/projects/sqlite-utils/builds/20512693/ ``` Running Sphinx v6.2.1 Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 442, in load_extension mod = import_module(extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File """", line 1014, in _gcd_import File """", line 991, in _find_and_load File """", line 975, in _find_and_load_unlocked File """", line 671, in _load_unlocked File """", line 783, in exec_module File """", line 219, in _call_with_frames_removed File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/builders/linkcheck.py"", line 20, in from requests import Response File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/requests/__init__.py"", line 43, in import urllib3 File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/urllib3/__init__.py"", line 38, in raise ImportError( ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168 The above exception was the direct cause of the following exception: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/cmd/build.py"", line 280, in build_main app = Sphinx(args.sourcedir, args.confdir, args.outputdir, File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 225, in __init__ self.setup_extension(extension) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 404, in setup_extension self.registry.load_extension(self, extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 445, in load_extension raise ExtensionError(__('Could not import extension %s') % extname, sphinx.errors.ExtensionError: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) Extension error: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1701018909,I_kwDOCGYnMM5lY30d,543,Tests broken on Windows due to new convert() lambda names,9599,simonw,closed,0,,,,,0,2023-05-08T22:11:29Z,2023-05-08T22:19:04Z,2023-05-08T22:19:04Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/4920084038/jobs/8788501314 ```python sql = 'update [example] set [dt] = lambda_-9223371942137158589([dt]);' ``` From: - #526",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1700840265,I_kwDOCGYnMM5lYMNJ,541,Get tests to pass with `pytest -Werror`,9599,simonw,open,0,,,,,1,2023-05-08T19:57:23Z,2023-05-08T19:59:35Z,,OWNER,,"Inspired by: - #534",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1700936245,I_kwDOCGYnMM5lYjo1,542,Remove `skip_false=True` and `--no-skip-false` in `sqlite-utils` 4.0,9599,simonw,open,0,,,9374594,4.0 backwards incomatible changes,1,2023-05-08T21:04:28Z,2023-05-08T21:07:41Z,,OWNER,,"Following: - #527 The only reason I didn't remove fix this mis-feature entirely is that it represents a backwards incompatible change. I'll make that change in 4.0.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1718607907,I_kwDOCGYnMM5mb-Aj,551,Make as many examples in the CLI docs as possible copy-and-pastable,9599,simonw,closed,0,,,,,6,2023-05-21T19:04:10Z,2023-05-21T21:04:04Z,2023-05-21T20:57:24Z,OWNER,,"e.g. in this section: https://sqlite-utils.datasette.io/en/stable/cli.html#running-queries-directly-against-csv-or-json The little copy button will also copy the `$ ` which breaks the examples when copied.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718572201,I_kwDOCGYnMM5mb1Sp,547,No need to show common values if everything is null,9599,simonw,closed,0,,,,,1,2023-05-21T17:05:07Z,2023-05-21T17:19:21Z,2023-05-21T17:19:21Z,OWNER,,"Noticed this: ``` % sqlite-utils analyze-tables content.db repos -c delete_branch_on_merge --common-limit 20 --no-least repos.delete_branch_on_merge: (1/1) Total rows: 158 Null rows: 158 Blank rows: 0 Distinct values: 0 Most common: 158: None ``` The `158: None` there is duplicate information considering we already know there are 158/158 null rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718576761,I_kwDOCGYnMM5mb2Z5,548,analyze-tables should validate provide --column names,9599,simonw,closed,0,,,,,1,2023-05-21T17:20:24Z,2023-05-21T17:35:52Z,2023-05-21T17:35:52Z,OWNER,,"Noticed this while testing: - #547 If you pass a non-existent column to `-c/--column` you don't get an error message.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718612569,I_kwDOCGYnMM5mb_JZ,552,Document how to setup shell auto-completion,9599,simonw,closed,0,,,,,1,2023-05-21T19:20:41Z,2023-05-21T21:05:16Z,2023-05-21T21:03:40Z,OWNER,,"https://click.palletsprojects.com/en/8.1.x/shell-completion/ This works for `zsh`: eval ""$(_SQLITE_UTILS_COMPLETE=zsh_source sqlite-utils)"" This will probably work for `bash`: eval ""$(_SQLITE_UTILS_COMPLETE=bash_source sqlite-utils)"" Need to add this to the installation docs here: https://sqlite-utils.datasette.io/en/stable/installation.html - along with the pattern for adding that to `.zshrc` or whatever.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718515590,I_kwDOCGYnMM5mbneG,544,New options for analyze-tables --common-limit --no-most and --no-least,9599,simonw,closed,0,,,,,2,2023-05-21T14:03:19Z,2023-05-21T17:03:06Z,2023-05-21T16:19:31Z,OWNER,,"The ""least common"" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values. sqlite-utils analyze-tables content.db repos --common-limit 20 --no-least",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718517882,I_kwDOCGYnMM5mboB6,545,Try out Trogon for a tui interface,9599,simonw,closed,0,,,,,6,2023-05-21T14:08:25Z,2023-05-21T19:33:13Z,2023-05-21T18:41:58Z,OWNER,,https://github.com/Textualize/trogon,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/545/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1720096994,I_kwDOCGYnMM5mhpji,554,"`IndexError` when doing `.insert(..., pk='id')` after `insert_all`",1231935,xavdid,open,0,,,,,1,2023-05-22T17:13:02Z,2023-05-22T17:18:33Z,,NONE,,"I believe this is related to https://github.com/simonw/sqlite-utils/issues/98. When `pk` is specified by table A's `insert` call, it throws an index error if a different table has written a row with a higher rowid than exists in the first table. Here's a basic example: ```py from sqlite_utils import Database def test_pk_for_insert(fresh_db): user = {""id"": ""abc"", ""name"": ""david""} fresh_db[""users""].insert(user, pk=""id"") fresh_db[""comments""].insert_all( [ {""id"": ""def"", ""text"": ""ok""}, {""id"": ""ghi"", ""text"": ""great""}, ], ) fresh_db[""users""].insert( user, ignore=True, # BUG: when specifying pk on the second insert call # db.py goes into a block it doesn't expect and we get the error pk=""id"", ) if __name__ == ""__main__"": db = Database(""bug.db"") if db[""users""].exists(): raise ValueError( ""bug only shows on a new database - remove bug.db before running the script"" ) test_pk_for_insert(db) ``` The error is: ```py File ""/Users/david/projects/reddit-to-sqlite/.venv/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2960, in insert_chunk row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: list index out of range ``` The issue is in this block: https://github.com/simonw/sqlite-utils/blob/2747257a3334d55e890b40ec58fada57ae8cfbfd/sqlite_utils/db.py#L2954-L2958 relevant locals are: - `pk`: `'id'` - `result.lastrowid`: `2` What's most interesting is the comment `# self.last_rowid will be 0 if a ""INSERT OR IGNORE"" happened`, which doesn't seem to be the case here. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1733198948,I_kwDOCGYnMM5nToRk,555,Filter table by a large bunch of ids,10843208,redraw,open,0,,,,,1,2023-05-31T00:29:51Z,2023-06-14T22:01:57Z,,NONE,,"Hi! this might be a question related to both SQLite & sqlite-utils, and you might be more experienced with them. I have a large bunch of ids, and I'm wondering which is the best way to query them in terms of performance, and simplicity if possible. The naive approach would be something like `select * from table where rowid in (?, ?, ?...)` but that wouldn't scale if ids are >1k. Another approach might be creating a temp table, or in-memory db table, insert all ids in that table and then join with the target one. I failed to attach an in-memory db both using sqlite-utils, and plain sql's execute(), so my closest approach is something like, ```python def filter_existing_video_ids(video_ids): db = get_db() # contains a ""videos"" table db.execute(""CREATE TEMPORARY TABLE IF NOT EXISTS tmp (video_id TEXT NOT NULL PRIMARY KEY)"") db[""tmp""].insert_all([{""video_id"": video_id} for video_id in video_ids]) for row in db[""tmp""].rows_where(""video_id not in (select video_id from videos)""): yield row[""video_id""] db[""tmp""].drop() ``` That kinda worked, I couldn't find an option in sqlite-utils's `create_table()` to tell it's a temporary table. Also, `tmp` table is not dropped finally, neither using `.drop()` despite being created with the keyword `TEMPORARY`. I believe it should be automatically dropped after connection/session ends though I read.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1740026046,I_kwDOCGYnMM5ntrC-,556,Support storing incrementally piped values,601708,mcint,open,0,,,,,1,2023-06-04T00:45:23Z,2023-06-04T01:21:15Z,,CONTRIBUTOR,,"I'm trying to use sqlite-utils to data generated incrementally. There are a few aspects of this that I don't currently know how to handle. I would like an option to apply writes incrementally, line-by-line as they are received. I would like an option to echo incremental progress. And, it would be nice to have In particular, I'm using CoreLocationCLI -w -j to generate, newline-delimited JSON. One variant of the command `stdbuf -oL CoreLocationCLI -w -j | pee 'sqlite-utils insert loc.db loc -' nl` `pee`, from `moreutils`, is like `tee` but spawns and pipes to the processes created by invoking each of its arguments, so, for gratuitous demonstration, `pee 'sponge out.log' cat` would behave like `tee`. It looks like I can get what I want with: `stdbuf -oL CoreLocationCLI -w -j | while read line; do <<<""$line"" sqlite-utils insert loc.db loc -; echo ""$line""; done | nl` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1740150327,I_kwDOCGYnMM5nuJY3,557,Aliased ROWID option for tables created from alter=True commands,7908073,chapmanjacobd,closed,0,,,,,2,2023-06-04T05:29:28Z,2023-06-14T06:09:21Z,2023-06-05T19:26:26Z,CONTRIBUTOR,,"> If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values. ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK. It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1754174496,I_kwDOCGYnMM5ojpQg,558,Ability to define unique columns when creating a table,1910303,aguinane,open,0,,,,,0,2023-06-13T06:56:19Z,2023-08-18T01:06:03Z,,NONE,,"When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set. ```python from sqlite_utils import Database columns = {""mRID"": str, ""name"": str} db = Database(""example.db"") db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], if_not_exists=True) db[""ExampleTable""].create_index([""mRID""], unique=True, if_not_exists=True) ``` So something like this would add the UNIQUE flag to the table definition. ```python db[""ExampleTable""].create(columns, pk=""mRID"", not_null=[""mRID""], unique=[""mRID""], if_not_exists=True) ``` ```sql CREATE TABLE ExampleTable ( mRID TEXT PRIMARY KEY NOT NULL UNIQUE, name TEXT ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1777548699,I_kwDOCGYnMM5p8z2b,561,`--stop-after` option for `insert` and `upsert` commands,9599,simonw,closed,0,,,,,1,2023-06-27T18:44:15Z,2023-06-27T18:50:09Z,2023-06-27T18:50:08Z,OWNER,,I found myself wanting to insert rows from a 849MB CSV file without processing the whole thing: https://huggingface.co/datasets/jerpint-org/HackAPrompt-Playground-Submissions/tree/main,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1773450152,I_kwDOCGYnMM5ptLOo,559,sqlean support,9599,simonw,closed,0,,,,,0,2023-06-25T19:27:26Z,2023-06-25T23:25:53Z,2023-06-25T23:25:53Z,OWNER,,"If sqlean is available, use that. Refs: - https://github.com/nalgeon/sqlean.py/issues/1#issuecomment-1605707788 This will provide a good workaround for: - #235 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1784794489,I_kwDOCGYnMM5qYc15,562,Explore the intersection between sqlite-utils and dataclasses,9599,simonw,open,0,,,,,1,2023-07-02T19:23:08Z,2023-07-02T19:26:39Z,,OWNER,,"> Aside: this makes me think it might be cool if `sqlite-utils` had a way of working with dataclasses rather than just dicts, and knew how to create a SQLite table to match a dataclass and maybe how to code-generate dataclasses for a specific table schema (dynamically or even using code-generation that can be written to disk, for better editor integrations). _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1616742529_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1785360409,I_kwDOCGYnMM5qanAZ,563,`--empty-null` option when importing CSV,9599,simonw,closed,0,,,,,1,2023-07-03T05:23:36Z,2023-07-03T05:44:43Z,2023-07-03T05:42:30Z,OWNER,,"CSV files with empty cells in (which come through as the empty string) are common and a bit gross. Having an option that means ""and if it's an empty string store `null` instead) would be cool. I brainstormed name options here https://chat.openai.com/share/c947b738-ee7d-419c-af90-bc84e90987da",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1786243905,I_kwDOCGYnMM5qd-tB,564,Document that running `db.transform()` tidies up the schema indentation,9599,simonw,closed,0,,,,,0,2023-07-03T13:59:28Z,2023-07-22T22:15:34Z,2023-07-22T22:15:34Z,OWNER,,"> ... and it turns out running `.transform()` with no arguments still fixes the format of the schema! ```pycon >>> db[""log""].add_column(""foo"", str)
>>> db[""log""].add_column(""bar"", str)
>>> db[""log""].add_column(""baz"", str)
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT , [foo] TEXT, [bar] TEXT, [baz] TEXT) >>> db[""log""].transform()
>>> print(db[""log""].schema) CREATE TABLE ""log"" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT, [foo] TEXT, [bar] TEXT, [baz] TEXT ) ``` _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618347727_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/564/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 1786258502,I_kwDOCGYnMM5qeCRG,565,Table renaming: db.rename_table() and sqlite-utils rename-table,9599,simonw,closed,0,,,,,6,2023-07-03T14:07:42Z,2023-07-22T22:12:40Z,2023-07-22T22:12:40Z,OWNER,,"> I find myself wanting two new features in `sqlite-utils`: > - The ability to have the new transformed table set to a specific name, while keeping the old table around > - The ability to rename a table (`sqlite-utils` doesn't have a table rename function at all right now) _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618375042_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795219865,I_kwDOCGYnMM5rAOGZ,566,`--no-headers` doesn't work on most formats,33625,zellyn,open,0,,,,,2,2023-07-09T03:43:36Z,2023-07-09T04:13:35Z,,NONE,,"Version 3.33 ``` sqlite-utils query library.db 'select asin from audible' --fmt plain --no-headers | head -3 asin 0062804006 0062891421 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1801394744,I_kwDOCGYnMM5rXxo4,567,Plugin system,15178711,asg017,closed,0,,,,,9,2023-07-12T17:02:14Z,2023-07-22T22:59:37Z,2023-07-22T22:59:36Z,CONTRIBUTOR,,"I'd like there to be a plugin system for sqlite-utils, similar to the datasette/llm plugins. I'd like to make plugins that would do things like: - Register SQLite extensions for more SQL functions + virtual tables - Register new subcommands - Different input file formats for `sqlite-utils memory` - Different output file formats (in addition to `--csv` `--tsv` `--nl` etc. A few real-world use-cases of plugins I'd like to see in sqlite-utils: - Register many of my sqlite extensions in sqlite-utils (`sqlite-http`, `sqlite-lines`, `sqlite-regex`, etc.) - New subcommands to work with `sqlite-vss` vector tables - Input/ouput Parquet/Avro/Arrow IPC files with `sqlite-arrow`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/567/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816876211,I_kwDOCGYnMM5sS1Sz,571,`.transform(keep_table=...)` option,9599,simonw,closed,0,,,,,1,2023-07-22T19:49:29Z,2023-07-22T22:32:18Z,2023-07-22T22:32:18Z,OWNER,,">> Also need a design for an option for the `.transform()` method to indicate that the new table should be created with a new name without dropping the old one. > > I think `keep_table=""name_of_table""` is good for this. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/565#issuecomment-1646657324_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816877910,I_kwDOCGYnMM5sS1tW,572,Don't test Python 3.7 against textual,9599,simonw,closed,0,,,,,2,2023-07-22T19:57:03Z,2023-07-22T22:16:50Z,2023-07-22T22:16:50Z,OWNER,,"Spotted this in the GitHub Actions logs: ![IMG_5046](https://github.com/simonw/sqlite-utils/assets/9599/81fb1093-cd8a-4019-a612-2e49b500c933) ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816919568,I_kwDOCGYnMM5sS_4Q,575,Python API ability to opt-out of connection plugins,9599,simonw,closed,0,,,,,2,2023-07-22T23:01:13Z,2023-07-22T23:17:22Z,2023-07-22T23:08:22Z,OWNER,,"Plugins affecting the CLI by default makes sense to me. I'm less confident about them _always_ affecting users of the Python API. I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this: ```python from sqlite_utils import Database db = Database(memory=True, execute_plugins=False) # Anything using db from here on will not execute plugins ``` cc @asg017 Refs: - #567 - #574 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816851056,I_kwDOCGYnMM5sSvJw,568,"table.create(..., replace=True)",9599,simonw,closed,0,,,,,7,2023-07-22T18:12:22Z,2023-07-22T19:25:35Z,2023-07-22T19:15:44Z,OWNER,,"Found myself using this pattern to quickly prototype a schema: ```python import sqlite_utils db = sqlite_utils.Database(memory=True) print(db[""answers_chunks""].create({ ""id"": int, ""content"": str, ""embedding_type_id"": int, ""embedding"": bytes, ""embedding_content_md5"": str, ""source"": str, }, pk=""id"", transform=True).schema) ``` Using `replace=True` to drop and then recreate the table would be neat here, and would be consistent with other places that use `replace=True`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816857105,I_kwDOCGYnMM5sSwoR,570,`sqlite-utils install -e` option,9599,simonw,closed,0,,,,,0,2023-07-22T18:32:23Z,2023-07-22T18:55:59Z,2023-07-22T18:32:56Z,OWNER,,"As seen in LLM. Needed while working on: - #567",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816997390,I_kwDOCGYnMM5sTS4O,576,Backfill the release notes prior to 0.4,9599,simonw,closed,0,,,,,2,2023-07-23T05:41:42Z,2023-07-23T05:49:51Z,2023-07-23T05:48:21Z,OWNER,,"Currently the changelog starts at 0.4: https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115 I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing: ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1817289521,I_kwDOCGYnMM5sUaMx,577,Get `add_foreign_keys()` to work without modifying `sqlite_master`,9599,simonw,closed,0,,,,,9,2023-07-23T20:40:18Z,2023-08-18T17:43:11Z,2023-08-18T00:48:10Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/13ebcc575d2547c45e8d31288b71a3242c16b886/sqlite_utils/db.py#L1165-L1174 This is the only place in the code that attempts to modify `sqlite_master` directly, which fails on some Python installations. Could this use the `.transform()` trick instead? Or automatically switch to that trick if it hits an error?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1818838294,I_kwDOCGYnMM5saUUW,578,Plugin hook for adding new output formats,9599,simonw,open,0,,,,,5,2023-07-24T17:29:18Z,2023-08-07T15:41:49Z,,OWNER,,"> What would it take to add a format hook? I'm still thinking about my GIS workflow, and being able to do `sqlite-utils query ... --geojson` would be nice. It's the one place my Datasette workflow is messy, having to do `datasette . --get /path/to/query.geojson --setting max_rows_returned 10000 --load-extension spatialite`. > I know the current pattern is `--csv`, but maybe `--format geojson` is more future-proof. https://discord.com/channels/823971286308356157/997738192360964156/1133076679011602432",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1821108702,I_kwDOCGYnMM5si-ne,579,Special handling for SQLite column of type `JSON`,15178711,asg017,open,0,,,,,0,2023-07-25T20:37:23Z,2023-07-25T20:37:23Z,,CONTRIBUTOR,,"`sqlite-utils` should detect and have specially handling for column with a `JSON` column. For example: ```sql CREATE TABLE ""dogs"" ( id INTEGER PRIMARY KEY, name TEXT, friends JSON ); ``` ## Automatic Nesting According to [""Nested JSON Values""](https://sqlite-utils.datasette.io/en/stable/cli.html#nested-json-values), sqlite-utils will only expand JSON if the `--json-cols` flag is passed. It looks like it'll try to `json.load` all text column to test if its JSON, which can get expensive on non-json columns. Instead, `sqlite-utils` should be default (ie without the `--json-cols` flags) do the `maybe_json()` operation on columns with a declared `JSON` type. So the above table would expand the `""friends""` column as expected, withoutthe `--json-cols` flag: ```bash sqlite-utils dogs.db ""select * from dogs"" | python -mjson.tool ``` ``` [ { ""id"": 1, ""name"": ""Cleo"", ""friends"": [ { ""name"": ""Pancakes"" }, { ""name"": ""Bailey"" } ] } ] ``` --- I'm sure there's other ways `sqlite-utils` can specially handle JSON columns, so keeping this open while I think of more",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822918995,I_kwDOCGYnMM5sp4lT,580,Add way to export to a csv file using the Python library,44324811,kevinlinxc,open,0,,,,,0,2023-07-26T18:09:26Z,2023-07-26T18:09:26Z,,NONE,,"According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1823160748,I_kwDOCGYnMM5sqzms,581,`sqlite-utils convert --pdb` option,9599,simonw,closed,0,,,,,1,2023-07-26T21:02:50Z,2023-07-26T21:07:45Z,2023-07-26T21:06:10Z,OWNER,,While using `sqlite-utils convert` I realized it would be handy if you could pass `--pdb` to have it open the debugger at the first instance of a failed conversion.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1839344979,I_kwDOCGYnMM5toi1T,582,Handling CSV/file input that contains NUL bytes,1448859,betatim,open,0,,,,,0,2023-08-07T12:24:14Z,2023-08-07T12:24:14Z,,NONE,,"I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte. When the processing reaches the line that contains the NUL an exception is raised. I'm wondering if there is something that can be done in `sqlite-utils` to say ""skip lines with encoding errors"" or some such. I think it isn't super straightforward though as the exception comes from inside the `csv` module that does all the parsing. Concretely the file is the `KernelVersions.csv` from https://www.kaggle.com/datasets/kaggle/meta-kaggle This is the command and output: ``` $ sqlite-utils insert --csv kaggle.db kaggle KernelVersions.csv [------------------------------------] 0% [#####################---------------] 60% 00:04:24Traceback (most recent call last): File ""/home/foobar/miniconda/envs/meta-kaggle/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1223, in insert insert_upsert_implementation( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1085, in insert_upsert_implementation db[table].insert_all( File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3198, in insert_all chunk = list(chunk) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3742, in fix_square_braces for record in records: File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1071, in docs = (decode_base64_values(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1068, in docs = (verify_is_dict(doc) for doc in docs) File ""/home/foobar/miniconda/envs/meta-kaggle/lib/python3.10/site-packages/sqlite_utils/cli.py"", line 1003, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: line contains NUL ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1855836914,I_kwDOCGYnMM5undLy,583,Get rid of test.utils.collapse_whitespace,9599,simonw,closed,0,,,,,1,2023-08-17T23:31:09Z,2023-08-18T00:59:19Z,2023-08-18T00:59:19Z,OWNER,,"I have a neater pattern for this now - instead of: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L472-L475 I now prefer: https://github.com/simonw/sqlite-utils/blob/1dc6b5aa644a92d3654f7068110ed7930989ce71/tests/test_create.py#L1163-L1171",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1855894222,I_kwDOCGYnMM5unrLO,585,CLI equivalents to `transform(add_foreign_keys=)`,9599,simonw,closed,0,,,,,7,2023-08-18T01:07:15Z,2023-08-18T01:51:16Z,2023-08-18T01:51:15Z,OWNER,,"The new options added in: - #577 Deserve consideration in the CLI as well. https://github.com/simonw/sqlite-utils/blob/d2bcdc00c6ecc01a6e8135e775ffdb87572b802b/sqlite_utils/db.py#L1706-L1708",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,simonw,open,0,,,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1868713944,I_kwDOCGYnMM5vYk_Y,588,`table.get(column=value)` option for retrieving things not by their primary key,9599,simonw,open,0,,,,,1,2023-08-28T00:41:23Z,2023-08-28T00:41:54Z,,OWNER,,"This came up working on this feature: - https://github.com/simonw/llm/pull/186 I have a table with this schema: ```sql CREATE TABLE [collections] ( [id] INTEGER PRIMARY KEY, [name] TEXT, [model] TEXT ); CREATE UNIQUE INDEX [idx_collections_name] ON [collections] ([name]); ``` So the primary key is an integer (because it's going to have a huge number of rows foreign key related to it, and I don't want to store a larger text value thousands of times), but there is a unique constraint on the `name` - that would be the primary key column if not for all of those foreign keys. Problem is, fetching the collection by name is actually pretty inconvenient. Fetch by numeric ID: ```python try: table[""collections""].get(1) except NotFoundError: # It doesn't exist ``` Fetching by name: ```python def get_collection(db, collection): rows = db[""collections""].rows_where(""name = ?"", [collection]) try: return next(rows) except StopIteration: raise NotFoundError(""Collection not found: {}"".format(collection)) ``` It would be neat if, for columns where we know that we should always get 0 or one result, we could do this instead: ```python try: collection = table[""collections""].get(name=""entries"") except NotFoundError: # It doesn't exist ``` The existing `.get()` method doesn't have any non-positional arguments, so using `**kwargs` like that should work: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L1495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,simonw,open,0,,,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1879214365,I_kwDOCGYnMM5wAokd,590,Ability to tell if a Database is an in-memory one,9599,simonw,open,0,,,,,1,2023-09-03T19:50:15Z,2023-09-03T19:50:36Z,,OWNER,,"Currently the constructor accepts `memory=True` or `memory_name=...` and uses those to create a connection, but does not record what those values were: https://github.com/simonw/sqlite-utils/blob/1260bdc7bfe31c36c272572c6389125f8de6ef71/sqlite_utils/db.py#L307-L349 This makes it hard to tell if a database object is to an in-memory or a file-based database, which is sometimes useful to know.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1886771493,I_kwDOCGYnMM5wddkl,592,`table.transform()` should preserve `rowid` values,9599,simonw,closed,0,,,,,6,2023-09-08T00:42:38Z,2023-09-10T17:46:41Z,2023-09-09T00:45:32Z,OWNER,,"I just spotted a bug when using https://datasette.io/plugins/datasette-configure-fts and https://datasette.io/plugins/datasette-edit-schema at the same time. Steps to reproduce: - Configure FTS for a table, then run a test search - Edit the schema for that table and change the order of columns - Run the test search again I got the wrong search results, which I think is because the `_fts` table pointed to the first table by `rowid` but those `rowid` values were entirely rewritten as a consequence of running `table.transform()` on the table. Reconfiguring FTS on the table fixed the problem. I think `table.transform()` should be able to preserve `rowid` values.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1891614971,I_kwDOCGYnMM5wv8D7,594,Represent compound foreign keys in table.foreign_keys output,9599,simonw,open,0,,,,,2,2023-09-12T03:48:24Z,2023-09-12T03:51:13Z,,OWNER,,"Given this schema: ```sql CREATE TABLE departments ( campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, dept_name TEXT, PRIMARY KEY (campus_name, dept_code) ); CREATE TABLE courses ( course_code TEXT PRIMARY KEY, course_name TEXT, campus_name TEXT NOT NULL, dept_code TEXT NOT NULL, FOREIGN KEY (campus_name, dept_code) REFERENCES departments(campus_name, dept_code) ); ``` The output of `db[""courses""].foreign_keys` right now is: ``` [ForeignKey(table='courses', column='campus_name', other_table='departments', other_column='campus_name'), ForeignKey(table='courses', column='dept_code', other_table='departments', other_column='dept_code')] ``` Which suggests two normal foreign keys, not one compound foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1907281675,I_kwDOCGYnMM5xrs8L,595,Cascading DELETE not working with Table.delete(pk),123451970,cycle-data,closed,0,,,,,1,2023-09-21T15:46:41Z,2023-09-25T09:38:57Z,2023-09-25T09:38:13Z,NONE,,"Hi ! I noticed that when I am trying to use the delete method of the Table object, the record get properly deleted from the table, but the cascading delete triggers on foreign keys do not activate. `self.db[""contact""].delete(contact_id)` I tried querying the database directly via DB Browser and the triggers work without any issue. Looked up the source code and behind the scene this method is just querying the database normally so I'm not exactly sure where this behavior comes from. Thank you in advance for your time ! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1920416843,I_kwDOCGYnMM5ydzxL,597,sqlite-utils insert-files should be able to convert fields,1737541,grimnight,open,0,,,,,0,2023-09-30T22:20:47Z,2023-09-30T22:20:47Z,,NONE,,"Currently using both `insert-files` and `convert` is needed in order to create sqlar files, it would be more convenient if it could be done with just one command. ```shell ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100% ~ ❯ sqlite-utils convert test.sqlar sqlar data ""zlib.compress(value)"" --import=zlib --where ""name = 'test.py'"" [####################################] 100% ~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data ""zlib.compress(sys.stdin.buffer.read())"" --import=zlib --import=sys --where ""name = 'test.py'"" # Alternative way [####################################] 100% ~ ❯ sqlite3 test.sqlar ""SELECT hex(data) FROM sqlar WHERE name = 'test.py';"" | python3 -c ""import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))"" import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ~ ❯ rm test.py ~ ❯ sqlar -l test.sqlar test.py ~ ❯ sqlar -x test.sqlar ~ ❯ cat test.py import os class Example: def __init__(self, arg1, arg2): self.arg1 = arg1 ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1373210675,I_kwDODD6af85R2Ygz,13,fails before generating views. ERR: table sqlite_master may not be modified,116795,pax,open,0,,,,,4,2022-09-14T15:41:50Z,2023-04-11T03:46:17Z,,NONE,,"generates checkins.db but seems to fail before generating views note: it worked on an Ubuntu WSL but fails on macOS 12.5.1 later edit: I suspect this is a problem with my local set-up, `dogsheep-beta index` also throws the same error full error: Importing 2591 checkins [###################################-] 98% 00:00:00 Traceback (most recent call last): File ""/Users/pax/devbox/envAll/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/cli.py"", line 77, in cli ensure_foreign_keys(db) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 145, in ensure_foreign_keys db[fk.table].add_foreign_key(fk.column, fk.other_table, fk.other_column) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2123, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1086, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1063982712,I_kwDODEm0Qs4_axZ4,60,Execution on Windows,1733616,bernard01,open,0,,,,,1,2021-11-26T00:24:34Z,2022-10-14T16:58:27Z,,NONE,,"My installation on Windows using pip has been successful. I have Python 3.6. How do I run twitter-to-sqlite? I cannot even figure out how ""auth"" is a command. I have python on my path: C:\prog\python\Python36;C:\prog\python\Python36\Scripts Where should the commands be executed, and where are the files created? Could some basics please be added to the documentation to get beginners started?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077560091,I_kwDODEm0Qs5AOkMb,61,"Data Pull fails for ""Essential"" level access to the Twitter API (for Documentation)",57161638,jmnickerson05,open,0,,,,,1,2021-12-11T14:59:41Z,2022-10-31T14:47:58Z,,NONE,,"Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note. ![image](https://user-images.githubusercontent.com/57161638/145681272-8c85b3b9-be95-44ff-9760-1bafa4917ce2.png) And this is how I'm surfacing the message from utils.py: ![image](https://user-images.githubusercontent.com/57161638/145681005-2776c0ad-9822-4461-b43a-450ab2e828eb.png) ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091850530,I_kwDODEm0Qs5BFFEi,63,Import archive error 'withheld_in_countries',521097,pauloxnet,open,0,,,,,0,2022-01-01T16:58:59Z,2022-01-01T16:58:59Z,,NONE,,"Importing the twitter archive I received this error: ```bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File ""/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 759, in import_ archive.import_from_file(db, filename, content) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2625, in insert_all self.insert_chunk( File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2406, in insert_chunk result = self.db.execute(query, params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries ``` I found only a single tweet with the key `withheld_in_countries` in `tweet.js` that seems the problems: ```JSON [ { ""tweet"" : { ""retweeted"" : false, ""source"" : ""Twitter for Android"", ""entities"" : { ""hashtags"" : [ { ""text"" : ""NowOnAndroid"", ""indices"" : [ ""64"", ""77"" ] } ], ""symbols"" : [ ], ""user_mentions"" : [ { ""name"" : ""Periscope"", ""screen_name"" : ""PeriscopeCo"", ""indices"" : [ ""3"", ""15"" ], ""id_str"" : ""1111111111"", ""id"" : ""222222222"" } ], ""urls"" : [ { ""url"" : ""https://t.co/xxxxxxxxx"", ""expanded_url"" : ""https://vine.co/v/xxxxxxxxx"", ""display_url"" : ""vine.co/v/xxxxxxxxxx"", ""indices"" : [ ""78"", ""101"" ] } ] }, ""display_text_range"" : [ ""0"", ""101"" ], ""favorite_count"" : ""0"", ""id_str"" : ""1111111111111111111111"", ""truncated"" : false, ""retweet_count"" : ""0"", ""withheld_in_countries"" : [ ""TR"" ], ""id"" : ""000000000000000000"", ""possibly_sensitive"" : false, ""created_at"" : ""Fri Aug 14 06:04:03 +0000 2015"", ""favorited"" : false, ""full_text"" : ""RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk"", ""lang"" : ""en"" } } ] ``` I solved the error removing the key from the `tweet.js` but I'm reporting this error to improve the project.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097332098,I_kwDODEm0Qs5BZ_WC,64,Include all entities for tweets,111631,max,open,0,,,,,0,2022-01-09T23:35:28Z,2022-01-09T23:35:28Z,,NONE,,"Per our conversation [on Twitter](https://twitter.com/mschoening/status/1480312477246054401): It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type `media`.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524431805,I_kwDODEm0Qs5a3Pu9,72,"Import thread, including self- and others' replies",601708,mcint,open,0,,,,,0,2023-01-08T09:51:06Z,2023-01-08T09:51:06Z,,NONE,,"statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this. `twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234` twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is [implemented in twarc](https://sourcegraph.com/github.com/DocNow/twarc/-/blob/twarc/client.py?L708-766&subtree=true), using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a [search query](https://stackoverflow.com/a/30480103/1020467), use of [undocumented `related_results` api](https://stackoverflow.com/a/9419346/1020467), or with requested inclusion of [newer conversation_id](https://stackoverflow.com/a/68115718/1020467) with subsequent query. ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1816830546,I_kwDODEm0Qs5sSqJS,73,Twitter v1 API shutdown,6341745,david-perez,open,0,,,,,0,2023-07-22T16:57:41Z,2023-07-22T16:57:41Z,,NONE,,"I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get: ``` [2023-07-19 21:00:04.937536] File ""/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets[""errors""])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}] ``` It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they [announced they'd be deprecated on 29th April](https://twittercommunity.com/t/reminder-to-migrate-to-the-new-free-basic-or-enterprise-plans-of-the-twitter-api/189737). Unfortunately [retrieving likes using the v2 API](https://developer.twitter.com/en/docs/twitter-api/tweets/likes/introduction) is not part of their [free plan](https://developer.twitter.com/en/portal/products). In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 1353411865,I_kwDODEpn8M5Qq20Z,1,Problem with my user,2467,fernand0,open,0,,,,,0,2022-08-28T16:59:37Z,2022-08-28T16:59:37Z,,NONE,,"If I call the program with: inaturalist-to-sqlite inaturalist.db ftricas the program exits with an error: `Importing 36 observations Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/inaturalist-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/cli.py"", line 51, in cli save_observation(observation, db) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/utils.py"", line 34, in save_observation db[""observations""] File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2965, in insert return self.insert_all( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3068, in insert_all self.create( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 1564, in create self.db.create_table( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 951, in create_table sql = self.create_table_sql( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 765, in create_table_sql foreign_keys = self.resolve_foreign_keys(name, foreign_keys or []) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 702, in resolve_foreign_keys other_table = table.guess_foreign_table(column) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2061, in guess_foreign_table raise NoObviousTable( sqlite_utils.db.NoObviousTable: No obvious foreign key table for column 'taxon' - tried ['taxon', 'taxons'] ` If I call the program with your user everything seems to go well and then, I can call the program with my own user without problems. Moreover, I can call the program again with my own user and everything goes well now. Additional info, the command: sqlite-utils tables inaturalist.db shows that the correct name can be 'taxons'. There is another small problem with a warning: warnings.warn(""urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "" ",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1485017981,I_kwDODEpn8M5Yg5N9,2,table identifications has no column named previous_observation_taxon,520541,heaversm,open,0,,,,,0,2022-12-08T16:47:17Z,2022-12-08T16:47:17Z,,NONE,,"Installed successfully with pip and ran `inaturalist-to-sqlite inaturalist.db simonw` and got the error: ``` sqlite3.OperationalError: table identifications has no column named previous_observation_taxon ```",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1123393829,I_kwDODFE5qs5C9aEl,10,sqlite3.OperationalError: no such table: main.my_activity,69208826,glxblt14,open,0,,,,,1,2022-02-03T17:59:29Z,2022-03-20T02:38:07Z,,NONE,,"Hello, When i run the command `google-takeout-to-sqlite my-activity db.db takeout-20220203T174446Z-001.zip`, i get this error : ``` Traceback (most recent call last): File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""C:\Users\julie\AppData\Local\Programs\Python\Python39-32\Scripts\google-takeout-to-sqlite.exe\__main__.py"", line 7, in File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1053, in main rv = self.invoke(ctx) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\cli.py"", line 31, in my_activity utils.save_my_activity(db, zf) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\utils.py"", line 19, in save_my_activity db[""my_activity""].create_index([""time""]) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\sqlite_utils\db.py"", line 629, in create_index self.db.conn.execute(sql) sqlite3.OperationalError: no such table: main.my_activity ``` Thank you for your help Sorry for my bad English EDIT: i used the json format",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557599877,I_kwDODFE5qs5c1xaF,12,location history changes,14809320,gerardrbentley,open,0,,,,,0,2023-01-26T03:57:25Z,2023-01-26T03:57:25Z,,NONE,,"not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from ""Location History.json"" to ""Records.json"" `""timestampMs""` is not present, `""timestamp""` is roughly iso timestamp ```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%SZ"").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%S.%fZ"").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open(""Takeout/Location History/Records.json"") ) db[""location_history""].upsert_all( ( { ""id"": id_for_location_history(row), ""latitude"": row[""latitudeE7""] / 1e7, ""longitude"": row[""longitudeE7""] / 1e7, ""accuracy"": row[""accuracy""], ""timestampMs"": get_timestamp_ms(row[""timestamp""]), ""when"": row[""timestamp""], } for row in location_history[""locations""] ), pk=""id"", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=("","", "":""), sort_keys=True).encode(""utf8"") ).hexdigest()[:6] return ""{}-{}"".format( row['timestamp'], first_six, ) ``` example locations from mine ```json { ""latitudeE7"": 427220206, ""longitudeE7"": -923423972, ""accuracy"": 10, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:31:50.867Z"" } ``` ```json { ""latitudeE7"": 427011317, ""longitudeE7"": -923448300, ""accuracy"": 5, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:33:53Z"" }, ```",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 2}",, 1071071397,I_kwDODFdgUs4_10Cl,69,View that combines issues and issue comments,9599,simonw,open,0,,,,,1,2021-12-04T00:34:33Z,2021-12-04T00:34:52Z,,MEMBER,,I want to see a reverse chronologically ordered interface onto both issues and comments - essentially a unified log of comments and issues opened across one or multiple projects.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1211283427,I_kwDODFdgUs5IMrfj,72,feature: display progress bar when downloading multi-page responses,9020979,hydrosquall,open,0,,,,,1,2022-04-21T16:37:12Z,2022-04-21T17:29:31Z,,NONE,,"## Motivation For a long running command (longer than 1 minute) for a big table (like pull requests or commits), it can be tricky to know if the script is still running, or if a rate limit/error was encountered We know how many pages there are, so it may be possible to indicate how many remain. https://github.com/dogsheep/github-to-sqlite/blob/a6e237f75a4b86963d91dcb5c9582e3a1b3349d6/github_to_sqlite/utils.py#L367 ## Resources - Using the existing Click API: - https://click.palletsprojects.com/en/5.x/utils/#showing-progress-bars - Loading spinner: https://github.com/pavdmyt/yaspin - Progress bar: https://github.com/tqdm/tqdm",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363244199,I_kwDODFdgUs5RQXSn,75,Fetch repos doesn't support organisations,2757699,OverkillGuy,open,0,,,,,0,2022-09-06T12:55:06Z,2022-09-06T12:55:06Z,,NONE,,"Say I want to get all my Github Org's repos info, for data analysis. Not just the public repos, but also the private/internal repos. The endpoints are different for organisation, and this tool doesn't take it into account: https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L453 https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L455 The endpoints for organisation repos is instead ([source](https://docs.github.com/en/rest/repos/repos#list-organization-repositories)): `url = ""https://api.github.com/orgs/{}/repos"".format(username)` Let's add support for organisations repo scraping.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1410548368,I_kwDODFdgUs5UE0KQ,77,Feature: Support GitHub discussions,631242,frosencrantz,open,0,,,,,0,2022-10-16T16:53:38Z,2022-10-16T16:53:38Z,,CONTRIBUTOR,,"Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it. I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database. However, I'm not offering a pull request.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1505411725,I_kwDODFdgUs5ZusKN,78,self-hosted or corp github enterprise,549431,ebdavison,open,0,,,,,0,2022-12-20T22:51:45Z,2022-12-20T22:51:45Z,,NONE,,"We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,simonw,open,0,,,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1205867842,I_kwDODtX3eM5H4BVC,4,Retrieve the top-level story for a comment,1755789,telotortium,open,0,,,,,0,2022-04-15T20:25:39Z,2022-04-15T20:25:39Z,,NONE,,"I think that each comment inserted into the database should include a column `onstory` that contains the ID of the story on which the comment was made. This is exactly equivalent to the link after ""on:"" at the top of an HN comment page ([example](https://news.ycombinator.com/item?id=18358028)). We could do this either by directly retrieving the HTML page and using Beautiful Soup to find that link, or alternatively recurse up the tree in the Firebase API using the `parent` field (probably using `functools.lru_cache` in case a person has commented a bunch of times on the same story).",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1943259395,I_kwDOEhK-wc5z08kD,16, time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ',3746270,linonetwo,open,0,,,,,0,2023-10-14T13:24:39Z,2023-10-14T13:24:39Z,,NONE,," ``` evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File ""/usr/local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) ^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 46, in save_note ""created"": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 111, in convert_datetime return datetime.datetime.strptime(s, ""%Y%m%dT%H%M%SZ"").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 349, in _strptime raise ValueError(""time data %r does not match format %r"" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' ``` enex is exported by evernote mac client ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,simonw,open,0,,,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then. Problem is I don't actually know what order it iterates over the notes in.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,simonw,open,0,,,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,simonw,open,0,,,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it: > Or just ""Actions ⚙️ "" And got back: > `Actions ‚öôÔ∏è` Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this: ```python s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè' s = s.encode('latin-1') s = s.decode('utf-8') s = s.encode('macroman') s = s.decode('utf-8') print(s) ``` ",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,simonw,open,0,,,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1650981564,I_kwDOJHON9s5iZ_q8,12,Error running pytest,14314871,amlestin,open,0,,,,,0,2023-04-02T15:02:36Z,2023-04-02T15:07:10Z,,NONE,,"`______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'` Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using `python -m pytest` The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 276192732,MDExOlB1bGxSZXF1ZXN0MTU0MjQ2ODE2,145,Fix pytest version conflict,9599,simonw,closed,0,,,,,0,2017-11-22T20:15:34Z,2017-11-22T20:17:54Z,2017-11-22T20:17:52Z,OWNER,simonw/datasette/pulls/145,"https://travis-ci.org/simonw/datasette/jobs/305929426 pkg_resources.VersionConflict: (pytest 3.2.1 (/home/travis/virtualenv/python3.5.3/lib/python3.5/site-packages), Requirement.parse('pytest==3.2.3'))",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 280662866,MDExOlB1bGxSZXF1ZXN0MTU3MzY1ODEx,168,Upgrade to Sanic 0.7.0,9599,simonw,closed,0,,,,,1,2017-12-09T01:25:08Z,2017-12-09T03:00:34Z,2017-12-09T03:00:34Z,OWNER,simonw/datasette/pulls/168,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273595473,MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw,81,:fire: Removes DS_Store,50527,jefftriplett,closed,0,,,,,2,2017-11-13T22:07:52Z,2017-11-14T02:24:54Z,2017-11-13T22:16:55Z,CONTRIBUTOR,simonw/datasette/pulls/81,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273816720,MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy,89,SQL syntax highlighting with CodeMirror,15543,tomdyson,closed,0,,,,,1,2017-11-14T14:43:33Z,2017-11-15T02:03:01Z,2017-11-15T02:03:01Z,CONTRIBUTOR,simonw/datasette/pulls/89,"Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {""Ctrl-Space"": ""autocomplete""}, hintOptions: {tables: { users: [""name"", ""score"", ""birthDate""], countries: [""name"", ""population"", ""size""] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273961179,MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw,94,Initial add simple prod ready Dockerfile refs #57,247192,macropin,closed,0,,,,,1,2017-11-14T22:09:09Z,2017-11-15T03:08:04Z,2017-11-15T03:08:04Z,CONTRIBUTOR,simonw/datasette/pulls/94,"Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274284246,MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw,104,[WIP] Add publish to heroku support,21148,jacobian,closed,0,,,,,6,2017-11-15T19:56:22Z,2017-11-21T20:55:05Z,2017-11-21T20:55:05Z,CONTRIBUTOR,simonw/datasette/pulls/104," Refs #90 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274733145,MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1,114,"Add spatialite, switch to debian and local build",54999,ingenieroariel,closed,0,,,,,1,2017-11-17T02:37:09Z,2017-11-17T03:50:52Z,2017-11-17T03:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/114,"Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274877366,MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy,115,Add keyboard shortcut to execute SQL query,198537,rgieseke,closed,0,,,,,1,2017-11-17T14:13:33Z,2017-11-17T15:16:34Z,2017-11-17T14:22:56Z,CONTRIBUTOR,simonw/datasette/pulls/115,"Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274900388,MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx,117,Don't prevent tabbing to `Run SQL` button,198537,rgieseke,closed,0,,,,,1,2017-11-17T15:27:50Z,2017-11-19T20:30:24Z,2017-11-18T00:53:43Z,CONTRIBUTOR,simonw/datasette/pulls/117,"Mentioned in #115 Here you go!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275048699,MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1,118,Foreign key information on row and table pages,9599,simonw,closed,0,,,,,0,2017-11-18T03:13:27Z,2017-11-18T03:15:57Z,2017-11-18T03:15:50Z,OWNER,simonw/datasette/pulls/118,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 291451116,MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3,182,Add db filesize next to download link,3433657,raynae,closed,0,,,,,0,2018-01-25T04:58:56Z,2019-03-22T13:50:57Z,2019-02-06T04:59:38Z,CONTRIBUTOR,simonw/datasette/pulls/182,"Took a stab at #172, will this do the trick?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 287240246,MDExOlB1bGxSZXF1ZXN0MTYxOTgyNzEx,178,"If metadata exists, add it to heroku launch command",82988,psychemedia,closed,0,,,,,1,2018-01-09T21:42:21Z,2018-01-15T09:42:46Z,2018-01-14T21:05:16Z,CONTRIBUTOR,simonw/datasette/pulls/178,"The heroku build does seem to make use of any provided `metadata.json` file. Add the `--metadata` switch to the Heroku web launch command if a `metadata.json` file is available. Addresses: https://github.com/simonw/datasette/issues/177",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 289375133,MDExOlB1bGxSZXF1ZXN0MTYzNTIzOTc2,180,make html title more readable in query template,56477,ryanpitts,closed,0,,,,,0,2018-01-17T18:56:03Z,2018-04-03T16:03:38Z,2018-04-03T15:24:05Z,CONTRIBUTOR,simonw/datasette/pulls/180,tiny tweak to make this easier to visually parse—I think it matches your style in other templates,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 289425975,MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw,181,"add ""format sql"" button to query page, uses sql-formatter",1957344,bsmithgall,closed,0,,,,,7,2018-01-17T21:50:04Z,2019-11-11T03:08:25Z,2019-11-11T03:08:25Z,NONE,simonw/datasette/pulls/181,"Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 310850458,MDExOlB1bGxSZXF1ZXN0MTc5MTA4OTYx,192,New ?_shape=objects/object/lists param for JSON API,9599,simonw,closed,0,,,,,0,2018-04-03T14:02:58Z,2018-04-03T14:53:00Z,2018-04-03T14:52:55Z,OWNER,simonw/datasette/pulls/192,Refs #122,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 319371036,MDExOlB1bGxSZXF1ZXN0MTg1MzA3NDA3,246,?_shape=array and _timelimit=,9599,simonw,closed,0,,,,,0,2018-05-02T00:18:54Z,2018-05-02T00:20:41Z,2018-05-02T00:20:40Z,OWNER,simonw/datasette/pulls/246,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322591993,MDExOlB1bGxSZXF1ZXN0MTg3NjY4ODkw,257,Refactor views,9599,simonw,closed,0,,,,,5,2018-05-13T13:00:50Z,2018-05-14T03:04:25Z,2018-05-14T03:04:24Z,OWNER,simonw/datasette/pulls/257,"* Split out view classes from main `app.py` * Run [black](https://github.com/ambv/black) against resulting code to apply opinionated source code formatting * Run [isort](https://github.com/timothycrosley/isort) to re-order my imports Refs #256 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,philroche,closed,0,,,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 323459939,MDExOlB1bGxSZXF1ZXN0MTg4MzEyNDEx,261,Facets improvements plus suggested facets,9599,simonw,closed,0,,,,,0,2018-05-16T03:52:39Z,2018-05-16T15:27:26Z,2018-05-16T15:27:25Z,OWNER,simonw/datasette/pulls/261,Refs #255,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 324836533,MDExOlB1bGxSZXF1ZXN0MTg5MzE4NDUz,277,Refactor inspect logic,45057,russss,closed,0,,,,,2,2018-05-21T08:49:31Z,2018-05-22T16:07:24Z,2018-05-22T14:03:07Z,CONTRIBUTOR,simonw/datasette/pulls/277,"This pulls the logic for inspect out into a new file which makes it a bit easier to understand. This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325352370,MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0,279,Add version number support with Versioneer,198537,rgieseke,closed,0,,,,,4,2018-05-22T15:39:45Z,2018-05-22T19:35:23Z,2018-05-22T19:35:22Z,CONTRIBUTOR,simonw/datasette/pulls/279,"I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325373747,MDExOlB1bGxSZXF1ZXN0MTg5NzIzNzE2,280,Build Dockerfile with recent Sqlite + Spatialite,565628,r4vi,closed,0,,,,,10,2018-05-22T16:33:50Z,2018-06-28T11:26:23Z,2018-05-23T17:43:35Z,CONTRIBUTOR,simonw/datasette/pulls/280,"This solves #278 without bloating the Dockerfile too much, the image size is now 495MB (original was ~240MB) but it could be reduced significantly if we only copied the output of the compilation of spatialite and friends to /usr/local/lib, instead of the entirety of it however that will take more time. In the python code change references to `import sqlite3` to `import pysqlite3` and it should use the compiled version of sqlite3.23.1. You don't need to try/except because pysqlite3 falls back to builtin sqlite3 if there is no compiled version. ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter "".help"" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter "".help"" for instructions Enter SQL statements terminated with a "";"" spatialite> ``` ```bash $ docker run --rm -it datasette python -c ""import pysqlite3; print(pysqlite3.sqlite_version)"" 3.23.1 ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325553991,MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy,281,Reduces image size using Alpine + Multistage (re: #278),487897,iMerica,closed,0,,,,,1,2018-05-23T05:27:05Z,2018-05-26T02:10:38Z,2018-05-26T02:10:38Z,NONE,simonw/datasette/pulls/281,"Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 312355154,MDExOlB1bGxSZXF1ZXN0MTgwMTg4Mzk3,196,_sort= and _sort_desc= parameters to table view,9599,simonw,closed,0,,,,,0,2018-04-09T00:07:21Z,2018-04-09T05:10:29Z,2018-04-09T05:10:23Z,OWNER,simonw/datasette/pulls/196,See #189 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313494458,MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0,200,Hide Spatialite system tables,45057,russss,closed,0,,,,,3,2018-04-11T21:26:58Z,2018-04-12T21:34:48Z,2018-04-12T21:34:48Z,CONTRIBUTOR,simonw/datasette/pulls/200,They were getting on my nerves.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313785206,MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4,202,Raise 404 on nonexistent table URLs,45057,russss,closed,0,,,,,2,2018-04-12T15:47:06Z,2018-04-13T19:22:56Z,2018-04-13T18:19:15Z,CONTRIBUTOR,simonw/datasette/pulls/202,"Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314256802,MDExOlB1bGxSZXF1ZXN0MTgxNjAwOTI2,204,Initial units support,45057,russss,closed,0,,,,,0,2018-04-13T21:32:49Z,2018-04-14T09:44:33Z,2018-04-14T03:32:54Z,CONTRIBUTOR,simonw/datasette/pulls/204,"Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). Example table metadata: ```json ""license_frequency"": { ""units"": { ""frequency"": ""Hz"", ""channel_width"": ""Hz"", ""height"": ""m"", ""antenna_height"": ""m"", ""azimuth"": ""degrees"" } } ``` [Example result](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency/1) This works surprisingly well! I'd like to add support for using units when querying but this is PR is pretty usable as-is. (Pint doesn't seem to support decibels though - it thinks they're decibytes - which is an annoying omission.) (ref ticket #203)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/204/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314323977,MDExOlB1bGxSZXF1ZXN0MTgxNjQ0ODA1,206,Fix sqlite error when loading rows with no incoming FKs,45057,russss,closed,0,,,,,0,2018-04-14T12:08:17Z,2018-04-14T14:32:42Z,2018-04-14T14:24:25Z,CONTRIBUTOR,simonw/datasette/pulls/206,"This fixes `ERROR: conn=, sql = 'select ', params = {'id': '1'}` caused by an invalid query loading incoming FKs when none exist. The error was ignored due to async but it still got printed to the console.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314329002,MDExOlB1bGxSZXF1ZXN0MTgxNjQ3NzE3,207,Link foreign keys which don't have labels,45057,russss,closed,0,,,,,1,2018-04-14T13:27:14Z,2018-04-14T15:00:00Z,2018-04-14T15:00:00Z,CONTRIBUTOR,simonw/datasette/pulls/207,"This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. ![image](https://user-images.githubusercontent.com/45057/38768722-ea15a000-3fef-11e8-8664-ffd7aa4894ea.png) Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314319372,MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0,205,Support filtering with units and more,45057,russss,closed,0,,,,,3,2018-04-14T10:47:51Z,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,CONTRIBUTOR,simonw/datasette/pulls/205,"The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314340944,MDExOlB1bGxSZXF1ZXN0MTgxNjU0ODM5,208,Return HTTP 405 on InvalidUsage rather than 500,45057,russss,closed,0,,,,,0,2018-04-14T16:12:50Z,2018-04-14T18:00:39Z,2018-04-14T18:00:39Z,CONTRIBUTOR,simonw/datasette/pulls/208,"This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314455877,MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz,209, Don't duplicate simple primary keys in the link column,45057,russss,closed,0,,,,,6,2018-04-15T21:56:15Z,2018-04-18T08:40:37Z,2018-04-18T01:13:04Z,CONTRIBUTOR,simonw/datasette/pulls/209,"When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text ""view"" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314469126,MDExOlB1bGxSZXF1ZXN0MTgxNzMxOTU2,210,"Start of the plugin system, based on pluggy",9599,simonw,closed,0,,,,,0,2018-04-16T00:51:30Z,2018-04-16T00:56:16Z,2018-04-16T00:56:16Z,OWNER,simonw/datasette/pulls/210,Refs #14,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314504812,MDExOlB1bGxSZXF1ZXN0MTgxNzU1MjIw,212,New --plugins-dir=plugins/ option,9599,simonw,closed,0,,,,,0,2018-04-16T05:19:28Z,2018-04-16T05:22:18Z,2018-04-16T05:22:01Z,OWNER,simonw/datasette/pulls/212,Refs #211,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 315316214,MDExOlB1bGxSZXF1ZXN0MTgyMzU3NjEz,222,Fix for plugins in Python 3.5,9599,simonw,closed,0,,,,,0,2018-04-18T03:21:01Z,2018-04-18T04:26:50Z,2018-04-18T03:24:21Z,OWNER,simonw/datasette/pulls/222,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 316365426,MDExOlB1bGxSZXF1ZXN0MTgzMTM1NjA0,232,Fix a typo,45281,lsb,closed,0,,,,,1,2018-04-20T18:20:04Z,2018-04-21T00:19:08Z,2018-04-21T00:19:08Z,CONTRIBUTOR,simonw/datasette/pulls/232,It looks like this was the only instance of it: https://github.com/simonw/datasette/search?utf8=%E2%9C%93&q=SOLite&type=,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 332998752,MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx,311,"?_labels=1 to expand foreign keys (in csv and json), refs #233",9599,simonw,closed,0,,,,,2,2018-06-16T16:31:12Z,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,simonw/datasette/pulls/311,"Output looks something like this: { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... }",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 333120982,MDExOlB1bGxSZXF1ZXN0MTk1NDEzMjQx,315,Streaming mode for downloading all rows as a CSV,9599,simonw,closed,0,,,,,0,2018-06-18T03:06:59Z,2018-06-18T03:29:13Z,2018-06-18T03:21:02Z,OWNER,simonw/datasette/pulls/315,Refs #266,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 334592281,MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx,322,Feature/in operator,2691848,4e1e0603,closed,0,,,,,0,2018-06-21T17:41:51Z,2018-06-21T17:45:25Z,2018-06-21T17:45:25Z,NONE,simonw/datasette/pulls/322,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 334731076,MDExOlB1bGxSZXF1ZXN0MTk2NjI4MzA0,324,Speed up Travis by reusing pip wheel cache across builds,9599,simonw,closed,0,,,,,0,2018-06-22T03:20:08Z,2018-06-24T01:03:47Z,2018-06-24T01:03:47Z,OWNER,simonw/datasette/pulls/324,From https://atchai.com/blog/faster-ci/ - refs #323 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 326987229,MDExOlB1bGxSZXF1ZXN0MTkwOTAxNDI5,293,Support for external database connectors,11912854,jsancho-gpl,closed,0,,,,,1,2018-05-28T11:02:45Z,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/293,"I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 330323860,MDExOlB1bGxSZXF1ZXN0MTkzMzYxMzQx,307,"Initial sketch of custom URL routing, refs #306",9599,simonw,closed,0,,,,,1,2018-06-07T15:26:48Z,2018-06-07T15:29:54Z,2018-06-07T15:29:41Z,OWNER,simonw/datasette/pulls/307,See #306 for background on this.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 344695978,MDExOlB1bGxSZXF1ZXN0MjA0MDI5MTQy,349,"publish_subcommand hook + default plugins mechanism, used for publish heroku/now",9599,simonw,closed,0,,,,,1,2018-07-26T05:03:22Z,2018-07-26T05:28:54Z,2018-07-26T05:16:00Z,OWNER,simonw/datasette/pulls/349,"This change introduces a new plugin hook, publish_subcommand, which can be used to implement new subcommands for the ""datasette publish"" command family. I've used this new hook to refactor out the ""publish now"" and ""publish heroku"" implementations into separate modules. I've also added unit tests for these two publishers, mocking the subprocess.call and subprocess.check_output functions. As part of this, I introduced a mechanism for loading default plugins. These are defined in the new ""default_plugins"" list inside datasette/app.py Closes #217 (Plugin support for ""datasette publish"") Closes #348 (Unit tests for ""datasette publish"") Refs #14, #59, #102, #103, #146, #236, #347",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 345821778,MDExOlB1bGxSZXF1ZXN0MjA0ODUxNTEx,353,render_cell(value) plugin hook,9599,simonw,closed,0,,,,,0,2018-07-30T15:57:08Z,2018-08-05T00:14:57Z,2018-08-05T00:14:57Z,OWNER,simonw/datasette/pulls/353,Closes #352.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 347058326,MDExOlB1bGxSZXF1ZXN0MjA1NzcwOTk2,1,Make .indexes compatible with older SQLite versions,9599,simonw,closed,0,,,,,0,2018-08-02T15:17:05Z,2018-08-02T15:17:30Z,2018-08-02T15:17:30Z,OWNER,simonw/sqlite-utils/pulls/1,Older SQLite versions return a different set of columns from the PRAGMA we are using.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 348534997,MDExOlB1bGxSZXF1ZXN0MjA2ODYzODAz,358,"Bump versions of pytest, pluggy and beautifulsoup4",9599,simonw,closed,0,,,,,0,2018-08-08T00:44:38Z,2018-08-08T01:11:13Z,2018-08-08T01:11:13Z,OWNER,simonw/datasette/pulls/358,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 351017365,MDExOlB1bGxSZXF1ZXN0MjA4NzE5MDQz,361," Import pysqlite3 if available, closes #360 ",9599,simonw,closed,0,,,,,0,2018-08-16T00:52:21Z,2018-08-16T00:58:57Z,2018-08-16T00:58:57Z,OWNER,simonw/datasette/pulls/361,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 340733753,MDExOlB1bGxSZXF1ZXN0MjAxMDc1NTMy,341,Bump aiohttp to fix compatibility with Python 3.7,9599,simonw,closed,0,,,,,0,2018-07-12T17:41:24Z,2018-07-12T18:07:38Z,2018-07-12T18:07:38Z,OWNER,simonw/datasette/pulls/341,Tests failed here: https://travis-ci.org/simonw/datasette/jobs/403223333,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 341235633,MDExOlB1bGxSZXF1ZXN0MjAxNDUxMzMy,345,Allow app names for `datasette publish heroku`,45057,russss,closed,0,,,,,1,2018-07-14T13:12:34Z,2018-07-14T14:09:54Z,2018-07-14T14:04:44Z,CONTRIBUTOR,simonw/datasette/pulls/345,"Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 359075028,MDExOlB1bGxSZXF1ZXN0MjE0NjUzNjQx,364,Support for other types of databases using external connectors,11912854,jsancho-gpl,open,0,,,,,0,2018-09-11T14:31:47Z,2018-09-11T14:31:47Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/364,"This PR is related to #293, but now all commits have been merged. The purpose is to support other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. The modifications in the original datasette code are minimal and many are in a separated file.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/364/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 361764460,MDExOlB1bGxSZXF1ZXN0MjE2NjUxMzE3,365,fix small doc typo,418191,jaywgraves,closed,0,,,,,2,2018-09-19T14:02:02Z,2019-12-19T02:30:33Z,2018-09-19T17:15:43Z,CONTRIBUTOR,simonw/datasette/pulls/365,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 355299310,MDExOlB1bGxSZXF1ZXN0MjExODYwNzA2,363,Search all apps during heroku publish,436032,kevboh,open,0,,,,,1,2018-08-29T19:25:10Z,2018-08-31T14:39:45Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/363,Adds the `-A` option to include apps from all organizations when searching app names for publish.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 374675798,MDExOlB1bGxSZXF1ZXN0MjI2MzE0ODYy,367,Mark codemirror files as vendored,48517,jaap3,closed,0,,,,,2,2018-10-27T18:41:25Z,2019-05-03T21:12:09Z,2019-05-03T21:11:20Z,CONTRIBUTOR,simonw/datasette/pulls/367,"GitHub lists datasette as a Javascript project, primarily because of the vendored codemirror files. This is somewhat confusing when you're looking for datasette, knowing it's written in Python. Luckily it's possible exclude certain files from GitHub's code statistics: https://github.com/github/linguist#using-gitattributes",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/367/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 374676773,MDExOlB1bGxSZXF1ZXN0MjI2MzE1NTEz,368,Update installation instructions,48517,jaap3,closed,0,,,,,0,2018-10-27T18:52:31Z,2019-05-03T18:18:43Z,2019-05-03T18:18:42Z,CONTRIBUTOR,simonw/datasette/pulls/368,"I was writing this as a response to your tweet, but decided I might just make it a pull request. I feel like it might be confusing to those unfamiliar with Python's `-m` flag and the built-in `venv` module to omit the space between the flag and its argument. By adding a space and prefixing the second occurrence of `venv` with a `./` it's maybe a bit clearer what the arguments are and what they do. By also using `python3 -m pip` it becomes even clearer that `-m` is a special flag that makes the python executable do neat things.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 386459810,MDExOlB1bGxSZXF1ZXN0MjM1MTk0Mjg2,390,tiny typo in customization docs,418191,jaywgraves,closed,0,,,,,1,2018-12-01T13:44:42Z,2019-12-19T02:30:35Z,2018-12-16T21:32:56Z,CONTRIBUTOR,simonw/datasette/pulls/390,was looking to add some custom templates to my use of datasette and saw this small typo.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 382471625,MDExOlB1bGxSZXF1ZXN0MjMyMTcyMTA2,389,Bump dependency versions,9599,simonw,closed,0,,,,,2,2018-11-20T02:23:12Z,2019-11-13T19:13:41Z,2019-11-13T19:13:41Z,OWNER,simonw/datasette/pulls/389,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403028630,MDExOlB1bGxSZXF1ZXN0MjQ3NTc2OTQy,4,Fts5,9599,simonw,closed,0,,,,,0,2019-01-25T06:54:05Z,2019-01-25T06:54:33Z,2019-01-25T06:54:33Z,OWNER,simonw/sqlite-utils/pulls/4,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403396009,MDExOlB1bGxSZXF1ZXN0MjQ3ODYxNDE5,5,Run Travis tests against Python 3.8-dev,9599,simonw,closed,0,,,,,0,2019-01-26T02:30:55Z,2019-01-26T02:37:54Z,2019-01-26T02:37:54Z,OWNER,simonw/sqlite-utils/pulls/5,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403499298,MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3,404,Experiment: run Jinja in async mode,9599,simonw,closed,0,,,,,3,2019-01-27T00:28:44Z,2019-11-12T05:02:18Z,2019-11-12T05:02:13Z,OWNER,simonw/datasette/pulls/404,"See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 405801771,MDExOlB1bGxSZXF1ZXN0MjQ5NjgwOTQ0,9,:pencil: Updates my_database.py to my_database.db,50527,jefftriplett,closed,0,,,,,0,2019-02-01T17:35:43Z,2019-02-24T03:55:04Z,2019-02-24T03:55:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/9,I noticed that both `.py` and `.db` were used in the docs and assumed you'd prefer `.db`. ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/9/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 394751072,MDExOlB1bGxSZXF1ZXN0MjQxNDE4NDQz,392,Fix some regex DeprecationWarnings,9599,simonw,closed,0,,,,,0,2018-12-29T02:10:28Z,2018-12-29T02:22:28Z,2018-12-29T02:22:28Z,OWNER,simonw/datasette/pulls/392,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 413778585,MDExOlB1bGxSZXF1ZXN0MjU1NjU4MTEy,12,"Support for numpy types, closes #11",9599,simonw,closed,0,,,,,0,2019-02-24T03:57:32Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,simonw/sqlite-utils/pulls/12,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 413887019,MDExOlB1bGxSZXF1ZXN0MjU1NzI1MDU3,413,Update spatialite.rst,28597217,joelondon,closed,0,,,,,1,2019-02-25T00:08:35Z,2019-03-15T05:06:45Z,2019-03-15T05:06:45Z,CONTRIBUTOR,simonw/datasette/pulls/413,a line of sql added to create the idx_ in the python recipe,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 407073223,MDExOlB1bGxSZXF1ZXN0MjUwNjI4Mjc1,407,Heroku --include-vcs-ignore,9599,simonw,closed,0,,,,,1,2019-02-06T04:06:20Z,2019-02-06T04:31:30Z,2019-02-06T04:15:47Z,OWNER,simonw/datasette/pulls/407,"Should mean `datasette publish heroku` can work under Travis, unlike this failure: https://travis-ci.org/simonw/fivethirtyeight-datasette/builds/488047550 ``` 2.25s$ datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette tar: unrecognized option '--exclude-vcs-ignores' Try 'tar --help' or 'tar --usage' for more information. ▸ Command failed: tar cz -C /tmp/tmpuaxm7i8f --exclude-vcs-ignores --exclude ▸ .git --exclude .gitmodules . > ▸ /tmp/f49440e0-1bf3-4d3f-9eb0-fbc2967d1fd4.tar.gz ▸ tar: unrecognized option '--exclude-vcs-ignores' ▸ Try 'tar --help' or 'tar --usage' for more information. ▸ The command ""datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette"" exited with 0. ``` The fix for that issue is to call the heroku command like this: heroku builds:create -a app_name --include-vcs-ignore ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 427429265,MDExOlB1bGxSZXF1ZXN0MjY2MDM1Mzgy,424,Column types in inspected metadata,45057,russss,closed,0,,,,,2,2019-03-31T18:46:33Z,2019-04-29T18:30:50Z,2019-04-29T18:30:46Z,CONTRIBUTOR,simonw/datasette/pulls/424,"This PR does two things: * Adds the sqlite column type for each column to the inspected table info. * Stops binary columns from being rendered to HTML, unless a plugin handles it. There's a bit more detail in the changeset descriptions. These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 431756352,MDExOlB1bGxSZXF1ZXN0MjY5MzY0OTI0,426,Upgrade to Jinja2==2.10.1,9599,simonw,closed,0,,,,,1,2019-04-10T23:03:08Z,2019-04-22T21:23:22Z,2019-04-10T23:13:31Z,OWNER,simonw/datasette/pulls/426,"https://nvd.nist.gov/vuln/detail/CVE-2019-10906 This is only a security issue of concern if evaluating templates from untrusted sources, which isn't something I would ever expect a Datasette user to do.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/426/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 421348146,MDExOlB1bGxSZXF1ZXN0MjYxNDE4Mjg1,416,URL hashing now optional: turn on with --config hash_urls:1 (#418),9599,simonw,closed,0,,,,,8,2019-03-15T04:26:06Z,2019-03-17T22:55:04Z,2019-03-17T22:55:04Z,OWNER,simonw/datasette/pulls/416,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438048318,MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0,437,Add inspect and prepare_sanic hooks,45057,russss,closed,0,,,,,2,2019-04-28T11:53:34Z,2019-06-24T16:38:57Z,2019-06-24T16:38:56Z,CONTRIBUTOR,simonw/datasette/pulls/437,"This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/437/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438240541,MDExOlB1bGxSZXF1ZXN0Mjc0MzEzNjI1,439,[WIP] Add primary key to the extra_body_script hook arguments,45057,russss,closed,0,,,,,2,2019-04-29T10:08:23Z,2019-05-01T09:58:32Z,2019-05-01T09:58:30Z,CONTRIBUTOR,simonw/datasette/pulls/439,"This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map. I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky. (At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438437973,MDExOlB1bGxSZXF1ZXN0Mjc0NDY4ODM2,441,Add register_output_renderer hook,45057,russss,closed,0,,,,,8,2019-04-29T18:03:21Z,2019-05-01T23:01:57Z,2019-05-01T23:01:57Z,CONTRIBUTOR,simonw/datasette/pulls/441,"This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438450757,MDExOlB1bGxSZXF1ZXN0Mjc0NDc4NzYx,442,Suppress rendering of binary data,45057,russss,closed,0,,,,,2,2019-04-29T18:36:41Z,2019-05-03T18:26:48Z,2019-05-03T16:44:49Z,CONTRIBUTOR,simonw/datasette/pulls/442,"Binary columns (including spatialite geographies) get shown as ugly binary strings in the HTML by default. Nobody wants to see that mess. Show the size of the column in bytes instead. If you want to decode the binary data, you can use a plugin to do it.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439480260,MDExOlB1bGxSZXF1ZXN0Mjc1Mjc1NjEw,443,Pass view_name to extra_body_script hook,45057,russss,closed,0,,,,,0,2019-05-02T08:38:36Z,2019-05-03T13:12:20Z,2019-05-03T13:12:20Z,CONTRIBUTOR,simonw/datasette/pulls/443,"At the moment it's not easy to tell whether the hook is being called in (for example) the row or table view, as in both cases the `database` and `table` parameters are provided. This passes the `view_name` added in #441 to the `extra_body_script` hook.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439487648,MDExOlB1bGxSZXF1ZXN0Mjc1MjgxMzA3,444,Add a max-line-length setting for flake8,45057,russss,closed,0,,,,,0,2019-05-02T08:58:57Z,2019-05-04T09:44:48Z,2019-05-03T13:11:28Z,CONTRIBUTOR,simonw/datasette/pulls/444,"This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439836586,MDExOlB1bGxSZXF1ZXN0Mjc1NTU4NjEy,445,"Extract facet code out into a new plugin hook, closes #427",9599,simonw,closed,0,,,,,0,2019-05-03T00:02:41Z,2019-05-03T18:17:18Z,2019-05-03T00:11:27Z,OWNER,simonw/datasette/pulls/445,"Datasette previously only supported one type of faceting: exact column value counting. With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more. A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes. Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440159137,MDExOlB1bGxSZXF1ZXN0Mjc1ODAxNDYz,447,Use dist: xenial and python: 3.7 on Travis,9599,simonw,closed,0,,,,,1,2019-05-03T18:07:07Z,2019-05-03T18:17:05Z,2019-05-03T18:16:53Z,OWNER,simonw/datasette/pulls/447,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440237422,MDExOlB1bGxSZXF1ZXN0Mjc1ODYxNTU5,449,Apply black to everything,9599,simonw,closed,0,,,,,0,2019-05-03T21:57:26Z,2019-05-04T02:17:14Z,2019-05-04T02:15:15Z,OWNER,simonw/datasette/pulls/449,"I've been hesitating on this for literally months, because I'm not at all excited about the giant diff that will result. But I've been using black on many of my other projects (most actively [sqlite-utils](https://github.com/simonw/sqlite-utils)) and the productivity boost is undeniable: I don't have to spend a single second thinking about code formatting any more! So it's worth swallowing the one-off pain and moving on in a new, black-enabled world.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/449/reactions"", ""total_count"": 4, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 4, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440304714,MDExOlB1bGxSZXF1ZXN0Mjc1OTA5MTk3,450,Coalesce hidden table count to 0,45057,russss,closed,0,,,,,2,2019-05-04T09:37:10Z,2019-05-11T18:10:09Z,2019-05-11T18:10:09Z,CONTRIBUTOR,simonw/datasette/pulls/450,"For some reason I'm hitting a `None` here with a FTS table. I'm not entirely sure why but this makes the logic work the same as with non-hidden tables.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440325850,MDExOlB1bGxSZXF1ZXN0Mjc1OTIzMDY2,452,SQL builder utility classes,45057,russss,open,0,,,,,0,2019-05-04T13:57:47Z,2019-05-04T14:03:04Z,,CONTRIBUTOR,simonw/datasette/pulls/452,"This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything...",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/452/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 442402832,MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy,458,setup: add tests to package exclusion,7725188,hellerve,closed,0,,,,,1,2019-05-09T19:47:21Z,2020-07-21T01:14:42Z,2019-05-10T01:54:51Z,CONTRIBUTOR,simonw/datasette/pulls/458,"This PR fixes #456 by adding `tests` to the package exclusion list. Cheers",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 432792459,MDExOlB1bGxSZXF1ZXN0MjcwMTkxMDg0,430,"?_where= parameter on table views, closes #429",9599,simonw,closed,0,,,,,0,2019-04-13T01:15:09Z,2019-04-13T01:37:23Z,2019-04-13T01:37:23Z,OWNER,simonw/datasette/pulls/430,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 432893491,MDExOlB1bGxSZXF1ZXN0MjcwMjUxMDIx,432,"Refactor facets to a class and new plugin, refs #427",9599,simonw,closed,0,,,,,4,2019-04-13T20:04:45Z,2019-05-03T00:04:24Z,2019-05-03T00:04:24Z,OWNER,simonw/datasette/pulls/432,WIP for #427,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 434321685,MDExOlB1bGxSZXF1ZXN0MjcxMzM4NDA1,434,"""datasette publish cloudrun"" command to publish to Google Cloud Run",10352819,rprimet,closed,0,,,,,8,2019-04-17T14:41:18Z,2019-05-03T21:50:44Z,2019-05-03T13:59:02Z,CONTRIBUTOR,simonw/datasette/pulls/434,"This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400). The main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract). This was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach? ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 451261628,MDExOlB1bGxSZXF1ZXN0Mjg0MzkwMTk3,497,Upgrade pytest to 4.6.1,9599,simonw,closed,0,,,,,0,2019-06-03T01:45:34Z,2019-06-03T02:06:32Z,2019-06-03T02:06:27Z,OWNER,simonw/datasette/pulls/497,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 451705509,MDExOlB1bGxSZXF1ZXN0Mjg0NzQzNzk0,500,Fix typo in install step: should be install -e,32314,tmcw,closed,0,,,,,1,2019-06-03T21:50:51Z,2019-06-11T18:48:43Z,2019-06-11T18:48:40Z,CONTRIBUTOR,simonw/datasette/pulls/500,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 452901999,MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw,501,Test against Python 3.8-dev using Travis,9599,simonw,closed,0,,,,,3,2019-06-06T08:37:53Z,2019-11-11T03:23:29Z,2019-11-11T03:23:29Z,OWNER,simonw/datasette/pulls/501,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 445873563,MDExOlB1bGxSZXF1ZXN0MjgwMjA0Mjc2,479,doc typo fix,98555,IgnoredAmbience,closed,0,,,,,1,2019-05-19T22:54:25Z,2019-05-20T16:42:29Z,2019-05-20T16:42:29Z,CONTRIBUTOR,simonw/datasette/pulls/479,Fix typo in performance doc page,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/479/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 445875242,MDExOlB1bGxSZXF1ZXN0MjgwMjA1NTAy,480,Split pypi and docker travis tasks,813732,glasnt,closed,0,,,4471010,Datasette 0.29,1,2019-05-19T23:14:37Z,2019-07-07T20:03:20Z,2019-07-07T20:03:20Z,CONTRIBUTOR,simonw/datasette/pulls/480,"Resolves #478 This *should* work, but because this is a change that'll only really be testable on a) this repo, b) master branch, this might fail fast if I didn't get the configurations right. Looking at #478 it should just be as simple as splitting out the docker and pypi processes into separate jobs, but it might end up being more complicated than that, depending on what pre-processes the pypi deployment needs, and how travisci treats deployment steps without scripts in general. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/480/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464040911,MDExOlB1bGxSZXF1ZXN0Mjk0NDAwNDQ2,539,Secret plugin configuration options,9599,simonw,closed,0,,,,,2,2019-07-04T03:21:20Z,2019-07-04T05:36:45Z,2019-07-04T05:36:45Z,OWNER,simonw/datasette/pulls/539,Refs #538 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464786717,MDExOlB1bGxSZXF1ZXN0Mjk0OTkyNTc4,542,extra_template_vars plugin hook,9599,simonw,closed,0,,,,,5,2019-07-05T22:19:17Z,2019-07-06T00:05:57Z,2019-07-06T00:05:56Z,OWNER,simonw/datasette/pulls/542,Refs #541,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464894812,MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2,544,--plugin-secret option,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-06T22:18:20Z,2019-07-08T02:06:31Z,2019-07-08T02:06:31Z,OWNER,simonw/datasette/pulls/544,"Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464987783,MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz,546,Facet by delimiter,9599,simonw,open,0,,,,,2,2019-07-07T20:06:05Z,2019-11-18T23:46:01Z,,OWNER,simonw/datasette/pulls/546,Refs #510,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465728430,MDExOlB1bGxSZXF1ZXN0Mjk1NzExNTA0,554,Fix static mounts using relative paths and prevent traversal exploits,3243482,abdusco,closed,0,,,,,4,2019-07-09T11:32:02Z,2019-07-11T16:29:26Z,2019-07-11T16:13:19Z,CONTRIBUTOR,simonw/datasette/pulls/554,"While debugging why my static mounts using a relative path (`--static mystatic:rel/path/to/dir`) not working, I noticed that the requests fail no matter what, returning 404 errors. The reason is that datasette tries to prevent traversal exploits by checking if the path is relative to its registered directory. This check fails when the mount is a relative directory, because `/abs/dir/file` obviously not under `dir/file`. https://github.com/simonw/datasette/blob/81fa8b6cdc5457b42a224779e5291952314e8d20/datasette/utils/asgi.py#L303-L306 This also has the consequence of returning any requested file, because when `/abs/dir/../../evil.file` resolves `aiofiles` happily returns it to the client after it resolves the path itself. The solution is to make sure we're checking relativity of paths after they're fully resolved. I've implemented the mentioned changes and also updated the tests.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465773546,MDExOlB1bGxSZXF1ZXN0Mjk1NzQ4MjY4,556,Add support for running datasette as a module,3243482,abdusco,closed,0,,,,,1,2019-07-09T13:13:30Z,2019-07-11T16:07:45Z,2019-07-11T16:07:44Z,CONTRIBUTOR,simonw/datasette/pulls/556,"This PR allows running datasette using `python -m datasette` command in addition to just running the executable. This function is quite useful when debugging a plugin in a project because IDEs like PyCharm can easily start a debug session when datasette is run as a module in contrast to trying to attach a debugger to a running process. ![image](https://user-images.githubusercontent.com/3243482/60890448-fc4ede80-a263-11e9-8b42-d2a3db8d1a59.png) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 466996584,MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw,557,Get tests running on Windows using Travis CI,9599,simonw,closed,0,,,,,4,2019-07-11T16:36:57Z,2021-07-10T23:39:48Z,2021-07-10T23:39:48Z,OWNER,simonw/datasette/pulls/557,Refs #511,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467623820,MDExOlB1bGxSZXF1ZXN0Mjk3MjQzMDcz,559,Bump to uvicorn 0.8.4,9599,simonw,closed,0,,,,,0,2019-07-12T22:30:29Z,2019-07-13T22:34:58Z,2019-07-13T22:34:58Z,OWNER,simonw/datasette/pulls/559,"https://github.com/encode/uvicorn/commits/0.8.4 Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467862459,MDExOlB1bGxSZXF1ZXN0Mjk3NDEyNDY0,38,table.update() method,9599,simonw,closed,0,,,,,2,2019-07-14T17:03:49Z,2019-07-28T15:43:51Z,2019-07-28T15:43:51Z,OWNER,simonw/sqlite-utils/pulls/38,"Refs #35 Still to do: - [x] Unit tests - [x] Switch to using `.get()` - [x] Better exceptions, plus unit tests for what happens if pk does not exist - [x] Documentation - [x] Ensure compound primary keys work properly - [x] `alter=True` support",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467928674,MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3,40,.get() method plus support for compound primary keys,9599,simonw,closed,0,,,,,1,2019-07-15T03:43:13Z,2019-07-15T04:28:57Z,2019-07-15T04:28:52Z,OWNER,simonw/sqlite-utils/pulls/40,"- [x] Tests for the `NotFoundError` exception - [x] Documentation for `.get()` method - [x] Support `--pk` multiple times to define CLI compound primary keys - [x] Documentation for compound primary keys",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 469828961,MDExOlB1bGxSZXF1ZXN0Mjk4OTYyNTUx,561,Fix typos,15278512,minho42,closed,0,,,,,0,2019-07-18T15:13:35Z,2019-07-26T10:25:45Z,2019-07-26T10:25:45Z,CONTRIBUTOR,simonw/datasette/pulls/561,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 459587155,MDExOlB1bGxSZXF1ZXN0MjkwODk3MTA0,518,Port Datasette from Sanic to ASGI + Uvicorn,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,12,2019-06-23T15:18:42Z,2019-06-24T13:42:50Z,2019-06-24T03:13:09Z,OWNER,simonw/datasette/pulls/518,"Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces: - [x] Update test harness to more correctly simulate the `raw_path` issue - [x] Use `raw_path` so table names containing `/` can work correctly - [x] Bug: JSON not served with correct content-type - [x] Get ?_trace=1 working again - [x] Replacement for `@app.listener(""before_server_start"")` - [x] Bug: `/fixtures/table%2Fwith%2Fslashes.csv?_format=json` downloads as CSV - [x] Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency - [x] Final code tidy-up before merging to master",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 459689615,MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1,524,"Sort commits using isort, refs #516",9599,simonw,closed,0,,,,,1,2019-06-24T05:04:48Z,2023-08-23T01:31:08Z,2023-08-23T01:31:08Z,OWNER,simonw/datasette/pulls/524,Also added a lint unit test to ensure they stay sorted. #516,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 460396952,MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2,529,Use keyed rows - fixes #521,1383872,nathancahill,closed,0,,,,,1,2019-06-25T12:33:48Z,2019-06-25T12:35:07Z,2019-06-25T12:35:07Z,NONE,simonw/datasette/pulls/529,"Supports template syntax like this: ``` {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462094937,MDExOlB1bGxSZXF1ZXN0MjkyODc5MjA0,32,db.add_foreign_keys() method,9599,simonw,closed,0,,,,,1,2019-06-28T15:40:33Z,2019-06-29T06:27:39Z,2019-06-29T06:27:39Z,OWNER,simonw/sqlite-utils/pulls/32,"Refs #31. Still TODO: - [x] Unit tests - [x] Documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462423972,MDExOlB1bGxSZXF1ZXN0MjkzMTE3MTgz,34,sqlite-utils index-foreign-keys / db.index_foreign_keys(),9599,simonw,closed,0,,,,,0,2019-06-30T16:43:40Z,2019-06-30T23:50:55Z,2019-06-30T23:50:55Z,OWNER,simonw/sqlite-utils/pulls/34,"Refs #33 - [x] `sqlite-utils index-foreign-keys` command - [x] `db.index_foreign_keys()` method - [x] unit tests - [x] documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463492395,MDExOlB1bGxSZXF1ZXN0MjkzOTYyNDA1,533,"Support cleaner custom templates for rows and tables, closes #521",9599,simonw,closed,0,,,,,1,2019-07-03T00:40:18Z,2019-07-03T03:23:06Z,2019-07-03T03:23:06Z,OWNER,simonw/datasette/pulls/533,"- [x] Rename `_rows_and_columns.html` to `_table.html` - [x] Unit test - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463534974,MDExOlB1bGxSZXF1ZXN0MjkzOTk0NDQz,536,"Switch to ~= dependencies, closes #532",9599,simonw,closed,0,,,,,0,2019-07-03T04:12:16Z,2019-07-03T04:32:55Z,2019-07-03T04:32:55Z,OWNER,simonw/datasette/pulls/536,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/536/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463531894,MDExOlB1bGxSZXF1ZXN0MjkzOTkyMzgy,535,"Added asgi_wrapper plugin hook, closes #520",9599,simonw,closed,0,,,,,0,2019-07-03T03:58:00Z,2019-07-03T04:06:26Z,2019-07-03T04:06:26Z,OWNER,simonw/datasette/pulls/535,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 481887482,MDExOlB1bGxSZXF1ZXN0MzA4MjkyNDQ3,55,Ability to introspect and run queries against views,9599,simonw,closed,0,,,,,1,2019-08-17T13:40:56Z,2019-08-23T12:19:42Z,2019-08-23T12:19:42Z,OWNER,simonw/sqlite-utils/pulls/55,See #54 ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471684708,MDExOlB1bGxSZXF1ZXN0MzAwMjg2NTM1,45,"Implemented table.lookup(...), closes #44",9599,simonw,closed,0,,,,,0,2019-07-23T13:03:30Z,2019-07-23T13:07:00Z,2019-07-23T13:07:00Z,OWNER,simonw/sqlite-utils/pulls/45,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471797101,MDExOlB1bGxSZXF1ZXN0MzAwMzc3NTk5,47,extracts= table parameter,9599,simonw,closed,0,,,,,0,2019-07-23T16:30:29Z,2019-07-23T17:00:43Z,2019-07-23T17:00:43Z,OWNER,simonw/sqlite-utils/pulls/47,Still needs docs. Refs #46,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472104705,MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx,8,Use less RAM,9599,simonw,closed,0,,,,,0,2019-07-24T06:35:01Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,dogsheep/healthkit-to-sqlite/pulls/8,Closes #7,197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 473288428,MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz,564,First proof-of-concept of Datasette Library,9599,simonw,open,0,,,,,1,2019-07-26T10:22:26Z,2023-02-07T15:14:11Z,,OWNER,simonw/datasette/pulls/564,"Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 473733752,MDExOlB1bGxSZXF1ZXN0MzAxODI0MDk3,51,"Fix for too many SQL variables, closes #50",9599,simonw,closed,0,,,,,1,2019-07-28T11:30:30Z,2019-07-28T11:59:32Z,2019-07-28T11:59:32Z,OWNER,simonw/sqlite-utils/pulls/51,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 476436920,MDExOlB1bGxSZXF1ZXN0MzAzOTkwNjgz,53,Work in progress: m2m() method for creating many-to-many records,9599,simonw,closed,0,,,,,0,2019-08-03T10:03:56Z,2019-08-04T03:38:10Z,2019-08-04T03:37:33Z,OWNER,simonw/sqlite-utils/pulls/53,"- [x] `table.insert({""name"": ""Barry""}).m2m(""tags"", lookup={""tag"": ""Coworker""})` - [x] Explicit table name `.m2m(""humans"", ..., m2m_table=""relationships"")` - [x] Automatically use an existing m2m table if a single obvious candidate exists (a table with two foreign keys in the correct directions) - [x] Require the explicit `m2m_table=` argument if multiple candidates for the m2m table exist - [x] Documentation Refs #23",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487847945,MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz,56,Escape the table name in populate_fts and search.,49260,amjith,closed,0,,,,,2,2019-09-01T06:29:05Z,2019-09-02T17:23:21Z,2019-09-02T17:23:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/56,"The table names weren't escaped using double quotes in the populate_fts method. Reproducible case: ``` >>> import sqlite_utils >>> db = sqlite_utils.Database(""abc.db"") >>> db[""http://example.com""].insert_all([ ... {""id"": 1, ""age"": 4, ""name"": ""Cleo""}, ... {""id"": 2, ""age"": 2, ""name"": ""Pancakes""} ... ], pk=""id"")
>>> db[""http://example.com""].enable_fts([""name""]) Traceback (most recent call last): File """", line 1, in db[""http://example.com""].enable_fts([""name""]) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 705, in enable_fts self.populate_fts(columns) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 715, in populate_fts self.db.conn.executescript(sql) sqlite3.OperationalError: unrecognized token: "":"" >>> ```",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487987958,MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0,57,Add triggers while enabling FTS,49260,amjith,closed,0,,,,,4,2019-09-02T04:23:40Z,2019-09-03T01:03:59Z,2019-09-02T23:42:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/57,"This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488343304,MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2,571,detect_fts now works with alternative table escaping,9599,simonw,closed,0,,,,,0,2019-09-03T00:23:39Z,2019-09-03T00:32:28Z,2019-09-03T00:32:28Z,OWNER,simonw/datasette/pulls/571,Fixes #570,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488341021,MDExOlB1bGxSZXF1ZXN0MzEzMzgzMzE3,60,db.triggers and table.triggers introspection,9599,simonw,closed,0,,,,,0,2019-09-03T00:04:32Z,2019-09-03T00:09:42Z,2019-09-03T00:09:42Z,OWNER,simonw/sqlite-utils/pulls/60,Closes #59,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,simonw,closed,0,,,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505814865,MDExOlB1bGxSZXF1ZXN0MzI3MTY5NzQ4,589,Display metadata footer on custom SQL queries,2657547,rixx,closed,0,,,,,0,2019-10-11T12:10:28Z,2019-10-14T08:58:23Z,2019-10-14T03:53:22Z,CONTRIBUTOR,simonw/datasette/pulls/589,Closes #408,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505818256,MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1,590,Handle spaces in DB names,2657547,rixx,closed,0,,,,,3,2019-10-11T12:18:22Z,2019-11-04T23:16:31Z,2019-11-04T23:16:30Z,CONTRIBUTOR,simonw/datasette/pulls/590,Closes #503,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505837199,MDExOlB1bGxSZXF1ZXN0MzI3MTg4MDg3,591,Sort databases on homepage by argument order,2657547,rixx,closed,0,,,,,1,2019-10-11T12:57:38Z,2019-10-14T08:57:50Z,2019-10-14T03:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/591,Closes #585,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505950145,MDExOlB1bGxSZXF1ZXN0MzI3Mjc5ODE4,592,Offer SQL formatting,2657547,rixx,closed,0,,,,,1,2019-10-11T16:35:49Z,2019-10-14T08:57:12Z,2019-10-14T03:46:13Z,CONTRIBUTOR,simonw/datasette/pulls/592,"SQL code will be formatted on page load, and can additionally be formatted by clicking the ""Format SQL"" button. Closes #136",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506300941,MDExOlB1bGxSZXF1ZXN0MzI3NTQxMDQ2,595,bump uvicorn to 0.9.0 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,9,2019-10-13T10:00:04Z,2019-11-12T04:46:48Z,2019-11-12T04:46:48Z,NONE,simonw/datasette/pulls/595,"as uvicorn-0.9 is needed to get websockets-8.0.2, which is needed to have Python-3.8 compatibility",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,simonw,closed,0,,,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509267608,MDExOlB1bGxSZXF1ZXN0MzI5ODkwMzIw,599,Fix for /foo v.s. /foo-bar issue in #597,9599,simonw,closed,0,,,,,0,2019-10-18T19:22:55Z,2019-10-18T22:51:07Z,2019-10-18T22:51:07Z,OWNER,simonw/datasette/pulls/599,Refs #597,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509340359,MDExOlB1bGxSZXF1ZXN0MzI5OTQ3MTgw,601,Don't auto-format SQL on page load,9599,simonw,closed,0,,,,,5,2019-10-18T22:37:39Z,2019-10-20T02:29:49Z,2019-10-18T23:56:45Z,OWNER,simonw/datasette/pulls/601,Refs #600,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,heussd,closed,0,,,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 501773982,MDExOlB1bGxSZXF1ZXN0MzIzOTgzNzMy,579,New connection pooling,9599,simonw,open,0,,,,,1,2019-10-02T23:22:19Z,2019-11-15T22:57:21Z,,OWNER,simonw/datasette/pulls/579,See #569,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 514899195,MDExOlB1bGxSZXF1ZXN0MzM0NDQ4MjU4,609,Update to latest black,9599,simonw,closed,0,,,,,0,2019-10-30T18:42:35Z,2019-10-30T18:49:01Z,2019-10-30T18:49:01Z,OWNER,simonw/datasette/pulls/609,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/609/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 516763727,MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2,8,"stargazers command, refs #4",9599,simonw,closed,0,,,,,5,2019-11-03T00:37:36Z,2020-05-02T20:00:27Z,2020-05-02T20:00:26Z,MEMBER,dogsheep/github-to-sqlite/pulls/8,Needs tests. Refs #4.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519032008,MDExOlB1bGxSZXF1ZXN0MzM3ODQ3NTcz,64,test_insert_upsert_all_empty_list,9599,simonw,closed,0,,,,,0,2019-11-07T04:24:45Z,2019-11-07T04:32:38Z,2019-11-07T04:32:38Z,OWNER,simonw/sqlite-utils/pulls/64,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519039316,MDExOlB1bGxSZXF1ZXN0MzM3ODUzMzk0,65,Release 1.12.1,9599,simonw,closed,0,,,,,0,2019-11-07T04:51:29Z,2019-11-07T04:58:48Z,2019-11-07T04:58:47Z,OWNER,simonw/sqlite-utils/pulls/65,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,mrw34,closed,0,,,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520718056,MDExOlB1bGxSZXF1ZXN0MzM5MjM2NjQ3,623,Test against Python 3.8 in Travis,9599,simonw,closed,0,,,,,2,2019-11-11T03:24:54Z,2019-11-11T03:45:35Z,2019-11-11T03:45:35Z,OWNER,simonw/datasette/pulls/623,Needed for #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520728483,MDExOlB1bGxSZXF1ZXN0MzM5MjQ0ODg4,624,Bump pint to 0.9,9599,simonw,closed,0,,,,,0,2019-11-11T04:07:07Z,2019-11-11T04:19:02Z,2019-11-11T04:19:02Z,OWNER,simonw/datasette/pulls/624,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521323012,MDExOlB1bGxSZXF1ZXN0MzM5NzIyNzkw,627,"Support Python 3.8, stop supporting Python 3.5",9599,simonw,closed,0,,,,,2,2019-11-12T04:36:33Z,2020-04-05T10:23:58Z,2019-11-12T05:09:12Z,OWNER,simonw/datasette/pulls/627,Refs #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521346800,MDExOlB1bGxSZXF1ZXN0MzM5NzQyNDMy,630,Use python:3.8 base Docker image,9599,simonw,closed,0,,,,,0,2019-11-12T06:02:37Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,simonw/datasette/pulls/630,Closes #629,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509535510,MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz,602,Offer to format readonly SQL,2657547,rixx,closed,0,,,,,3,2019-10-20T02:29:32Z,2019-11-04T07:29:33Z,2019-11-04T02:39:56Z,CONTRIBUTOR,simonw/datasette/pulls/602,"Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509612217,MDExOlB1bGxSZXF1ZXN0MzMwMTI5MzU4,603,always pop as_format off args dict,6025893,chris48s,closed,0,,,,,2,2019-10-20T15:44:22Z,2019-10-30T19:12:22Z,2019-10-21T02:03:09Z,CONTRIBUTOR,simonw/datasette/pulls/603,closes #563,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 529376481,MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2,67,Run tests against 3.5 too,9599,simonw,closed,0,,,,,2,2019-11-27T14:20:35Z,2019-12-31T01:29:44Z,2019-12-31T01:29:43Z,OWNER,simonw/sqlite-utils/pulls/67,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 530513784,MDExOlB1bGxSZXF1ZXN0MzQ3MTc5MDgx,644,Validate metadata json on startup,6025893,chris48s,closed,0,,,,,1,2019-11-30T00:32:15Z,2021-07-28T17:58:45Z,2021-07-28T17:58:45Z,CONTRIBUTOR,simonw/datasette/pulls/644,"This PR adds a sanity check which builds up a marshmallow schema on-the-fly based on the structure of the database(s) on startup and then validates the metadata json against it. In case of invalid data, this will raise with a descriptive error e.g: ``` marshmallow.exceptions.ValidationError: {'databases': {'fixtures': {'tables': {'not_a_table': ['Unknown field.']}}}} ``` Closes #260 --- This was intended to be fairly self-contained, but then while I was working on it, I hit some problems getting the tests to pass in the context of the test suite as a whole. My tests passed in isolation, but then failed while doing a full test suite run. That's when the worms started coming out of the can :bug: After some sleuthing, it turned out this was essentially the result of several issues intersecting: * There are certain events in the application lifecycle where the metadata schema can be modified after it is loaded e.g: https://github.com/simonw/datasette/blob/a562f2965552fb2dbbbd74df245c9965ee23d886/datasette/app.py#L299-L320 This means that sometimes what goes in isn't always exactly what comes out when you call `/-/metadata`. * Because the test fixtures use session scope for performance reasons if one unit test performs an action which mutates the metadata, that can impact on other unit tests which run after it using the same fixture. * Because the `self._metadata` property was being set with a simple assignment `self._metadata = metadata`, that created an object reference to the test fixture data, so operating on `self._metadata` was actually modifying the test fixture `METADATA` meaning that depending on when it was loaded in the test suite lifecycle, `METADATA` had different content, which was somewhat unexpected. As such, I've added some band-aids in 3552024 and 6859fd8: * Switching the metadata object to a `deepcopy` of the input prevents us directly mutating the input fixture. * I've switched some of the tests to use a fixture with function scope instead of session scope so we're working on a clean copy that hasn't been mutated by other tests where necessary but keeping session scope in most cases for performance. * I haven't really addressed the fact that sometimes the metadata object gets mutated in place, so the object that is served from `/-/metadata` isn't necessarily always exactly the same as the file you fed into it on init. I'm not sure how much of a problem that is. The way the tests were written makes me think it was unexpected, but getting into it feels like too much scope creep for this PR so its probably best addressed as another issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/644/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521923131,MDExOlB1bGxSZXF1ZXN0MzQwMjExMTQ5,631,bugfix issue 572,3683993,qwo,closed,0,,,,,1,2019-11-13T02:46:50Z,2019-11-13T04:28:43Z,2019-11-13T04:28:42Z,CONTRIBUTOR,simonw/datasette/pulls/631,closes bugfix issue #572 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/631/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 522566332,MDExOlB1bGxSZXF1ZXN0MzQwNzQzMjIw,635,Use Jinja async mode,9599,simonw,closed,0,,,,,0,2019-11-14T01:20:57Z,2019-11-14T23:14:23Z,2019-11-14T23:14:23Z,OWNER,simonw/datasette/pulls/635,Refs #628. Still needs documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/635/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 539985017,MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx,652,Quick (and uninformed and perhaps misguided) attempt to add a url for hosting datasette at a particular host/URI,132978,terrycojones,closed,0,,,,,1,2019-12-18T23:37:16Z,2020-03-24T22:14:50Z,2020-03-24T22:14:50Z,NONE,simonw/datasette/pulls/652,"As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped. BUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases. My changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!) I'm happy to be more thorough on this if you think it's worth pursuing. Fixes #394 (he said, optimistically).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/652/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541331755,MDExOlB1bGxSZXF1ZXN0MzU2MDA0MjQy,653,allow leading comments in SQL input field,418191,jaywgraves,closed,0,,,,,8,2019-12-21T14:19:52Z,2020-02-05T02:35:41Z,2020-02-05T02:13:25Z,CONTRIBUTOR,simonw/datasette/pulls/653,"this changes the SQL validation to allow for lines that are commented out my main use case for this is that I like to write a succession of queries when trying to solve a problem. In most native SQL clients there is a key binding that will run just the current highlighted query or the program is smart enough to run just the query that the cursor is in if it's properly delimited with a ';'. Typically my workflow will start with a single simple query and I'll copy/paste it to a new query below when I want to make big changes while debugging. This makes it easy to go back to a working version above when the query doesn't work. Since datasette sends the whole query to the DB I have to comment out the older queries by prefixing each line with `--`. This gets caught by the validators when I use my typical strategy of copy/pasting each successive query below the last one. so this is just a simple fix to allow for a query to be sent to the DB with leading comments. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543355051,MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2,6,don't break if source is missing,78035,mfa,closed,0,,,,,1,2019-12-29T10:46:47Z,2020-03-28T02:28:11Z,2020-03-28T02:28:11Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/6,broke for me. very old checkins in 2010 had no source set.,205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,bcongdon,closed,0,,,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543738004,MDExOlB1bGxSZXF1ZXN0MzU3OTkyNTg4,72,Fixed implementation of upsert,9599,simonw,closed,0,,,,,0,2019-12-30T05:08:05Z,2019-12-30T05:29:24Z,2019-12-30T05:29:24Z,OWNER,simonw/sqlite-utils/pulls/72,Refs #66,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 546078359,MDExOlB1bGxSZXF1ZXN0MzU5ODIyNzcz,75,Explicitly include tests and docs in sdist,15092,jayvdb,closed,0,,,,,1,2020-01-07T04:53:20Z,2020-01-31T00:21:27Z,2020-01-31T00:21:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/75,Also exclude 'tests' from runtime installation.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 552773632,MDExOlB1bGxSZXF1ZXN0MzY1MjE4Mzkx,660,"gcloud run is now GA, s/beta//",813732,glasnt,closed,0,,,,,1,2020-01-21T10:08:38Z,2020-01-22T03:41:09Z,2020-01-21T23:28:12Z,CONTRIBUTOR,simonw/datasette/pulls/660,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557077945,MDExOlB1bGxSZXF1ZXN0MzY4NzM0NTAw,663,"-p argument for datasette package, plus tests - refs #661",9599,simonw,closed,0,,,,,1,2020-01-29T19:47:50Z,2020-01-29T22:46:43Z,2020-01-29T22:46:43Z,OWNER,simonw/datasette/pulls/663,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557830332,MDExOlB1bGxSZXF1ZXN0MzY5MzQ4MDg0,78,"New conversions= feature, refs #77",9599,simonw,closed,0,,,,,0,2020-01-31T00:02:33Z,2020-09-22T07:48:29Z,2020-01-31T00:24:31Z,OWNER,simonw/sqlite-utils/pulls/78,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557892819,MDExOlB1bGxSZXF1ZXN0MzY5Mzk0MDQz,80,on_create mechanism for after table creation,9599,simonw,closed,0,,,,,5,2020-01-31T03:38:48Z,2020-01-31T05:08:04Z,2020-01-31T05:08:04Z,OWNER,simonw/sqlite-utils/pulls/80,"I need this for `geojson-to-sqlite`, in particular https://github.com/simonw/geojson-to-sqlite/issues/6",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 565064079,MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy,672,--dirs option for scanning directories for SQLite databases,9599,simonw,open,0,,,,,15,2020-02-14T02:25:52Z,2020-03-27T01:03:53Z,,OWNER,simonw/datasette/pulls/672,Refs #417.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/672/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 570101428,MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4,683,.execute_write() and .execute_write_fn() methods on Database,9599,simonw,closed,0,,,3268330,Datasette 1.0,14,2020-02-24T19:51:58Z,2020-05-30T18:40:20Z,2020-02-25T04:45:08Z,OWNER,simonw/datasette/pulls/683,"See #682 - [x] Come up with design for `.execute_write()` and `.execute_write_fn()` - [x] Build some quick demo plugins to exercise the design - [x] Write some unit tests - [x] Write the documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 570327466,MDExOlB1bGxSZXF1ZXN0Mzc5Mzc4Nzgw,686,?_searchmode=raw option,9599,simonw,closed,0,,,,,0,2020-02-25T05:45:50Z,2020-02-25T05:56:09Z,2020-02-25T05:56:04Z,OWNER,simonw/datasette/pulls/686,Closes #676,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,bcongdon,closed,0,,,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 559522877,MDExOlB1bGxSZXF1ZXN0MzcwNjc1MDA3,664,Datasette.render_template() method,9599,simonw,closed,0,,,,,5,2020-02-04T06:53:59Z,2020-02-04T20:26:18Z,2020-02-04T20:26:18Z,OWNER,simonw/datasette/pulls/664,Refs #577,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/664/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 561469252,MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4,33,Upgrade to sqlite-utils 2.2.1,9599,simonw,closed,0,,,,,1,2020-02-07T07:32:12Z,2020-03-20T19:21:42Z,2020-03-20T19:21:41Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/33,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562085508,MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2,666,"Use inspect-file, if possible, for total row count",13896256,kevindkeogh,closed,0,,,,,3,2020-02-08T22:10:35Z,2020-03-09T02:47:15Z,2020-02-25T20:19:29Z,CONTRIBUTOR,simonw/datasette/pulls/666,"For large tables, counting the number of rows in the table can take a signficant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/666/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 563348959,MDExOlB1bGxSZXF1ZXN0MzczNzc1Nzg4,669,fix db-to-sqlite command in ecosystem doc page,883348,adipasquale,closed,0,,,,,1,2020-02-11T17:05:41Z,2020-02-22T02:32:18Z,2020-02-22T02:32:17Z,CONTRIBUTOR,simonw/datasette/pulls/669,the `--connection` parameter has become positional,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 573088799,MDExOlB1bGxSZXF1ZXN0MzgxNjY2Nzc3,688,Don't count rows on homepage for DBs > 100MB,9599,simonw,closed,0,,,,,0,2020-02-29T01:01:06Z,2020-02-29T01:08:30Z,2020-02-29T01:08:29Z,OWNER,simonw/datasette/pulls/688,Closes #649.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 589801352,MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3,96,Add type conversion for Panda's Timestamp,32605365,b0b5h4rp13,closed,0,,,,,2,2020-03-29T14:13:09Z,2020-03-31T04:40:49Z,2020-03-31T04:40:48Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/96,"Add type conversion for Panda's Timestamp, if Panda library is present in system (thanks for this project, I was about to do the same thing from scratch)",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 592844348,MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz,714,--metadata accepts YAML as well as JSON,9599,simonw,closed,0,,,,,1,2020-04-02T18:36:02Z,2020-04-02T19:30:54Z,2020-04-02T19:30:54Z,OWNER,simonw/datasette/pulls/714,Refs #713. Still needs tests and documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/714/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 594553553,MDExOlB1bGxSZXF1ZXN0Mzk5MTY2NDMz,719,asgi: check raw_path is not None,193185,cldellow,closed,0,,,,,1,2020-04-05T16:53:58Z,2020-05-04T17:14:26Z,2020-05-04T17:14:26Z,CONTRIBUTOR,simonw/datasette/pulls/719,"The ASGI spec (https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to imply that `None` is a valid value, so we need to check the value itself, not just whether the key is present. In particular, the [mangum](https://github.com/erm/mangum) adapter passes `None` for this key's value. This change permits mangum to be used to front datasette in Amazon API Gateway + AWS Lambda deployments.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585597133,MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5,703,WIP implementation of writable canned queries,9599,simonw,closed,0,,,,,3,2020-03-21T22:23:51Z,2020-06-03T00:08:14Z,2020-06-02T23:57:35Z,OWNER,simonw/datasette/pulls/703,Refs #698.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 587302139,MDExOlB1bGxSZXF1ZXN0MzkzMjc0NDMz,708,"base_url configuration setting, refs #394",9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-24T21:52:00Z,2020-03-25T00:18:44Z,2020-03-25T00:18:44Z,OWNER,simonw/datasette/pulls/708,Pull request implementing #394,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 603242257,MDExOlB1bGxSZXF1ZXN0NDA2MDY3MDE5,728,"Update mergedeep requirement from ~=1.1.1 to >=1.1.1,<1.4.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-20T13:33:23Z,2020-05-04T16:45:58Z,2020-05-04T16:45:49Z,CONTRIBUTOR,simonw/datasette/pulls/728,"Updates the requirements on [mergedeep](https://github.com/clarketm/mergedeep) to permit the latest version.
Commits
  • 3d6e7b4 v1.3.0 - support additive merging of Counter types
  • 56a258a v1.2.1 - tidy docs and variable names
  • 61ab213 v1.2.0 - support both TYPESAFE_REPLACE and TYPESAFE_ADDITIVE merge strategies...
  • b331bb5 cleanup Makefile
  • 6f577bf officially label support for python3.8
  • 84faf37 use pipenv for managing dev dependencies
  • 3a8761a Update README.md
  • See full diff in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 604001627,MDExOlB1bGxSZXF1ZXN0NDA2Njc3MjA1,730,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-04-21T13:32:35Z,2020-05-04T13:27:24Z,2020-05-04T13:27:23Z,CONTRIBUTOR,simonw/datasette/pulls/730,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • 1026c39 0.11.0
  • ab2b140 Test on Python 3.8, drop 3.3 and 3.4
  • 6397a22 plugin: Use pytest 5.4.0 new Function API
  • 21a0f94 Replace yield_fixture() by fixture()
  • 964b295 Added min hypothesis version so that bugfix for https://github.com/Hypothesis...
  • 4a11a20 Add max supported pytest version to < 5.4.0 to prevent fails until #141 is fi...
  • b305594 Change event_loop to module scope in hypothesis tests, fixing #145.
  • d5a0f47 Enable test_subprocess to be run on win, by changing to ProactorEventLoop in ...
  • d07cd2d Fix required pytest version
  • 86cd9a6 Handle BaseExceptions from loop.run_until_complete (#126)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 605546606,MDExOlB1bGxSZXF1ZXN0NDA3OTI5MTI4,734,"Update janus requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-23T13:43:45Z,2020-05-04T16:48:14Z,2020-05-04T16:48:04Z,CONTRIBUTOR,simonw/datasette/pulls/734,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
Changelog

Sourced from janus's changelog.

0.5.0 (2020-04-23)

  • Remove explicit loop arguments and forbid creating queues outside event loops #246

0.4.0 (2018-07-28)

  • Add py.typed macro #89
  • Drop python 3.4 support and fix minimal version python3.5.3 #88
  • Add property with that indicates if queue is closed #86

0.3.2 (2018-07-06)

  • Fixed python 3.7 support #97

0.3.1 (2018-01-30)

  • Fixed bug with join() in case tasks are added by sync_q.put() #75

0.3.0 (2017-02-21)

  • Expose unfinished_tasks property #34

0.2.4 (2016-12-05)

  • Restore tarball deploying

0.2.3 (2016-07-12)

  • Fix exception type

0.2.2 (2016-07-11)

  • Update asyncio.async() to use asyncio.ensure_future() #6

0.2.1 (2016-03-24)

  • Fix python setup.py test command #4

0.2.0 (2015-09-20)

... (truncated)
Commits
  • 8e89b45 Bump to 0.5.0
  • ec8592b Fix up Python 3.8 loop argument warnings (#246)
  • 2543af6 Bump coverage from 5.0.4 to 5.1
  • 03d1b36 Bump tox from 3.14.5 to 3.14.6
  • 8219c38 Bump coverage from 5.0.3 to 5.0.4
  • 85ec71d Bump pytest from 5.4.0 to 5.4.1
  • 3b974c9 Bump pytest from 5.3.5 to 5.4.0
  • 282dc12 Bump mypy from 0.761 to 0.770
  • 1364fb3 Bump tox from 3.14.4 to 3.14.5
  • dc519bb Bump tox from 3.14.3 to 3.14.4
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607067303,MDExOlB1bGxSZXF1ZXN0NDA5MTIzODk3,737,"Custom pages mechanism, refs #648",9599,simonw,closed,0,,,,,4,2020-04-26T17:31:41Z,2020-04-26T18:46:43Z,2020-04-26T18:46:43Z,OWNER,simonw/datasette/pulls/737,"Refs #648. TODO: - [x] Pass a `view_name` to `render_template()` - [x] Mechanism for custom status code / headers / redirect - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607107849,MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw,739,Configuration directory mode,9599,simonw,closed,0,,,,,3,2020-04-26T20:37:46Z,2020-04-27T16:30:25Z,2020-04-27T16:30:25Z,OWNER,simonw/datasette/pulls/739,"Refs #731 TODO: - [x] Decide how to combine explicit command-line options with items detected from the directory structure - [x] Add unit tests - [x] Implement `inspect-data.json` mechanism for populating `immutables` - [x] Add documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245802,MDExOlB1bGxSZXF1ZXN0NDAwNTc4OTc5,720,"Update beautifulsoup4 requirement from ~=4.8.1 to >=4.8.1,<4.10.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:24:38Z,2020-05-04T17:14:51Z,2020-05-04T17:14:46Z,CONTRIBUTOR,simonw/datasette/pulls/720,"Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245923,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MDc3,721,"Update pytest requirement from ~=5.2.2 to >=5.2.2,<5.5.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:04Z,2020-05-04T17:13:49Z,2020-05-04T17:13:41Z,CONTRIBUTOR,simonw/datasette/pulls/721,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
Release notes

Sourced from pytest's releases.

5.4.1

pytest 5.4.1 (2020-03-13)

Bug Fixes

  • #6909: Revert the change introduced by #6330, which required all arguments to @pytest.mark.parametrize to be explicitly defined in the function signature.

    The intention of the original change was to remove what was expected to be an unintended/surprising behavior, but it turns out many people relied on it, so the restriction has been reverted.

  • #6910: Fix crash when plugins return an unknown stats while using the --reportlog option.

Changelog

Sourced from pytest's changelog.

Commits
  • 3d0f3ba Preparing release version 5.4.1
  • b9e2cd0 Merge pull request #6914 from nicoddemus/revert-6330
  • a84fcbf Revert "[parametrize] enforce explicit argnames declaration (#6330)"
  • 59c1bfa Merge pull request #6913 from nicoddemus/backport-6910
  • 3267f64 Merge pull request #6910 from nicoddemus/resultlog-logreport
  • c9fd1bd Preparing release version 5.4.0
  • 93aa988 Merge pull request #6901 from RonnyPfannschmidt/regendoc-fix-simple
  • 7996724 Merge pull request #6902 from RoyalTS/filterwarnings-docfix
  • 90ee8a7 docfix
  • 378a75d run and fix tox -e regen to prepare 5.4
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596246006,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MTM2,722,"Update jinja2 requirement from ~=2.10.3 to >=2.10.3,<2.12.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:24Z,2020-05-04T17:13:26Z,2020-05-04T17:13:16Z,CONTRIBUTOR,simonw/datasette/pulls/722,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes

Sourced from jinja2's releases.

2.11.1

This fixes an issue in async environment when indexing the result of an attribute lookup, like {{ data.items[1:] }}.

Changelog

Sourced from jinja2's changelog.

Version 2.11.1

Released 2020-01-30

  • Fix a bug that prevented looking up a key after an attribute ({{ data.items[1:] }}) in an async template. 1141

Version 2.11.0

Released 2020-01-27

  • Drop support for Python 2.6, 3.3, and 3.4. This will be the last version to support Python 2.7 and 3.5.
  • Added a new ChainableUndefined class to support getitem and getattr on an undefined object. 977
  • Allow {%+ syntax (with NOP behavior) when lstrip_blocks is disabled. 748
  • Added a default parameter for the map filter. 557
  • Exclude environment globals from meta.find_undeclared_variables. 931
  • Float literals can be written with scientific notation, like 2.56e-3. 912, 922
  • Int and float literals can be written with the '_' separator for legibility, like 12_345. 923
  • Fix a bug causing deadlocks in LRUCache.setdefault. 1000
  • The trim filter takes an optional string of characters to trim. 828
  • A new jinja2.ext.debug extension adds a {% debug %} tag to quickly dump the current context and available filters and tests. 174, 798, 983
  • Lexing templates with large amounts of whitespace is much faster. 857, 858
  • Parentheses around comparisons are preserved, so {{ 2 * (3 < 5) }} outputs "2" instead of "False". 755, 938
  • Add new boolean, false, true, integer and float tests. 824
  • The environment's finalize function is only applied to the output of expressions (constant or not), not static template data. 63
  • When providing multiple paths to FileSystemLoader, a template can have the same name as a directory. 821
  • Always return Undefined when omitting the else clause in a {{ 'foo' if bar }} expression, regardless of the environment's undefined class. Omitting the else clause is a valid shortcut and should not raise an error when using StrictUndefined. 710, 1079
  • Fix behavior of loop control variables such as length and revindex0 when looping over a generator. 459, 751, 794, 993
  • Async support is only loaded the first time an environment enables it, in order to avoid a slow initial import. 765
  • In async environments, the |map filter will await the filter call if needed. 913
  • In for loops that access loop attributes, the iterator is not advanced ahead of the current iteration unless length, revindex, nextitem, or last are accessed. This makes it less likely to break groupby results. 555, 1101
  • In async environments, the loop attributes length and revindex work for async iterators. 1101
  • In async environments, values from attribute/property access will be awaited if needed. 1101
  • ~loader.PackageLoader doesn't depend on setuptools or pkg_resources. 970
  • PackageLoader has limited support for 420 namespace packages. 1097
  • Support os.PathLike objects in ~loader.FileSystemLoader and ~loader.ModuleLoader. 870
  • ~nativetypes.NativeTemplate correctly handles quotes between expressions. "'{{ a }}', '{{ b }}'" renders as the tuple ('1', '2') rather than the string '1, 2'. 1020
  • Creating a ~nativetypes.NativeTemplate directly creates a ~nativetypes.NativeEnvironment instead of a default Environment. 1091
  • After calling LRUCache.copy(), the copy's queue methods point to the correct queue. 843
  • Compiling templates always writes UTF-8 instead of defaulting to the system encoding. 889
  • |wordwrap filter treats existing newlines as separate paragraphs to be wrapped individually, rather than creating short intermediate lines. 175
  • Add break_on_hyphens parameter to |wordwrap filter. 550
  • Cython compiled functions decorated as context functions will be passed the context. 1108
  • When chained comparisons of constants are evaluated at compile time, the result follows Python's behavior of returning False if any comparison returns False, rather than only the last one. 1102
  • Tracebacks for exceptions in templates show the correct line numbers and source for Python >= 3.7. 1104
  • Tracebacks for template syntax errors in Python 3 no longer show internal compiler frames. 763
  • Add a DerivedContextReference node that can be used by extensions to get the current context and local variables such as loop. 860
  • Constant folding during compilation is applied to some node types that were previously overlooked. 733
  • TemplateSyntaxError.source is not empty when raised from an included template. 457
... (truncated)
Commits
  • b85283e release version 2.11.1
  • 3d5bfc6 Merge pull request #1143 from pallets/bugfix/attribute-access
  • d61c1ea add changelog
  • 15d7e61 Added regression test for slicing of attributes
  • 05dee9b Fix attribute access in async code. Fixes #1141
  • bbdafe3 release version 2.11.0
  • 9ff27f6 add python 3.8 classifier, clean up changelog
  • d312609 isolate bytecode cache tests
  • 9849979 import Markup from markupsafe, fix flake8 import warnings
  • c6d864c increment bytecode cache version
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 598891570,MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0,725,"Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,3,2020-04-13T13:32:47Z,2020-05-04T18:16:54Z,2020-05-04T16:17:49Z,CONTRIBUTOR,simonw/datasette/pulls/725,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 614806683,MDExOlB1bGxSZXF1ZXN0NDE1Mjg2MTA1,763,Documentation + improvements for db.execute() and Results class,9599,simonw,closed,0,,,,,0,2020-05-08T15:16:02Z,2020-06-11T16:05:48Z,2020-05-08T16:05:46Z,OWNER,simonw/datasette/pulls/763,"Refs #685 Still TODO: - [x] Implement `results.first()` - [x] Implement `results.single_value()` - [x] Unit tests for the above ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/763/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 608752766,MDExOlB1bGxSZXF1ZXN0NDEwNDY5Mjcy,746,"shutil.Error, not OSError",9599,simonw,closed,0,,,,,1,2020-04-29T03:30:51Z,2020-04-29T07:07:24Z,2020-04-29T07:07:23Z,OWNER,simonw/datasette/pulls/746,Refs #744,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 611874514,MDExOlB1bGxSZXF1ZXN0NDEyOTUxMTkx,753,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.13",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-05-04T13:27:19Z,2020-05-04T17:41:01Z,2020-05-04T17:40:49Z,CONTRIBUTOR,simonw/datasette/pulls/753,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • b8e2a45 0.12.0
  • 06580c6 Update changelog
  • b45de23 Fixed failing test case, 'test_asyncio_marker_without_loop'.
  • 238cced Put event_loop first among the fixtures of asyncio tests, fixes #154.
  • e5e3dc7 Added unittests for issue #154.
  • a7e5795 0.12.0 open for business!
  • 1026c39 0.11.0
  • ab2b140 Test on Python 3.8, drop 3.3 and 3.4
  • 6397a22 plugin: Use pytest 5.4.0 new Function API
  • 21a0f94 Replace yield_fixture() by fixture()
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 625922239,MDExOlB1bGxSZXF1ZXN0NDI0MDMyNDQ1,769,Backport of Python 3.8 shutil.copytree,9599,simonw,closed,0,,,5471110,Datasette 0.43,0,2020-05-27T18:17:15Z,2020-05-27T20:21:56Z,2020-05-27T18:17:44Z,OWNER,simonw/datasette/pulls/769,Closes #744,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/769/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 625991831,MDExOlB1bGxSZXF1ZXN0NDI0MDg1MjY0,772,Test that plugin hooks are unit tested,9599,simonw,closed,0,,,5471110,Datasette 0.43,0,2020-05-27T20:01:32Z,2020-05-27T20:21:56Z,2020-05-27T20:16:03Z,OWNER,simonw/datasette/pulls/772,Refs #771,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/772/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 627836898,MDExOlB1bGxSZXF1ZXN0NDI1NTMxMjA1,783,Authentication: plugin hooks plus default --root auth mechanism,9599,simonw,closed,0,,,,,0,2020-05-30T22:25:47Z,2020-06-01T01:16:44Z,2020-06-01T01:16:43Z,OWNER,simonw/datasette/pulls/783,See #699,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/783/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 629595228,MDExOlB1bGxSZXF1ZXN0NDI2ODkxNDcx,796,New WIP writable canned queries,9599,simonw,closed,0,,,3268330,Datasette 1.0,9,2020-06-03T00:08:00Z,2020-06-03T15:16:52Z,2020-06-03T15:16:50Z,OWNER,simonw/datasette/pulls/796,"Refs #698. Replaces #703 Still todo: - [x] Unit tests - ~~Figure out `.json` mode~~ - [x] Flash message solution - ~~CSRF protection~~ - [x] Better error message display on errors - [x] Documentation - ~~Maybe widgets?~~ I'll do these later",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 631300342,MDExOlB1bGxSZXF1ZXN0NDI4MjEyNDIx,798,CSRF protection,9599,simonw,closed,0,,,5512395,Datasette 0.44,5,2020-06-05T04:22:35Z,2020-06-06T00:43:41Z,2020-06-05T19:05:58Z,OWNER,simonw/datasette/pulls/798,Refs #793,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 632645865,MDExOlB1bGxSZXF1ZXN0NDI5MzY2NjQx,803,Canned query permissions,9599,simonw,closed,0,,,,,0,2020-06-06T18:20:00Z,2020-06-06T19:40:21Z,2020-06-06T19:40:20Z,OWNER,simonw/datasette/pulls/803,Refs #800. Closes #786,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/803/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 632919570,MDExOlB1bGxSZXF1ZXN0NDI5NjEzODkz,809,Publish secrets,9599,simonw,closed,0,,,5512395,Datasette 0.44,4,2020-06-07T02:00:31Z,2020-06-11T16:02:13Z,2020-06-11T16:02:03Z,OWNER,simonw/datasette/pulls/809,Refs #787. Will need quite a bit of manual testing since this involves code which runs against Heroku and Cloud Run.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/809/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 622672640,MDExOlB1bGxSZXF1ZXN0NDIxNDkxODEw,768,Use dirs_exist_ok=True,9599,simonw,closed,0,,,5471110,Datasette 0.43,0,2020-05-21T17:53:44Z,2020-05-27T20:21:56Z,2020-05-21T17:53:51Z,OWNER,simonw/datasette/pulls/768,Refs #744,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/768/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 638230433,MDExOlB1bGxSZXF1ZXN0NDM0MDU1NzUy,844,Action to run tests and upload coverage report,9599,simonw,closed,0,,,,,1,2020-06-13T20:52:47Z,2020-06-13T21:36:52Z,2020-06-13T21:36:50Z,OWNER,simonw/datasette/pulls/844,Refs #843,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/844/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 638270441,MDExOlB1bGxSZXF1ZXN0NDM0MDg1MjM1,848,Reload support for config_dir mode.,49260,amjith,closed,0,,,,,1,2020-06-14T02:34:46Z,2020-07-03T02:44:54Z,2020-07-03T02:44:53Z,CONTRIBUTOR,simonw/datasette/pulls/848,"A reference implementation for adding support to reload when datasette is in the config_dir mode. This implementation is flawed since it is watching the entire directory and any changes to the database will reload the server and adding unrelated files to the directory will also reload the server. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 638375985,MDExOlB1bGxSZXF1ZXN0NDM0MTYyMzE2,29,Fixed bug in SQL query for photo scores,41546558,RhetTbull,closed,0,,,,,1,2020-06-14T15:39:22Z,2020-12-04T22:32:36Z,2020-12-04T22:32:27Z,CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/29,"The join on ZCOMPUTEDASSETATTRIBUTES used the wrong columns. In most of the Photos database tables, table.ZASSET joins with ZGENERICASSET.Z_PK",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 644610729,MDExOlB1bGxSZXF1ZXN0NDM5MjAzODA4,866,"Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-06-24T13:21:47Z,2020-06-24T18:50:57Z,2020-06-24T18:50:56Z,CONTRIBUTOR,simonw/datasette/pulls/866,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • 53f3da7 Prepare for release
  • e99569d A line is added to the changelog.
  • 4099b63 One import is not needed
  • 68513b3 Clarify names and comments, according to yanlend comments 26 May
  • 907e8f2 FIX new test_cases on python 3.5 & 3.6
  • 51d986c To solve test cases that fail:
  • f97e900 1) Test case (test_async_fixtures_with_finalizer) refactoring to pass on pyth...
  • c1131f8 1) A new test case that fails with 0.12.0, and pass with this commit.
  • 7a255bc 0.13.0 open for business
  • b8e2a45 0.12.0
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/866/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 635037204,MDExOlB1bGxSZXF1ZXN0NDMxNDc4NzI0,819,register_routes() plugin hook,9599,simonw,closed,0,,,5512395,Datasette 0.44,0,2020-06-09T01:20:44Z,2020-06-09T03:12:08Z,2020-06-09T03:12:07Z,OWNER,simonw/datasette/pulls/819,Refs #215,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 651844316,MDExOlB1bGxSZXF1ZXN0NDQ1MDIzMzI2,118,Add insert --truncate option,79913,tsibley,closed,0,,,,,9,2020-07-06T21:58:40Z,2020-07-08T17:26:21Z,2020-07-08T17:26:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/118," Deletes all rows in the table (if it exists) before inserting new rows. SQLite doesn't implement a TRUNCATE TABLE statement but does optimize an unqualified DELETE FROM. This can be handy if you want to refresh the entire contents of a table but a) don't have a PK (so can't use --replace), b) don't want the table to disappear (even briefly) for other connections, and c) have to handle records that used to exist being deleted. Ideally the replacement of rows would appear instantaneous to other connections by putting the DELETE + INSERT in a transaction, but this is very difficult without breaking other code as the current transaction handling is inconsistent and non-systematic. There exists the possibility for the DELETE to succeed but the INSERT to fail, leaving an empty table. This is not much worse, however, than the current possibility of one chunked INSERT succeeding and being committed while the next chunked INSERT fails, leaving a partially complete operation.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 652816158,MDExOlB1bGxSZXF1ZXN0NDQ1ODMzOTA4,120,Fix query command's support for DML,79913,tsibley,closed,0,,,,,1,2020-07-08T01:36:34Z,2020-07-08T05:14:04Z,2020-07-08T05:14:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/120,See commit messages for details. I ran into this while investigating another feature/issue.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 655974395,MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw,30,Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces),110038,scanner,open,0,,,,,0,2020-07-13T16:15:26Z,2020-07-13T16:15:26Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/30,"Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3. Had to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop. Also probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it.",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 646448486,MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0,868,initial windows ci setup,702729,joshmgrant,open,0,,,,,3,2020-06-26T18:49:13Z,2021-07-10T23:41:43Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/868,Picking up the work done on #557 with a new PR. Seeing if I can get this working.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 646734280,MDExOlB1bGxSZXF1ZXN0NDQwOTQ2ODE3,869,Magic parameters for canned queries,9599,simonw,closed,0,,,5533512,Datasette 0.45,1,2020-06-27T18:37:21Z,2020-06-28T02:58:18Z,2020-06-28T02:58:17Z,OWNER,simonw/datasette/pulls/869,"Implementation for #842 TODO: - [x] Add tests for built-in magic parameters - [x] Magic parameters should not show up as blank form fields on the query page - [x] Update documentation for new `_request_X` (now called `_header_X`) implementation where X is a key from the ASGI scope - [x] Make sure these only work for canned queries, not for arbitrary SQL queries (security issue) - [x] Add test for the `register_magic_parameters` plugin hook - [x] Add documentation for the `register_magic_parameters` plugin hook ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 648749062,MDExOlB1bGxSZXF1ZXN0NDQyNTA1MDg4,883,Skip counting hidden tables,3243482,abdusco,open,0,,,,,4,2020-07-01T07:38:08Z,2020-07-02T00:25:44Z,,CONTRIBUTOR,simonw/datasette/pulls/883,"Potential fix for https://github.com/simonw/datasette/issues/859. Disabling table counts for hidden tables speeds up database page quite a bit. In my setup it reduced load time by 2/3 (~300 -> ~90ms)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/883/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 650305298,MDExOlB1bGxSZXF1ZXN0NDQzODIzMDQw,890,Load only python files from plugins-dir.,49260,amjith,closed,0,,,,,2,2020-07-03T02:47:32Z,2020-07-03T03:08:33Z,2020-07-03T03:08:33Z,CONTRIBUTOR,simonw/datasette/pulls/890,"The current behavior for `--plugins-dir` is to load every file in that folder as a python module. This can result in errors if there are non-python files in the plugins dir (such as .mypy_cache). This PR restricts the module loading to only python files. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/890/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,