id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,closed,0,,3268330,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,closed,0,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,closed,0,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,closed,0,,3268330,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,closed,0,,3268330,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,open,0,,3268330,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,closed,0,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,closed,0,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,closed,0,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,closed,0,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,closed,0,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1362363685,I_kwDOBm6k_c5RNAUl,1800,Remove upper bound dependencies as a default policy,9599,closed,0,,,3,2022-09-05T18:23:45Z,2022-09-05T18:39:52Z,2022-09-05T18:35:41Z,OWNER,,"https://iscinumpy.dev/post/bound-version-constraints/ has convinced me not to use upper bound dependencies unless I'm certain they are needed. Relevant PR: - https://github.com/simonw/datasette/pull/1799 Also: https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L45-L46 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L48-L49 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L51-L55 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L57-L59 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L75-L78 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L81-L82 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1800/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,closed,0,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386854246,I_kwDOBm6k_c5Sqbdm,1822,Switch to keyword-only arguments for a bunch of internal methods,9599,open,0,,3268330,3,2022-09-26T23:20:38Z,2022-09-27T00:44:04Z,,OWNER,,"This is a good idea, and one that needs to happen before Datasette 1.0: > While you are adding features, would you be future-proofing your APIs if you switched over some arguments over to keyword-only arguments or would that be too disruptive? > > Thinking out loud: > > ``` > async def render_template( > self, templates, *, context=None, plugin_context=None, request=None, view_name=None > ): > ``` _Originally posted by @jefftriplett in https://github.com/simonw/datasette/issues/1817#issuecomment-1256781274_ ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1822/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,closed,0,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422973111,I_kwDOBm6k_c5U0Ni3,1854,Flaky test: test_serve_localhost_http,9599,closed,0,,,3,2022-10-25T19:37:35Z,2022-10-25T19:53:02Z,2022-10-25T19:53:02Z,OWNER,,Failing on Python 3.10 at the moment: https://github.com/simonw/datasette/actions/runs/3323629947/jobs/5494340302,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336122,I_kwDOBm6k_c5U1mK6,1856,allow_signed_tokens setting for disabling API signed token mechanism,9599,closed,0,,8658075,3,2022-10-26T02:20:55Z,2022-11-15T19:57:05Z,2022-10-26T02:58:35Z,OWNER,,"Had some design thoughts here: https://github.com/simonw/datasette/issues/1852#issuecomment-1291272280 I liked this option the most: --setting allow_create_tokens off",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1856/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423369494,I_kwDOBm6k_c5U1uUW,1859,datasette create-token CLI command,9599,closed,0,,8658075,3,2022-10-26T03:12:59Z,2022-11-15T19:59:00Z,2022-10-26T04:31:39Z,OWNER,,The CLI equivalent of the `/-/create-token` page.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1410305897,I_kwDOBm6k_c5UD49p,1845,Reconsider the Datasette first-run experience,9599,open,0,,,3,2022-10-15T22:21:31Z,2022-10-16T08:54:53Z,,OWNER,,"Had a really interesting conversation today about how hard it is to get from ""I installed Datasette"" to ""I've done something useful with it"": https://news.ycombinator.com/item?id=33216789#33218590 Spending some time focusing on that first-run experience feels very worthwhile.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1420090659,I_kwDOBm6k_c5UpN0j,1848,Private database page should show padlock on every table,9599,closed,0,,,3,2022-10-24T02:28:38Z,2022-10-24T02:50:29Z,2022-10-24T02:42:34Z,OWNER,,"Following: - #1829 https://latest.datasette.io/_internal looks like this: But those queries and tables are private too, and should also show the padlock icon.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,closed,0,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1428560020,I_kwDOBm6k_c5VJhiU,1872,"SITE-BUSTING ERROR: ""render_template() called before await ds.invoke_startup()""",192568,closed,0,,,3,2022-10-30T02:28:39Z,2022-10-30T06:26:01Z,2022-10-30T06:26:01Z,CONTRIBUTOR,,"1. My https://list.saferdisinfectants.org/disinfectants/listN page (linked from https://SaferDisinfectants.org ) has been running beautifully for a year and a half, including a GitHub Actions workflow that's been routinely updating the database. 2. I received a recent report that the list page is down. I don't know when it went down, but the content is replaced with: ""render_template() called before await ds.invoke_startup()"" 3. The local datasette repo runs without incident. 4. The site is hosted on vercel, linked to my github repo. Perhaps some vercel changes were made, but not by anyone on our side. Here is a screenshot of the current project settings: Here a screenshot of the latest deployment status: This is my repository: https://github.com/mroswell/list-N (I notice: datasette==0.59 in my requirements.txt file) Because it's been long while since I actively worked on this or any other datasette project, I forget a lot of what I knew at one point. Perhaps some configuration file could be missing? Or perhaps I just need to know the right incantation to add to that vercel settings page. Help is welcome as the nonprofit org is soon hosting its annual conference, and we'd love to have the page working again. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,closed,0,,8658075,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470320227,I_kwDOBm6k_c5Xo05j,1923,latest.datasette.io Cloud Run deploys failing,9599,closed,0,,,3,2022-11-30T22:49:34Z,2022-11-30T23:04:56Z,2022-11-30T23:04:56Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/3587402085/jobs/6038106719v ``` Warning: ""service_account_key"" has been deprecated. Please switch to using google-github-actions/auth which supports both Workload Identity Federation and Service Account Key JSON authentication. For more details, see https://github.com/google-github-actions/setup-gcloud#authorization Error: google-github-actions/setup-gcloud failed with: failed to execute command `gcloud --quiet auth activate-service-account *** --key-file -`: /opt/hostedtoolcache/gcloud/275.0.0/x64/lib/googlecloudsdk/core/console/console_io.py:544: SyntaxWarning: ""is"" with a literal. Did you mean ""==""? if answer is None or (answer is '' and default is not None): ERROR: gcloud failed to load: module 'collections' has no attribute 'MutableMapping' ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,open,0,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,closed,0,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,open,0,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,closed,0,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,closed,0,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,open,0,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,closed,0,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,open,0,,8755003,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,closed,0,,9700784,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,closed,0,,9700784,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,closed,0,,3268330,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,closed,0,,9700784,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,closed,0,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,open,0,,3268330,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,open,0,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,closed,0,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,closed,0,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,open,0,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,open,0,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,closed,0,,7558727,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,closed,0,,7558727,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,closed,0,,7558727,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,closed,0,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,open,0,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,closed,0,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1227571375,I_kwDOCGYnMM5JK0Cv,431,Allow making m2m relation of a table to itself,738408,open,0,,,3,2022-05-06T08:30:43Z,2022-06-23T14:12:51Z,,NONE,,"I am building a database, in which one of the tables has a many-to-many relationship to itself. As far as I can see, this is not (yet) possible using `.m2m()` in sqlite-utils. This may be a bit of a niche use case, so feel free to close this issue if you feel it would introduce too much complexity compared to the benefits. Example: suppose I have a table of people, and I want to store the information that John and Mary have two children, Michael and Suzy. It would be neat if I could do something like this: ```python from sqlite_utils import Database db = Database(memory=True) db[""people""].insert({""name"": ""John""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) db[""people""].insert({""name"": ""Mary""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) ``` But if I do that, the many-to-many table `parent_child` has only one column: ``` CREATE TABLE [parent_child] ( [people_id] TEXT REFERENCES [people]([name]), PRIMARY KEY ([people_id], [people_id]) ) ``` This could be solved by adding one or two keyword_arguments to `.m2m()`, e.g. `.m2m(..., left_name=None, right_name=None)` or `.m2m(..., names=(None, None))`.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,closed,0,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,closed,0,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,closed,0,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,open,0,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,open,0,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,closed,0,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,closed,0,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,closed,0,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,closed,0,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,closed,0,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,closed,0,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,closed,0,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,closed,0,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,open,0,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,closed,0,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,open,0,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,closed,0,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,closed,0,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,closed,0,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,open,0,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,closed,0,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313494458,MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0,200,Hide Spatialite system tables,45057,closed,0,,,3,2018-04-11T21:26:58Z,2018-04-12T21:34:48Z,2018-04-12T21:34:48Z,CONTRIBUTOR,simonw/datasette/pulls/200,They were getting on my nerves.,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314319372,MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0,205,Support filtering with units and more,45057,closed,0,,,3,2018-04-14T10:47:51Z,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,CONTRIBUTOR,simonw/datasette/pulls/205,"The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403499298,MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3,404,Experiment: run Jinja in async mode,9599,closed,0,,,3,2019-01-27T00:28:44Z,2019-11-12T05:02:18Z,2019-11-12T05:02:13Z,OWNER,simonw/datasette/pulls/404,"See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 452901999,MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw,501,Test against Python 3.8-dev using Travis,9599,closed,0,,,3,2019-06-06T08:37:53Z,2019-11-11T03:23:29Z,2019-11-11T03:23:29Z,OWNER,simonw/datasette/pulls/501,,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505818256,MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1,590,Handle spaces in DB names,2657547,closed,0,,,3,2019-10-11T12:18:22Z,2019-11-04T23:16:31Z,2019-11-04T23:16:30Z,CONTRIBUTOR,simonw/datasette/pulls/590,Closes #503,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,closed,0,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509535510,MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz,602,Offer to format readonly SQL,2657547,closed,0,,,3,2019-10-20T02:29:32Z,2019-11-04T07:29:33Z,2019-11-04T02:39:56Z,CONTRIBUTOR,simonw/datasette/pulls/602,"Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562085508,MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2,666,"Use inspect-file, if possible, for total row count",13896256,closed,0,,,3,2020-02-08T22:10:35Z,2020-03-09T02:47:15Z,2020-02-25T20:19:29Z,CONTRIBUTOR,simonw/datasette/pulls/666,"For large tables, counting the number of rows in the table can take a signficant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*).",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/666/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585597133,MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5,703,WIP implementation of writable canned queries,9599,closed,0,,,3,2020-03-21T22:23:51Z,2020-06-03T00:08:14Z,2020-06-02T23:57:35Z,OWNER,simonw/datasette/pulls/703,Refs #698.,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 607107849,MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw,739,Configuration directory mode,9599,closed,0,,,3,2020-04-26T20:37:46Z,2020-04-27T16:30:25Z,2020-04-27T16:30:25Z,OWNER,simonw/datasette/pulls/739,"Refs #731 TODO: - [x] Decide how to combine explicit command-line options with items detected from the directory structure - [x] Add unit tests - [x] Implement `inspect-data.json` mechanism for populating `immutables` - [x] Add documentation",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 598891570,MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0,725,"Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6",27856297,closed,0,,,3,2020-04-13T13:32:47Z,2020-05-04T18:16:54Z,2020-05-04T16:17:49Z,CONTRIBUTOR,simonw/datasette/pulls/725,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 646448486,MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0,868,initial windows ci setup,702729,open,0,,,3,2020-06-26T18:49:13Z,2021-07-10T23:41:43Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/868,Picking up the work done on #557 with a new PR. Seeing if I can get this working.,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 688386219,MDExOlB1bGxSZXF1ZXN0NDc1NjY1OTg0,142,"insert_all(..., alter=True) should work for new columns introduced after the first 100 records",96218,closed,0,,,3,2020-08-28T22:22:57Z,2020-08-30T07:28:23Z,2020-08-28T22:30:14Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/142,Closes #139.,140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 726910999,MDExOlB1bGxSZXF1ZXN0NTA3OTAzMzky,1040,/db/table/-/blob/pk/column.blob download URL,9599,closed,0,,6026070,3,2020-10-21T22:39:15Z,2020-10-24T23:09:20Z,2020-10-24T23:09:19Z,OWNER,simonw/datasette/pulls/1040,"Refs #1036. Still needs: - [x] Comprehensive tests across all of the code branches, plus permissions - [x] A bit more refactoring to share logic cleanly with `RowView` - ~~A configuration option to disable this feature (probably)~~",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1040/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 735663855,MDExOlB1bGxSZXF1ZXN0NTE1MDE0ODgz,195,table.search() improvements plus sqlite-utils search command,9599,closed,0,,,3,2020-11-03T22:02:08Z,2020-11-06T18:30:49Z,2020-11-06T18:30:42Z,OWNER,simonw/sqlite-utils/pulls/195,Refs #192. Still needs tests.,140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 729818242,MDExOlB1bGxSZXF1ZXN0NTEwMjM1OTA5,189,Allow iterables other than Lists in m2m records,35681,closed,0,,,3,2020-10-26T18:47:44Z,2020-10-27T16:28:37Z,2020-10-27T16:24:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/189,"I was playing around with sqlite-utils, creating a Roam Research dogsheep-style importer for Datasette, and ran into a slight snag. I wanted to use a generator to add an order column in an importer. It looked something like: ``` def order_generator(iterable, attr=None): if attr is None: attr = ""order"" order: int = 0 for i in iterable: i[attr] = order order += 1 yield i ``` When I used this with `insert_all` and other things, it worked fine--but it didn't work as the `records` argument to `m2m`. I dug into it, and sqlite-utils is explicitly checking if the records argument is a list or a tuple. I flipped the check upside down, and now it checks if the argument is a mapping. If it's a mapping, it wraps it in a list, otherwise it leaves it alone. (I get that it might not really make sense to put the order column on the second table. I changed my import schema a bit, and no longer have a real example, but maybe this change still makes sense.) The automated tests still pass, but I did not add any new ones. Let me know what you think! I'm really loving Datasette and its ecosystem; thanks for everything!",140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 752749485,MDExOlB1bGxSZXF1ZXN0NTI4OTk3NjE0,1112,Fix --metadata doc usage,50527,closed,0,,6055094,3,2020-11-28T19:19:51Z,2020-11-28T23:28:21Z,2020-11-28T19:53:48Z,CONTRIBUTOR,simonw/datasette/pulls/1112,"I stumbled on this while trying to figure out how to configure datasette-ripgrep via https://github.com/simonw/datasette-ripgrep/issues/15 You may not want to update the changelog (those are annoying) so I added two commits in case that's easier. ",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 778126516,MDExOlB1bGxSZXF1ZXN0NTQ4MjcxNDcy,1170,Install Prettier via package.json,3637,closed,0,,6346396,3,2021-01-04T14:18:03Z,2021-01-24T21:21:01Z,2021-01-04T19:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/1170,This adds a package.json with Prettier and means that developers/CI will use the same version. It also ensures that NPM packages are cached on GitHub Actions which fixes #1169.,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,closed,0,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797649915,MDExOlB1bGxSZXF1ZXN0NTY0NjA4MjY0,1211,Use context manager instead of plain open,4488943,closed,0,,,3,2021-01-31T07:58:10Z,2021-03-11T16:15:50Z,2021-03-11T16:15:50Z,CONTRIBUTOR,simonw/datasette/pulls/1211,"Context manager with open closes the files after usage. Fixes: https://github.com/simonw/datasette/issues/1208 When the object is already a pathlib.Path i used read_text write_text functions In some cases pathlib.Path.open were used in context manager, it is basically the same as builtin open. Tests are passing: 850 passed, 5 xfailed, 10 xpassed",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 793002853,MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1,1204,WIP: Plugin includes,9599,open,0,,,3,2021-01-25T03:59:06Z,2021-12-17T07:10:49Z,,OWNER,simonw/datasette/pulls/1204,"Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 793086333,MDExOlB1bGxSZXF1ZXN0NTYwODMxNjM4,1206,Release 0.54,9599,closed,0,,,3,2021-01-25T06:45:47Z,2021-01-25T17:33:30Z,2021-01-25T17:33:29Z,OWNER,simonw/datasette/pulls/1206,Refs #1201,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 810507413,MDExOlB1bGxSZXF1ZXN0NTc1MTg3NDU3,1229,ensure immutable databses when starting in configuration directory mode with,295329,closed,0,,,3,2021-02-17T20:18:26Z,2022-04-22T13:16:36Z,2021-03-29T00:17:32Z,CONTRIBUTOR,simonw/datasette/pulls/1229,"fixes #1224 This PR ensures all databases found in a configuration directory that match the files in `inspect-data.json` will be set to `immutable` as outlined in https://docs.datasette.io/en/latest/settings.html#configuration-directory-mode specifically on building the `datasette` instance it checks: - if `immutables` is an empty tuple - as passed by the cli code - if `immutables` is the default function value `None` - when it's not explicitly set And correctly builds the immutable database list from the `inspect-data[file]` keys. Note for this to work the `inspect-data.json` file must contain `file` paths which are relative to the configuration directory otherwise the file paths won't match and the dbs won't be set to immutable. I couldn't find an easy way to test this due to the way `make_app_client` works, happy to take directions on adding a test for this. I've updated the relevant docs as well, i.e. use the `inspect` cli cmd from the config directory path to create the relevant file ``` cd $config_dir datasette inspect *.db --inspect-file=inspect-data.json ``` https://docs.datasette.io/en/latest/performance.html#using-datasette-inspect",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 827341657,MDExOlB1bGxSZXF1ZXN0NTg5MjYzMjk3,1256,Minor type in IP adress,6371750,closed,0,,,3,2021-03-10T08:28:22Z,2021-03-10T18:26:46Z,2021-03-10T18:26:40Z,CONTRIBUTOR,simonw/datasette/pulls/1256,127.0.01 replaced by 127.0.0.1,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 837956424,MDExOlB1bGxSZXF1ZXN0NTk4MjEzNTY1,1271,Use SQLite conn.interrupt() instead of sqlite_timelimit(),9599,open,0,,,3,2021-03-22T17:34:20Z,2021-03-22T21:49:27Z,,OWNER,simonw/datasette/pulls/1271,"Refs #1270, #1268, #1249 Before merging this I need to do some more testing (to make sure that expensive queries really are properly cancelled). I also need to delete a bunch of code relating to the old mechanism of cancelling queries. [See comment below: this doesn't actually cancel the query due to a thread-local confusion]",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1271/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 904537568,MDExOlB1bGxSZXF1ZXN0NjU1Njg0NDc3,1346,Re-display user's query with an error message if an error occurs,9599,closed,0,,,3,2021-05-28T02:04:20Z,2021-06-02T03:46:21Z,2021-06-02T03:46:21Z,OWNER,simonw/datasette/pulls/1346,Refs #619,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 929748885,MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5,293,Test against Python 3.10-dev,9599,closed,0,,,3,2021-06-25T01:40:39Z,2021-10-13T21:49:33Z,2021-10-13T21:49:33Z,OWNER,simonw/sqlite-utils/pulls/293,,140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 988325628,MDExOlB1bGxSZXF1ZXN0NzI3MjY1MDI1,1455,Add scientists to target groups,198537,closed,0,,,3,2021-09-04T16:28:58Z,2021-09-04T16:32:21Z,2021-09-04T16:31:38Z,CONTRIBUTOR,simonw/datasette/pulls/1455,"Not sure if you want them mentioned explicitly (it's already a long list), but following up on https://twitter.com/simonw/status/1434176989565382656",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 991575770,MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3,1467,Add Authorization header when CORS flag is set,3058200,closed,0,,,3,2021-09-08T22:14:41Z,2021-10-17T02:29:07Z,2021-10-14T18:54:18Z,NONE,simonw/datasette/pulls/1467,"This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 400229984,MDU6SXNzdWU0MDAyMjk5ODQ=,401,How to pass configuration to plugins?,1055831,closed,0,,,3,2019-01-17T11:20:41Z,2019-01-18T11:48:13Z,2019-01-18T06:49:07Z,NONE,,"Hi, Firstly, thanks for your work on datasette, it is a hugely useful tool! I've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead. It uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so: ``` let tiles = L.tileLayer.provider('Esri.WorldTopoMap'); ``` instead of the current: ``` let tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom: 19, detectRetina: true, attribution: '© OpenStreetMap contributors' }), ``` However I've got stuck in trying to work out how to pass the provider string to the plugin. In the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used. Can you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version? Many thanks Darren",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed